Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 21 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,27 @@ The format is inspired by Keep a Changelog and follows semantic versioning.

## [Unreleased]

## [0.5.0] - 2026-03-15

### Added
- Higeco OEM transform spec, runtime transformer, and harness fixture (SEP-019 Phase 4).
- Regulatory event normalization contract with unified transform spec and `odse.regulatory` module.
- ERP enrichment JSON schemas: equipment register, equipment ID map, failure taxonomy, maintenance history, spare parts, procurement context, alarm frequency profile.
- ERP enrichment starter notebook with SCADA alarm triage workflow and visualizations.
- IFS Cloud ERP transform spec and alarm frequency computation spec.
- CLI interface (`odse transform`, `odse validate`) with JSON/CSV/Parquet output formats (SEP-015).
- Output serialization module (`odse.io`) with `to_json`, `to_csv`, `to_parquet`, `to_dataframe` (SEP-016).
- Batch validation helper (`odse.validate_batch`) with summary reporting (SEP-018).
- Generic CSV column-mapping transformer (`source="csv"`) with kW-to-kWh fallback (SEP-020).
- SDK usage examples and fixture library: basic transform, batch directory, generic CSV, full pipeline (SEP-019).
- 60-second quickstart guide with sample CSV (SEP-028).
- Sample data fixtures for tutorials and QA: Huawei 24h, Enphase 24h, SolarEdge 24h, generic historian 7d (SEP-037).
- Winter Storm Fern analysis notebook and SMA CSV demo.
- Test scaffold for ERP enrichment schemas (52 tests).

### Fixed
- Version mismatch: synced `__version__` to 0.4.0 and removed phantom dependencies (SEP-017).

## [0.4.0] - 2026-02-19

### Added
Expand Down
13 changes: 13 additions & 0 deletions spec/inverter-api-access.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ Last reviewed: 2026-02-09
- FIMER: `transforms/fimer-auroravision-api.yaml`
- Solis: `transforms/soliscloud-api.yaml`
- SolaX: `transforms/solaxcloud-api-v2.yaml`
- Higeco: `transforms/higeco-api.yaml`

## Runtime Support (Python `odse.transform`)

Expand All @@ -30,6 +31,7 @@ Last reviewed: 2026-02-09
| `fronius` | Implemented |
| `sma` | Implemented (normalized contract input) |
| `solis`, `soliscloud` | Implemented (normalized contract input) |
| `higeco` | Implemented (normalized contract input) |

## Runtime Verification Harness

Expand Down Expand Up @@ -98,6 +100,7 @@ Troubleshooting:
| FIMER Aurora Vision | Included (Spec) | Cloud API | Aurora Vision account with required role; request API enablement via FIMER support | Vendor-issued credentials per Aurora Vision API docs |
| SolisCloud | Included (Spec) | Cloud API | Complete Solis cooperation/application process and receive API activation materials | OAuth2 with AppKey/AppSecret |
| SolaX Cloud | Included (Spec) | Cloud API | Generate API token in Solax Cloud third-party ecosystem settings | API token |
| Higeco | Included (Spec) | Cloud API (docAPI) | Obtain API credentials from Higeco for target instance | Bearer token (POST /authenticate) |

## Setup Instructions By OEM

Expand Down Expand Up @@ -190,6 +193,16 @@ Official references:
- https://global.solaxcloud.com/blue/4/user_api/2024/SolaXCloud_User_API_V2.pdf
- https://doc.solaxcloud.com/en/inst-w/service/

### Higeco (Included (Spec))

1. Obtain API credentials (username/password or apiToken) from the Higeco instance administrator.
2. Authenticate via POST `https://{instance}.higeco.com/docapi/authenticate` to receive a bearer token.
3. Use the bearer token in subsequent requests to list plants, devices, and retrieve log data.
4. Note the 100,000-sample cap on `log_data` queries; use pagination or narrower time ranges for larger datasets.

Official references:
- Higeco docAPI endpoint hierarchy is instance-specific; consult your Higeco account representative for documentation access.

## Implementation Notes For ODS-E

- Treat cloud APIs as rate-limited and implement retries with backoff.
Expand Down
1 change: 1 addition & 0 deletions spec/launch-kit.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ Publish this matrix in docs and outreach posts.
| Solis | Yes | Yes | Partner-gated | SolisCloud onboarding |
| SolaX | Yes | Yes | Account-required | SolaX tokenId |
| Solarman | Yes | Yes | Account/file feed | Logger exports/API |
| Higeco | Yes | Yes | Partner-gated | Higeco docAPI bearer token |

### Consumption & Net Metering Sources (Schema Ready)

Expand Down
2 changes: 1 addition & 1 deletion src/python/odse/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
using the ODS-E specification.
"""

__version__ = "0.4.0"
__version__ = "0.5.0"

from .validator import (
validate,
Expand Down
77 changes: 77 additions & 0 deletions src/python/odse/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,7 @@ def _get_transformer(source: str):
"sma": SMATransformer(),
"solis": SolisTransformer(),
"soliscloud": SolisTransformer(),
"higeco": HigecoTransformer(),
"csv": GenericCSVTransformer(),
"generic_csv": GenericCSVTransformer(),
"generic": GenericCSVTransformer(),
Expand Down Expand Up @@ -890,6 +891,82 @@ def transform(self, data: Union[str, Path], **kwargs) -> List[Dict[str, Any]]:
return out


class HigecoTransformer(BaseTransformer):
"""Transform normalized Higeco docAPI records to ODS-E."""

CONNECTION_STATUS_MAPPING = {
"CONNECTED": "normal",
"DISCONNECTED": "offline",
}

POWER_STATUS_MAPPING = {
"ON": "normal",
"OFF": "standby",
"FAULT": "fault",
"WARNING": "warning",
}

def _resolve_error_type(self, normalized: Dict[str, Any], power_w: Optional[float]) -> str:
conn = str(normalized.get("connectionStatus") or "").upper()
if conn in self.CONNECTION_STATUS_MAPPING:
mapped = self.CONNECTION_STATUS_MAPPING[conn]
if mapped != "normal":
return mapped

pstat = str(normalized.get("powerStatus") or "").upper()
if pstat in self.POWER_STATUS_MAPPING:
return self.POWER_STATUS_MAPPING[pstat]

if power_w is not None:
return "standby" if power_w == 0 else "normal"

return "unknown"

def transform(self, data: Union[str, Path], **kwargs) -> List[Dict[str, Any]]:
payload = self._parse_json(data)
timezone = kwargs.get("timezone")
interval_hours = (kwargs.get("interval_minutes", 5) or 5) / 60.0
asset_id = kwargs.get("asset_id")
records_in = _extract_records(payload)

out: List[Dict[str, Any]] = []
for r in records_in:
normalized = r.get("normalized") if isinstance(r.get("normalized"), dict) else r
ts = _to_iso8601(normalized.get("timestamp"), timezone=timezone)
if not ts:
continue

p_w = _to_float(normalized.get("active_power_w"))
e_wh = _to_float(normalized.get("active_energy_wh"))

kwh = (e_wh / 1000.0) if e_wh is not None else max(((p_w or 0.0) / 1000.0) * interval_hours, 0.0)
error_type = self._resolve_error_type(normalized, p_w)

rec = _base_record(
timestamp=ts,
kwh=kwh,
error_type=error_type,
error_code=normalized.get("status_code"),
asset_id=asset_id,
)
if p_w is not None:
rec["kW"] = p_w / 1000.0
for src, dst in [
("voltage_dc_v", "voltage_dc"),
("current_dc_a", "current_dc"),
("temperature_c", "temperature"),
("voltage_v", "voltage_ac"),
("current_a", "current_ac"),
("frequency_hz", "frequency"),
]:
val = _to_float(normalized.get(src))
if val is not None:
rec[dst] = val
out.append(rec)

return out


class GenericCSVTransformer(BaseTransformer):
"""Transform arbitrary CSV data to ODS-E using a column mapping."""

Expand Down
4 changes: 2 additions & 2 deletions src/python/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "odse"
version = "0.4.0"
version = "0.5.0"
description = "Open Data Schema for Energy - validation and transformation library"
readme = "README.md"
license = {text = "Apache-2.0"}
Expand Down Expand Up @@ -41,7 +41,7 @@ dataframe = [
]

[project.urls]
Homepage = "https://github.com/AsobaCloud/odse"
Homepage = "https://opendataschema.energy"
Documentation = "https://opendataschema.energy/docs/"
Repository = "https://github.com/AsobaCloud/odse"

Expand Down
61 changes: 61 additions & 0 deletions src/python/tests/test_transformer_runtime.py
Original file line number Diff line number Diff line change
Expand Up @@ -202,6 +202,67 @@ def test_solis_normalized_mapping(self):
self.assertEqual(rows[0]["error_code"], "200")
self.assertAlmostEqual(rows[0]["kW"], 4.6)

def test_higeco_normalized_log_data_mapping(self):
payload = """
{
"records": [
{
"normalized": {
"timestamp": "2026-03-15T10:00:00Z",
"active_power_w": 5200,
"active_energy_wh": 1300,
"temperature_c": 38.5,
"connectionStatus": "CONNECTED",
"powerStatus": "ON"
}
}
]
}
"""
rows = transform(payload, source="higeco")
self.assertEqual(len(rows), 1)
self.assertEqual(rows[0]["error_type"], "normal")
self.assertAlmostEqual(rows[0]["kWh"], 1.3)
self.assertAlmostEqual(rows[0]["kW"], 5.2)
self.assertEqual(rows[0]["temperature"], 38.5)

def test_higeco_disconnected_maps_offline(self):
payload = """
{
"records": [
{
"normalized": {
"timestamp": "2026-03-15T10:00:00Z",
"active_power_w": 0,
"connectionStatus": "DISCONNECTED",
"powerStatus": "OFF"
}
}
]
}
"""
rows = transform(payload, source="higeco")
self.assertEqual(len(rows), 1)
self.assertEqual(rows[0]["error_type"], "offline")

def test_higeco_fault_status_mapping(self):
payload = """
{
"data": [
{
"normalized": {
"timestamp": "2026-03-15T10:00:00Z",
"active_power_w": 0,
"powerStatus": "FAULT"
}
}
]
}
"""
rows = transform(payload, source="higeco")
self.assertEqual(len(rows), 1)
self.assertEqual(rows[0]["error_type"], "fault")


class EnergyTimeseriesValidationTests(unittest.TestCase):
"""Tests for the extended energy-timeseries schema fields."""
Expand Down
15 changes: 15 additions & 0 deletions tools/transform_harness.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@
"fimer",
"solis",
"solaxcloud",
"higeco",
]

FIXTURES: Dict[str, str] = {
Expand Down Expand Up @@ -100,6 +101,20 @@
}
]
}),
"higeco": json.dumps({
"records": [
{
"normalized": {
"timestamp": "2026-03-15T10:00:00Z",
"active_power_w": 5200,
"active_energy_wh": 1300,
"temperature_c": 38.5,
"connectionStatus": "CONNECTED",
"powerStatus": "ON",
}
}
]
}),
"solaxcloud": json.dumps({
"success": True,
"code": 0,
Expand Down
Loading
Loading