mirror of
https://github.com/jpawlowski/hass.tibber_prices.git
synced 2026-03-29 21:03:40 +00:00
refactor(sensors): Transform sensor platform into package
This commit is contained in:
parent
d2d07d5e16
commit
fa40c00f67
16 changed files with 3039 additions and 2401 deletions
30
AGENTS.md
30
AGENTS.md
|
|
@ -4,8 +4,8 @@ This is a **Home Assistant custom component** for Tibber electricity price data,
|
||||||
|
|
||||||
## Documentation Metadata
|
## Documentation Metadata
|
||||||
|
|
||||||
- **Last Major Update**: 2025-11-15
|
- **Last Major Update**: 2025-11-17
|
||||||
- **Last Architecture Review**: 2025-11-15 (Sensor.py refactoring completed - unified handler methods for interval/rolling hour/daily stats/24h windows. Sensor organization changed from feature-type to calculation-method grouping. Common Tasks section updated with new patterns.)
|
- **Last Architecture Review**: 2025-11-17 (Module splitting refactoring completed - sensor.py split into sensor/ package with core.py, definitions.py, helpers.py, attributes.py. Created entity_utils/ package for shared icon/color/attribute logic. All phases complete.)
|
||||||
- **Documentation Status**: ✅ Current (verified against codebase)
|
- **Documentation Status**: ✅ Current (verified against codebase)
|
||||||
|
|
||||||
_Note: When proposing significant updates to this file, update the metadata above with the new date and brief description of changes._
|
_Note: When proposing significant updates to this file, update the metadata above with the new date and brief description of changes._
|
||||||
|
|
@ -112,7 +112,7 @@ This ensures the documentation stays accurate and useful as the codebase evolves
|
||||||
1. `TibberPricesApiClient` (`api.py`) queries Tibber's GraphQL API with `resolution:QUARTER_HOURLY` for user data and prices (yesterday/today/tomorrow - 192 intervals total)
|
1. `TibberPricesApiClient` (`api.py`) queries Tibber's GraphQL API with `resolution:QUARTER_HOURLY` for user data and prices (yesterday/today/tomorrow - 192 intervals total)
|
||||||
2. `TibberPricesDataUpdateCoordinator` (`coordinator.py`) orchestrates updates every 15 minutes, manages persistent storage via `Store`, and schedules quarter-hour entity refreshes
|
2. `TibberPricesDataUpdateCoordinator` (`coordinator.py`) orchestrates updates every 15 minutes, manages persistent storage via `Store`, and schedules quarter-hour entity refreshes
|
||||||
3. Price enrichment functions (`price_utils.py`, `average_utils.py`) calculate trailing/leading 24h averages, price differences, and rating levels for each 15-minute interval
|
3. Price enrichment functions (`price_utils.py`, `average_utils.py`) calculate trailing/leading 24h averages, price differences, and rating levels for each 15-minute interval
|
||||||
4. Entity platforms (`sensor.py`, `binary_sensor.py`) expose enriched data as Home Assistant entities
|
4. Entity platforms (`sensor/` package, `binary_sensor.py`) expose enriched data as Home Assistant entities
|
||||||
5. Custom services (`services.py`) provide API endpoints for integrations like ApexCharts
|
5. Custom services (`services.py`) provide API endpoints for integrations like ApexCharts
|
||||||
|
|
||||||
**Key Patterns:**
|
**Key Patterns:**
|
||||||
|
|
@ -179,7 +179,7 @@ This ensures the documentation stays accurate and useful as the codebase evolves
|
||||||
"rating_level": "NORMAL" # Added: LOW/NORMAL/HIGH based on thresholds
|
"rating_level": "NORMAL" # Added: LOW/NORMAL/HIGH based on thresholds
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
- **Sensor organization (refactored Nov 2025)**: Sensors in `sensor.py` are grouped by **calculation method** rather than feature type, enabling code reuse through unified handler methods:
|
- **Sensor organization (refactored Nov 2025)**: The `sensor/` package is organized by **calculation method** rather than feature type, enabling code reuse through unified handler methods:
|
||||||
- **Interval-based sensors**: Use `_get_interval_value(interval_offset, value_type)` for current/next/previous interval data
|
- **Interval-based sensors**: Use `_get_interval_value(interval_offset, value_type)` for current/next/previous interval data
|
||||||
- **Rolling hour sensors**: Use `_get_rolling_hour_value(hour_offset, value_type)` for 5-interval windows
|
- **Rolling hour sensors**: Use `_get_rolling_hour_value(hour_offset, value_type)` for 5-interval windows
|
||||||
- **Daily statistics**: Use `_get_daily_stat_value(day, stat_func)` for calendar day min/max/avg
|
- **Daily statistics**: Use `_get_daily_stat_value(day, stat_func)` for calendar day min/max/avg
|
||||||
|
|
@ -204,9 +204,19 @@ custom_components/tibber_prices/
|
||||||
├── price_utils.py # Price enrichment, level/rating calculations
|
├── price_utils.py # Price enrichment, level/rating calculations
|
||||||
├── average_utils.py # Trailing/leading average utilities
|
├── average_utils.py # Trailing/leading average utilities
|
||||||
├── services.py # Custom services (get_price, ApexCharts, etc.)
|
├── services.py # Custom services (get_price, ApexCharts, etc.)
|
||||||
├── sensor.py # Price/stats/diagnostic sensors
|
├── sensor/ # Sensor platform (package)
|
||||||
|
│ ├── __init__.py # Platform setup (async_setup_entry)
|
||||||
|
│ ├── core.py # TibberPricesSensor class
|
||||||
|
│ ├── definitions.py # ENTITY_DESCRIPTIONS
|
||||||
|
│ ├── helpers.py # Pure helper functions
|
||||||
|
│ └── attributes.py # Attribute builders
|
||||||
├── binary_sensor.py # Peak/best hour binary sensors
|
├── binary_sensor.py # Peak/best hour binary sensors
|
||||||
├── entity.py # Base TibberPricesEntity class
|
├── entity.py # Base TibberPricesEntity class
|
||||||
|
├── entity_utils/ # Shared entity helpers (both platforms)
|
||||||
|
│ ├── __init__.py # Package exports
|
||||||
|
│ ├── icons.py # Icon mapping logic
|
||||||
|
│ ├── colors.py # Color mapping logic
|
||||||
|
│ └── attributes.py # Common attribute builders
|
||||||
├── data.py # @dataclass TibberPricesData
|
├── data.py # @dataclass TibberPricesData
|
||||||
├── const.py # Constants, translation loaders, currency helpers
|
├── const.py # Constants, translation loaders, currency helpers
|
||||||
├── config_flow.py # UI configuration flow
|
├── config_flow.py # UI configuration flow
|
||||||
|
|
@ -1768,7 +1778,7 @@ After the sensor.py refactoring (completed Nov 2025), sensors are organized by *
|
||||||
- **Volatility**: Statistical analysis of price variation
|
- **Volatility**: Statistical analysis of price variation
|
||||||
- **Diagnostic**: System information and metadata
|
- **Diagnostic**: System information and metadata
|
||||||
|
|
||||||
2. **Add entity description** to appropriate sensor group in `sensor.py`:
|
2. **Add entity description** to appropriate sensor group in `sensor/definitions.py`:
|
||||||
|
|
||||||
- `INTERVAL_PRICE_SENSORS`, `INTERVAL_LEVEL_SENSORS`, or `INTERVAL_RATING_SENSORS`
|
- `INTERVAL_PRICE_SENSORS`, `INTERVAL_LEVEL_SENSORS`, or `INTERVAL_RATING_SENSORS`
|
||||||
- `ROLLING_HOUR_PRICE_SENSORS`, `ROLLING_HOUR_LEVEL_SENSORS`, or `ROLLING_HOUR_RATING_SENSORS`
|
- `ROLLING_HOUR_PRICE_SENSORS`, `ROLLING_HOUR_LEVEL_SENSORS`, or `ROLLING_HOUR_RATING_SENSORS`
|
||||||
|
|
@ -1778,7 +1788,7 @@ After the sensor.py refactoring (completed Nov 2025), sensors are organized by *
|
||||||
- `VOLATILITY_SENSORS`
|
- `VOLATILITY_SENSORS`
|
||||||
- `DIAGNOSTIC_SENSORS`
|
- `DIAGNOSTIC_SENSORS`
|
||||||
|
|
||||||
3. **Add handler mapping** in `_get_value_getter()` method:
|
3. **Add handler mapping** in `sensor/core.py` → `_get_value_getter()` method:
|
||||||
|
|
||||||
- For interval-based: Use `_get_interval_value(interval_offset, value_type)`
|
- For interval-based: Use `_get_interval_value(interval_offset, value_type)`
|
||||||
- For rolling hour: Use `_get_rolling_hour_value(hour_offset, value_type)`
|
- For rolling hour: Use `_get_rolling_hour_value(hour_offset, value_type)`
|
||||||
|
|
@ -1793,7 +1803,7 @@ After the sensor.py refactoring (completed Nov 2025), sensors are organized by *
|
||||||
**Example - Adding a "2 hours ago" interval sensor:**
|
**Example - Adding a "2 hours ago" interval sensor:**
|
||||||
|
|
||||||
```python
|
```python
|
||||||
# 1. Add to INTERVAL_PRICE_SENSORS group in sensor.py
|
# 1. Add to INTERVAL_PRICE_SENSORS group in sensor/definitions.py
|
||||||
SensorEntityDescription(
|
SensorEntityDescription(
|
||||||
key="two_hours_ago_price",
|
key="two_hours_ago_price",
|
||||||
translation_key="two_hours_ago_price",
|
translation_key="two_hours_ago_price",
|
||||||
|
|
@ -1804,7 +1814,7 @@ SensorEntityDescription(
|
||||||
suggested_display_precision=2,
|
suggested_display_precision=2,
|
||||||
)
|
)
|
||||||
|
|
||||||
# 2. Add handler in _get_value_getter()
|
# 2. Add handler in sensor/core.py → _get_value_getter()
|
||||||
"two_hours_ago_price": lambda: self._get_interval_value(
|
"two_hours_ago_price": lambda: self._get_interval_value(
|
||||||
interval_offset=-8, # 2 hours = 8 intervals (15 min each)
|
interval_offset=-8, # 2 hours = 8 intervals (15 min each)
|
||||||
value_type="price",
|
value_type="price",
|
||||||
|
|
@ -1834,7 +1844,7 @@ SensorEntityDescription(
|
||||||
|
|
||||||
**Unified Handler Methods (Post-Refactoring):**
|
**Unified Handler Methods (Post-Refactoring):**
|
||||||
|
|
||||||
The refactoring consolidated duplicate logic into unified methods:
|
The refactoring consolidated duplicate logic into unified methods in `sensor/core.py`:
|
||||||
|
|
||||||
- **`_get_interval_value(interval_offset, value_type, in_euro=False)`**
|
- **`_get_interval_value(interval_offset, value_type, in_euro=False)`**
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -16,6 +16,7 @@ from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
from .coordinator import TIME_SENSITIVE_ENTITY_KEYS
|
from .coordinator import TIME_SENSITIVE_ENTITY_KEYS
|
||||||
from .entity import TibberPricesEntity
|
from .entity import TibberPricesEntity
|
||||||
|
from .entity_utils import add_icon_color_attribute, get_binary_sensor_icon
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from collections.abc import Callable
|
from collections.abc import Callable
|
||||||
|
|
@ -28,8 +29,6 @@ if TYPE_CHECKING:
|
||||||
from .data import TibberPricesConfigEntry
|
from .data import TibberPricesConfigEntry
|
||||||
|
|
||||||
from .const import (
|
from .const import (
|
||||||
BINARY_SENSOR_COLOR_MAPPING,
|
|
||||||
BINARY_SENSOR_ICON_MAPPING,
|
|
||||||
CONF_EXTENDED_DESCRIPTIONS,
|
CONF_EXTENDED_DESCRIPTIONS,
|
||||||
DEFAULT_EXTENDED_DESCRIPTIONS,
|
DEFAULT_EXTENDED_DESCRIPTIONS,
|
||||||
async_get_entity_description,
|
async_get_entity_description,
|
||||||
|
|
@ -446,24 +445,15 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
|
||||||
"""Return the icon based on binary sensor state."""
|
"""Return the icon based on binary sensor state."""
|
||||||
key = self.entity_description.key
|
key = self.entity_description.key
|
||||||
|
|
||||||
# Dynamic icons for best/peak price period sensors
|
# Use shared icon utility
|
||||||
if key in BINARY_SENSOR_ICON_MAPPING:
|
icon = get_binary_sensor_icon(
|
||||||
if self.is_on:
|
key,
|
||||||
# Sensor is ON - use "on" icon
|
is_on=self.is_on,
|
||||||
icon = BINARY_SENSOR_ICON_MAPPING[key].get("on")
|
has_future_periods_callback=self._has_future_periods,
|
||||||
else:
|
)
|
||||||
# Sensor is OFF - check if future periods exist
|
|
||||||
has_future_periods = self._has_future_periods()
|
|
||||||
if has_future_periods:
|
|
||||||
icon = BINARY_SENSOR_ICON_MAPPING[key].get("off")
|
|
||||||
else:
|
|
||||||
icon = BINARY_SENSOR_ICON_MAPPING[key].get("off_no_future")
|
|
||||||
|
|
||||||
if icon:
|
# Fall back to static icon from entity description
|
||||||
return icon
|
return icon or self.entity_description.icon
|
||||||
|
|
||||||
# For all other sensors, use static icon from entity description
|
|
||||||
return self.entity_description.icon
|
|
||||||
|
|
||||||
def _has_future_periods(self) -> bool:
|
def _has_future_periods(self) -> bool:
|
||||||
"""
|
"""
|
||||||
|
|
@ -514,15 +504,10 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
|
||||||
clean_attrs = {k: v for k, v in dynamic_attrs.items() if not k.startswith("_")}
|
clean_attrs = {k: v for k, v in dynamic_attrs.items() if not k.startswith("_")}
|
||||||
attributes.update(clean_attrs)
|
attributes.update(clean_attrs)
|
||||||
|
|
||||||
# Add icon_color for best/peak price period sensors
|
# Add icon_color for best/peak price period sensors using shared utility
|
||||||
key = self.entity_description.key
|
add_icon_color_attribute(attributes, self.entity_description.key, is_on=self.is_on)
|
||||||
if key in BINARY_SENSOR_COLOR_MAPPING:
|
|
||||||
state = "on" if self.is_on else "off"
|
|
||||||
color = BINARY_SENSOR_COLOR_MAPPING[key].get(state)
|
|
||||||
if color:
|
|
||||||
attributes["icon_color"] = color
|
|
||||||
|
|
||||||
# Add descriptions from the custom translations file
|
# Add description from the custom translations file
|
||||||
if self.entity_description.translation_key and self.hass is not None:
|
if self.entity_description.translation_key and self.hass is not None:
|
||||||
# Get user's language preference
|
# Get user's language preference
|
||||||
language = self.hass.config.language if self.hass.config.language else "en"
|
language = self.hass.config.language if self.hass.config.language else "en"
|
||||||
|
|
@ -596,13 +581,8 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
|
||||||
clean_attrs = {k: v for k, v in dynamic_attrs.items() if not k.startswith("_")}
|
clean_attrs = {k: v for k, v in dynamic_attrs.items() if not k.startswith("_")}
|
||||||
attributes.update(clean_attrs)
|
attributes.update(clean_attrs)
|
||||||
|
|
||||||
# Add icon_color for best/peak price period sensors
|
# Add icon_color for best/peak price period sensors using shared utility
|
||||||
key = self.entity_description.key
|
add_icon_color_attribute(attributes, self.entity_description.key, is_on=self.is_on)
|
||||||
if key in BINARY_SENSOR_COLOR_MAPPING:
|
|
||||||
state = "on" if self.is_on else "off"
|
|
||||||
color = BINARY_SENSOR_COLOR_MAPPING[key].get(state)
|
|
||||||
if color:
|
|
||||||
attributes["icon_color"] = color
|
|
||||||
|
|
||||||
# Add descriptions from the cache (non-blocking)
|
# Add descriptions from the cache (non-blocking)
|
||||||
if self.entity_description.translation_key and self.hass is not None:
|
if self.entity_description.translation_key and self.hass is not None:
|
||||||
|
|
|
||||||
|
|
@ -526,7 +526,7 @@ class TibberPricesOptionsFlowHandler(OptionsFlow):
|
||||||
|
|
||||||
if user_input is not None:
|
if user_input is not None:
|
||||||
self._options.update(user_input)
|
self._options.update(user_input)
|
||||||
return await self.async_step_price_rating()
|
return await self.async_step_current_interval_price_rating()
|
||||||
|
|
||||||
return self.async_show_form(
|
return self.async_show_form(
|
||||||
step_id="init",
|
step_id="init",
|
||||||
|
|
@ -546,7 +546,9 @@ class TibberPricesOptionsFlowHandler(OptionsFlow):
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
async def async_step_price_rating(self, user_input: dict[str, Any] | None = None) -> ConfigFlowResult:
|
async def async_step_current_interval_price_rating(
|
||||||
|
self, user_input: dict[str, Any] | None = None
|
||||||
|
) -> ConfigFlowResult:
|
||||||
"""Configure price rating thresholds."""
|
"""Configure price rating thresholds."""
|
||||||
if user_input is not None:
|
if user_input is not None:
|
||||||
self._options.update(user_input)
|
self._options.update(user_input)
|
||||||
|
|
|
||||||
|
|
@ -183,8 +183,8 @@ VOLATILITY_VERY_HIGH = "VERY_HIGH"
|
||||||
|
|
||||||
# Sensor options (lowercase versions for ENUM device class)
|
# Sensor options (lowercase versions for ENUM device class)
|
||||||
# NOTE: These constants define the valid enum options, but they are not used directly
|
# NOTE: These constants define the valid enum options, but they are not used directly
|
||||||
# in sensor.py due to import timing issues. Instead, the options are defined inline
|
# in sensor/definitions.py due to import timing issues. Instead, the options are defined inline
|
||||||
# in the SensorEntityDescription objects. Keep these in sync with sensor.py!
|
# in the SensorEntityDescription objects. Keep these in sync with sensor/definitions.py!
|
||||||
PRICE_LEVEL_OPTIONS = [
|
PRICE_LEVEL_OPTIONS = [
|
||||||
PRICE_LEVEL_VERY_CHEAP.lower(),
|
PRICE_LEVEL_VERY_CHEAP.lower(),
|
||||||
PRICE_LEVEL_CHEAP.lower(),
|
PRICE_LEVEL_CHEAP.lower(),
|
||||||
|
|
|
||||||
31
custom_components/tibber_prices/entity_utils/__init__.py
Normal file
31
custom_components/tibber_prices/entity_utils/__init__.py
Normal file
|
|
@ -0,0 +1,31 @@
|
||||||
|
"""Entity utilities for Tibber Prices integration."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from .attributes import build_period_attributes, build_timestamp_attribute
|
||||||
|
from .colors import add_icon_color_attribute, get_icon_color
|
||||||
|
from .icons import (
|
||||||
|
get_binary_sensor_icon,
|
||||||
|
get_dynamic_icon,
|
||||||
|
get_level_sensor_icon,
|
||||||
|
get_price_level_for_icon,
|
||||||
|
get_price_sensor_icon,
|
||||||
|
get_rating_sensor_icon,
|
||||||
|
get_trend_icon,
|
||||||
|
get_volatility_sensor_icon,
|
||||||
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"add_icon_color_attribute",
|
||||||
|
"build_period_attributes",
|
||||||
|
"build_timestamp_attribute",
|
||||||
|
"get_binary_sensor_icon",
|
||||||
|
"get_dynamic_icon",
|
||||||
|
"get_icon_color",
|
||||||
|
"get_level_sensor_icon",
|
||||||
|
"get_price_level_for_icon",
|
||||||
|
"get_price_sensor_icon",
|
||||||
|
"get_rating_sensor_icon",
|
||||||
|
"get_trend_icon",
|
||||||
|
"get_volatility_sensor_icon",
|
||||||
|
]
|
||||||
42
custom_components/tibber_prices/entity_utils/attributes.py
Normal file
42
custom_components/tibber_prices/entity_utils/attributes.py
Normal file
|
|
@ -0,0 +1,42 @@
|
||||||
|
"""Common attribute utilities for Tibber Prices entities."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
|
||||||
|
def build_timestamp_attribute(interval_data: dict | None) -> str | None:
|
||||||
|
"""
|
||||||
|
Build timestamp attribute from interval data.
|
||||||
|
|
||||||
|
Extracts startsAt field consistently across all sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
interval_data: Interval data dictionary containing startsAt field
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
ISO format timestamp string or None
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not interval_data:
|
||||||
|
return None
|
||||||
|
return interval_data.get("startsAt")
|
||||||
|
|
||||||
|
|
||||||
|
def build_period_attributes(period_data: dict) -> dict:
|
||||||
|
"""
|
||||||
|
Build common period attributes (start, end, duration, timestamp).
|
||||||
|
|
||||||
|
Used by binary sensors for period-based entities.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
period_data: Period data dictionary
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with common period attributes
|
||||||
|
|
||||||
|
"""
|
||||||
|
return {
|
||||||
|
"start": period_data.get("start"),
|
||||||
|
"end": period_data.get("end"),
|
||||||
|
"duration_minutes": period_data.get("duration_minutes"),
|
||||||
|
"timestamp": period_data.get("start"), # Timestamp = period start
|
||||||
|
}
|
||||||
79
custom_components/tibber_prices/entity_utils/colors.py
Normal file
79
custom_components/tibber_prices/entity_utils/colors.py
Normal file
|
|
@ -0,0 +1,79 @@
|
||||||
|
"""Color utilities for Tibber Prices entities."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import (
|
||||||
|
BINARY_SENSOR_COLOR_MAPPING,
|
||||||
|
PRICE_LEVEL_COLOR_MAPPING,
|
||||||
|
PRICE_RATING_COLOR_MAPPING,
|
||||||
|
VOLATILITY_COLOR_MAPPING,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def add_icon_color_attribute(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
state_value: Any = None,
|
||||||
|
*,
|
||||||
|
is_on: bool | None = None,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add icon_color attribute if color mapping exists.
|
||||||
|
|
||||||
|
Used by both sensor and binary_sensor platforms.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Attribute dictionary to update
|
||||||
|
key: Entity description key
|
||||||
|
state_value: Sensor value (for sensors) or None (for binary sensors)
|
||||||
|
is_on: Binary sensor state (for binary sensors) or None (for sensors)
|
||||||
|
|
||||||
|
"""
|
||||||
|
color = get_icon_color(key, state_value, is_on=is_on)
|
||||||
|
if color:
|
||||||
|
attributes["icon_color"] = color
|
||||||
|
|
||||||
|
|
||||||
|
def get_icon_color(
|
||||||
|
key: str,
|
||||||
|
state_value: Any = None,
|
||||||
|
*,
|
||||||
|
is_on: bool | None = None,
|
||||||
|
) -> str | None:
|
||||||
|
"""
|
||||||
|
Get icon color from various mappings.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: Entity description key
|
||||||
|
state_value: Sensor value (for sensors)
|
||||||
|
is_on: Binary sensor state (for binary sensors)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
CSS color variable string or None
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Binary sensor colors (based on on/off state)
|
||||||
|
if key in BINARY_SENSOR_COLOR_MAPPING and is_on is not None:
|
||||||
|
state_key = "on" if is_on else "off"
|
||||||
|
return BINARY_SENSOR_COLOR_MAPPING[key].get(state_key)
|
||||||
|
|
||||||
|
# Trend sensor colors (based on trend state)
|
||||||
|
if key.startswith("price_trend_") and isinstance(state_value, str):
|
||||||
|
trend_colors = {
|
||||||
|
"rising": "var(--error-color)", # Red/Orange for rising prices
|
||||||
|
"falling": "var(--success-color)", # Green for falling prices
|
||||||
|
"stable": "var(--state-icon-color)", # Default gray for stable
|
||||||
|
}
|
||||||
|
return trend_colors.get(state_value)
|
||||||
|
|
||||||
|
# Price level/rating/volatility colors (based on uppercase value)
|
||||||
|
if isinstance(state_value, str):
|
||||||
|
return (
|
||||||
|
PRICE_LEVEL_COLOR_MAPPING.get(state_value.upper())
|
||||||
|
or PRICE_RATING_COLOR_MAPPING.get(state_value.upper())
|
||||||
|
or VOLATILITY_COLOR_MAPPING.get(state_value.upper())
|
||||||
|
)
|
||||||
|
|
||||||
|
return None
|
||||||
206
custom_components/tibber_prices/entity_utils/icons.py
Normal file
206
custom_components/tibber_prices/entity_utils/icons.py
Normal file
|
|
@ -0,0 +1,206 @@
|
||||||
|
"""Icon utilities for Tibber Prices entities."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import timedelta
|
||||||
|
from typing import TYPE_CHECKING, Any
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import (
|
||||||
|
BINARY_SENSOR_ICON_MAPPING,
|
||||||
|
PRICE_LEVEL_CASH_ICON_MAPPING,
|
||||||
|
PRICE_LEVEL_ICON_MAPPING,
|
||||||
|
PRICE_RATING_ICON_MAPPING,
|
||||||
|
VOLATILITY_ICON_MAPPING,
|
||||||
|
)
|
||||||
|
from custom_components.tibber_prices.price_utils import find_price_data_for_interval
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from collections.abc import Callable
|
||||||
|
|
||||||
|
# Constants imported from price_utils
|
||||||
|
MINUTES_PER_INTERVAL = 15
|
||||||
|
|
||||||
|
|
||||||
|
def get_dynamic_icon(
|
||||||
|
key: str,
|
||||||
|
value: Any,
|
||||||
|
*,
|
||||||
|
is_on: bool | None = None,
|
||||||
|
coordinator_data: dict | None = None,
|
||||||
|
has_future_periods_callback: Callable[[], bool] | None = None,
|
||||||
|
) -> str | None:
|
||||||
|
"""
|
||||||
|
Get dynamic icon based on sensor state.
|
||||||
|
|
||||||
|
Unified function for both sensor and binary_sensor platforms.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: Entity description key
|
||||||
|
value: Native value of the sensor
|
||||||
|
is_on: Binary sensor state (None for regular sensors)
|
||||||
|
coordinator_data: Coordinator data for price level lookups
|
||||||
|
has_future_periods_callback: Callback to check if future periods exist (binary sensors)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Icon string or None if no dynamic icon applies
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Try various icon sources in order
|
||||||
|
return (
|
||||||
|
get_trend_icon(key, value)
|
||||||
|
or get_price_sensor_icon(key, coordinator_data)
|
||||||
|
or get_level_sensor_icon(key, value)
|
||||||
|
or get_rating_sensor_icon(key, value)
|
||||||
|
or get_volatility_sensor_icon(key, value)
|
||||||
|
or get_binary_sensor_icon(key, is_on=is_on, has_future_periods_callback=has_future_periods_callback)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_trend_icon(key: str, value: Any) -> str | None:
|
||||||
|
"""Get icon for trend sensors."""
|
||||||
|
if not key.startswith("price_trend_") or not isinstance(value, str):
|
||||||
|
return None
|
||||||
|
|
||||||
|
trend_icons = {
|
||||||
|
"rising": "mdi:trending-up",
|
||||||
|
"falling": "mdi:trending-down",
|
||||||
|
"stable": "mdi:trending-neutral",
|
||||||
|
}
|
||||||
|
return trend_icons.get(value)
|
||||||
|
|
||||||
|
|
||||||
|
def get_price_sensor_icon(key: str, coordinator_data: dict | None) -> str | None:
|
||||||
|
"""
|
||||||
|
Get icon for current price sensors (dynamic based on price level).
|
||||||
|
|
||||||
|
Only current_interval_price and current_hour_average have dynamic icons.
|
||||||
|
Other price sensors (next/previous) use static icons from entity description.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: Entity description key
|
||||||
|
coordinator_data: Coordinator data for price level lookups
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Icon string or None if not a current price sensor
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not coordinator_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Only current price sensors get dynamic icons
|
||||||
|
if key == "current_interval_price":
|
||||||
|
level = get_price_level_for_icon(coordinator_data, interval_offset=0)
|
||||||
|
if level:
|
||||||
|
return PRICE_LEVEL_CASH_ICON_MAPPING.get(level.upper())
|
||||||
|
elif key == "current_hour_average":
|
||||||
|
# For hour average, we cannot use this helper (needs sensor rolling hour logic)
|
||||||
|
# Return None and let sensor handle it
|
||||||
|
return None
|
||||||
|
|
||||||
|
# For all other price sensors, let entity description handle the icon
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def get_level_sensor_icon(key: str, value: Any) -> str | None:
|
||||||
|
"""Get icon for price level sensors."""
|
||||||
|
if key not in [
|
||||||
|
"current_interval_price_level",
|
||||||
|
"next_interval_price_level",
|
||||||
|
"previous_interval_price_level",
|
||||||
|
"current_hour_price_level",
|
||||||
|
"next_hour_price_level",
|
||||||
|
] or not isinstance(value, str):
|
||||||
|
return None
|
||||||
|
|
||||||
|
return PRICE_LEVEL_ICON_MAPPING.get(value.upper())
|
||||||
|
|
||||||
|
|
||||||
|
def get_rating_sensor_icon(key: str, value: Any) -> str | None:
|
||||||
|
"""Get icon for price rating sensors."""
|
||||||
|
if key not in [
|
||||||
|
"current_interval_price_rating",
|
||||||
|
"next_interval_price_rating",
|
||||||
|
"previous_interval_price_rating",
|
||||||
|
"current_hour_price_rating",
|
||||||
|
"next_hour_price_rating",
|
||||||
|
] or not isinstance(value, str):
|
||||||
|
return None
|
||||||
|
|
||||||
|
return PRICE_RATING_ICON_MAPPING.get(value.upper())
|
||||||
|
|
||||||
|
|
||||||
|
def get_volatility_sensor_icon(key: str, value: Any) -> str | None:
|
||||||
|
"""Get icon for volatility sensors."""
|
||||||
|
if not key.endswith("_volatility") or not isinstance(value, str):
|
||||||
|
return None
|
||||||
|
|
||||||
|
return VOLATILITY_ICON_MAPPING.get(value.upper())
|
||||||
|
|
||||||
|
|
||||||
|
def get_binary_sensor_icon(
|
||||||
|
key: str,
|
||||||
|
*,
|
||||||
|
is_on: bool | None,
|
||||||
|
has_future_periods_callback: Callable[[], bool] | None = None,
|
||||||
|
) -> str | None:
|
||||||
|
"""
|
||||||
|
Get icon for binary sensors with dynamic state-based icons.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: Entity description key
|
||||||
|
is_on: Binary sensor state
|
||||||
|
has_future_periods_callback: Callback to check if future periods exist
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Icon string or None if not a binary sensor with dynamic icons
|
||||||
|
|
||||||
|
"""
|
||||||
|
if key not in BINARY_SENSOR_ICON_MAPPING or is_on is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if is_on:
|
||||||
|
# Sensor is ON - use "on" icon
|
||||||
|
return BINARY_SENSOR_ICON_MAPPING[key].get("on")
|
||||||
|
|
||||||
|
# Sensor is OFF - check if future periods exist
|
||||||
|
has_future_periods = has_future_periods_callback() if has_future_periods_callback else False
|
||||||
|
|
||||||
|
if has_future_periods:
|
||||||
|
return BINARY_SENSOR_ICON_MAPPING[key].get("off")
|
||||||
|
|
||||||
|
return BINARY_SENSOR_ICON_MAPPING[key].get("off_no_future")
|
||||||
|
|
||||||
|
|
||||||
|
def get_price_level_for_icon(
|
||||||
|
coordinator_data: dict,
|
||||||
|
*,
|
||||||
|
interval_offset: int | None = None,
|
||||||
|
) -> str | None:
|
||||||
|
"""
|
||||||
|
Get the price level for icon determination.
|
||||||
|
|
||||||
|
Supports interval-based lookups (current/next/previous interval).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
coordinator_data: Coordinator data
|
||||||
|
interval_offset: Interval offset (0=current, 1=next, -1=previous)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Price level string or None if not found
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not coordinator_data or interval_offset is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
price_info = coordinator_data.get("priceInfo", {})
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
# Interval-based lookup
|
||||||
|
target_time = now + timedelta(minutes=MINUTES_PER_INTERVAL * interval_offset)
|
||||||
|
interval_data = find_price_data_for_interval(price_info, target_time)
|
||||||
|
|
||||||
|
if not interval_data or "level" not in interval_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return interval_data["level"]
|
||||||
File diff suppressed because it is too large
Load diff
30
custom_components/tibber_prices/sensor/__init__.py
Normal file
30
custom_components/tibber_prices/sensor/__init__.py
Normal file
|
|
@ -0,0 +1,30 @@
|
||||||
|
"""Sensor platform for Tibber Prices integration."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from .core import TibberPricesSensor
|
||||||
|
from .definitions import ENTITY_DESCRIPTIONS
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from custom_components.tibber_prices.data import TibberPricesConfigEntry
|
||||||
|
from homeassistant.core import HomeAssistant
|
||||||
|
from homeassistant.helpers.entity_platform import AddEntitiesCallback
|
||||||
|
|
||||||
|
|
||||||
|
async def async_setup_entry(
|
||||||
|
_hass: HomeAssistant,
|
||||||
|
entry: TibberPricesConfigEntry,
|
||||||
|
async_add_entities: AddEntitiesCallback,
|
||||||
|
) -> None:
|
||||||
|
"""Set up Tibber Prices sensor based on a config entry."""
|
||||||
|
coordinator = entry.runtime_data.coordinator
|
||||||
|
|
||||||
|
async_add_entities(
|
||||||
|
TibberPricesSensor(
|
||||||
|
coordinator=coordinator,
|
||||||
|
entity_description=entity_description,
|
||||||
|
)
|
||||||
|
for entity_description in ENTITY_DESCRIPTIONS
|
||||||
|
)
|
||||||
767
custom_components/tibber_prices/sensor/attributes.py
Normal file
767
custom_components/tibber_prices/sensor/attributes.py
Normal file
|
|
@ -0,0 +1,767 @@
|
||||||
|
"""
|
||||||
|
Attribute builders for Tibber Prices sensors.
|
||||||
|
|
||||||
|
This module contains all the attribute building logic extracted from TibberPricesSensor.
|
||||||
|
Each function takes explicit parameters instead of accessing instance variables.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from typing import TYPE_CHECKING, Any
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import (
|
||||||
|
PRICE_LEVEL_MAPPING,
|
||||||
|
PRICE_RATING_MAPPING,
|
||||||
|
)
|
||||||
|
from custom_components.tibber_prices.entity_utils import add_icon_color_attribute
|
||||||
|
from custom_components.tibber_prices.price_utils import (
|
||||||
|
MINUTES_PER_INTERVAL,
|
||||||
|
calculate_volatility_level,
|
||||||
|
find_price_data_for_interval,
|
||||||
|
)
|
||||||
|
from homeassistant.const import PERCENTAGE
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from custom_components.tibber_prices.coordinator import (
|
||||||
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Constants
|
||||||
|
MAX_FORECAST_INTERVALS = 8 # Show up to 8 future intervals (2 hours with 15-min intervals)
|
||||||
|
|
||||||
|
|
||||||
|
def build_sensor_attributes(
|
||||||
|
key: str,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
native_value: Any,
|
||||||
|
cached_data: dict,
|
||||||
|
) -> dict | None:
|
||||||
|
"""
|
||||||
|
Build attributes for a sensor based on its key.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: The sensor entity key
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
native_value: The current native value of the sensor
|
||||||
|
cached_data: Dictionary containing cached sensor data
|
||||||
|
(_last_extreme_interval, _trend_attributes, _volatility_attributes, etc.)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary of attributes or None if no attributes should be added
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not coordinator.data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
try:
|
||||||
|
attributes: dict[str, Any] = {}
|
||||||
|
|
||||||
|
# For trend sensors, use the cached _trend_attributes
|
||||||
|
if key.startswith("price_trend_") and cached_data.get("trend_attributes"):
|
||||||
|
attributes.update(cached_data["trend_attributes"])
|
||||||
|
|
||||||
|
# Group sensors by type and delegate to specific handlers
|
||||||
|
if key in [
|
||||||
|
"current_interval_price",
|
||||||
|
"current_interval_price_level",
|
||||||
|
"next_interval_price",
|
||||||
|
"previous_interval_price",
|
||||||
|
"current_hour_average",
|
||||||
|
"next_hour_average",
|
||||||
|
"next_interval_price_level",
|
||||||
|
"previous_interval_price_level",
|
||||||
|
"current_hour_price_level",
|
||||||
|
"next_hour_price_level",
|
||||||
|
"next_interval_price_rating",
|
||||||
|
"previous_interval_price_rating",
|
||||||
|
"current_hour_price_rating",
|
||||||
|
"next_hour_price_rating",
|
||||||
|
]:
|
||||||
|
add_current_interval_price_attributes(
|
||||||
|
attributes=attributes,
|
||||||
|
key=key,
|
||||||
|
coordinator=coordinator,
|
||||||
|
native_value=native_value,
|
||||||
|
cached_data=cached_data,
|
||||||
|
)
|
||||||
|
elif key in [
|
||||||
|
"trailing_price_average",
|
||||||
|
"leading_price_average",
|
||||||
|
"trailing_price_min",
|
||||||
|
"trailing_price_max",
|
||||||
|
"leading_price_min",
|
||||||
|
"leading_price_max",
|
||||||
|
]:
|
||||||
|
add_average_price_attributes(attributes=attributes, key=key, coordinator=coordinator)
|
||||||
|
elif key.startswith("next_avg_"):
|
||||||
|
add_next_avg_attributes(attributes=attributes, key=key, coordinator=coordinator)
|
||||||
|
elif any(pattern in key for pattern in ["_price_today", "_price_tomorrow", "rating", "data_timestamp"]):
|
||||||
|
add_statistics_attributes(
|
||||||
|
attributes=attributes,
|
||||||
|
key=key,
|
||||||
|
coordinator=coordinator,
|
||||||
|
cached_data=cached_data,
|
||||||
|
)
|
||||||
|
elif key == "price_forecast":
|
||||||
|
add_price_forecast_attributes(attributes=attributes, coordinator=coordinator)
|
||||||
|
elif key.endswith("_volatility"):
|
||||||
|
add_volatility_attributes(attributes=attributes, cached_data=cached_data)
|
||||||
|
|
||||||
|
# For price_level, add the original level as attribute
|
||||||
|
if key == "current_interval_price_level" and cached_data.get("last_price_level") is not None:
|
||||||
|
attributes["level_id"] = cached_data["last_price_level"]
|
||||||
|
|
||||||
|
except (KeyError, ValueError, TypeError) as ex:
|
||||||
|
coordinator.logger.exception(
|
||||||
|
"Error getting sensor attributes",
|
||||||
|
extra={
|
||||||
|
"error": str(ex),
|
||||||
|
"entity": key,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
return None
|
||||||
|
else:
|
||||||
|
return attributes if attributes else None
|
||||||
|
|
||||||
|
|
||||||
|
def add_current_interval_price_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
native_value: Any,
|
||||||
|
cached_data: dict,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add attributes for current interval price sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
native_value: The current native value of the sensor
|
||||||
|
cached_data: Dictionary containing cached sensor data
|
||||||
|
|
||||||
|
"""
|
||||||
|
price_info = coordinator.data.get("priceInfo", {}) if coordinator.data else {}
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
# Determine which interval to use based on sensor type
|
||||||
|
next_interval_sensors = [
|
||||||
|
"next_interval_price",
|
||||||
|
"next_interval_price_level",
|
||||||
|
"next_interval_price_rating",
|
||||||
|
]
|
||||||
|
previous_interval_sensors = [
|
||||||
|
"previous_interval_price",
|
||||||
|
"previous_interval_price_level",
|
||||||
|
"previous_interval_price_rating",
|
||||||
|
]
|
||||||
|
next_hour_sensors = [
|
||||||
|
"next_hour_average",
|
||||||
|
"next_hour_price_level",
|
||||||
|
"next_hour_price_rating",
|
||||||
|
]
|
||||||
|
current_hour_sensors = [
|
||||||
|
"current_hour_average",
|
||||||
|
"current_hour_price_level",
|
||||||
|
"current_hour_price_rating",
|
||||||
|
]
|
||||||
|
|
||||||
|
# Set timestamp and interval data based on sensor type
|
||||||
|
interval_data = None
|
||||||
|
if key in next_interval_sensors:
|
||||||
|
target_time = now + timedelta(minutes=MINUTES_PER_INTERVAL)
|
||||||
|
interval_data = find_price_data_for_interval(price_info, target_time)
|
||||||
|
attributes["timestamp"] = interval_data["startsAt"] if interval_data else None
|
||||||
|
elif key in previous_interval_sensors:
|
||||||
|
target_time = now - timedelta(minutes=MINUTES_PER_INTERVAL)
|
||||||
|
interval_data = find_price_data_for_interval(price_info, target_time)
|
||||||
|
attributes["timestamp"] = interval_data["startsAt"] if interval_data else None
|
||||||
|
elif key in next_hour_sensors:
|
||||||
|
target_time = now + timedelta(hours=1)
|
||||||
|
interval_data = find_price_data_for_interval(price_info, target_time)
|
||||||
|
attributes["timestamp"] = interval_data["startsAt"] if interval_data else None
|
||||||
|
elif key in current_hour_sensors:
|
||||||
|
current_interval_data = get_current_interval_data(coordinator)
|
||||||
|
attributes["timestamp"] = current_interval_data["startsAt"] if current_interval_data else None
|
||||||
|
else:
|
||||||
|
current_interval_data = get_current_interval_data(coordinator)
|
||||||
|
attributes["timestamp"] = current_interval_data["startsAt"] if current_interval_data else None
|
||||||
|
|
||||||
|
# Add icon_color for price sensors (based on their price level)
|
||||||
|
if key in ["current_interval_price", "next_interval_price", "previous_interval_price"]:
|
||||||
|
# For interval-based price sensors, get level from interval_data
|
||||||
|
if interval_data and "level" in interval_data:
|
||||||
|
level = interval_data["level"]
|
||||||
|
add_icon_color_attribute(attributes, key="price_level", state_value=level)
|
||||||
|
elif key in ["current_hour_average", "next_hour_average"]:
|
||||||
|
# For hour-based price sensors, get level from cached_data
|
||||||
|
level = cached_data.get("rolling_hour_level")
|
||||||
|
if level:
|
||||||
|
add_icon_color_attribute(attributes, key="price_level", state_value=level)
|
||||||
|
|
||||||
|
# Add price level attributes for all level sensors
|
||||||
|
add_level_attributes_for_sensor(
|
||||||
|
attributes=attributes,
|
||||||
|
key=key,
|
||||||
|
interval_data=interval_data,
|
||||||
|
coordinator=coordinator,
|
||||||
|
native_value=native_value,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add price rating attributes for all rating sensors
|
||||||
|
add_rating_attributes_for_sensor(
|
||||||
|
attributes=attributes,
|
||||||
|
key=key,
|
||||||
|
interval_data=interval_data,
|
||||||
|
coordinator=coordinator,
|
||||||
|
native_value=native_value,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def add_level_attributes_for_sensor(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
interval_data: dict | None,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
native_value: Any,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add price level attributes based on sensor type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
interval_data: Interval data for next/previous sensors
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
native_value: The current native value of the sensor
|
||||||
|
|
||||||
|
"""
|
||||||
|
# For interval-based level sensors (next/previous), use interval data
|
||||||
|
if key in ["next_interval_price_level", "previous_interval_price_level"]:
|
||||||
|
if interval_data and "level" in interval_data:
|
||||||
|
add_price_level_attributes(attributes, interval_data["level"])
|
||||||
|
# For hour-aggregated level sensors, use native_value
|
||||||
|
elif key in ["current_hour_price_level", "next_hour_price_level"]:
|
||||||
|
level_value = native_value
|
||||||
|
if level_value and isinstance(level_value, str):
|
||||||
|
add_price_level_attributes(attributes, level_value.upper())
|
||||||
|
# For current price level sensor
|
||||||
|
elif key == "current_interval_price_level":
|
||||||
|
current_interval_data = get_current_interval_data(coordinator)
|
||||||
|
if current_interval_data and "level" in current_interval_data:
|
||||||
|
add_price_level_attributes(attributes, current_interval_data["level"])
|
||||||
|
|
||||||
|
|
||||||
|
def add_price_level_attributes(attributes: dict, level: str) -> None:
|
||||||
|
"""
|
||||||
|
Add price level specific attributes.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
level: The price level value (e.g., VERY_CHEAP, NORMAL, etc.)
|
||||||
|
|
||||||
|
"""
|
||||||
|
if level in PRICE_LEVEL_MAPPING:
|
||||||
|
attributes["level_value"] = PRICE_LEVEL_MAPPING[level]
|
||||||
|
attributes["level_id"] = level
|
||||||
|
|
||||||
|
# Add icon_color for dynamic styling
|
||||||
|
add_icon_color_attribute(attributes, key="price_level", state_value=level)
|
||||||
|
|
||||||
|
|
||||||
|
def add_rating_attributes_for_sensor(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
interval_data: dict | None,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
native_value: Any,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add price rating attributes based on sensor type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
interval_data: Interval data for next/previous sensors
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
native_value: The current native value of the sensor
|
||||||
|
|
||||||
|
"""
|
||||||
|
# For interval-based rating sensors (next/previous), use interval data
|
||||||
|
if key in ["next_interval_price_rating", "previous_interval_price_rating"]:
|
||||||
|
if interval_data and "rating_level" in interval_data:
|
||||||
|
add_price_rating_attributes(attributes, interval_data["rating_level"])
|
||||||
|
# For hour-aggregated rating sensors, use native_value
|
||||||
|
elif key in ["current_hour_price_rating", "next_hour_price_rating"]:
|
||||||
|
rating_value = native_value
|
||||||
|
if rating_value and isinstance(rating_value, str):
|
||||||
|
add_price_rating_attributes(attributes, rating_value.upper())
|
||||||
|
# For current price rating sensor
|
||||||
|
elif key == "current_interval_price_rating":
|
||||||
|
current_interval_data = get_current_interval_data(coordinator)
|
||||||
|
if current_interval_data and "rating_level" in current_interval_data:
|
||||||
|
add_price_rating_attributes(attributes, current_interval_data["rating_level"])
|
||||||
|
|
||||||
|
|
||||||
|
def add_price_rating_attributes(attributes: dict, rating: str) -> None:
|
||||||
|
"""
|
||||||
|
Add price rating specific attributes.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
rating: The price rating value (e.g., LOW, NORMAL, HIGH)
|
||||||
|
|
||||||
|
"""
|
||||||
|
if rating in PRICE_RATING_MAPPING:
|
||||||
|
attributes["rating_value"] = PRICE_RATING_MAPPING[rating]
|
||||||
|
attributes["rating_id"] = rating
|
||||||
|
|
||||||
|
# Add icon_color for dynamic styling
|
||||||
|
add_icon_color_attribute(attributes, key="price_rating", state_value=rating)
|
||||||
|
|
||||||
|
|
||||||
|
def add_statistics_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
cached_data: dict,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add attributes for statistics and rating sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
cached_data: Dictionary containing cached sensor data
|
||||||
|
|
||||||
|
"""
|
||||||
|
price_info = coordinator.data.get("priceInfo", {})
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
if key == "data_timestamp":
|
||||||
|
# For data_timestamp sensor, use the latest timestamp from cached_data
|
||||||
|
latest_timestamp = cached_data.get("data_timestamp")
|
||||||
|
if latest_timestamp:
|
||||||
|
attributes["timestamp"] = latest_timestamp.isoformat()
|
||||||
|
elif key == "current_interval_price_rating":
|
||||||
|
interval_data = find_price_data_for_interval(price_info, now)
|
||||||
|
attributes["timestamp"] = interval_data["startsAt"] if interval_data else None
|
||||||
|
if cached_data.get("last_rating_difference") is not None:
|
||||||
|
attributes["diff_" + PERCENTAGE] = cached_data["last_rating_difference"]
|
||||||
|
if cached_data.get("last_rating_level") is not None:
|
||||||
|
attributes["level_id"] = cached_data["last_rating_level"]
|
||||||
|
attributes["level_value"] = PRICE_RATING_MAPPING.get(
|
||||||
|
cached_data["last_rating_level"], cached_data["last_rating_level"]
|
||||||
|
)
|
||||||
|
elif key in [
|
||||||
|
"lowest_price_today",
|
||||||
|
"highest_price_today",
|
||||||
|
"lowest_price_tomorrow",
|
||||||
|
"highest_price_tomorrow",
|
||||||
|
]:
|
||||||
|
# Use the timestamp from the interval that has the extreme price
|
||||||
|
if cached_data.get("last_extreme_interval"):
|
||||||
|
attributes["timestamp"] = cached_data["last_extreme_interval"].get("startsAt")
|
||||||
|
else:
|
||||||
|
# Fallback: use the first timestamp of the appropriate day
|
||||||
|
day_key = "tomorrow" if "tomorrow" in key else "today"
|
||||||
|
day_data = price_info.get(day_key, [])
|
||||||
|
if day_data:
|
||||||
|
attributes["timestamp"] = day_data[0].get("startsAt")
|
||||||
|
else:
|
||||||
|
# Fallback: use the first timestamp of the appropriate day
|
||||||
|
day_key = "tomorrow" if "tomorrow" in key else "today"
|
||||||
|
day_data = price_info.get(day_key, [])
|
||||||
|
if day_data:
|
||||||
|
attributes["timestamp"] = day_data[0].get("startsAt")
|
||||||
|
|
||||||
|
|
||||||
|
def add_average_price_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add attributes for trailing and leading average price sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
|
||||||
|
"""
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
# Determine if this is trailing or leading
|
||||||
|
is_trailing = "trailing" in key
|
||||||
|
|
||||||
|
# Get all price intervals
|
||||||
|
price_info = coordinator.data.get("priceInfo", {})
|
||||||
|
yesterday_prices = price_info.get("yesterday", [])
|
||||||
|
today_prices = price_info.get("today", [])
|
||||||
|
tomorrow_prices = price_info.get("tomorrow", [])
|
||||||
|
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
||||||
|
|
||||||
|
if not all_prices:
|
||||||
|
return
|
||||||
|
|
||||||
|
# Calculate the time window
|
||||||
|
if is_trailing:
|
||||||
|
window_start = now - timedelta(hours=24)
|
||||||
|
window_end = now
|
||||||
|
else:
|
||||||
|
window_start = now
|
||||||
|
window_end = now + timedelta(hours=24)
|
||||||
|
|
||||||
|
# Find all intervals in the window and get first/last timestamps
|
||||||
|
intervals_in_window = []
|
||||||
|
for price_data in all_prices:
|
||||||
|
starts_at = dt_util.parse_datetime(price_data["startsAt"])
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
if window_start <= starts_at < window_end:
|
||||||
|
intervals_in_window.append(price_data)
|
||||||
|
|
||||||
|
# Add timestamp attribute (first interval in the window)
|
||||||
|
if intervals_in_window:
|
||||||
|
attributes["timestamp"] = intervals_in_window[0].get("startsAt")
|
||||||
|
attributes["interval_count"] = len(intervals_in_window)
|
||||||
|
|
||||||
|
|
||||||
|
def add_next_avg_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add attributes for next N hours average price sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
|
||||||
|
"""
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
# Extract hours from sensor key (e.g., "next_avg_3h" -> 3)
|
||||||
|
try:
|
||||||
|
hours = int(key.replace("next_avg_", "").replace("h", ""))
|
||||||
|
except (ValueError, AttributeError):
|
||||||
|
return
|
||||||
|
|
||||||
|
# Get next interval start time (this is where the calculation begins)
|
||||||
|
next_interval_start = now + timedelta(minutes=MINUTES_PER_INTERVAL)
|
||||||
|
|
||||||
|
# Calculate the end of the time window
|
||||||
|
window_end = next_interval_start + timedelta(hours=hours)
|
||||||
|
|
||||||
|
# Get all price intervals
|
||||||
|
price_info = coordinator.data.get("priceInfo", {})
|
||||||
|
today_prices = price_info.get("today", [])
|
||||||
|
tomorrow_prices = price_info.get("tomorrow", [])
|
||||||
|
all_prices = today_prices + tomorrow_prices
|
||||||
|
|
||||||
|
if not all_prices:
|
||||||
|
return
|
||||||
|
|
||||||
|
# Find all intervals in the window
|
||||||
|
intervals_in_window = []
|
||||||
|
for price_data in all_prices:
|
||||||
|
starts_at = dt_util.parse_datetime(price_data["startsAt"])
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
if next_interval_start <= starts_at < window_end:
|
||||||
|
intervals_in_window.append(price_data)
|
||||||
|
|
||||||
|
# Add timestamp attribute (start of next interval - where calculation begins)
|
||||||
|
if intervals_in_window:
|
||||||
|
attributes["timestamp"] = intervals_in_window[0].get("startsAt")
|
||||||
|
attributes["interval_count"] = len(intervals_in_window)
|
||||||
|
attributes["hours"] = hours
|
||||||
|
|
||||||
|
|
||||||
|
def add_price_forecast_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add forecast attributes for the price forecast sensor.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
|
||||||
|
"""
|
||||||
|
future_prices = get_future_prices(coordinator, max_intervals=MAX_FORECAST_INTERVALS)
|
||||||
|
if not future_prices:
|
||||||
|
attributes["intervals"] = []
|
||||||
|
attributes["intervals_by_hour"] = []
|
||||||
|
attributes["data_available"] = False
|
||||||
|
return
|
||||||
|
|
||||||
|
# Add timestamp attribute (first future interval)
|
||||||
|
if future_prices:
|
||||||
|
attributes["timestamp"] = future_prices[0]["interval_start"]
|
||||||
|
|
||||||
|
attributes["intervals"] = future_prices
|
||||||
|
attributes["data_available"] = True
|
||||||
|
|
||||||
|
# Group by hour for easier consumption in dashboards
|
||||||
|
hours: dict[str, Any] = {}
|
||||||
|
for interval in future_prices:
|
||||||
|
starts_at = datetime.fromisoformat(interval["interval_start"])
|
||||||
|
hour_key = starts_at.strftime("%Y-%m-%d %H")
|
||||||
|
|
||||||
|
if hour_key not in hours:
|
||||||
|
hours[hour_key] = {
|
||||||
|
"hour": starts_at.hour,
|
||||||
|
"day": interval["day"],
|
||||||
|
"date": starts_at.date().isoformat(),
|
||||||
|
"intervals": [],
|
||||||
|
"min_price": None,
|
||||||
|
"max_price": None,
|
||||||
|
"avg_price": 0,
|
||||||
|
"avg_rating": None, # Initialize rating tracking
|
||||||
|
"ratings_available": False, # Track if any ratings are available
|
||||||
|
}
|
||||||
|
|
||||||
|
# Create interval data with both price and rating info
|
||||||
|
interval_data = {
|
||||||
|
"minute": starts_at.minute,
|
||||||
|
"price": interval["price"],
|
||||||
|
"price_minor": interval["price_minor"],
|
||||||
|
"level": interval["level"], # Price level from priceInfo
|
||||||
|
"time": starts_at.strftime("%H:%M"),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add rating data if available
|
||||||
|
if interval["rating"] is not None:
|
||||||
|
interval_data["rating"] = interval["rating"]
|
||||||
|
interval_data["rating_level"] = interval["rating_level"]
|
||||||
|
hours[hour_key]["ratings_available"] = True
|
||||||
|
|
||||||
|
hours[hour_key]["intervals"].append(interval_data)
|
||||||
|
|
||||||
|
# Track min/max/avg for the hour
|
||||||
|
price = interval["price"]
|
||||||
|
if hours[hour_key]["min_price"] is None or price < hours[hour_key]["min_price"]:
|
||||||
|
hours[hour_key]["min_price"] = price
|
||||||
|
if hours[hour_key]["max_price"] is None or price > hours[hour_key]["max_price"]:
|
||||||
|
hours[hour_key]["max_price"] = price
|
||||||
|
|
||||||
|
# Calculate averages
|
||||||
|
for hour_data in hours.values():
|
||||||
|
prices = [interval["price"] for interval in hour_data["intervals"]]
|
||||||
|
if prices:
|
||||||
|
hour_data["avg_price"] = sum(prices) / len(prices)
|
||||||
|
hour_data["min_price"] = hour_data["min_price"]
|
||||||
|
hour_data["max_price"] = hour_data["max_price"]
|
||||||
|
|
||||||
|
# Calculate average rating if ratings are available
|
||||||
|
if hour_data["ratings_available"]:
|
||||||
|
ratings = [interval.get("rating") for interval in hour_data["intervals"] if "rating" in interval]
|
||||||
|
if ratings:
|
||||||
|
hour_data["avg_rating"] = sum(ratings) / len(ratings)
|
||||||
|
|
||||||
|
# Convert to list sorted by hour
|
||||||
|
attributes["intervals_by_hour"] = [hour_data for _, hour_data in sorted(hours.items())]
|
||||||
|
|
||||||
|
|
||||||
|
def add_volatility_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
cached_data: dict,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add attributes for volatility sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
cached_data: Dictionary containing cached sensor data
|
||||||
|
|
||||||
|
"""
|
||||||
|
if cached_data.get("volatility_attributes"):
|
||||||
|
attributes.update(cached_data["volatility_attributes"])
|
||||||
|
|
||||||
|
|
||||||
|
def get_prices_for_volatility(
|
||||||
|
volatility_type: str,
|
||||||
|
price_info: dict,
|
||||||
|
) -> list[float]:
|
||||||
|
"""
|
||||||
|
Get price list for volatility calculation based on type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
volatility_type: One of "today", "tomorrow", "next_24h", "today_tomorrow"
|
||||||
|
price_info: Price information dictionary from coordinator data
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of prices to analyze
|
||||||
|
|
||||||
|
"""
|
||||||
|
if volatility_type == "today":
|
||||||
|
return [float(p["total"]) for p in price_info.get("today", []) if "total" in p]
|
||||||
|
|
||||||
|
if volatility_type == "tomorrow":
|
||||||
|
return [float(p["total"]) for p in price_info.get("tomorrow", []) if "total" in p]
|
||||||
|
|
||||||
|
if volatility_type == "next_24h":
|
||||||
|
# Rolling 24h from now
|
||||||
|
now = dt_util.now()
|
||||||
|
end_time = now + timedelta(hours=24)
|
||||||
|
prices = []
|
||||||
|
|
||||||
|
for day_key in ["today", "tomorrow"]:
|
||||||
|
for price_data in price_info.get(day_key, []):
|
||||||
|
starts_at = dt_util.parse_datetime(price_data.get("startsAt"))
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
|
||||||
|
if now <= starts_at < end_time and "total" in price_data:
|
||||||
|
prices.append(float(price_data["total"]))
|
||||||
|
return prices
|
||||||
|
|
||||||
|
if volatility_type == "today_tomorrow":
|
||||||
|
# Combined today + tomorrow
|
||||||
|
prices = []
|
||||||
|
for day_key in ["today", "tomorrow"]:
|
||||||
|
for price_data in price_info.get(day_key, []):
|
||||||
|
if "total" in price_data:
|
||||||
|
prices.append(float(price_data["total"]))
|
||||||
|
return prices
|
||||||
|
|
||||||
|
return []
|
||||||
|
|
||||||
|
|
||||||
|
def add_volatility_type_attributes(
|
||||||
|
volatility_attributes: dict,
|
||||||
|
volatility_type: str,
|
||||||
|
price_info: dict,
|
||||||
|
thresholds: dict,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add type-specific attributes for volatility sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
volatility_attributes: Dictionary to add type-specific attributes to
|
||||||
|
volatility_type: Type of volatility calculation
|
||||||
|
price_info: Price information dictionary from coordinator data
|
||||||
|
thresholds: Volatility thresholds configuration
|
||||||
|
|
||||||
|
"""
|
||||||
|
if volatility_type == "today_tomorrow":
|
||||||
|
# Add breakdown for today vs tomorrow
|
||||||
|
today_prices = [float(p["total"]) for p in price_info.get("today", []) if "total" in p]
|
||||||
|
tomorrow_prices = [float(p["total"]) for p in price_info.get("tomorrow", []) if "total" in p]
|
||||||
|
|
||||||
|
if today_prices:
|
||||||
|
today_vol = calculate_volatility_level(today_prices, **thresholds)
|
||||||
|
today_spread = (max(today_prices) - min(today_prices)) * 100
|
||||||
|
volatility_attributes["today_spread"] = round(today_spread, 2)
|
||||||
|
volatility_attributes["today_volatility"] = today_vol
|
||||||
|
volatility_attributes["interval_count_today"] = len(today_prices)
|
||||||
|
|
||||||
|
if tomorrow_prices:
|
||||||
|
tomorrow_vol = calculate_volatility_level(tomorrow_prices, **thresholds)
|
||||||
|
tomorrow_spread = (max(tomorrow_prices) - min(tomorrow_prices)) * 100
|
||||||
|
volatility_attributes["tomorrow_spread"] = round(tomorrow_spread, 2)
|
||||||
|
volatility_attributes["tomorrow_volatility"] = tomorrow_vol
|
||||||
|
volatility_attributes["interval_count_tomorrow"] = len(tomorrow_prices)
|
||||||
|
|
||||||
|
elif volatility_type == "next_24h":
|
||||||
|
# Add time window info
|
||||||
|
now = dt_util.now()
|
||||||
|
volatility_attributes["timestamp"] = now.isoformat()
|
||||||
|
|
||||||
|
|
||||||
|
def get_future_prices(
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
max_intervals: int | None = None,
|
||||||
|
) -> list[dict] | None:
|
||||||
|
"""
|
||||||
|
Get future price data for multiple upcoming intervals.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
max_intervals: Maximum number of future intervals to return
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of upcoming price intervals with timestamps and prices
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not coordinator.data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
price_info = coordinator.data.get("priceInfo", {})
|
||||||
|
|
||||||
|
today_prices = price_info.get("today", [])
|
||||||
|
tomorrow_prices = price_info.get("tomorrow", [])
|
||||||
|
all_prices = today_prices + tomorrow_prices
|
||||||
|
|
||||||
|
if not all_prices:
|
||||||
|
return None
|
||||||
|
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
# Initialize the result list
|
||||||
|
future_prices = []
|
||||||
|
|
||||||
|
# Track the maximum intervals to return
|
||||||
|
intervals_to_return = MAX_FORECAST_INTERVALS if max_intervals is None else max_intervals
|
||||||
|
|
||||||
|
for day_key in ["today", "tomorrow"]:
|
||||||
|
for price_data in price_info.get(day_key, []):
|
||||||
|
starts_at = dt_util.parse_datetime(price_data["startsAt"])
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
interval_end = starts_at + timedelta(minutes=MINUTES_PER_INTERVAL)
|
||||||
|
|
||||||
|
if starts_at > now:
|
||||||
|
future_prices.append(
|
||||||
|
{
|
||||||
|
"interval_start": starts_at.isoformat(),
|
||||||
|
"interval_end": interval_end.isoformat(),
|
||||||
|
"price": float(price_data["total"]),
|
||||||
|
"price_minor": round(float(price_data["total"]) * 100, 2),
|
||||||
|
"level": price_data.get("level", "NORMAL"),
|
||||||
|
"rating": price_data.get("difference", None),
|
||||||
|
"rating_level": price_data.get("rating_level"),
|
||||||
|
"day": day_key,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Sort by start time
|
||||||
|
future_prices.sort(key=lambda x: x["interval_start"])
|
||||||
|
|
||||||
|
# Limit to the requested number of intervals
|
||||||
|
return future_prices[:intervals_to_return] if future_prices else None
|
||||||
|
|
||||||
|
|
||||||
|
def get_current_interval_data(
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
) -> dict | None:
|
||||||
|
"""
|
||||||
|
Get the current interval data from coordinator.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Current interval data dictionary or None
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not coordinator.data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
price_info = coordinator.data.get("priceInfo", {})
|
||||||
|
now = dt_util.now()
|
||||||
|
return find_price_data_for_interval(price_info, now)
|
||||||
1035
custom_components/tibber_prices/sensor/core.py
Normal file
1035
custom_components/tibber_prices/sensor/core.py
Normal file
File diff suppressed because it is too large
Load diff
564
custom_components/tibber_prices/sensor/definitions.py
Normal file
564
custom_components/tibber_prices/sensor/definitions.py
Normal file
|
|
@ -0,0 +1,564 @@
|
||||||
|
"""
|
||||||
|
Sensor entity definitions for Tibber Prices.
|
||||||
|
|
||||||
|
This module contains all SensorEntityDescription definitions organized by
|
||||||
|
calculation method. Sensor definitions are declarative and independent of
|
||||||
|
the implementation logic.
|
||||||
|
|
||||||
|
Organization by calculation pattern:
|
||||||
|
1. Interval-based: Time offset from current interval
|
||||||
|
2. Rolling hour: 5-interval aggregation windows
|
||||||
|
3. Daily statistics: Calendar day min/max/avg
|
||||||
|
4. 24h windows: Trailing/leading statistics
|
||||||
|
5. Future forecast: N-hour windows from next interval
|
||||||
|
6. Volatility: Price variation analysis
|
||||||
|
7. Diagnostic: System metadata
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from homeassistant.components.sensor import (
|
||||||
|
SensorDeviceClass,
|
||||||
|
SensorEntityDescription,
|
||||||
|
)
|
||||||
|
from homeassistant.const import EntityCategory
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# SENSOR DEFINITIONS - Grouped by calculation method
|
||||||
|
# ============================================================================
|
||||||
|
#
|
||||||
|
# Sensors are organized by HOW they calculate values, not WHAT they display.
|
||||||
|
# This groups sensors that share common logic and enables code reuse through
|
||||||
|
# unified handler methods.
|
||||||
|
#
|
||||||
|
# Calculation patterns:
|
||||||
|
# 1. Interval-based: Use time offset from current interval
|
||||||
|
# 2. Rolling hour: Aggregate 5-interval window (2 before + center + 2 after)
|
||||||
|
# 3. Daily statistics: Min/max/avg within calendar day boundaries
|
||||||
|
# 4. 24h windows: Trailing/leading from current interval
|
||||||
|
# 5. Future forecast: N-hour windows starting from next interval
|
||||||
|
# 6. Volatility: Statistical analysis of price variation
|
||||||
|
# 7. Diagnostic: System information and metadata
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# 1. INTERVAL-BASED SENSORS (offset: -1, 0, +1 from current interval)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# All use find_price_data_for_interval() with time offset
|
||||||
|
# Shared handler: _get_interval_value(interval_offset, value_type)
|
||||||
|
|
||||||
|
INTERVAL_PRICE_SENSORS = (
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="current_interval_price",
|
||||||
|
translation_key="current_interval_price",
|
||||||
|
name="Current Electricity Price",
|
||||||
|
icon="mdi:cash", # Dynamic: will show cash-multiple/plus/cash/minus/remove based on level
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=2,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_interval_price",
|
||||||
|
translation_key="next_interval_price",
|
||||||
|
name="Next Price",
|
||||||
|
icon="mdi:cash-fast", # Static: motion lines indicate "coming soon"
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=2,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="previous_interval_price",
|
||||||
|
translation_key="previous_interval_price",
|
||||||
|
name="Previous Electricity Price",
|
||||||
|
icon="mdi:cash-refund", # Static: arrow back indicates "past"
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
suggested_display_precision=2,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# NOTE: Enum options are defined inline (not imported from const.py) to avoid
|
||||||
|
# import timing issues with Home Assistant's entity platform initialization.
|
||||||
|
# Keep in sync with PRICE_LEVEL_OPTIONS in const.py!
|
||||||
|
INTERVAL_LEVEL_SENSORS = (
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="current_interval_price_level",
|
||||||
|
translation_key="current_interval_price_level",
|
||||||
|
name="Current Price Level",
|
||||||
|
icon="mdi:gauge",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["very_cheap", "cheap", "normal", "expensive", "very_expensive"],
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_interval_price_level",
|
||||||
|
translation_key="next_interval_price_level",
|
||||||
|
name="Next Price Level",
|
||||||
|
icon="mdi:gauge-empty",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["very_cheap", "cheap", "normal", "expensive", "very_expensive"],
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="previous_interval_price_level",
|
||||||
|
translation_key="previous_interval_price_level",
|
||||||
|
name="Previous Price Level",
|
||||||
|
icon="mdi:gauge-empty",
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["very_cheap", "cheap", "normal", "expensive", "very_expensive"],
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# NOTE: Enum options are defined inline (not imported from const.py) to avoid
|
||||||
|
# import timing issues with Home Assistant's entity platform initialization.
|
||||||
|
# Keep in sync with PRICE_RATING_OPTIONS in const.py!
|
||||||
|
INTERVAL_RATING_SENSORS = (
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="current_interval_price_rating",
|
||||||
|
translation_key="current_interval_price_rating",
|
||||||
|
name="Current Price Rating",
|
||||||
|
icon="mdi:star-outline",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["low", "normal", "high"],
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_interval_price_rating",
|
||||||
|
translation_key="next_interval_price_rating",
|
||||||
|
name="Next Price Rating",
|
||||||
|
icon="mdi:star-half-full",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["low", "normal", "high"],
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="previous_interval_price_rating",
|
||||||
|
translation_key="previous_interval_price_rating",
|
||||||
|
name="Previous Price Rating",
|
||||||
|
icon="mdi:star-half-full",
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["low", "normal", "high"],
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# 2. ROLLING HOUR SENSORS (5-interval window: 2 before + center + 2 after)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# All aggregate data from rolling 5-interval window around a specific hour
|
||||||
|
# Shared handler: _get_rolling_hour_value(hour_offset, value_type)
|
||||||
|
|
||||||
|
ROLLING_HOUR_PRICE_SENSORS = (
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="current_hour_average",
|
||||||
|
translation_key="current_hour_average",
|
||||||
|
name="Current Hour Average Price",
|
||||||
|
icon="mdi:cash", # Dynamic: will show cash-multiple/plus/cash/minus/remove based on level
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_hour_average",
|
||||||
|
translation_key="next_hour_average",
|
||||||
|
name="Next Hour Average Price",
|
||||||
|
icon="mdi:clock-fast", # Static: clock indicates "next time period"
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# NOTE: Enum options are defined inline (not imported from const.py) to avoid
|
||||||
|
# import timing issues with Home Assistant's entity platform initialization.
|
||||||
|
# Keep in sync with PRICE_LEVEL_OPTIONS in const.py!
|
||||||
|
ROLLING_HOUR_LEVEL_SENSORS = (
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="current_hour_price_level",
|
||||||
|
translation_key="current_hour_price_level",
|
||||||
|
name="Current Hour Price Level",
|
||||||
|
icon="mdi:gauge",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["very_cheap", "cheap", "normal", "expensive", "very_expensive"],
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_hour_price_level",
|
||||||
|
translation_key="next_hour_price_level",
|
||||||
|
name="Next Hour Price Level",
|
||||||
|
icon="mdi:gauge-empty",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["very_cheap", "cheap", "normal", "expensive", "very_expensive"],
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# NOTE: Enum options are defined inline (not imported from const.py) to avoid
|
||||||
|
# import timing issues with Home Assistant's entity platform initialization.
|
||||||
|
# Keep in sync with PRICE_RATING_OPTIONS in const.py!
|
||||||
|
ROLLING_HOUR_RATING_SENSORS = (
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="current_hour_price_rating",
|
||||||
|
translation_key="current_hour_price_rating",
|
||||||
|
name="Current Hour Price Rating",
|
||||||
|
icon="mdi:star-outline",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["low", "normal", "high"],
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_hour_price_rating",
|
||||||
|
translation_key="next_hour_price_rating",
|
||||||
|
name="Next Hour Price Rating",
|
||||||
|
icon="mdi:star-half-full",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["low", "normal", "high"],
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# 3. DAILY STATISTICS SENSORS (min/max/avg for calendar day boundaries)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Calculate statistics for specific calendar days (today/tomorrow)
|
||||||
|
|
||||||
|
DAILY_STAT_SENSORS = (
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="lowest_price_today",
|
||||||
|
translation_key="lowest_price_today",
|
||||||
|
name="Today's Lowest Price",
|
||||||
|
icon="mdi:arrow-collapse-down",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="highest_price_today",
|
||||||
|
translation_key="highest_price_today",
|
||||||
|
name="Today's Highest Price",
|
||||||
|
icon="mdi:arrow-collapse-up",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="average_price_today",
|
||||||
|
translation_key="average_price_today",
|
||||||
|
name="Today's Average Price",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="lowest_price_tomorrow",
|
||||||
|
translation_key="lowest_price_tomorrow",
|
||||||
|
name="Tomorrow's Lowest Price",
|
||||||
|
icon="mdi:arrow-collapse-down",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="highest_price_tomorrow",
|
||||||
|
translation_key="highest_price_tomorrow",
|
||||||
|
name="Tomorrow's Highest Price",
|
||||||
|
icon="mdi:arrow-collapse-up",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="average_price_tomorrow",
|
||||||
|
translation_key="average_price_tomorrow",
|
||||||
|
name="Tomorrow's Average Price",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# 4. 24H WINDOW SENSORS (trailing/leading from current interval)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Calculate statistics over sliding 24-hour windows
|
||||||
|
|
||||||
|
WINDOW_24H_SENSORS = (
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="trailing_price_average",
|
||||||
|
translation_key="trailing_price_average",
|
||||||
|
name="Trailing 24h Average Price",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="leading_price_average",
|
||||||
|
translation_key="leading_price_average",
|
||||||
|
name="Leading 24h Average Price",
|
||||||
|
icon="mdi:chart-line-variant",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="trailing_price_min",
|
||||||
|
translation_key="trailing_price_min",
|
||||||
|
name="Trailing 24h Minimum Price",
|
||||||
|
icon="mdi:arrow-collapse-down",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="trailing_price_max",
|
||||||
|
translation_key="trailing_price_max",
|
||||||
|
name="Trailing 24h Maximum Price",
|
||||||
|
icon="mdi:arrow-collapse-up",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="leading_price_min",
|
||||||
|
translation_key="leading_price_min",
|
||||||
|
name="Leading 24h Minimum Price",
|
||||||
|
icon="mdi:arrow-collapse-down",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="leading_price_max",
|
||||||
|
translation_key="leading_price_max",
|
||||||
|
name="Leading 24h Maximum Price",
|
||||||
|
icon="mdi:arrow-collapse-up",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# 5. FUTURE FORECAST SENSORS (N-hour windows starting from next interval)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Calculate averages and trends for upcoming time windows
|
||||||
|
|
||||||
|
FUTURE_AVG_SENSORS = (
|
||||||
|
# Default enabled: 1h-5h
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_avg_1h",
|
||||||
|
translation_key="next_avg_1h",
|
||||||
|
name="Next 1h Average Price",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
entity_registry_enabled_default=True,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_avg_2h",
|
||||||
|
translation_key="next_avg_2h",
|
||||||
|
name="Next 2h Average Price",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
entity_registry_enabled_default=True,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_avg_3h",
|
||||||
|
translation_key="next_avg_3h",
|
||||||
|
name="Next 3h Average Price",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
entity_registry_enabled_default=True,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_avg_4h",
|
||||||
|
translation_key="next_avg_4h",
|
||||||
|
name="Next 4h Average Price",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
entity_registry_enabled_default=True,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_avg_5h",
|
||||||
|
translation_key="next_avg_5h",
|
||||||
|
name="Next 5h Average Price",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
entity_registry_enabled_default=True,
|
||||||
|
),
|
||||||
|
# Disabled by default: 6h, 8h, 12h (advanced use cases)
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_avg_6h",
|
||||||
|
translation_key="next_avg_6h",
|
||||||
|
name="Next 6h Average Price",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_avg_8h",
|
||||||
|
translation_key="next_avg_8h",
|
||||||
|
name="Next 8h Average Price",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_avg_12h",
|
||||||
|
translation_key="next_avg_12h",
|
||||||
|
name="Next 12h Average Price",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
device_class=SensorDeviceClass.MONETARY,
|
||||||
|
suggested_display_precision=1,
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
FUTURE_TREND_SENSORS = (
|
||||||
|
# Default enabled: 1h-5h
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="price_trend_1h",
|
||||||
|
translation_key="price_trend_1h",
|
||||||
|
name="Price Trend (1h)",
|
||||||
|
icon="mdi:trending-up",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["rising", "falling", "stable"],
|
||||||
|
entity_registry_enabled_default=True,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="price_trend_2h",
|
||||||
|
translation_key="price_trend_2h",
|
||||||
|
name="Price Trend (2h)",
|
||||||
|
icon="mdi:trending-up",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["rising", "falling", "stable"],
|
||||||
|
entity_registry_enabled_default=True,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="price_trend_3h",
|
||||||
|
translation_key="price_trend_3h",
|
||||||
|
name="Price Trend (3h)",
|
||||||
|
icon="mdi:trending-up",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["rising", "falling", "stable"],
|
||||||
|
entity_registry_enabled_default=True,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="price_trend_4h",
|
||||||
|
translation_key="price_trend_4h",
|
||||||
|
name="Price Trend (4h)",
|
||||||
|
icon="mdi:trending-up",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["rising", "falling", "stable"],
|
||||||
|
entity_registry_enabled_default=True,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="price_trend_5h",
|
||||||
|
translation_key="price_trend_5h",
|
||||||
|
name="Price Trend (5h)",
|
||||||
|
icon="mdi:trending-up",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["rising", "falling", "stable"],
|
||||||
|
entity_registry_enabled_default=True,
|
||||||
|
),
|
||||||
|
# Disabled by default: 6h, 8h, 12h
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="price_trend_6h",
|
||||||
|
translation_key="price_trend_6h",
|
||||||
|
name="Price Trend (6h)",
|
||||||
|
icon="mdi:trending-up",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["rising", "falling", "stable"],
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="price_trend_8h",
|
||||||
|
translation_key="price_trend_8h",
|
||||||
|
name="Price Trend (8h)",
|
||||||
|
icon="mdi:trending-up",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["rising", "falling", "stable"],
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="price_trend_12h",
|
||||||
|
translation_key="price_trend_12h",
|
||||||
|
name="Price Trend (12h)",
|
||||||
|
icon="mdi:trending-up",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["rising", "falling", "stable"],
|
||||||
|
entity_registry_enabled_default=False,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# 6. VOLATILITY SENSORS (coefficient of variation analysis)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# NOTE: Enum options are defined inline (not imported from const.py) to avoid
|
||||||
|
# import timing issues with Home Assistant's entity platform initialization.
|
||||||
|
# Keep in sync with VOLATILITY_OPTIONS in const.py!
|
||||||
|
|
||||||
|
VOLATILITY_SENSORS = (
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="today_volatility",
|
||||||
|
translation_key="today_volatility",
|
||||||
|
name="Today's Price Volatility",
|
||||||
|
icon="mdi:chart-bell-curve-cumulative",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["low", "moderate", "high", "very_high"],
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="tomorrow_volatility",
|
||||||
|
translation_key="tomorrow_volatility",
|
||||||
|
name="Tomorrow's Price Volatility",
|
||||||
|
icon="mdi:chart-bell-curve-cumulative",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["low", "moderate", "high", "very_high"],
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="next_24h_volatility",
|
||||||
|
translation_key="next_24h_volatility",
|
||||||
|
name="Next 24h Price Volatility",
|
||||||
|
icon="mdi:chart-bell-curve-cumulative",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["low", "moderate", "high", "very_high"],
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="today_tomorrow_volatility",
|
||||||
|
translation_key="today_tomorrow_volatility",
|
||||||
|
name="Today + Tomorrow Price Volatility",
|
||||||
|
icon="mdi:chart-bell-curve-cumulative",
|
||||||
|
device_class=SensorDeviceClass.ENUM,
|
||||||
|
options=["low", "moderate", "high", "very_high"],
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# 7. DIAGNOSTIC SENSORS (data availability and metadata)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
DIAGNOSTIC_SENSORS = (
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="data_timestamp",
|
||||||
|
translation_key="data_timestamp",
|
||||||
|
name="Data Expiration",
|
||||||
|
icon="mdi:clock-check",
|
||||||
|
device_class=SensorDeviceClass.TIMESTAMP,
|
||||||
|
entity_category=EntityCategory.DIAGNOSTIC,
|
||||||
|
),
|
||||||
|
SensorEntityDescription(
|
||||||
|
key="price_forecast",
|
||||||
|
translation_key="price_forecast",
|
||||||
|
name="Price Forecast",
|
||||||
|
icon="mdi:chart-line",
|
||||||
|
entity_category=EntityCategory.DIAGNOSTIC,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# COMBINED SENSOR DEFINITIONS
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
ENTITY_DESCRIPTIONS = (
|
||||||
|
*INTERVAL_PRICE_SENSORS,
|
||||||
|
*INTERVAL_LEVEL_SENSORS,
|
||||||
|
*INTERVAL_RATING_SENSORS,
|
||||||
|
*ROLLING_HOUR_PRICE_SENSORS,
|
||||||
|
*ROLLING_HOUR_LEVEL_SENSORS,
|
||||||
|
*ROLLING_HOUR_RATING_SENSORS,
|
||||||
|
*DAILY_STAT_SENSORS,
|
||||||
|
*WINDOW_24H_SENSORS,
|
||||||
|
*FUTURE_AVG_SENSORS,
|
||||||
|
*FUTURE_TREND_SENSORS,
|
||||||
|
*VOLATILITY_SENSORS,
|
||||||
|
*DIAGNOSTIC_SENSORS,
|
||||||
|
)
|
||||||
178
custom_components/tibber_prices/sensor/helpers.py
Normal file
178
custom_components/tibber_prices/sensor/helpers.py
Normal file
|
|
@ -0,0 +1,178 @@
|
||||||
|
"""Helper functions for sensor platform."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import timedelta
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import get_price_level_translation
|
||||||
|
from custom_components.tibber_prices.price_utils import (
|
||||||
|
aggregate_price_levels,
|
||||||
|
aggregate_price_rating,
|
||||||
|
)
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from homeassistant.core import HomeAssistant
|
||||||
|
|
||||||
|
|
||||||
|
def aggregate_price_data(window_data: list[dict]) -> float | None:
|
||||||
|
"""
|
||||||
|
Calculate average price from window data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
window_data: List of price interval dictionaries with 'total' key
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Average price in minor currency units (cents/øre), or None if no prices
|
||||||
|
|
||||||
|
"""
|
||||||
|
prices = [float(i["total"]) for i in window_data if "total" in i]
|
||||||
|
if not prices:
|
||||||
|
return None
|
||||||
|
# Return in minor currency units (cents/øre)
|
||||||
|
return round((sum(prices) / len(prices)) * 100, 2)
|
||||||
|
|
||||||
|
|
||||||
|
def aggregate_level_data(window_data: list[dict]) -> str | None:
|
||||||
|
"""
|
||||||
|
Aggregate price levels from window data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
window_data: List of price interval dictionaries with 'level' key
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Aggregated price level (lowercase), or None if no levels
|
||||||
|
|
||||||
|
"""
|
||||||
|
levels = [i["level"] for i in window_data if "level" in i]
|
||||||
|
if not levels:
|
||||||
|
return None
|
||||||
|
aggregated = aggregate_price_levels(levels)
|
||||||
|
return aggregated.lower() if aggregated else None
|
||||||
|
|
||||||
|
|
||||||
|
def aggregate_rating_data(
|
||||||
|
window_data: list[dict],
|
||||||
|
threshold_low: float,
|
||||||
|
threshold_high: float,
|
||||||
|
) -> str | None:
|
||||||
|
"""
|
||||||
|
Aggregate price ratings from window data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
window_data: List of price interval dictionaries with 'difference' and 'rating_level'
|
||||||
|
threshold_low: Low threshold for rating calculation
|
||||||
|
threshold_high: High threshold for rating calculation
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Aggregated price rating (lowercase), or None if no ratings
|
||||||
|
|
||||||
|
"""
|
||||||
|
differences = [i["difference"] for i in window_data if "difference" in i and "rating_level" in i]
|
||||||
|
if not differences:
|
||||||
|
return None
|
||||||
|
|
||||||
|
aggregated, _ = aggregate_price_rating(differences, threshold_low, threshold_high)
|
||||||
|
return aggregated.lower() if aggregated else None
|
||||||
|
|
||||||
|
|
||||||
|
def find_rolling_hour_center_index(
|
||||||
|
all_prices: list[dict],
|
||||||
|
current_time: datetime,
|
||||||
|
hour_offset: int,
|
||||||
|
) -> int | None:
|
||||||
|
"""
|
||||||
|
Find the center index for the rolling hour window.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
all_prices: List of all price interval dictionaries with 'startsAt' key
|
||||||
|
current_time: Current datetime to find the current interval
|
||||||
|
hour_offset: Number of hours to offset from current interval (can be negative)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Index of the center interval for the rolling hour window, or None if not found
|
||||||
|
|
||||||
|
"""
|
||||||
|
current_idx = None
|
||||||
|
|
||||||
|
for idx, price_data in enumerate(all_prices):
|
||||||
|
starts_at = dt_util.parse_datetime(price_data["startsAt"])
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
interval_end = starts_at + timedelta(minutes=15)
|
||||||
|
|
||||||
|
if starts_at <= current_time < interval_end:
|
||||||
|
current_idx = idx
|
||||||
|
break
|
||||||
|
|
||||||
|
if current_idx is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return current_idx + (hour_offset * 4)
|
||||||
|
|
||||||
|
|
||||||
|
def translate_level(hass: HomeAssistant, level: str) -> str:
|
||||||
|
"""
|
||||||
|
Translate price level to the user's language.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
hass: HomeAssistant instance for language configuration
|
||||||
|
level: Price level to translate (e.g., VERY_CHEAP, NORMAL, etc.)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Translated level string, or original level if translation not found
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not hass:
|
||||||
|
return level
|
||||||
|
|
||||||
|
language = hass.config.language or "en"
|
||||||
|
translated = get_price_level_translation(level, language)
|
||||||
|
if translated:
|
||||||
|
return translated
|
||||||
|
|
||||||
|
if language != "en":
|
||||||
|
fallback = get_price_level_translation(level, "en")
|
||||||
|
if fallback:
|
||||||
|
return fallback
|
||||||
|
|
||||||
|
return level
|
||||||
|
|
||||||
|
|
||||||
|
def translate_rating_level(rating: str) -> str:
|
||||||
|
"""
|
||||||
|
Translate price rating level to the user's language.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
rating: Price rating to translate (e.g., LOW, NORMAL, HIGH)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Translated rating string, or original rating if translation not found
|
||||||
|
|
||||||
|
Note:
|
||||||
|
Currently returns the rating as-is. Translation mapping for ratings
|
||||||
|
can be added here when needed, similar to translate_level().
|
||||||
|
|
||||||
|
"""
|
||||||
|
# For now, ratings are returned as-is
|
||||||
|
# Add translation mapping here when needed
|
||||||
|
return rating
|
||||||
|
|
||||||
|
|
||||||
|
def get_price_value(price: float, *, in_euro: bool) -> float:
|
||||||
|
"""
|
||||||
|
Convert price based on unit.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
price: Price value to convert
|
||||||
|
in_euro: If True, return price in euros; if False, return in cents/øre
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Price in requested unit (euros or minor currency units)
|
||||||
|
|
||||||
|
"""
|
||||||
|
return price if in_euro else round((price * 100), 2)
|
||||||
|
|
@ -4,21 +4,21 @@ This section contains documentation for contributors and maintainers of the Tibb
|
||||||
|
|
||||||
## 📚 Developer Guides
|
## 📚 Developer Guides
|
||||||
|
|
||||||
- **[Setup](setup.md)** - DevContainer, environment setup, and dependencies
|
- **[Setup](setup.md)** - DevContainer, environment setup, and dependencies
|
||||||
- **[Architecture](architecture.md)** - Code structure, patterns, and conventions
|
- **[Architecture](architecture.md)** - Code structure, patterns, and conventions
|
||||||
- **[Testing](testing.md)** - How to run tests and write new test cases
|
- **[Testing](testing.md)** - How to run tests and write new test cases
|
||||||
- **[Release Management](release-management.md)** - Release workflow and versioning process
|
- **[Release Management](release-management.md)** - Release workflow and versioning process
|
||||||
- **[Coding Guidelines](coding-guidelines.md)** - Style guide, linting, and best practices
|
- **[Coding Guidelines](coding-guidelines.md)** - Style guide, linting, and best practices
|
||||||
|
|
||||||
## 🤖 AI Documentation
|
## 🤖 AI Documentation
|
||||||
|
|
||||||
The main AI/Copilot documentation is in [`AGENTS.md`](../../AGENTS.md). This file serves as long-term memory for AI assistants and contains:
|
The main AI/Copilot documentation is in [`AGENTS.md`](../../AGENTS.md). This file serves as long-term memory for AI assistants and contains:
|
||||||
|
|
||||||
- Detailed architectural patterns
|
- Detailed architectural patterns
|
||||||
- Code quality rules and conventions
|
- Code quality rules and conventions
|
||||||
- Development workflow guidance
|
- Development workflow guidance
|
||||||
- Common pitfalls and anti-patterns
|
- Common pitfalls and anti-patterns
|
||||||
- Project-specific patterns and utilities
|
- Project-specific patterns and utilities
|
||||||
|
|
||||||
**Important:** When proposing changes to patterns or conventions, always update [`AGENTS.md`](../../AGENTS.md) to keep AI guidance consistent.
|
**Important:** When proposing changes to patterns or conventions, always update [`AGENTS.md`](../../AGENTS.md) to keep AI guidance consistent.
|
||||||
|
|
||||||
|
|
@ -26,29 +26,32 @@ The main AI/Copilot documentation is in [`AGENTS.md`](../../AGENTS.md). This fil
|
||||||
|
|
||||||
This integration is developed with extensive AI assistance (GitHub Copilot, Claude, and other AI tools). The AI handles:
|
This integration is developed with extensive AI assistance (GitHub Copilot, Claude, and other AI tools). The AI handles:
|
||||||
|
|
||||||
- **Pattern Recognition**: Understanding and applying Home Assistant best practices
|
- **Pattern Recognition**: Understanding and applying Home Assistant best practices
|
||||||
- **Code Generation**: Implementing features with proper type hints, error handling, and documentation
|
- **Code Generation**: Implementing features with proper type hints, error handling, and documentation
|
||||||
- **Refactoring**: Maintaining consistency across the codebase during structural changes
|
- **Refactoring**: Maintaining consistency across the codebase during structural changes
|
||||||
- **Translation Management**: Keeping 5 language files synchronized
|
- **Translation Management**: Keeping 5 language files synchronized
|
||||||
- **Documentation**: Generating and maintaining comprehensive documentation
|
- **Documentation**: Generating and maintaining comprehensive documentation
|
||||||
|
|
||||||
**Quality Assurance:**
|
**Quality Assurance:**
|
||||||
- Automated linting with Ruff (120-char line length, max complexity 25)
|
|
||||||
- Home Assistant's type checking and validation
|
- Automated linting with Ruff (120-char line length, max complexity 25)
|
||||||
- Real-world testing in development environment
|
- Home Assistant's type checking and validation
|
||||||
- Code review by maintainer before merging
|
- Real-world testing in development environment
|
||||||
|
- Code review by maintainer before merging
|
||||||
|
|
||||||
**Benefits:**
|
**Benefits:**
|
||||||
- Rapid feature development while maintaining quality
|
|
||||||
- Consistent code patterns across all modules
|
- Rapid feature development while maintaining quality
|
||||||
- Comprehensive documentation maintained alongside code
|
- Consistent code patterns across all modules
|
||||||
- Quick bug fixes with proper understanding of context
|
- Comprehensive documentation maintained alongside code
|
||||||
|
- Quick bug fixes with proper understanding of context
|
||||||
|
|
||||||
**Limitations:**
|
**Limitations:**
|
||||||
- AI may occasionally miss edge cases or subtle bugs
|
|
||||||
- Some complex Home Assistant patterns may need human review
|
- AI may occasionally miss edge cases or subtle bugs
|
||||||
- Translation quality depends on AI's understanding of target language
|
- Some complex Home Assistant patterns may need human review
|
||||||
- User feedback is crucial for discovering real-world issues
|
- Translation quality depends on AI's understanding of target language
|
||||||
|
- User feedback is crucial for discovering real-world issues
|
||||||
|
|
||||||
If you're working with AI tools on this project, the [`AGENTS.md`](../../AGENTS.md) file provides the context and patterns that ensure consistency.
|
If you're working with AI tools on this project, the [`AGENTS.md`](../../AGENTS.md) file provides the context and patterns that ensure consistency.
|
||||||
|
|
||||||
|
|
@ -68,13 +71,13 @@ If you're working with AI tools on this project, the [`AGENTS.md`](../../AGENTS.
|
||||||
|
|
||||||
The project includes several helper scripts in `./scripts/`:
|
The project includes several helper scripts in `./scripts/`:
|
||||||
|
|
||||||
- `bootstrap` - Initial setup of dependencies
|
- `bootstrap` - Initial setup of dependencies
|
||||||
- `develop` - Start Home Assistant in debug mode
|
- `develop` - Start Home Assistant in debug mode
|
||||||
- `lint` - Auto-fix code issues with ruff
|
- `lint` - Auto-fix code issues with ruff
|
||||||
- `lint-check` - Check code without modifications (CI mode)
|
- `lint-check` - Check code without modifications (CI mode)
|
||||||
- `setup` - Install development tools (git-cliff, @github/copilot)
|
- `setup` - Install development tools (git-cliff, @github/copilot)
|
||||||
- `prepare-release` - Prepare a new release (bump version, create tag)
|
- `prepare-release` - Prepare a new release (bump version, create tag)
|
||||||
- `generate-release-notes` - Generate release notes from commits
|
- `generate-release-notes` - Generate release notes from commits
|
||||||
|
|
||||||
## 📦 Project Structure
|
## 📦 Project Structure
|
||||||
|
|
||||||
|
|
@ -85,8 +88,17 @@ custom_components/tibber_prices/
|
||||||
├── api.py # Tibber GraphQL API client
|
├── api.py # Tibber GraphQL API client
|
||||||
├── price_utils.py # Price enrichment functions
|
├── price_utils.py # Price enrichment functions
|
||||||
├── average_utils.py # Average calculation utilities
|
├── average_utils.py # Average calculation utilities
|
||||||
├── sensor.py # Sensor platform
|
├── sensor/ # Sensor platform (package)
|
||||||
|
│ ├── __init__.py # Platform setup
|
||||||
|
│ ├── core.py # TibberPricesSensor class
|
||||||
|
│ ├── definitions.py # Entity descriptions
|
||||||
|
│ ├── helpers.py # Pure helper functions
|
||||||
|
│ └── attributes.py # Attribute builders
|
||||||
├── binary_sensor.py # Binary sensor platform
|
├── binary_sensor.py # Binary sensor platform
|
||||||
|
├── entity_utils/ # Shared entity helpers
|
||||||
|
│ ├── icons.py # Icon mapping logic
|
||||||
|
│ ├── colors.py # Color mapping logic
|
||||||
|
│ └── attributes.py # Common attribute builders
|
||||||
├── services.py # Custom services
|
├── services.py # Custom services
|
||||||
├── config_flow.py # UI configuration flow
|
├── config_flow.py # UI configuration flow
|
||||||
├── const.py # Constants and helpers
|
├── const.py # Constants and helpers
|
||||||
|
|
@ -97,21 +109,24 @@ custom_components/tibber_prices/
|
||||||
## 🔍 Key Concepts
|
## 🔍 Key Concepts
|
||||||
|
|
||||||
**DataUpdateCoordinator Pattern:**
|
**DataUpdateCoordinator Pattern:**
|
||||||
- Centralized data fetching and caching
|
|
||||||
- Automatic entity updates on data changes
|
- Centralized data fetching and caching
|
||||||
- Persistent storage via `Store`
|
- Automatic entity updates on data changes
|
||||||
- Quarter-hour boundary refresh scheduling
|
- Persistent storage via `Store`
|
||||||
|
- Quarter-hour boundary refresh scheduling
|
||||||
|
|
||||||
**Price Data Enrichment:**
|
**Price Data Enrichment:**
|
||||||
- Raw API data is enriched with statistical analysis
|
|
||||||
- Trailing/leading 24h averages calculated per interval
|
- Raw API data is enriched with statistical analysis
|
||||||
- Price differences and ratings added
|
- Trailing/leading 24h averages calculated per interval
|
||||||
- All via pure functions in `price_utils.py`
|
- Price differences and ratings added
|
||||||
|
- All via pure functions in `price_utils.py`
|
||||||
|
|
||||||
**Translation System:**
|
**Translation System:**
|
||||||
- Dual system: `/translations/` (HA schema) + `/custom_translations/` (extended)
|
|
||||||
- Both must stay in sync across all languages (de, en, nb, nl, sv)
|
- Dual system: `/translations/` (HA schema) + `/custom_translations/` (extended)
|
||||||
- Async loading at integration setup
|
- Both must stay in sync across all languages (de, en, nb, nl, sv)
|
||||||
|
- Async loading at integration setup
|
||||||
|
|
||||||
## 🧪 Testing
|
## 🧪 Testing
|
||||||
|
|
||||||
|
|
@ -128,11 +143,11 @@ pytest --cov=custom_components.tibber_prices tests/
|
||||||
|
|
||||||
## 📝 Documentation Standards
|
## 📝 Documentation Standards
|
||||||
|
|
||||||
- **User-facing docs** go in `docs/user/`
|
- **User-facing docs** go in `docs/user/`
|
||||||
- **Developer docs** go in `docs/development/`
|
- **Developer docs** go in `docs/development/`
|
||||||
- **AI guidance** goes in `AGENTS.md`
|
- **AI guidance** goes in `AGENTS.md`
|
||||||
- Use clear examples and code snippets
|
- Use clear examples and code snippets
|
||||||
- Keep docs up-to-date with code changes
|
- Keep docs up-to-date with code changes
|
||||||
|
|
||||||
## 🤝 Contributing
|
## 🤝 Contributing
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -201,7 +201,7 @@ generate_with_copilot() {
|
||||||
- Bug fixes that users experience (translations, api, coordinator, etc.)
|
- Bug fixes that users experience (translations, api, coordinator, etc.)
|
||||||
|
|
||||||
8. **Understanding the file paths (use this to assess importance):**
|
8. **Understanding the file paths (use this to assess importance):**
|
||||||
- custom_components/tibber_prices/sensor.py = User-facing sensors (HIGH priority for title)
|
- custom_components/tibber_prices/sensor/ = User-facing sensors (HIGH priority for title)
|
||||||
- custom_components/tibber_prices/binary_sensor.py = User-facing binary sensors (HIGH priority)
|
- custom_components/tibber_prices/binary_sensor.py = User-facing binary sensors (HIGH priority)
|
||||||
- custom_components/tibber_prices/config_flow.py = User-facing configuration (HIGH priority)
|
- custom_components/tibber_prices/config_flow.py = User-facing configuration (HIGH priority)
|
||||||
- custom_components/tibber_prices/services.py = User-facing services (HIGH priority)
|
- custom_components/tibber_prices/services.py = User-facing services (HIGH priority)
|
||||||
|
|
|
||||||
Loading…
Reference in a new issue