mirror of
https://github.com/jpawlowski/hass.tibber_prices.git
synced 2026-03-29 21:03:40 +00:00
refactor(sensor): implement Calculator Pattern with specialized modules
Massive refactoring of sensor platform reducing core.py from 2,170 to 909 lines (58% reduction). Extracted business logic into specialized calculators and attribute builders following separation of concerns principles. Changes: - Created sensor/calculators/ package (8 specialized calculators, 1,838 lines): * base.py: Abstract BaseCalculator with coordinator access * interval.py: Single interval calculations (current/next/previous) * rolling_hour.py: 5-interval rolling windows * daily_stat.py: Calendar day min/max/avg statistics * window_24h.py: Trailing/leading 24h windows * volatility.py: Price volatility analysis * trend.py: Complex trend analysis with caching (640 lines) * timing.py: Best/peak price period timing * metadata.py: Home/metering metadata - Created sensor/attributes/ package (8 specialized modules, 1,209 lines): * Modules match calculator types for consistent organization * __init__.py: Routing logic + unified builders * Handles state presentation separately from business logic - Created sensor/chart_data.py (144 lines): * Extracted chart data export functionality from entity class * YAML parsing, service calls, metadata formatting - Created sensor/value_getters.py (276 lines): * Centralized handler mapping for all 80+ sensor types * Single source of truth for sensor routing - Extended sensor/helpers.py (+88 lines): * Added aggregate_window_data() unified aggregator * Added get_hourly_price_value() for backward compatibility * Consolidated sensor-specific helper functions - Refactored sensor/core.py (909 lines, was 2,170): * Instantiates all calculators in __init__ * Delegates value calculations to appropriate calculator * Uses unified handler methods via value_getters mapping * Minimal platform-specific logic remains (icon callbacks, entity lifecycle) - Deleted sensor/attributes.py (1,106 lines): * Functionality split into attributes/ package (8 modules) - Updated AGENTS.md: * Documented Calculator Pattern architecture * Added guidance for adding new sensors with calculation groups * Updated file organization with new package structure Architecture Benefits: - Clear separation: Calculators (business logic) vs Attributes (presentation) - Improved testability: Each calculator independently testable - Better maintainability: 21 focused modules vs monolithic file - Easy extensibility: Add sensors by choosing calculation pattern - Reusable components: Calculators and attribute builders shared across sensors Impact: Significantly improved code organization and maintainability while preserving all functionality. All 80+ sensor types continue working with cleaner, more modular architecture. Developer experience improved with logical file structure and clear separation of concerns.
This commit is contained in:
parent
df075ae56a
commit
a962289682
25 changed files with 3698 additions and 2449 deletions
58
AGENTS.md
58
AGENTS.md
|
|
@ -5,7 +5,7 @@ This is a **Home Assistant custom component** for Tibber electricity price data,
|
||||||
## Documentation Metadata
|
## Documentation Metadata
|
||||||
|
|
||||||
- **Last Major Update**: 2025-11-18
|
- **Last Major Update**: 2025-11-18
|
||||||
- **Last Architecture Review**: 2025-11-18 (Created /utils/ package, moved average_utils.py→utils/average.py and price_utils.py→utils/price.py. Added file organization policy to prevent root clutter.)
|
- **Last Architecture Review**: 2025-11-18 (Completed sensor/core.py refactoring: Calculator Pattern implementation with 8 specialized calculators and 8 attribute modules. Reduced core.py from 2,170 → 1,268 lines (42% reduction). Total 3,047 lines extracted to specialized packages.)
|
||||||
- **Last Code Example Cleanup**: 2025-11-18 (Removed redundant implementation details from AGENTS.md, added guidelines for when to include code examples)
|
- **Last Code Example Cleanup**: 2025-11-18 (Removed redundant implementation details from AGENTS.md, added guidelines for when to include code examples)
|
||||||
- **Documentation Status**: ✅ Current (verified against codebase)
|
- **Documentation Status**: ✅ Current (verified against codebase)
|
||||||
|
|
||||||
|
|
@ -337,10 +337,13 @@ After successful refactoring:
|
||||||
- **Pattern**: Coordinator-specific implementations
|
- **Pattern**: Coordinator-specific implementations
|
||||||
|
|
||||||
4. **`/sensor/`** - Sensor platform package
|
4. **`/sensor/`** - Sensor platform package
|
||||||
- `core.py` - Entity class
|
- `core.py` - Entity class (1,268 lines - manages 80+ sensor types)
|
||||||
- `definitions.py` - Entity descriptions
|
- `definitions.py` - Entity descriptions
|
||||||
- `attributes.py` - Attribute builders
|
|
||||||
- `helpers.py` - Sensor-specific helpers
|
- `helpers.py` - Sensor-specific helpers
|
||||||
|
- `calculators/` - Value calculation package (8 specialized calculators, 1,838 lines)
|
||||||
|
- `attributes/` - Attribute builders package (8 specialized modules, 1,209 lines)
|
||||||
|
- **Pattern**: Calculator Pattern (business logic separated from presentation)
|
||||||
|
- **Architecture**: Two-tier (Calculators handle computation → Attributes handle state presentation)
|
||||||
|
|
||||||
5. **`/binary_sensor/`** - Binary sensor platform package
|
5. **`/binary_sensor/`** - Binary sensor platform package
|
||||||
- Same structure as `/sensor/`
|
- Same structure as `/sensor/`
|
||||||
|
|
@ -393,11 +396,26 @@ After successful refactoring:
|
||||||
See `config_flow/schemas.py` for implementation examples.
|
See `config_flow/schemas.py` for implementation examples.
|
||||||
|
|
||||||
- **Price data enrichment**: All quarter-hourly price intervals get augmented with `trailing_avg_24h`, `difference`, and `rating_level` fields via `enrich_price_info_with_differences()` in `utils/price.py`. This adds statistical analysis (24h trailing average, percentage difference from average, rating classification) to each 15-minute interval. See `utils/price.py` for enrichment logic.
|
- **Price data enrichment**: All quarter-hourly price intervals get augmented with `trailing_avg_24h`, `difference`, and `rating_level` fields via `enrich_price_info_with_differences()` in `utils/price.py`. This adds statistical analysis (24h trailing average, percentage difference from average, rating classification) to each 15-minute interval. See `utils/price.py` for enrichment logic.
|
||||||
- **Sensor organization (refactored Nov 2025)**: The `sensor/` package is organized by **calculation method** rather than feature type, enabling code reuse through unified handler methods:
|
- **Sensor organization (refactored Nov 2025)**: The `sensor/` package uses **Calculator Pattern** for separation of concerns:
|
||||||
- **Interval-based sensors**: Use `_get_interval_value(interval_offset, value_type)` for current/next/previous interval data
|
- **Calculator Package** (`sensor/calculators/`): 8 specialized calculators handle business logic (1,838 lines total)
|
||||||
- **Rolling hour sensors**: Use `_get_rolling_hour_value(hour_offset, value_type)` for 5-interval windows
|
- `base.py` - Abstract BaseCalculator with coordinator access
|
||||||
- **Daily statistics**: Use `_get_daily_stat_value(day, stat_func)` for calendar day min/max/avg
|
- `interval.py` - Single interval calculations (current/next/previous)
|
||||||
- **24h windows**: Use `_get_24h_window_value(stat_func)` for trailing/leading statistics
|
- `rolling_hour.py` - 5-interval rolling windows
|
||||||
|
- `daily_stat.py` - Calendar day min/max/avg statistics
|
||||||
|
- `window_24h.py` - Trailing/leading 24h windows
|
||||||
|
- `volatility.py` - Price volatility analysis
|
||||||
|
- `trend.py` - Complex trend analysis with caching (640 lines)
|
||||||
|
- `timing.py` - Best/peak price period timing
|
||||||
|
- `metadata.py` - Home/metering metadata
|
||||||
|
- **Attributes Package** (`sensor/attributes/`): 8 specialized modules handle state presentation (1,209 lines total)
|
||||||
|
- Modules match calculator types: `interval.py`, `daily_stat.py`, `window_24h.py`, `volatility.py`, `trend.py`, `timing.py`, `future.py`, `metadata.py`
|
||||||
|
- `__init__.py` - Routing logic + unified builders (`build_sensor_attributes`, `build_extra_state_attributes`)
|
||||||
|
- **Core Entity** (`sensor/core.py`): 1,268 lines managing 80+ sensor types
|
||||||
|
- Instantiates all calculators in `__init__`
|
||||||
|
- Delegates value calculations to appropriate calculator
|
||||||
|
- Uses unified handler methods: `_get_interval_value()`, `_get_rolling_hour_value()`, `_get_daily_stat_value()`, `_get_24h_window_value()`
|
||||||
|
- Handler mapping dictionary routes entity keys to value getters
|
||||||
|
- **Architecture Benefits**: 42% line reduction in core.py (2,170 → 1,268 lines), clear separation of concerns, improved testability, reusable components
|
||||||
- **See "Common Tasks" section** for detailed patterns and examples
|
- **See "Common Tasks" section** for detailed patterns and examples
|
||||||
- **Quarter-hour precision**: Entities update on 00/15/30/45-minute boundaries via `schedule_quarter_hour_refresh()` in `coordinator/listeners.py`, not just on data fetch intervals. Uses `async_track_utc_time_change(minute=[0, 15, 30, 45], second=0)` for absolute-time scheduling. Smart boundary tolerance (±2 seconds) in `sensor/helpers.py` → `round_to_nearest_quarter_hour()` handles HA scheduling jitter: if HA triggers at 14:59:58 → rounds to 15:00:00 (next interval), if HA restarts at 14:59:30 → stays at 14:45:00 (current interval). This ensures current price sensors update without waiting for the next API poll, while preventing premature data display during normal operation.
|
- **Quarter-hour precision**: Entities update on 00/15/30/45-minute boundaries via `schedule_quarter_hour_refresh()` in `coordinator/listeners.py`, not just on data fetch intervals. Uses `async_track_utc_time_change(minute=[0, 15, 30, 45], second=0)` for absolute-time scheduling. Smart boundary tolerance (±2 seconds) in `sensor/helpers.py` → `round_to_nearest_quarter_hour()` handles HA scheduling jitter: if HA triggers at 14:59:58 → rounds to 15:00:00 (next interval), if HA restarts at 14:59:30 → stays at 14:45:00 (current interval). This ensures current price sensors update without waiting for the next API poll, while preventing premature data display during normal operation.
|
||||||
- **Currency handling**: Multi-currency support with major/minor units (e.g., EUR/ct, NOK/øre) via `get_currency_info()` and `format_price_unit_*()` in `const.py`.
|
- **Currency handling**: Multi-currency support with major/minor units (e.g., EUR/ct, NOK/øre) via `get_currency_info()` and `format_price_unit_*()` in `const.py`.
|
||||||
|
|
@ -458,10 +476,30 @@ custom_components/tibber_prices/
|
||||||
├── services.py # Custom services (get_price, ApexCharts, etc.)
|
├── services.py # Custom services (get_price, ApexCharts, etc.)
|
||||||
├── sensor/ # Sensor platform (package)
|
├── sensor/ # Sensor platform (package)
|
||||||
│ ├── __init__.py # Platform setup (async_setup_entry)
|
│ ├── __init__.py # Platform setup (async_setup_entry)
|
||||||
│ ├── core.py # TibberPricesSensor class
|
│ ├── core.py # TibberPricesSensor class (1,268 lines)
|
||||||
│ ├── definitions.py # ENTITY_DESCRIPTIONS
|
│ ├── definitions.py # ENTITY_DESCRIPTIONS
|
||||||
│ ├── helpers.py # Pure helper functions (incl. smart boundary tolerance)
|
│ ├── helpers.py # Pure helper functions (incl. smart boundary tolerance)
|
||||||
│ └── attributes.py # Attribute builders
|
│ ├── calculators/ # Value calculation package (1,838 lines)
|
||||||
|
│ │ ├── __init__.py # Package exports
|
||||||
|
│ │ ├── base.py # Abstract BaseCalculator (57 lines)
|
||||||
|
│ │ ├── interval.py # Single interval calculations (206 lines)
|
||||||
|
│ │ ├── rolling_hour.py # 5-interval rolling windows (123 lines)
|
||||||
|
│ │ ├── daily_stat.py # Daily min/max/avg (211 lines)
|
||||||
|
│ │ ├── window_24h.py # Trailing/leading 24h (53 lines)
|
||||||
|
│ │ ├── volatility.py # Price volatility (113 lines)
|
||||||
|
│ │ ├── trend.py # Trend analysis with caching (640 lines)
|
||||||
|
│ │ ├── timing.py # Best/peak price timing (246 lines)
|
||||||
|
│ │ └── metadata.py # Home/metering metadata (123 lines)
|
||||||
|
│ └── attributes/ # Attribute builders package (1,209 lines)
|
||||||
|
│ ├── __init__.py # Routing + unified builders (267 lines)
|
||||||
|
│ ├── interval.py # Interval attributes (228 lines)
|
||||||
|
│ ├── daily_stat.py # Statistics attributes (124 lines)
|
||||||
|
│ ├── window_24h.py # 24h window attributes (106 lines)
|
||||||
|
│ ├── timing.py # Period timing attributes (64 lines)
|
||||||
|
│ ├── volatility.py # Volatility attributes (128 lines)
|
||||||
|
│ ├── trend.py # Trend attribute routing (34 lines)
|
||||||
|
│ ├── future.py # Forecast attributes (223 lines)
|
||||||
|
│ └── metadata.py # Current interval helper (35 lines)
|
||||||
├── binary_sensor/ # Binary sensor platform (package)
|
├── binary_sensor/ # Binary sensor platform (package)
|
||||||
│ ├── __init__.py # Platform setup (async_setup_entry)
|
│ ├── __init__.py # Platform setup (async_setup_entry)
|
||||||
│ ├── core.py # TibberPricesBinarySensor class
|
│ ├── core.py # TibberPricesBinarySensor class
|
||||||
|
|
|
||||||
File diff suppressed because it is too large
Load diff
267
custom_components/tibber_prices/sensor/attributes/__init__.py
Normal file
267
custom_components/tibber_prices/sensor/attributes/__init__.py
Normal file
|
|
@ -0,0 +1,267 @@
|
||||||
|
"""
|
||||||
|
Attribute builders for Tibber Prices sensors.
|
||||||
|
|
||||||
|
This package contains attribute building functions organized by sensor calculation type.
|
||||||
|
The main entry point is build_sensor_attributes() which routes to the appropriate
|
||||||
|
specialized attribute builder.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import TYPE_CHECKING, Any
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.entity_utils import (
|
||||||
|
add_description_attributes,
|
||||||
|
add_icon_color_attribute,
|
||||||
|
)
|
||||||
|
from custom_components.tibber_prices.utils.average import round_to_nearest_quarter_hour
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from custom_components.tibber_prices.coordinator.core import (
|
||||||
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
)
|
||||||
|
from custom_components.tibber_prices.data import TibberPricesConfigEntry
|
||||||
|
from homeassistant.core import HomeAssistant
|
||||||
|
|
||||||
|
# Import from specialized modules
|
||||||
|
from .daily_stat import add_statistics_attributes
|
||||||
|
from .future import add_next_avg_attributes, add_price_forecast_attributes, get_future_prices
|
||||||
|
from .interval import add_current_interval_price_attributes
|
||||||
|
from .timing import _is_timing_or_volatility_sensor
|
||||||
|
from .trend import _add_cached_trend_attributes, _add_timing_or_volatility_attributes
|
||||||
|
from .volatility import add_volatility_type_attributes, get_prices_for_volatility
|
||||||
|
from .window_24h import add_average_price_attributes
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"add_volatility_type_attributes",
|
||||||
|
"build_extra_state_attributes",
|
||||||
|
"build_sensor_attributes",
|
||||||
|
"get_future_prices",
|
||||||
|
"get_prices_for_volatility",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def build_sensor_attributes(
|
||||||
|
key: str,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
native_value: Any,
|
||||||
|
cached_data: dict,
|
||||||
|
) -> dict | None:
|
||||||
|
"""
|
||||||
|
Build attributes for a sensor based on its key.
|
||||||
|
|
||||||
|
Routes to specialized attribute builders based on sensor type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: The sensor entity key
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
native_value: The current native value of the sensor
|
||||||
|
cached_data: Dictionary containing cached sensor data
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary of attributes or None if no attributes should be added
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not coordinator.data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
try:
|
||||||
|
attributes: dict[str, Any] = {}
|
||||||
|
|
||||||
|
# For trend sensors, use cached attributes
|
||||||
|
_add_cached_trend_attributes(attributes, key, cached_data)
|
||||||
|
|
||||||
|
# Group sensors by type and delegate to specific handlers
|
||||||
|
if key in [
|
||||||
|
"current_interval_price",
|
||||||
|
"current_interval_price_level",
|
||||||
|
"next_interval_price",
|
||||||
|
"previous_interval_price",
|
||||||
|
"current_hour_average_price",
|
||||||
|
"next_hour_average_price",
|
||||||
|
"next_interval_price_level",
|
||||||
|
"previous_interval_price_level",
|
||||||
|
"current_hour_price_level",
|
||||||
|
"next_hour_price_level",
|
||||||
|
"next_interval_price_rating",
|
||||||
|
"previous_interval_price_rating",
|
||||||
|
"current_hour_price_rating",
|
||||||
|
"next_hour_price_rating",
|
||||||
|
]:
|
||||||
|
add_current_interval_price_attributes(
|
||||||
|
attributes=attributes,
|
||||||
|
key=key,
|
||||||
|
coordinator=coordinator,
|
||||||
|
native_value=native_value,
|
||||||
|
cached_data=cached_data,
|
||||||
|
)
|
||||||
|
elif key in [
|
||||||
|
"trailing_price_average",
|
||||||
|
"leading_price_average",
|
||||||
|
"trailing_price_min",
|
||||||
|
"trailing_price_max",
|
||||||
|
"leading_price_min",
|
||||||
|
"leading_price_max",
|
||||||
|
]:
|
||||||
|
add_average_price_attributes(attributes=attributes, key=key, coordinator=coordinator)
|
||||||
|
elif key.startswith("next_avg_"):
|
||||||
|
add_next_avg_attributes(attributes=attributes, key=key, coordinator=coordinator)
|
||||||
|
elif any(
|
||||||
|
pattern in key
|
||||||
|
for pattern in [
|
||||||
|
"_price_today",
|
||||||
|
"_price_tomorrow",
|
||||||
|
"_price_yesterday",
|
||||||
|
"yesterday_price_level",
|
||||||
|
"today_price_level",
|
||||||
|
"tomorrow_price_level",
|
||||||
|
"yesterday_price_rating",
|
||||||
|
"today_price_rating",
|
||||||
|
"tomorrow_price_rating",
|
||||||
|
"rating",
|
||||||
|
"data_timestamp",
|
||||||
|
]
|
||||||
|
):
|
||||||
|
add_statistics_attributes(
|
||||||
|
attributes=attributes,
|
||||||
|
key=key,
|
||||||
|
cached_data=cached_data,
|
||||||
|
)
|
||||||
|
elif key == "price_forecast":
|
||||||
|
add_price_forecast_attributes(attributes=attributes, coordinator=coordinator)
|
||||||
|
elif _is_timing_or_volatility_sensor(key):
|
||||||
|
_add_timing_or_volatility_attributes(attributes, key, cached_data, native_value)
|
||||||
|
|
||||||
|
# For current_interval_price_level, add the original level as attribute
|
||||||
|
if key == "current_interval_price_level" and cached_data.get("last_price_level") is not None:
|
||||||
|
attributes["level_id"] = cached_data["last_price_level"]
|
||||||
|
|
||||||
|
# Add icon_color for daily level and rating sensors (uses native_value)
|
||||||
|
if key in [
|
||||||
|
"yesterday_price_level",
|
||||||
|
"today_price_level",
|
||||||
|
"tomorrow_price_level",
|
||||||
|
"yesterday_price_rating",
|
||||||
|
"today_price_rating",
|
||||||
|
"tomorrow_price_rating",
|
||||||
|
]:
|
||||||
|
add_icon_color_attribute(attributes, key=key, state_value=native_value)
|
||||||
|
|
||||||
|
except (KeyError, ValueError, TypeError) as ex:
|
||||||
|
coordinator.logger.exception(
|
||||||
|
"Error getting sensor attributes",
|
||||||
|
extra={
|
||||||
|
"error": str(ex),
|
||||||
|
"entity": key,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
return None
|
||||||
|
else:
|
||||||
|
return attributes if attributes else None
|
||||||
|
|
||||||
|
|
||||||
|
def build_extra_state_attributes( # noqa: PLR0913
|
||||||
|
entity_key: str,
|
||||||
|
translation_key: str | None,
|
||||||
|
hass: HomeAssistant,
|
||||||
|
*,
|
||||||
|
config_entry: TibberPricesConfigEntry,
|
||||||
|
coordinator_data: dict,
|
||||||
|
sensor_attrs: dict | None = None,
|
||||||
|
) -> dict[str, Any] | None:
|
||||||
|
"""
|
||||||
|
Build extra state attributes for sensors.
|
||||||
|
|
||||||
|
This function implements the unified attribute building pattern:
|
||||||
|
1. Generate default timestamp (current time rounded to nearest quarter hour)
|
||||||
|
2. Merge sensor-specific attributes (may override timestamp)
|
||||||
|
3. Preserve timestamp ordering (always FIRST in dict)
|
||||||
|
4. Add description attributes (always LAST)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
entity_key: Entity key (e.g., "current_interval_price")
|
||||||
|
translation_key: Translation key for entity
|
||||||
|
hass: Home Assistant instance
|
||||||
|
config_entry: Config entry with options (keyword-only)
|
||||||
|
coordinator_data: Coordinator data dict (keyword-only)
|
||||||
|
sensor_attrs: Sensor-specific attributes (keyword-only)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Complete attributes dict or None if no data available
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not coordinator_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Calculate default timestamp: current time rounded to nearest quarter hour
|
||||||
|
# This ensures all sensors have a consistent reference time for when calculations were made
|
||||||
|
# Individual sensors can override this if they need a different timestamp
|
||||||
|
now = dt_util.now()
|
||||||
|
default_timestamp = round_to_nearest_quarter_hour(now)
|
||||||
|
|
||||||
|
# Special handling for chart_data_export: metadata → descriptions → service data
|
||||||
|
if entity_key == "chart_data_export":
|
||||||
|
attributes: dict[str, Any] = {
|
||||||
|
"timestamp": default_timestamp.isoformat(),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Step 1: Add metadata (timestamp + error if present)
|
||||||
|
if sensor_attrs:
|
||||||
|
if "timestamp" in sensor_attrs and sensor_attrs["timestamp"] is not None:
|
||||||
|
# Chart data has its own timestamp (when service was last called)
|
||||||
|
attributes["timestamp"] = sensor_attrs["timestamp"]
|
||||||
|
|
||||||
|
if "error" in sensor_attrs:
|
||||||
|
attributes["error"] = sensor_attrs["error"]
|
||||||
|
|
||||||
|
# Step 2: Add descriptions before service data (via central utility)
|
||||||
|
add_description_attributes(
|
||||||
|
attributes,
|
||||||
|
"sensor",
|
||||||
|
translation_key,
|
||||||
|
hass,
|
||||||
|
config_entry,
|
||||||
|
position="before_service_data",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Step 3: Add service data (everything except metadata)
|
||||||
|
if sensor_attrs:
|
||||||
|
attributes.update({k: v for k, v in sensor_attrs.items() if k not in ("timestamp", "error")})
|
||||||
|
|
||||||
|
return attributes if attributes else None
|
||||||
|
|
||||||
|
# For all other sensors: standard behavior
|
||||||
|
# Start with default timestamp
|
||||||
|
attributes: dict[str, Any] = {
|
||||||
|
"timestamp": default_timestamp.isoformat(),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add sensor-specific attributes (may override timestamp)
|
||||||
|
if sensor_attrs:
|
||||||
|
# Extract timestamp override if present
|
||||||
|
timestamp_override = sensor_attrs.pop("timestamp", None)
|
||||||
|
|
||||||
|
# Add all other sensor attributes
|
||||||
|
attributes.update(sensor_attrs)
|
||||||
|
|
||||||
|
# If sensor wants to override timestamp, rebuild dict with timestamp FIRST
|
||||||
|
if timestamp_override is not None:
|
||||||
|
temp_attrs = dict(attributes)
|
||||||
|
attributes.clear()
|
||||||
|
attributes["timestamp"] = timestamp_override
|
||||||
|
for key, value in temp_attrs.items():
|
||||||
|
if key != "timestamp":
|
||||||
|
attributes[key] = value
|
||||||
|
|
||||||
|
# Add description attributes (always last, via central utility)
|
||||||
|
add_description_attributes(
|
||||||
|
attributes,
|
||||||
|
"sensor",
|
||||||
|
translation_key,
|
||||||
|
hass,
|
||||||
|
config_entry,
|
||||||
|
position="end",
|
||||||
|
)
|
||||||
|
|
||||||
|
return attributes if attributes else None
|
||||||
124
custom_components/tibber_prices/sensor/attributes/daily_stat.py
Normal file
124
custom_components/tibber_prices/sensor/attributes/daily_stat.py
Normal file
|
|
@ -0,0 +1,124 @@
|
||||||
|
"""Daily statistics attribute builders for Tibber Prices sensors."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import PRICE_RATING_MAPPING
|
||||||
|
from homeassistant.const import PERCENTAGE
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
|
||||||
|
def _get_day_midnight_timestamp(key: str) -> str:
|
||||||
|
"""Get midnight timestamp for a given day sensor key."""
|
||||||
|
now = dt_util.now()
|
||||||
|
local_midnight = dt_util.start_of_local_day(now)
|
||||||
|
|
||||||
|
if key.startswith("yesterday") or key == "average_price_yesterday":
|
||||||
|
local_midnight = local_midnight - timedelta(days=1)
|
||||||
|
elif key.startswith("tomorrow") or key == "average_price_tomorrow":
|
||||||
|
local_midnight = local_midnight + timedelta(days=1)
|
||||||
|
|
||||||
|
return local_midnight.isoformat()
|
||||||
|
|
||||||
|
|
||||||
|
def _get_day_key_from_sensor_key(key: str) -> str:
|
||||||
|
"""
|
||||||
|
Extract day key (yesterday/today/tomorrow) from sensor key.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: The sensor entity key
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Day key: "yesterday", "today", or "tomorrow"
|
||||||
|
|
||||||
|
"""
|
||||||
|
if "yesterday" in key:
|
||||||
|
return "yesterday"
|
||||||
|
if "tomorrow" in key:
|
||||||
|
return "tomorrow"
|
||||||
|
return "today"
|
||||||
|
|
||||||
|
|
||||||
|
def _add_fallback_timestamp(attributes: dict, key: str, price_info: dict) -> None:
|
||||||
|
"""
|
||||||
|
Add fallback timestamp to attributes based on the day in the sensor key.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add timestamp to
|
||||||
|
key: The sensor entity key
|
||||||
|
price_info: Price info dictionary from coordinator data
|
||||||
|
|
||||||
|
"""
|
||||||
|
day_key = _get_day_key_from_sensor_key(key)
|
||||||
|
day_data = price_info.get(day_key, [])
|
||||||
|
if day_data:
|
||||||
|
attributes["timestamp"] = day_data[0].get("startsAt")
|
||||||
|
|
||||||
|
|
||||||
|
def add_statistics_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
cached_data: dict,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add attributes for statistics and rating sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
cached_data: Dictionary containing cached sensor data
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Data timestamp sensor - shows API fetch time
|
||||||
|
if key == "data_timestamp":
|
||||||
|
latest_timestamp = cached_data.get("data_timestamp")
|
||||||
|
if latest_timestamp:
|
||||||
|
attributes["timestamp"] = latest_timestamp.isoformat()
|
||||||
|
return
|
||||||
|
|
||||||
|
# Current interval price rating - add rating attributes
|
||||||
|
if key == "current_interval_price_rating":
|
||||||
|
if cached_data.get("last_rating_difference") is not None:
|
||||||
|
attributes["diff_" + PERCENTAGE] = cached_data["last_rating_difference"]
|
||||||
|
if cached_data.get("last_rating_level") is not None:
|
||||||
|
attributes["level_id"] = cached_data["last_rating_level"]
|
||||||
|
attributes["level_value"] = PRICE_RATING_MAPPING.get(
|
||||||
|
cached_data["last_rating_level"], cached_data["last_rating_level"]
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Extreme value sensors - show when the extreme occurs
|
||||||
|
extreme_sensors = {
|
||||||
|
"lowest_price_today",
|
||||||
|
"highest_price_today",
|
||||||
|
"lowest_price_tomorrow",
|
||||||
|
"highest_price_tomorrow",
|
||||||
|
}
|
||||||
|
if key in extreme_sensors:
|
||||||
|
if cached_data.get("last_extreme_interval"):
|
||||||
|
extreme_starts_at = cached_data["last_extreme_interval"].get("startsAt")
|
||||||
|
if extreme_starts_at:
|
||||||
|
attributes["timestamp"] = extreme_starts_at
|
||||||
|
return
|
||||||
|
|
||||||
|
# Daily average sensors - show midnight to indicate whole day
|
||||||
|
daily_avg_sensors = {"average_price_today", "average_price_tomorrow"}
|
||||||
|
if key in daily_avg_sensors:
|
||||||
|
attributes["timestamp"] = _get_day_midnight_timestamp(key)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Daily aggregated level/rating sensors - show midnight to indicate whole day
|
||||||
|
daily_aggregated_sensors = {
|
||||||
|
"yesterday_price_level",
|
||||||
|
"today_price_level",
|
||||||
|
"tomorrow_price_level",
|
||||||
|
"yesterday_price_rating",
|
||||||
|
"today_price_rating",
|
||||||
|
"tomorrow_price_rating",
|
||||||
|
}
|
||||||
|
if key in daily_aggregated_sensors:
|
||||||
|
attributes["timestamp"] = _get_day_midnight_timestamp(key)
|
||||||
|
return
|
||||||
|
|
||||||
|
# All other statistics sensors - keep default timestamp (when calculation was made)
|
||||||
223
custom_components/tibber_prices/sensor/attributes/future.py
Normal file
223
custom_components/tibber_prices/sensor/attributes/future.py
Normal file
|
|
@ -0,0 +1,223 @@
|
||||||
|
"""Future price/trend attribute builders for Tibber Prices sensors."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from typing import TYPE_CHECKING, Any
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import MINUTES_PER_INTERVAL
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from custom_components.tibber_prices.coordinator.core import (
|
||||||
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Constants
|
||||||
|
MAX_FORECAST_INTERVALS = 8 # Show up to 8 future intervals (2 hours with 15-min intervals)
|
||||||
|
|
||||||
|
|
||||||
|
def add_next_avg_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add attributes for next N hours average price sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
|
||||||
|
"""
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
# Extract hours from sensor key (e.g., "next_avg_3h" -> 3)
|
||||||
|
try:
|
||||||
|
hours = int(key.replace("next_avg_", "").replace("h", ""))
|
||||||
|
except (ValueError, AttributeError):
|
||||||
|
return
|
||||||
|
|
||||||
|
# Get next interval start time (this is where the calculation begins)
|
||||||
|
next_interval_start = now + timedelta(minutes=MINUTES_PER_INTERVAL)
|
||||||
|
|
||||||
|
# Calculate the end of the time window
|
||||||
|
window_end = next_interval_start + timedelta(hours=hours)
|
||||||
|
|
||||||
|
# Get all price intervals
|
||||||
|
price_info = coordinator.data.get("priceInfo", {})
|
||||||
|
today_prices = price_info.get("today", [])
|
||||||
|
tomorrow_prices = price_info.get("tomorrow", [])
|
||||||
|
all_prices = today_prices + tomorrow_prices
|
||||||
|
|
||||||
|
if not all_prices:
|
||||||
|
return
|
||||||
|
|
||||||
|
# Find all intervals in the window
|
||||||
|
intervals_in_window = []
|
||||||
|
for price_data in all_prices:
|
||||||
|
starts_at = dt_util.parse_datetime(price_data["startsAt"])
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
if next_interval_start <= starts_at < window_end:
|
||||||
|
intervals_in_window.append(price_data)
|
||||||
|
|
||||||
|
# Add timestamp attribute (start of next interval - where calculation begins)
|
||||||
|
if intervals_in_window:
|
||||||
|
attributes["timestamp"] = intervals_in_window[0].get("startsAt")
|
||||||
|
attributes["interval_count"] = len(intervals_in_window)
|
||||||
|
attributes["hours"] = hours
|
||||||
|
|
||||||
|
|
||||||
|
def add_price_forecast_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add forecast attributes for the price forecast sensor.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
|
||||||
|
"""
|
||||||
|
future_prices = get_future_prices(coordinator, max_intervals=MAX_FORECAST_INTERVALS)
|
||||||
|
if not future_prices:
|
||||||
|
attributes["intervals"] = []
|
||||||
|
attributes["intervals_by_hour"] = []
|
||||||
|
attributes["data_available"] = False
|
||||||
|
return
|
||||||
|
|
||||||
|
# Add timestamp attribute (first future interval)
|
||||||
|
if future_prices:
|
||||||
|
attributes["timestamp"] = future_prices[0]["interval_start"]
|
||||||
|
|
||||||
|
attributes["intervals"] = future_prices
|
||||||
|
attributes["data_available"] = True
|
||||||
|
|
||||||
|
# Group by hour for easier consumption in dashboards
|
||||||
|
hours: dict[str, Any] = {}
|
||||||
|
for interval in future_prices:
|
||||||
|
starts_at = datetime.fromisoformat(interval["interval_start"])
|
||||||
|
hour_key = starts_at.strftime("%Y-%m-%d %H")
|
||||||
|
|
||||||
|
if hour_key not in hours:
|
||||||
|
hours[hour_key] = {
|
||||||
|
"hour": starts_at.hour,
|
||||||
|
"day": interval["day"],
|
||||||
|
"date": starts_at.date().isoformat(),
|
||||||
|
"intervals": [],
|
||||||
|
"min_price": None,
|
||||||
|
"max_price": None,
|
||||||
|
"avg_price": 0,
|
||||||
|
"avg_rating": None, # Initialize rating tracking
|
||||||
|
"ratings_available": False, # Track if any ratings are available
|
||||||
|
}
|
||||||
|
|
||||||
|
# Create interval data with both price and rating info
|
||||||
|
interval_data = {
|
||||||
|
"minute": starts_at.minute,
|
||||||
|
"price": interval["price"],
|
||||||
|
"price_minor": interval["price_minor"],
|
||||||
|
"level": interval["level"], # Price level from priceInfo
|
||||||
|
"time": starts_at.strftime("%H:%M"),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add rating data if available
|
||||||
|
if interval["rating"] is not None:
|
||||||
|
interval_data["rating"] = interval["rating"]
|
||||||
|
interval_data["rating_level"] = interval["rating_level"]
|
||||||
|
hours[hour_key]["ratings_available"] = True
|
||||||
|
|
||||||
|
hours[hour_key]["intervals"].append(interval_data)
|
||||||
|
|
||||||
|
# Track min/max/avg for the hour
|
||||||
|
price = interval["price"]
|
||||||
|
if hours[hour_key]["min_price"] is None or price < hours[hour_key]["min_price"]:
|
||||||
|
hours[hour_key]["min_price"] = price
|
||||||
|
if hours[hour_key]["max_price"] is None or price > hours[hour_key]["max_price"]:
|
||||||
|
hours[hour_key]["max_price"] = price
|
||||||
|
|
||||||
|
# Calculate averages
|
||||||
|
for hour_data in hours.values():
|
||||||
|
prices = [interval["price"] for interval in hour_data["intervals"]]
|
||||||
|
if prices:
|
||||||
|
hour_data["avg_price"] = sum(prices) / len(prices)
|
||||||
|
hour_data["min_price"] = hour_data["min_price"]
|
||||||
|
hour_data["max_price"] = hour_data["max_price"]
|
||||||
|
|
||||||
|
# Calculate average rating if ratings are available
|
||||||
|
if hour_data["ratings_available"]:
|
||||||
|
ratings = [interval.get("rating") for interval in hour_data["intervals"] if "rating" in interval]
|
||||||
|
if ratings:
|
||||||
|
hour_data["avg_rating"] = sum(ratings) / len(ratings)
|
||||||
|
|
||||||
|
# Convert to list sorted by hour
|
||||||
|
attributes["intervals_by_hour"] = [hour_data for _, hour_data in sorted(hours.items())]
|
||||||
|
|
||||||
|
|
||||||
|
def get_future_prices(
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
max_intervals: int | None = None,
|
||||||
|
) -> list[dict] | None:
|
||||||
|
"""
|
||||||
|
Get future price data for multiple upcoming intervals.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
max_intervals: Maximum number of future intervals to return
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of upcoming price intervals with timestamps and prices
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not coordinator.data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
price_info = coordinator.data.get("priceInfo", {})
|
||||||
|
|
||||||
|
today_prices = price_info.get("today", [])
|
||||||
|
tomorrow_prices = price_info.get("tomorrow", [])
|
||||||
|
all_prices = today_prices + tomorrow_prices
|
||||||
|
|
||||||
|
if not all_prices:
|
||||||
|
return None
|
||||||
|
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
# Initialize the result list
|
||||||
|
future_prices = []
|
||||||
|
|
||||||
|
# Track the maximum intervals to return
|
||||||
|
intervals_to_return = MAX_FORECAST_INTERVALS if max_intervals is None else max_intervals
|
||||||
|
|
||||||
|
for day_key in ["today", "tomorrow"]:
|
||||||
|
for price_data in price_info.get(day_key, []):
|
||||||
|
starts_at = dt_util.parse_datetime(price_data["startsAt"])
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
interval_end = starts_at + timedelta(minutes=MINUTES_PER_INTERVAL)
|
||||||
|
|
||||||
|
if starts_at > now:
|
||||||
|
future_prices.append(
|
||||||
|
{
|
||||||
|
"interval_start": starts_at.isoformat(),
|
||||||
|
"interval_end": interval_end.isoformat(),
|
||||||
|
"price": float(price_data["total"]),
|
||||||
|
"price_minor": round(float(price_data["total"]) * 100, 2),
|
||||||
|
"level": price_data.get("level", "NORMAL"),
|
||||||
|
"rating": price_data.get("difference", None),
|
||||||
|
"rating_level": price_data.get("rating_level"),
|
||||||
|
"day": day_key,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Sort by start time
|
||||||
|
future_prices.sort(key=lambda x: x["interval_start"])
|
||||||
|
|
||||||
|
# Limit to the requested number of intervals
|
||||||
|
return future_prices[:intervals_to_return] if future_prices else None
|
||||||
228
custom_components/tibber_prices/sensor/attributes/interval.py
Normal file
228
custom_components/tibber_prices/sensor/attributes/interval.py
Normal file
|
|
@ -0,0 +1,228 @@
|
||||||
|
"""Interval attribute builders for Tibber Prices sensors."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import timedelta
|
||||||
|
from typing import TYPE_CHECKING, Any
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import (
|
||||||
|
MINUTES_PER_INTERVAL,
|
||||||
|
PRICE_LEVEL_MAPPING,
|
||||||
|
PRICE_RATING_MAPPING,
|
||||||
|
)
|
||||||
|
from custom_components.tibber_prices.entity_utils import add_icon_color_attribute
|
||||||
|
from custom_components.tibber_prices.utils.price import find_price_data_for_interval
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from custom_components.tibber_prices.coordinator.core import (
|
||||||
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
)
|
||||||
|
|
||||||
|
from .metadata import get_current_interval_data
|
||||||
|
|
||||||
|
|
||||||
|
def add_current_interval_price_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
native_value: Any,
|
||||||
|
cached_data: dict,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add attributes for current interval price sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
native_value: The current native value of the sensor
|
||||||
|
cached_data: Dictionary containing cached sensor data
|
||||||
|
|
||||||
|
"""
|
||||||
|
price_info = coordinator.data.get("priceInfo", {}) if coordinator.data else {}
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
# Determine which interval to use based on sensor type
|
||||||
|
next_interval_sensors = [
|
||||||
|
"next_interval_price",
|
||||||
|
"next_interval_price_level",
|
||||||
|
"next_interval_price_rating",
|
||||||
|
]
|
||||||
|
previous_interval_sensors = [
|
||||||
|
"previous_interval_price",
|
||||||
|
"previous_interval_price_level",
|
||||||
|
"previous_interval_price_rating",
|
||||||
|
]
|
||||||
|
next_hour_sensors = [
|
||||||
|
"next_hour_average_price",
|
||||||
|
"next_hour_price_level",
|
||||||
|
"next_hour_price_rating",
|
||||||
|
]
|
||||||
|
current_hour_sensors = [
|
||||||
|
"current_hour_average_price",
|
||||||
|
"current_hour_price_level",
|
||||||
|
"current_hour_price_rating",
|
||||||
|
]
|
||||||
|
|
||||||
|
# Set interval data based on sensor type
|
||||||
|
# For sensors showing data from OTHER intervals (next/previous), override timestamp with that interval's startsAt
|
||||||
|
# For current interval sensors, keep the default platform timestamp (calculation time)
|
||||||
|
interval_data = None
|
||||||
|
if key in next_interval_sensors:
|
||||||
|
target_time = now + timedelta(minutes=MINUTES_PER_INTERVAL)
|
||||||
|
interval_data = find_price_data_for_interval(price_info, target_time)
|
||||||
|
# Override timestamp with the NEXT interval's startsAt (when that interval starts)
|
||||||
|
if interval_data:
|
||||||
|
attributes["timestamp"] = interval_data["startsAt"]
|
||||||
|
elif key in previous_interval_sensors:
|
||||||
|
target_time = now - timedelta(minutes=MINUTES_PER_INTERVAL)
|
||||||
|
interval_data = find_price_data_for_interval(price_info, target_time)
|
||||||
|
# Override timestamp with the PREVIOUS interval's startsAt
|
||||||
|
if interval_data:
|
||||||
|
attributes["timestamp"] = interval_data["startsAt"]
|
||||||
|
elif key in next_hour_sensors:
|
||||||
|
target_time = now + timedelta(hours=1)
|
||||||
|
interval_data = find_price_data_for_interval(price_info, target_time)
|
||||||
|
# Override timestamp with the center of the next rolling hour window
|
||||||
|
if interval_data:
|
||||||
|
attributes["timestamp"] = interval_data["startsAt"]
|
||||||
|
elif key in current_hour_sensors:
|
||||||
|
current_interval_data = get_current_interval_data(coordinator)
|
||||||
|
# Keep default timestamp (when calculation was made) for current hour sensors
|
||||||
|
else:
|
||||||
|
current_interval_data = get_current_interval_data(coordinator)
|
||||||
|
interval_data = current_interval_data # Use current_interval_data as interval_data for current_interval_price
|
||||||
|
# Keep default timestamp (current calculation time) for current interval sensors
|
||||||
|
|
||||||
|
# Add icon_color for price sensors (based on their price level)
|
||||||
|
if key in ["current_interval_price", "next_interval_price", "previous_interval_price"]:
|
||||||
|
# For interval-based price sensors, get level from interval_data
|
||||||
|
if interval_data and "level" in interval_data:
|
||||||
|
level = interval_data["level"]
|
||||||
|
add_icon_color_attribute(attributes, key="price_level", state_value=level)
|
||||||
|
elif key in ["current_hour_average_price", "next_hour_average_price"]:
|
||||||
|
# For hour-based price sensors, get level from cached_data
|
||||||
|
level = cached_data.get("rolling_hour_level")
|
||||||
|
if level:
|
||||||
|
add_icon_color_attribute(attributes, key="price_level", state_value=level)
|
||||||
|
|
||||||
|
# Add price level attributes for all level sensors
|
||||||
|
add_level_attributes_for_sensor(
|
||||||
|
attributes=attributes,
|
||||||
|
key=key,
|
||||||
|
interval_data=interval_data,
|
||||||
|
coordinator=coordinator,
|
||||||
|
native_value=native_value,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add price rating attributes for all rating sensors
|
||||||
|
add_rating_attributes_for_sensor(
|
||||||
|
attributes=attributes,
|
||||||
|
key=key,
|
||||||
|
interval_data=interval_data,
|
||||||
|
coordinator=coordinator,
|
||||||
|
native_value=native_value,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def add_level_attributes_for_sensor(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
interval_data: dict | None,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
native_value: Any,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add price level attributes based on sensor type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
interval_data: Interval data for next/previous sensors
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
native_value: The current native value of the sensor
|
||||||
|
|
||||||
|
"""
|
||||||
|
# For interval-based level sensors (next/previous), use interval data
|
||||||
|
if key in ["next_interval_price_level", "previous_interval_price_level"]:
|
||||||
|
if interval_data and "level" in interval_data:
|
||||||
|
add_price_level_attributes(attributes, interval_data["level"])
|
||||||
|
# For hour-aggregated level sensors, use native_value
|
||||||
|
elif key in ["current_hour_price_level", "next_hour_price_level"]:
|
||||||
|
level_value = native_value
|
||||||
|
if level_value and isinstance(level_value, str):
|
||||||
|
add_price_level_attributes(attributes, level_value.upper())
|
||||||
|
# For current price level sensor
|
||||||
|
elif key == "current_interval_price_level":
|
||||||
|
current_interval_data = get_current_interval_data(coordinator)
|
||||||
|
if current_interval_data and "level" in current_interval_data:
|
||||||
|
add_price_level_attributes(attributes, current_interval_data["level"])
|
||||||
|
|
||||||
|
|
||||||
|
def add_price_level_attributes(attributes: dict, level: str) -> None:
|
||||||
|
"""
|
||||||
|
Add price level specific attributes.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
level: The price level value (e.g., VERY_CHEAP, NORMAL, etc.)
|
||||||
|
|
||||||
|
"""
|
||||||
|
if level in PRICE_LEVEL_MAPPING:
|
||||||
|
attributes["level_value"] = PRICE_LEVEL_MAPPING[level]
|
||||||
|
attributes["level_id"] = level
|
||||||
|
|
||||||
|
# Add icon_color for dynamic styling
|
||||||
|
add_icon_color_attribute(attributes, key="price_level", state_value=level)
|
||||||
|
|
||||||
|
|
||||||
|
def add_rating_attributes_for_sensor(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
interval_data: dict | None,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
native_value: Any,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add price rating attributes based on sensor type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
interval_data: Interval data for next/previous sensors
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
native_value: The current native value of the sensor
|
||||||
|
|
||||||
|
"""
|
||||||
|
# For interval-based rating sensors (next/previous), use interval data
|
||||||
|
if key in ["next_interval_price_rating", "previous_interval_price_rating"]:
|
||||||
|
if interval_data and "rating_level" in interval_data:
|
||||||
|
add_price_rating_attributes(attributes, interval_data["rating_level"])
|
||||||
|
# For hour-aggregated rating sensors, use native_value
|
||||||
|
elif key in ["current_hour_price_rating", "next_hour_price_rating"]:
|
||||||
|
rating_value = native_value
|
||||||
|
if rating_value and isinstance(rating_value, str):
|
||||||
|
add_price_rating_attributes(attributes, rating_value.upper())
|
||||||
|
# For current price rating sensor
|
||||||
|
elif key == "current_interval_price_rating":
|
||||||
|
current_interval_data = get_current_interval_data(coordinator)
|
||||||
|
if current_interval_data and "rating_level" in current_interval_data:
|
||||||
|
add_price_rating_attributes(attributes, current_interval_data["rating_level"])
|
||||||
|
|
||||||
|
|
||||||
|
def add_price_rating_attributes(attributes: dict, rating: str) -> None:
|
||||||
|
"""
|
||||||
|
Add price rating specific attributes.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
rating: The price rating value (e.g., LOW, NORMAL, HIGH)
|
||||||
|
|
||||||
|
"""
|
||||||
|
if rating in PRICE_RATING_MAPPING:
|
||||||
|
attributes["rating_value"] = PRICE_RATING_MAPPING[rating]
|
||||||
|
attributes["rating_id"] = rating
|
||||||
|
|
||||||
|
# Add icon_color for dynamic styling
|
||||||
|
add_icon_color_attribute(attributes, key="price_rating", state_value=rating)
|
||||||
|
|
@ -0,0 +1,35 @@
|
||||||
|
"""Metadata attribute builders for Tibber Prices sensors."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.utils.price import find_price_data_for_interval
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from custom_components.tibber_prices.coordinator.core import (
|
||||||
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_current_interval_data(
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
) -> dict | None:
|
||||||
|
"""
|
||||||
|
Get the current price interval data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Current interval data dict, or None if unavailable
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not coordinator.data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
price_info = coordinator.data.get("priceInfo", {})
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
return find_price_data_for_interval(price_info, now)
|
||||||
64
custom_components/tibber_prices/sensor/attributes/timing.py
Normal file
64
custom_components/tibber_prices/sensor/attributes/timing.py
Normal file
|
|
@ -0,0 +1,64 @@
|
||||||
|
"""Period timing attribute builders for Tibber Prices sensors."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.entity_utils import add_icon_color_attribute
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
|
||||||
|
def _is_timing_or_volatility_sensor(key: str) -> bool:
|
||||||
|
"""Check if sensor is a timing or volatility sensor."""
|
||||||
|
return key.endswith("_volatility") or (
|
||||||
|
key.startswith(("best_price_", "peak_price_"))
|
||||||
|
and any(
|
||||||
|
suffix in key
|
||||||
|
for suffix in [
|
||||||
|
"end_time",
|
||||||
|
"remaining_minutes",
|
||||||
|
"progress",
|
||||||
|
"next_start_time",
|
||||||
|
"next_in_minutes",
|
||||||
|
]
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def add_period_timing_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
state_value: Any = None,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add timestamp and icon_color attributes for best_price/peak_price timing sensors.
|
||||||
|
|
||||||
|
The timestamp indicates when the sensor value was calculated:
|
||||||
|
- Quarter-hour sensors (end_time, next_start_time): Timestamp of current 15-min interval
|
||||||
|
- Minute-update sensors (remaining_minutes, progress, next_in_minutes): Current minute with :00 seconds
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key (e.g., "best_price_end_time")
|
||||||
|
state_value: Current sensor value for icon_color calculation
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Determine if this is a quarter-hour or minute-update sensor
|
||||||
|
is_quarter_hour_sensor = key.endswith(("_end_time", "_next_start_time"))
|
||||||
|
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
if is_quarter_hour_sensor:
|
||||||
|
# Quarter-hour sensors: Use timestamp of current 15-minute interval
|
||||||
|
# Round down to the nearest quarter hour (:00, :15, :30, :45)
|
||||||
|
minute = (now.minute // 15) * 15
|
||||||
|
timestamp = now.replace(minute=minute, second=0, microsecond=0)
|
||||||
|
else:
|
||||||
|
# Minute-update sensors: Use current minute with :00 seconds
|
||||||
|
# This ensures clean timestamps despite timer fluctuations
|
||||||
|
timestamp = now.replace(second=0, microsecond=0)
|
||||||
|
|
||||||
|
attributes["timestamp"] = timestamp.isoformat()
|
||||||
|
|
||||||
|
# Add icon_color for dynamic styling
|
||||||
|
add_icon_color_attribute(attributes, key=key, state_value=state_value)
|
||||||
34
custom_components/tibber_prices/sensor/attributes/trend.py
Normal file
34
custom_components/tibber_prices/sensor/attributes/trend.py
Normal file
|
|
@ -0,0 +1,34 @@
|
||||||
|
"""Trend attribute builders for Tibber Prices sensors."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from .timing import add_period_timing_attributes
|
||||||
|
from .volatility import add_volatility_attributes
|
||||||
|
|
||||||
|
|
||||||
|
def _add_timing_or_volatility_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
cached_data: dict,
|
||||||
|
native_value: Any = None,
|
||||||
|
) -> None:
|
||||||
|
"""Add attributes for timing or volatility sensors."""
|
||||||
|
if key.endswith("_volatility"):
|
||||||
|
add_volatility_attributes(attributes=attributes, cached_data=cached_data)
|
||||||
|
else:
|
||||||
|
add_period_timing_attributes(attributes=attributes, key=key, state_value=native_value)
|
||||||
|
|
||||||
|
|
||||||
|
def _add_cached_trend_attributes(attributes: dict, key: str, cached_data: dict) -> None:
|
||||||
|
"""Add cached trend attributes if available."""
|
||||||
|
if key.startswith("price_trend_") and cached_data.get("trend_attributes"):
|
||||||
|
attributes.update(cached_data["trend_attributes"])
|
||||||
|
elif key == "current_price_trend" and cached_data.get("current_trend_attributes"):
|
||||||
|
# Add cached attributes (timestamp already set by platform)
|
||||||
|
attributes.update(cached_data["current_trend_attributes"])
|
||||||
|
elif key == "next_price_trend_change" and cached_data.get("trend_change_attributes"):
|
||||||
|
# Add cached attributes (timestamp already set by platform)
|
||||||
|
# State contains the timestamp of the trend change itself
|
||||||
|
attributes.update(cached_data["trend_change_attributes"])
|
||||||
128
custom_components/tibber_prices/sensor/attributes/volatility.py
Normal file
128
custom_components/tibber_prices/sensor/attributes/volatility.py
Normal file
|
|
@ -0,0 +1,128 @@
|
||||||
|
"""Volatility attribute builders for Tibber Prices sensors."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.utils.price import calculate_volatility_level
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
|
||||||
|
def add_volatility_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
cached_data: dict,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add attributes for volatility sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
cached_data: Dictionary containing cached sensor data
|
||||||
|
|
||||||
|
"""
|
||||||
|
if cached_data.get("volatility_attributes"):
|
||||||
|
attributes.update(cached_data["volatility_attributes"])
|
||||||
|
|
||||||
|
|
||||||
|
def get_prices_for_volatility(
|
||||||
|
volatility_type: str,
|
||||||
|
price_info: dict,
|
||||||
|
) -> list[float]:
|
||||||
|
"""
|
||||||
|
Get price list for volatility calculation based on type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
volatility_type: One of "today", "tomorrow", "next_24h", "today_tomorrow"
|
||||||
|
price_info: Price information dictionary from coordinator data
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of prices to analyze
|
||||||
|
|
||||||
|
"""
|
||||||
|
if volatility_type == "today":
|
||||||
|
return [float(p["total"]) for p in price_info.get("today", []) if "total" in p]
|
||||||
|
|
||||||
|
if volatility_type == "tomorrow":
|
||||||
|
return [float(p["total"]) for p in price_info.get("tomorrow", []) if "total" in p]
|
||||||
|
|
||||||
|
if volatility_type == "next_24h":
|
||||||
|
# Rolling 24h from now
|
||||||
|
now = dt_util.now()
|
||||||
|
end_time = now + timedelta(hours=24)
|
||||||
|
prices = []
|
||||||
|
|
||||||
|
for day_key in ["today", "tomorrow"]:
|
||||||
|
for price_data in price_info.get(day_key, []):
|
||||||
|
starts_at = dt_util.parse_datetime(price_data.get("startsAt"))
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
|
||||||
|
if now <= starts_at < end_time and "total" in price_data:
|
||||||
|
prices.append(float(price_data["total"]))
|
||||||
|
return prices
|
||||||
|
|
||||||
|
if volatility_type == "today_tomorrow":
|
||||||
|
# Combined today + tomorrow
|
||||||
|
prices = []
|
||||||
|
for day_key in ["today", "tomorrow"]:
|
||||||
|
for price_data in price_info.get(day_key, []):
|
||||||
|
if "total" in price_data:
|
||||||
|
prices.append(float(price_data["total"]))
|
||||||
|
return prices
|
||||||
|
|
||||||
|
return []
|
||||||
|
|
||||||
|
|
||||||
|
def add_volatility_type_attributes(
|
||||||
|
volatility_attributes: dict,
|
||||||
|
volatility_type: str,
|
||||||
|
price_info: dict,
|
||||||
|
thresholds: dict,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add type-specific attributes for volatility sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
volatility_attributes: Dictionary to add type-specific attributes to
|
||||||
|
volatility_type: Type of volatility calculation
|
||||||
|
price_info: Price information dictionary from coordinator data
|
||||||
|
thresholds: Volatility thresholds configuration
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Add timestamp for calendar day volatility sensors (midnight of the day)
|
||||||
|
if volatility_type == "today":
|
||||||
|
today_data = price_info.get("today", [])
|
||||||
|
if today_data:
|
||||||
|
volatility_attributes["timestamp"] = today_data[0].get("startsAt")
|
||||||
|
elif volatility_type == "tomorrow":
|
||||||
|
tomorrow_data = price_info.get("tomorrow", [])
|
||||||
|
if tomorrow_data:
|
||||||
|
volatility_attributes["timestamp"] = tomorrow_data[0].get("startsAt")
|
||||||
|
elif volatility_type == "today_tomorrow":
|
||||||
|
# For combined today+tomorrow, use today's midnight
|
||||||
|
today_data = price_info.get("today", [])
|
||||||
|
if today_data:
|
||||||
|
volatility_attributes["timestamp"] = today_data[0].get("startsAt")
|
||||||
|
|
||||||
|
# Add breakdown for today vs tomorrow
|
||||||
|
today_prices = [float(p["total"]) for p in price_info.get("today", []) if "total" in p]
|
||||||
|
tomorrow_prices = [float(p["total"]) for p in price_info.get("tomorrow", []) if "total" in p]
|
||||||
|
|
||||||
|
if today_prices:
|
||||||
|
today_vol = calculate_volatility_level(today_prices, **thresholds)
|
||||||
|
today_spread = (max(today_prices) - min(today_prices)) * 100
|
||||||
|
volatility_attributes["today_spread"] = round(today_spread, 2)
|
||||||
|
volatility_attributes["today_volatility"] = today_vol
|
||||||
|
volatility_attributes["interval_count_today"] = len(today_prices)
|
||||||
|
|
||||||
|
if tomorrow_prices:
|
||||||
|
tomorrow_vol = calculate_volatility_level(tomorrow_prices, **thresholds)
|
||||||
|
tomorrow_spread = (max(tomorrow_prices) - min(tomorrow_prices)) * 100
|
||||||
|
volatility_attributes["tomorrow_spread"] = round(tomorrow_spread, 2)
|
||||||
|
volatility_attributes["tomorrow_volatility"] = tomorrow_vol
|
||||||
|
volatility_attributes["interval_count_tomorrow"] = len(tomorrow_prices)
|
||||||
|
elif volatility_type == "next_24h":
|
||||||
|
# Add time window info
|
||||||
|
now = dt_util.now()
|
||||||
|
volatility_attributes["timestamp"] = now.isoformat()
|
||||||
106
custom_components/tibber_prices/sensor/attributes/window_24h.py
Normal file
106
custom_components/tibber_prices/sensor/attributes/window_24h.py
Normal file
|
|
@ -0,0 +1,106 @@
|
||||||
|
"""24-hour window attribute builders for Tibber Prices sensors."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import timedelta
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from custom_components.tibber_prices.coordinator.core import (
|
||||||
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _update_extreme_interval(extreme_interval: dict | None, price_data: dict, key: str) -> dict:
|
||||||
|
"""
|
||||||
|
Update extreme interval for min/max sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
extreme_interval: Current extreme interval or None
|
||||||
|
price_data: New price data to compare
|
||||||
|
key: Sensor key to determine if min or max
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Updated extreme interval
|
||||||
|
|
||||||
|
"""
|
||||||
|
if extreme_interval is None:
|
||||||
|
return price_data
|
||||||
|
|
||||||
|
price = price_data.get("total")
|
||||||
|
extreme_price = extreme_interval.get("total")
|
||||||
|
|
||||||
|
if price is None or extreme_price is None:
|
||||||
|
return extreme_interval
|
||||||
|
|
||||||
|
is_new_extreme = ("min" in key and price < extreme_price) or ("max" in key and price > extreme_price)
|
||||||
|
|
||||||
|
return price_data if is_new_extreme else extreme_interval
|
||||||
|
|
||||||
|
|
||||||
|
def add_average_price_attributes(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Add attributes for trailing and leading average/min/max price sensors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
attributes: Dictionary to add attributes to
|
||||||
|
key: The sensor entity key
|
||||||
|
coordinator: The data update coordinator
|
||||||
|
|
||||||
|
"""
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
# Determine if this is trailing or leading
|
||||||
|
is_trailing = "trailing" in key
|
||||||
|
|
||||||
|
# Get all price intervals
|
||||||
|
price_info = coordinator.data.get("priceInfo", {})
|
||||||
|
yesterday_prices = price_info.get("yesterday", [])
|
||||||
|
today_prices = price_info.get("today", [])
|
||||||
|
tomorrow_prices = price_info.get("tomorrow", [])
|
||||||
|
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
||||||
|
|
||||||
|
if not all_prices:
|
||||||
|
return
|
||||||
|
|
||||||
|
# Calculate the time window
|
||||||
|
if is_trailing:
|
||||||
|
window_start = now - timedelta(hours=24)
|
||||||
|
window_end = now
|
||||||
|
else:
|
||||||
|
window_start = now
|
||||||
|
window_end = now + timedelta(hours=24)
|
||||||
|
|
||||||
|
# Find all intervals in the window
|
||||||
|
intervals_in_window = []
|
||||||
|
extreme_interval = None # Track interval with min/max for min/max sensors
|
||||||
|
is_min_max_sensor = "min" in key or "max" in key
|
||||||
|
|
||||||
|
for price_data in all_prices:
|
||||||
|
starts_at = dt_util.parse_datetime(price_data["startsAt"])
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
if window_start <= starts_at < window_end:
|
||||||
|
intervals_in_window.append(price_data)
|
||||||
|
|
||||||
|
# Track extreme interval for min/max sensors
|
||||||
|
if is_min_max_sensor:
|
||||||
|
extreme_interval = _update_extreme_interval(extreme_interval, price_data, key)
|
||||||
|
|
||||||
|
# Add timestamp attribute
|
||||||
|
if intervals_in_window:
|
||||||
|
# For min/max sensors: use the timestamp of the interval with extreme price
|
||||||
|
# For average sensors: use first interval in the window
|
||||||
|
if extreme_interval and is_min_max_sensor:
|
||||||
|
attributes["timestamp"] = extreme_interval.get("startsAt")
|
||||||
|
else:
|
||||||
|
attributes["timestamp"] = intervals_in_window[0].get("startsAt")
|
||||||
|
|
||||||
|
attributes["interval_count"] = len(intervals_in_window)
|
||||||
|
|
@ -0,0 +1,33 @@
|
||||||
|
"""
|
||||||
|
Calculator classes for Tibber Prices sensor value calculations.
|
||||||
|
|
||||||
|
This package contains specialized calculator classes that handle different types
|
||||||
|
of sensor value calculations. Each calculator focuses on one calculation pattern
|
||||||
|
(interval-based, rolling hour, daily statistics, etc.).
|
||||||
|
|
||||||
|
All calculators inherit from BaseCalculator and have access to coordinator data.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from .base import BaseCalculator
|
||||||
|
from .daily_stat import DailyStatCalculator
|
||||||
|
from .interval import IntervalCalculator
|
||||||
|
from .metadata import MetadataCalculator
|
||||||
|
from .rolling_hour import RollingHourCalculator
|
||||||
|
from .timing import TimingCalculator
|
||||||
|
from .trend import TrendCalculator
|
||||||
|
from .volatility import VolatilityCalculator
|
||||||
|
from .window_24h import Window24hCalculator
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"BaseCalculator",
|
||||||
|
"DailyStatCalculator",
|
||||||
|
"IntervalCalculator",
|
||||||
|
"MetadataCalculator",
|
||||||
|
"RollingHourCalculator",
|
||||||
|
"TimingCalculator",
|
||||||
|
"TrendCalculator",
|
||||||
|
"VolatilityCalculator",
|
||||||
|
"Window24hCalculator",
|
||||||
|
]
|
||||||
71
custom_components/tibber_prices/sensor/calculators/base.py
Normal file
71
custom_components/tibber_prices/sensor/calculators/base.py
Normal file
|
|
@ -0,0 +1,71 @@
|
||||||
|
"""Base calculator class for all Tibber Prices sensor calculators."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import TYPE_CHECKING, Any
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from custom_components.tibber_prices.coordinator import (
|
||||||
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
)
|
||||||
|
from custom_components.tibber_prices.data import TibberPricesConfigEntry
|
||||||
|
from homeassistant.core import HomeAssistant
|
||||||
|
|
||||||
|
|
||||||
|
class BaseCalculator:
|
||||||
|
"""
|
||||||
|
Base class for all sensor value calculators.
|
||||||
|
|
||||||
|
Provides common access patterns to coordinator data and configuration.
|
||||||
|
All specialized calculators should inherit from this class.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, coordinator: TibberPricesDataUpdateCoordinator) -> None:
|
||||||
|
"""
|
||||||
|
Initialize the calculator.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
coordinator: The data update coordinator providing price and user data.
|
||||||
|
|
||||||
|
"""
|
||||||
|
self._coordinator = coordinator
|
||||||
|
|
||||||
|
@property
|
||||||
|
def coordinator(self) -> TibberPricesDataUpdateCoordinator:
|
||||||
|
"""Get the coordinator instance."""
|
||||||
|
return self._coordinator
|
||||||
|
|
||||||
|
@property
|
||||||
|
def hass(self) -> HomeAssistant:
|
||||||
|
"""Get Home Assistant instance."""
|
||||||
|
return self._coordinator.hass
|
||||||
|
|
||||||
|
@property
|
||||||
|
def config_entry(self) -> TibberPricesConfigEntry:
|
||||||
|
"""Get config entry."""
|
||||||
|
return self._coordinator.config_entry
|
||||||
|
|
||||||
|
@property
|
||||||
|
def config(self) -> Any:
|
||||||
|
"""Get configuration options."""
|
||||||
|
return self.config_entry.options
|
||||||
|
|
||||||
|
@property
|
||||||
|
def coordinator_data(self) -> dict[str, Any]:
|
||||||
|
"""Get full coordinator data."""
|
||||||
|
return self._coordinator.data
|
||||||
|
|
||||||
|
@property
|
||||||
|
def price_info(self) -> dict[str, Any]:
|
||||||
|
"""Get price information from coordinator data."""
|
||||||
|
return self.coordinator_data.get("priceInfo", {})
|
||||||
|
|
||||||
|
@property
|
||||||
|
def user_data(self) -> dict[str, Any]:
|
||||||
|
"""Get user data from coordinator data."""
|
||||||
|
return self.coordinator_data.get("user_data", {})
|
||||||
|
|
||||||
|
@property
|
||||||
|
def currency(self) -> str:
|
||||||
|
"""Get currency code from price info."""
|
||||||
|
return self.price_info.get("currency", "EUR")
|
||||||
206
custom_components/tibber_prices/sensor/calculators/daily_stat.py
Normal file
206
custom_components/tibber_prices/sensor/calculators/daily_stat.py
Normal file
|
|
@ -0,0 +1,206 @@
|
||||||
|
"""Calculator for daily statistics (min/max/avg within calendar day)."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import timedelta
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import (
|
||||||
|
CONF_PRICE_RATING_THRESHOLD_HIGH,
|
||||||
|
CONF_PRICE_RATING_THRESHOLD_LOW,
|
||||||
|
DEFAULT_PRICE_RATING_THRESHOLD_HIGH,
|
||||||
|
DEFAULT_PRICE_RATING_THRESHOLD_LOW,
|
||||||
|
)
|
||||||
|
from custom_components.tibber_prices.entity_utils import get_price_value
|
||||||
|
from custom_components.tibber_prices.sensor.helpers import (
|
||||||
|
aggregate_level_data,
|
||||||
|
aggregate_rating_data,
|
||||||
|
)
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
from .base import BaseCalculator
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from collections.abc import Callable
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.coordinator import (
|
||||||
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class DailyStatCalculator(BaseCalculator):
|
||||||
|
"""
|
||||||
|
Calculator for daily statistics.
|
||||||
|
|
||||||
|
Handles sensors that calculate min/max/avg prices or aggregate level/rating
|
||||||
|
for entire calendar days (yesterday/today/tomorrow).
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, coordinator: TibberPricesDataUpdateCoordinator) -> None:
|
||||||
|
"""
|
||||||
|
Initialize calculator.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
coordinator: The data update coordinator.
|
||||||
|
|
||||||
|
"""
|
||||||
|
super().__init__(coordinator)
|
||||||
|
self._last_extreme_interval: dict | None = None
|
||||||
|
|
||||||
|
def get_daily_stat_value(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
day: str = "today",
|
||||||
|
stat_func: Callable[[list[float]], float],
|
||||||
|
) -> float | None:
|
||||||
|
"""
|
||||||
|
Unified method for daily statistics (min/max/avg within calendar day).
|
||||||
|
|
||||||
|
Calculates statistics for a specific calendar day using local timezone
|
||||||
|
boundaries. Stores the extreme interval for use in attributes.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
day: "today" or "tomorrow" - which calendar day to calculate for.
|
||||||
|
stat_func: Statistical function (min, max, or lambda for avg).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Price value in minor currency units (cents/øre), or None if unavailable.
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not self.coordinator_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
price_info = self.price_info
|
||||||
|
|
||||||
|
# Get local midnight boundaries based on the requested day
|
||||||
|
local_midnight = dt_util.as_local(dt_util.start_of_local_day(dt_util.now()))
|
||||||
|
if day == "tomorrow":
|
||||||
|
local_midnight = local_midnight + timedelta(days=1)
|
||||||
|
local_midnight_next_day = local_midnight + timedelta(days=1)
|
||||||
|
|
||||||
|
# Collect all prices and their intervals from both today and tomorrow data
|
||||||
|
# that fall within the target day's local date boundaries
|
||||||
|
price_intervals = []
|
||||||
|
for day_key in ["today", "tomorrow"]:
|
||||||
|
for price_data in price_info.get(day_key, []):
|
||||||
|
starts_at_str = price_data.get("startsAt")
|
||||||
|
if not starts_at_str:
|
||||||
|
continue
|
||||||
|
|
||||||
|
starts_at = dt_util.parse_datetime(starts_at_str)
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Convert to local timezone for comparison
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
|
||||||
|
# Include price if it starts within the target day's local date boundaries
|
||||||
|
if local_midnight <= starts_at < local_midnight_next_day:
|
||||||
|
total_price = price_data.get("total")
|
||||||
|
if total_price is not None:
|
||||||
|
price_intervals.append(
|
||||||
|
{
|
||||||
|
"price": float(total_price),
|
||||||
|
"interval": price_data,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
if not price_intervals:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Find the extreme value and store its interval for later use in attributes
|
||||||
|
prices = [pi["price"] for pi in price_intervals]
|
||||||
|
value = stat_func(prices)
|
||||||
|
|
||||||
|
# Store the interval with the extreme price for use in attributes
|
||||||
|
for pi in price_intervals:
|
||||||
|
if pi["price"] == value:
|
||||||
|
self._last_extreme_interval = pi["interval"]
|
||||||
|
break
|
||||||
|
|
||||||
|
# Always return in minor currency units (cents/øre) with 2 decimals
|
||||||
|
result = get_price_value(value, in_euro=False)
|
||||||
|
return round(result, 2)
|
||||||
|
|
||||||
|
def get_daily_aggregated_value(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
day: str = "today",
|
||||||
|
value_type: str = "level",
|
||||||
|
) -> str | None:
|
||||||
|
"""
|
||||||
|
Get aggregated price level or rating for a specific calendar day.
|
||||||
|
|
||||||
|
Aggregates all intervals within a calendar day using the same logic
|
||||||
|
as rolling hour sensors, but for the entire day.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
day: "yesterday", "today", or "tomorrow" - which calendar day to calculate for.
|
||||||
|
value_type: "level" or "rating" - type of aggregation to perform.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Aggregated level/rating value (lowercase), or None if unavailable.
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not self.coordinator_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
price_info = self.price_info
|
||||||
|
|
||||||
|
# Get local midnight boundaries based on the requested day
|
||||||
|
local_midnight = dt_util.as_local(dt_util.start_of_local_day(dt_util.now()))
|
||||||
|
if day == "tomorrow":
|
||||||
|
local_midnight = local_midnight + timedelta(days=1)
|
||||||
|
elif day == "yesterday":
|
||||||
|
local_midnight = local_midnight - timedelta(days=1)
|
||||||
|
local_midnight_next_day = local_midnight + timedelta(days=1)
|
||||||
|
|
||||||
|
# Collect all intervals from both today and tomorrow data
|
||||||
|
# that fall within the target day's local date boundaries
|
||||||
|
day_intervals = []
|
||||||
|
for day_key in ["yesterday", "today", "tomorrow"]:
|
||||||
|
for price_data in price_info.get(day_key, []):
|
||||||
|
starts_at_str = price_data.get("startsAt")
|
||||||
|
if not starts_at_str:
|
||||||
|
continue
|
||||||
|
|
||||||
|
starts_at = dt_util.parse_datetime(starts_at_str)
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Convert to local timezone for comparison
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
|
||||||
|
# Include interval if it starts within the target day's local date boundaries
|
||||||
|
if local_midnight <= starts_at < local_midnight_next_day:
|
||||||
|
day_intervals.append(price_data)
|
||||||
|
|
||||||
|
if not day_intervals:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Use the same aggregation logic as rolling hour sensors
|
||||||
|
if value_type == "level":
|
||||||
|
return aggregate_level_data(day_intervals)
|
||||||
|
if value_type == "rating":
|
||||||
|
# Get thresholds from config
|
||||||
|
threshold_low = self.config.get(
|
||||||
|
CONF_PRICE_RATING_THRESHOLD_LOW,
|
||||||
|
DEFAULT_PRICE_RATING_THRESHOLD_LOW,
|
||||||
|
)
|
||||||
|
threshold_high = self.config.get(
|
||||||
|
CONF_PRICE_RATING_THRESHOLD_HIGH,
|
||||||
|
DEFAULT_PRICE_RATING_THRESHOLD_HIGH,
|
||||||
|
)
|
||||||
|
return aggregate_rating_data(day_intervals, threshold_low, threshold_high)
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def get_last_extreme_interval(self) -> dict | None:
|
||||||
|
"""
|
||||||
|
Get the last stored extreme interval (from min/max calculation).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with interval data, or None if no extreme interval stored.
|
||||||
|
|
||||||
|
"""
|
||||||
|
return self._last_extreme_interval
|
||||||
182
custom_components/tibber_prices/sensor/calculators/interval.py
Normal file
182
custom_components/tibber_prices/sensor/calculators/interval.py
Normal file
|
|
@ -0,0 +1,182 @@
|
||||||
|
"""Calculator for interval-based sensors (current/next/previous interval values)."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import timedelta
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import MINUTES_PER_INTERVAL
|
||||||
|
from custom_components.tibber_prices.utils.price import find_price_data_for_interval
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
from .base import BaseCalculator
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from custom_components.tibber_prices.coordinator import (
|
||||||
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class IntervalCalculator(BaseCalculator):
|
||||||
|
"""
|
||||||
|
Calculator for interval-based sensors.
|
||||||
|
|
||||||
|
Handles sensors that retrieve values (price/level/rating) for specific intervals
|
||||||
|
relative to the current time (current, next, previous).
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, coordinator: TibberPricesDataUpdateCoordinator) -> None:
|
||||||
|
"""
|
||||||
|
Initialize calculator.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
coordinator: The data update coordinator.
|
||||||
|
|
||||||
|
"""
|
||||||
|
super().__init__(coordinator)
|
||||||
|
# State attributes for specific sensors
|
||||||
|
self._last_price_level: str | None = None
|
||||||
|
self._last_rating_level: str | None = None
|
||||||
|
self._last_rating_difference: float | None = None
|
||||||
|
|
||||||
|
def get_interval_value(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
interval_offset: int,
|
||||||
|
value_type: str,
|
||||||
|
in_euro: bool = False,
|
||||||
|
) -> str | float | None:
|
||||||
|
"""
|
||||||
|
Unified method to get values (price/level/rating) for intervals with offset.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
interval_offset: Offset from current interval (0=current, 1=next, -1=previous).
|
||||||
|
value_type: Type of value to retrieve ("price", "level", "rating").
|
||||||
|
in_euro: For prices only - return in EUR if True, cents if False.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
For "price": float in EUR or cents.
|
||||||
|
For "level" or "rating": lowercase enum string.
|
||||||
|
None if data unavailable.
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not self.coordinator_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
price_info = self.price_info
|
||||||
|
now = dt_util.now()
|
||||||
|
target_time = now + timedelta(minutes=MINUTES_PER_INTERVAL * interval_offset)
|
||||||
|
|
||||||
|
interval_data = find_price_data_for_interval(price_info, target_time)
|
||||||
|
if not interval_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Extract value based on type
|
||||||
|
if value_type == "price":
|
||||||
|
price = interval_data.get("total")
|
||||||
|
if price is None:
|
||||||
|
return None
|
||||||
|
price = float(price)
|
||||||
|
return price if in_euro else round(price * 100, 2)
|
||||||
|
|
||||||
|
if value_type == "level":
|
||||||
|
level = interval_data.get("level")
|
||||||
|
return level.lower() if level else None
|
||||||
|
|
||||||
|
# For rating: extract rating_level
|
||||||
|
rating = interval_data.get("rating_level")
|
||||||
|
return rating.lower() if rating else None
|
||||||
|
|
||||||
|
def get_price_level_value(self) -> str | None:
|
||||||
|
"""
|
||||||
|
Get the current price level value as enum string for the state.
|
||||||
|
|
||||||
|
Stores the level in internal state for attribute building.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Price level (lowercase), or None if unavailable.
|
||||||
|
|
||||||
|
"""
|
||||||
|
current_interval_data = self.get_current_interval_data()
|
||||||
|
if not current_interval_data or "level" not in current_interval_data:
|
||||||
|
return None
|
||||||
|
level = current_interval_data["level"]
|
||||||
|
self._last_price_level = level
|
||||||
|
# Convert API level (e.g., "NORMAL") to lowercase enum value (e.g., "normal")
|
||||||
|
return level.lower() if level else None
|
||||||
|
|
||||||
|
def get_rating_value(self, *, rating_type: str) -> str | None:
|
||||||
|
"""
|
||||||
|
Get the price rating level from the current price interval in priceInfo.
|
||||||
|
|
||||||
|
Returns the rating level enum value, and stores the original
|
||||||
|
level and percentage difference as attributes.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
rating_type: Must be "current" (other values return None).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Rating level (lowercase), or None if unavailable.
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not self.coordinator_data or rating_type != "current":
|
||||||
|
self._last_rating_difference = None
|
||||||
|
self._last_rating_level = None
|
||||||
|
return None
|
||||||
|
|
||||||
|
now = dt_util.now()
|
||||||
|
price_info = self.price_info
|
||||||
|
current_interval = find_price_data_for_interval(price_info, now)
|
||||||
|
|
||||||
|
if current_interval:
|
||||||
|
rating_level = current_interval.get("rating_level")
|
||||||
|
difference = current_interval.get("difference")
|
||||||
|
if rating_level is not None:
|
||||||
|
self._last_rating_difference = float(difference) if difference is not None else None
|
||||||
|
self._last_rating_level = rating_level
|
||||||
|
# Convert API rating (e.g., "NORMAL") to lowercase enum value (e.g., "normal")
|
||||||
|
return rating_level.lower() if rating_level else None
|
||||||
|
|
||||||
|
self._last_rating_difference = None
|
||||||
|
self._last_rating_level = None
|
||||||
|
return None
|
||||||
|
|
||||||
|
def get_current_interval_data(self) -> dict | None:
|
||||||
|
"""
|
||||||
|
Get the price data for the current interval using coordinator utility.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with interval data, or None if unavailable.
|
||||||
|
|
||||||
|
"""
|
||||||
|
return self.coordinator.get_current_interval()
|
||||||
|
|
||||||
|
def get_last_price_level(self) -> str | None:
|
||||||
|
"""
|
||||||
|
Get the last stored price level (from get_price_level_value call).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Price level string, or None if no level stored.
|
||||||
|
|
||||||
|
"""
|
||||||
|
return self._last_price_level
|
||||||
|
|
||||||
|
def get_last_rating_level(self) -> str | None:
|
||||||
|
"""
|
||||||
|
Get the last stored rating level (from get_rating_value call).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Rating level string, or None if no level stored.
|
||||||
|
|
||||||
|
"""
|
||||||
|
return self._last_rating_level
|
||||||
|
|
||||||
|
def get_last_rating_difference(self) -> float | None:
|
||||||
|
"""
|
||||||
|
Get the last stored rating difference (from get_rating_value call).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Rating difference percentage, or None if no difference stored.
|
||||||
|
|
||||||
|
"""
|
||||||
|
return self._last_rating_difference
|
||||||
115
custom_components/tibber_prices/sensor/calculators/metadata.py
Normal file
115
custom_components/tibber_prices/sensor/calculators/metadata.py
Normal file
|
|
@ -0,0 +1,115 @@
|
||||||
|
"""Calculator for home metadata, metering point, and subscription data."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from .base import BaseCalculator
|
||||||
|
|
||||||
|
|
||||||
|
class MetadataCalculator(BaseCalculator):
|
||||||
|
"""
|
||||||
|
Calculator for home metadata, metering point, and subscription data.
|
||||||
|
|
||||||
|
Handles sensors that expose static or slowly-changing user data from the
|
||||||
|
Tibber API, such as home characteristics, metering point information, and
|
||||||
|
subscription details.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get_home_metadata_value(self, field: str) -> str | int | None:
|
||||||
|
"""
|
||||||
|
Get home metadata value from user data.
|
||||||
|
|
||||||
|
String values are converted to lowercase for ENUM device_class compatibility.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
field: The metadata field name (e.g., "type", "size", "mainFuseSize").
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The field value, or None if not available.
|
||||||
|
|
||||||
|
"""
|
||||||
|
user_homes = self.coordinator.get_user_homes()
|
||||||
|
if not user_homes:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Find the home matching this sensor's home_id
|
||||||
|
home_id = self.config_entry.data.get("home_id")
|
||||||
|
if not home_id:
|
||||||
|
return None
|
||||||
|
|
||||||
|
home_data = next((home for home in user_homes if home.get("id") == home_id), None)
|
||||||
|
if not home_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
value = home_data.get(field)
|
||||||
|
|
||||||
|
# Convert string to lowercase for ENUM device_class
|
||||||
|
if isinstance(value, str):
|
||||||
|
return value.lower()
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
def get_metering_point_value(self, field: str) -> str | int | None:
|
||||||
|
"""
|
||||||
|
Get metering point data value from user data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
field: The metering point field name (e.g., "gridCompany", "priceAreaCode").
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The field value, or None if not available.
|
||||||
|
|
||||||
|
"""
|
||||||
|
user_homes = self.coordinator.get_user_homes()
|
||||||
|
if not user_homes:
|
||||||
|
return None
|
||||||
|
|
||||||
|
home_id = self.config_entry.data.get("home_id")
|
||||||
|
if not home_id:
|
||||||
|
return None
|
||||||
|
|
||||||
|
home_data = next((home for home in user_homes if home.get("id") == home_id), None)
|
||||||
|
if not home_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
metering_point = home_data.get("meteringPointData")
|
||||||
|
if not metering_point:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return metering_point.get(field)
|
||||||
|
|
||||||
|
def get_subscription_value(self, field: str) -> str | None:
|
||||||
|
"""
|
||||||
|
Get subscription value from user data.
|
||||||
|
|
||||||
|
String values are converted to lowercase for ENUM device_class compatibility.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
field: The subscription field name (e.g., "status").
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The field value, or None if not available.
|
||||||
|
|
||||||
|
"""
|
||||||
|
user_homes = self.coordinator.get_user_homes()
|
||||||
|
if not user_homes:
|
||||||
|
return None
|
||||||
|
|
||||||
|
home_id = self.config_entry.data.get("home_id")
|
||||||
|
if not home_id:
|
||||||
|
return None
|
||||||
|
|
||||||
|
home_data = next((home for home in user_homes if home.get("id") == home_id), None)
|
||||||
|
if not home_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
subscription = home_data.get("currentSubscription")
|
||||||
|
if not subscription:
|
||||||
|
return None
|
||||||
|
|
||||||
|
value = subscription.get(field)
|
||||||
|
|
||||||
|
# Convert string to lowercase for ENUM device_class
|
||||||
|
if isinstance(value, str):
|
||||||
|
return value.lower()
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
@ -0,0 +1,116 @@
|
||||||
|
"""Calculator for rolling hour average values (5-interval windows)."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import (
|
||||||
|
CONF_PRICE_RATING_THRESHOLD_HIGH,
|
||||||
|
CONF_PRICE_RATING_THRESHOLD_LOW,
|
||||||
|
DEFAULT_PRICE_RATING_THRESHOLD_HIGH,
|
||||||
|
DEFAULT_PRICE_RATING_THRESHOLD_LOW,
|
||||||
|
)
|
||||||
|
from custom_components.tibber_prices.entity_utils import find_rolling_hour_center_index
|
||||||
|
from custom_components.tibber_prices.sensor.helpers import (
|
||||||
|
aggregate_level_data,
|
||||||
|
aggregate_price_data,
|
||||||
|
aggregate_rating_data,
|
||||||
|
)
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
from .base import BaseCalculator
|
||||||
|
|
||||||
|
|
||||||
|
class RollingHourCalculator(BaseCalculator):
|
||||||
|
"""
|
||||||
|
Calculator for rolling hour values (5-interval windows).
|
||||||
|
|
||||||
|
Handles sensors that aggregate data from a 5-interval window (60 minutes):
|
||||||
|
2 intervals before + center interval + 2 intervals after.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get_rolling_hour_value(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
hour_offset: int = 0,
|
||||||
|
value_type: str = "price",
|
||||||
|
) -> str | float | None:
|
||||||
|
"""
|
||||||
|
Unified method to get aggregated values from 5-interval rolling window.
|
||||||
|
|
||||||
|
Window: 2 before + center + 2 after = 5 intervals (60 minutes total).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
hour_offset: 0 (current hour), 1 (next hour), etc.
|
||||||
|
value_type: "price" | "level" | "rating".
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Aggregated value based on type:
|
||||||
|
- "price": float (average price in minor currency units)
|
||||||
|
- "level": str (aggregated level: "very_cheap", "cheap", etc.)
|
||||||
|
- "rating": str (aggregated rating: "low", "normal", "high")
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not self.coordinator_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Get all available price data
|
||||||
|
price_info = self.price_info
|
||||||
|
all_prices = price_info.get("yesterday", []) + price_info.get("today", []) + price_info.get("tomorrow", [])
|
||||||
|
|
||||||
|
if not all_prices:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Find center index for the rolling window
|
||||||
|
now = dt_util.now()
|
||||||
|
center_idx = find_rolling_hour_center_index(all_prices, now, hour_offset)
|
||||||
|
if center_idx is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Collect data from 5-interval window (-2, -1, 0, +1, +2)
|
||||||
|
window_data = []
|
||||||
|
for offset in range(-2, 3):
|
||||||
|
idx = center_idx + offset
|
||||||
|
if 0 <= idx < len(all_prices):
|
||||||
|
window_data.append(all_prices[idx])
|
||||||
|
|
||||||
|
if not window_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return self._aggregate_window_data(window_data, value_type)
|
||||||
|
|
||||||
|
def _aggregate_window_data(
|
||||||
|
self,
|
||||||
|
window_data: list[dict],
|
||||||
|
value_type: str,
|
||||||
|
) -> str | float | None:
|
||||||
|
"""
|
||||||
|
Aggregate data from multiple intervals based on value type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
window_data: List of price interval dictionaries.
|
||||||
|
value_type: "price" | "level" | "rating".
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Aggregated value based on type.
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Get thresholds from config for rating aggregation
|
||||||
|
threshold_low = self.config.get(
|
||||||
|
CONF_PRICE_RATING_THRESHOLD_LOW,
|
||||||
|
DEFAULT_PRICE_RATING_THRESHOLD_LOW,
|
||||||
|
)
|
||||||
|
threshold_high = self.config.get(
|
||||||
|
CONF_PRICE_RATING_THRESHOLD_HIGH,
|
||||||
|
DEFAULT_PRICE_RATING_THRESHOLD_HIGH,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Map value types to aggregation functions
|
||||||
|
aggregators = {
|
||||||
|
"price": lambda data: aggregate_price_data(data),
|
||||||
|
"level": lambda data: aggregate_level_data(data),
|
||||||
|
"rating": lambda data: aggregate_rating_data(data, threshold_low, threshold_high),
|
||||||
|
}
|
||||||
|
|
||||||
|
aggregator = aggregators.get(value_type)
|
||||||
|
if aggregator:
|
||||||
|
return aggregator(window_data)
|
||||||
|
return None
|
||||||
246
custom_components/tibber_prices/sensor/calculators/timing.py
Normal file
246
custom_components/tibber_prices/sensor/calculators/timing.py
Normal file
|
|
@ -0,0 +1,246 @@
|
||||||
|
"""
|
||||||
|
Timing calculator for best/peak price period timing sensors.
|
||||||
|
|
||||||
|
This module handles all timing-related calculations for period-based sensors:
|
||||||
|
- Period end times (when does current/next period end?)
|
||||||
|
- Period start times (when does next period start?)
|
||||||
|
- Remaining minutes (how long until period ends?)
|
||||||
|
- Progress (how far through the period are we?)
|
||||||
|
- Next period timing (when does the next period start?)
|
||||||
|
|
||||||
|
The calculator provides smart defaults:
|
||||||
|
- Active period → show current period timing
|
||||||
|
- No active → show next period timing
|
||||||
|
- No more periods → 0 for numeric values, None for timestamps
|
||||||
|
"""
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
from .base import BaseCalculator
|
||||||
|
|
||||||
|
# Constants
|
||||||
|
PROGRESS_GRACE_PERIOD_SECONDS = 60 # Show 100% for 1 minute after period ends
|
||||||
|
|
||||||
|
|
||||||
|
class TimingCalculator(BaseCalculator):
|
||||||
|
"""
|
||||||
|
Calculator for period timing sensors.
|
||||||
|
|
||||||
|
Handles timing information for best_price and peak_price periods:
|
||||||
|
- Active period timing (end time, remaining minutes, progress)
|
||||||
|
- Next period timing (start time, minutes until start)
|
||||||
|
- Period duration (total length in minutes)
|
||||||
|
|
||||||
|
Period states:
|
||||||
|
- ACTIVE: A period is currently running
|
||||||
|
- GRACE: Period just ended (within 60s), still showing 100% progress
|
||||||
|
- IDLE: No active period, waiting for next one
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get_period_timing_value(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
period_type: str,
|
||||||
|
value_type: str,
|
||||||
|
) -> datetime | float | None:
|
||||||
|
"""
|
||||||
|
Get timing-related values for best_price/peak_price periods.
|
||||||
|
|
||||||
|
This method provides timing information based on whether a period is currently
|
||||||
|
active or not, ensuring sensors always provide useful information.
|
||||||
|
|
||||||
|
Value types behavior:
|
||||||
|
- end_time: Active period → current end | No active → next period end | None if no periods
|
||||||
|
- next_start_time: Active period → next-next start | No active → next start | None if no more
|
||||||
|
- remaining_minutes: Active period → minutes to end | No active → 0
|
||||||
|
- progress: Active period → 0-100% | No active → 0
|
||||||
|
- next_in_minutes: Active period → minutes to next-next | No active → minutes to next | None if no more
|
||||||
|
|
||||||
|
Args:
|
||||||
|
period_type: "best_price" or "peak_price"
|
||||||
|
value_type: "end_time", "remaining_minutes", "progress", "next_start_time", "next_in_minutes"
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
- datetime for end_time/next_start_time
|
||||||
|
- float for remaining_minutes/next_in_minutes/progress (or 0 when not active)
|
||||||
|
- None if no relevant period data available
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not self.coordinator.data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Get period data from coordinator
|
||||||
|
periods_data = self.coordinator.data.get("periods", {})
|
||||||
|
period_data = periods_data.get(period_type)
|
||||||
|
|
||||||
|
if not period_data or not period_data.get("periods"):
|
||||||
|
# No periods available - return 0 for numeric sensors, None for timestamps
|
||||||
|
return 0 if value_type in ("remaining_minutes", "progress", "next_in_minutes") else None
|
||||||
|
|
||||||
|
period_summaries = period_data["periods"]
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
# Find current, previous and next periods
|
||||||
|
current_period = self._find_active_period(period_summaries, now)
|
||||||
|
previous_period = self._find_previous_period(period_summaries, now)
|
||||||
|
next_period = self._find_next_period(period_summaries, now, skip_current=bool(current_period))
|
||||||
|
|
||||||
|
# Delegate to specific calculators
|
||||||
|
return self._calculate_timing_value(value_type, current_period, previous_period, next_period, now)
|
||||||
|
|
||||||
|
def _calculate_timing_value(
|
||||||
|
self,
|
||||||
|
value_type: str,
|
||||||
|
current_period: dict | None,
|
||||||
|
previous_period: dict | None,
|
||||||
|
next_period: dict | None,
|
||||||
|
now: datetime,
|
||||||
|
) -> datetime | float | None:
|
||||||
|
"""Calculate specific timing value based on type and available periods."""
|
||||||
|
# Define calculation strategies for each value type
|
||||||
|
calculators = {
|
||||||
|
"end_time": lambda: (
|
||||||
|
current_period.get("end") if current_period else (next_period.get("end") if next_period else None)
|
||||||
|
),
|
||||||
|
"period_duration": lambda: self._calc_period_duration(current_period, next_period),
|
||||||
|
"next_start_time": lambda: next_period.get("start") if next_period else None,
|
||||||
|
"remaining_minutes": lambda: (self._calc_remaining_minutes(current_period, now) if current_period else 0),
|
||||||
|
"progress": lambda: self._calc_progress_with_grace_period(current_period, previous_period, now),
|
||||||
|
"next_in_minutes": lambda: (self._calc_next_in_minutes(next_period, now) if next_period else None),
|
||||||
|
}
|
||||||
|
|
||||||
|
calculator = calculators.get(value_type)
|
||||||
|
return calculator() if calculator else None
|
||||||
|
|
||||||
|
def _find_active_period(self, periods: list, now: datetime) -> dict | None:
|
||||||
|
"""Find currently active period."""
|
||||||
|
for period in periods:
|
||||||
|
start = period.get("start")
|
||||||
|
end = period.get("end")
|
||||||
|
if start and end and start <= now < end:
|
||||||
|
return period
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _find_previous_period(self, periods: list, now: datetime) -> dict | None:
|
||||||
|
"""Find the most recent period that has already ended."""
|
||||||
|
past_periods = [p for p in periods if p.get("end") and p.get("end") <= now]
|
||||||
|
|
||||||
|
if not past_periods:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Sort by end time descending to get the most recent one
|
||||||
|
past_periods.sort(key=lambda p: p["end"], reverse=True)
|
||||||
|
return past_periods[0]
|
||||||
|
|
||||||
|
def _find_next_period(self, periods: list, now: datetime, *, skip_current: bool = False) -> dict | None:
|
||||||
|
"""
|
||||||
|
Find next future period.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
periods: List of period dictionaries
|
||||||
|
now: Current time
|
||||||
|
skip_current: If True, skip the first future period (to get next-next)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Next period dict or None if no future periods
|
||||||
|
|
||||||
|
"""
|
||||||
|
future_periods = [p for p in periods if p.get("start") and p.get("start") > now]
|
||||||
|
|
||||||
|
if not future_periods:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Sort by start time to ensure correct order
|
||||||
|
future_periods.sort(key=lambda p: p["start"])
|
||||||
|
|
||||||
|
# Return second period if skip_current=True (next-next), otherwise first (next)
|
||||||
|
if skip_current and len(future_periods) > 1:
|
||||||
|
return future_periods[1]
|
||||||
|
if not skip_current and future_periods:
|
||||||
|
return future_periods[0]
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _calc_remaining_minutes(self, period: dict, now: datetime) -> float:
|
||||||
|
"""Calculate minutes until period ends."""
|
||||||
|
end = period.get("end")
|
||||||
|
if not end:
|
||||||
|
return 0
|
||||||
|
delta = end - now
|
||||||
|
return max(0, delta.total_seconds() / 60)
|
||||||
|
|
||||||
|
def _calc_next_in_minutes(self, period: dict, now: datetime) -> float:
|
||||||
|
"""Calculate minutes until period starts."""
|
||||||
|
start = period.get("start")
|
||||||
|
if not start:
|
||||||
|
return 0
|
||||||
|
delta = start - now
|
||||||
|
return max(0, delta.total_seconds() / 60)
|
||||||
|
|
||||||
|
def _calc_period_duration(self, current_period: dict | None, next_period: dict | None) -> float | None:
|
||||||
|
"""
|
||||||
|
Calculate total duration of active or next period in minutes.
|
||||||
|
|
||||||
|
Returns duration of current period if active, otherwise duration of next period.
|
||||||
|
This gives users a consistent view of period length regardless of timing.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
current_period: Currently active period (if any)
|
||||||
|
next_period: Next upcoming period (if any)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Duration in minutes, or None if no periods available
|
||||||
|
|
||||||
|
"""
|
||||||
|
period = current_period or next_period
|
||||||
|
if not period:
|
||||||
|
return None
|
||||||
|
|
||||||
|
start = period.get("start")
|
||||||
|
end = period.get("end")
|
||||||
|
if not start or not end:
|
||||||
|
return None
|
||||||
|
|
||||||
|
duration = (end - start).total_seconds() / 60
|
||||||
|
return max(0, duration)
|
||||||
|
|
||||||
|
def _calc_progress(self, period: dict, now: datetime) -> float:
|
||||||
|
"""Calculate progress percentage (0-100) of current period."""
|
||||||
|
start = period.get("start")
|
||||||
|
end = period.get("end")
|
||||||
|
if not start or not end:
|
||||||
|
return 0
|
||||||
|
total_duration = (end - start).total_seconds()
|
||||||
|
if total_duration <= 0:
|
||||||
|
return 0
|
||||||
|
elapsed = (now - start).total_seconds()
|
||||||
|
progress = (elapsed / total_duration) * 100
|
||||||
|
return min(100, max(0, progress))
|
||||||
|
|
||||||
|
def _calc_progress_with_grace_period(
|
||||||
|
self, current_period: dict | None, previous_period: dict | None, now: datetime
|
||||||
|
) -> float:
|
||||||
|
"""
|
||||||
|
Calculate progress with grace period after period end.
|
||||||
|
|
||||||
|
Shows 100% for 1 minute after period ends to allow triggers on 100% completion.
|
||||||
|
This prevents the progress from jumping directly from ~99% to 0% without ever
|
||||||
|
reaching 100%, which would make automations like "when progress = 100%" impossible.
|
||||||
|
"""
|
||||||
|
# If we have an active period, calculate normal progress
|
||||||
|
if current_period:
|
||||||
|
return self._calc_progress(current_period, now)
|
||||||
|
|
||||||
|
# No active period - check if we just finished one (within grace period)
|
||||||
|
if previous_period:
|
||||||
|
previous_end = previous_period.get("end")
|
||||||
|
if previous_end:
|
||||||
|
seconds_since_end = (now - previous_end).total_seconds()
|
||||||
|
# Grace period: Show 100% for defined time after period ended
|
||||||
|
if 0 <= seconds_since_end <= PROGRESS_GRACE_PERIOD_SECONDS:
|
||||||
|
return 100
|
||||||
|
|
||||||
|
# No active period and either no previous period or grace period expired
|
||||||
|
return 0
|
||||||
706
custom_components/tibber_prices/sensor/calculators/trend.py
Normal file
706
custom_components/tibber_prices/sensor/calculators/trend.py
Normal file
|
|
@ -0,0 +1,706 @@
|
||||||
|
"""
|
||||||
|
Trend calculator for price trend analysis sensors.
|
||||||
|
|
||||||
|
This module handles all trend-related calculations:
|
||||||
|
- Simple price trends (1h-12h future comparison)
|
||||||
|
- Current trend with momentum analysis
|
||||||
|
- Next trend change prediction
|
||||||
|
- Trend duration tracking
|
||||||
|
|
||||||
|
Caching strategy:
|
||||||
|
- Simple trends: Cached per sensor update to ensure consistency between state and attributes
|
||||||
|
- Current trend + next change: Cached centrally for 60s to avoid duplicate calculations
|
||||||
|
"""
|
||||||
|
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from typing import TYPE_CHECKING, Any
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import MINUTES_PER_INTERVAL
|
||||||
|
from custom_components.tibber_prices.utils.average import calculate_next_n_hours_avg
|
||||||
|
from custom_components.tibber_prices.utils.price import (
|
||||||
|
calculate_price_trend,
|
||||||
|
find_price_data_for_interval,
|
||||||
|
)
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
from .base import BaseCalculator
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from custom_components.tibber_prices.coordinator import (
|
||||||
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Constants
|
||||||
|
MIN_HOURS_FOR_LATER_HALF = 3 # Minimum hours needed to calculate later half average
|
||||||
|
|
||||||
|
|
||||||
|
class TrendCalculator(BaseCalculator):
|
||||||
|
"""
|
||||||
|
Calculator for price trend sensors.
|
||||||
|
|
||||||
|
Handles three types of trend analysis:
|
||||||
|
1. Simple trends (price_trend_1h-12h): Current vs next N hours average
|
||||||
|
2. Current trend (current_price_trend): Momentum + 3h outlook with volatility adjustment
|
||||||
|
3. Next change (next_price_trend_change): Scan forward for trend reversal
|
||||||
|
|
||||||
|
Caching:
|
||||||
|
- Simple trends: Per-sensor cache (_cached_trend_value, _trend_attributes)
|
||||||
|
- Current/Next: Centralized cache (_trend_calculation_cache) with 60s TTL
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, coordinator: "TibberPricesDataUpdateCoordinator") -> None:
|
||||||
|
"""Initialize trend calculator with caching state."""
|
||||||
|
super().__init__(coordinator)
|
||||||
|
# Per-sensor trend caches (for price_trend_Nh sensors)
|
||||||
|
self._cached_trend_value: str | None = None
|
||||||
|
self._trend_attributes: dict[str, Any] = {}
|
||||||
|
# Centralized trend calculation cache (for current_price_trend + next_price_trend_change)
|
||||||
|
self._trend_calculation_cache: dict[str, Any] | None = None
|
||||||
|
self._trend_calculation_timestamp: datetime | None = None
|
||||||
|
# Separate attribute storage for current_price_trend and next_price_trend_change
|
||||||
|
self._current_trend_attributes: dict[str, Any] | None = None
|
||||||
|
self._trend_change_attributes: dict[str, Any] | None = None
|
||||||
|
|
||||||
|
def get_price_trend_value(self, *, hours: int) -> str | None:
|
||||||
|
"""
|
||||||
|
Calculate price trend comparing current interval vs next N hours average.
|
||||||
|
|
||||||
|
This is for simple trend sensors (price_trend_1h through price_trend_12h).
|
||||||
|
Results are cached per sensor to ensure consistency between state and attributes.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
hours: Number of hours to look ahead for trend calculation
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Trend state: "rising" | "falling" | "stable", or None if unavailable
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Return cached value if available to ensure consistency between
|
||||||
|
# native_value and extra_state_attributes
|
||||||
|
if self._cached_trend_value is not None and self._trend_attributes:
|
||||||
|
return self._cached_trend_value
|
||||||
|
|
||||||
|
if not self.coordinator.data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Get current interval price and timestamp
|
||||||
|
current_interval = self.coordinator.get_current_interval()
|
||||||
|
if not current_interval or "total" not in current_interval:
|
||||||
|
return None
|
||||||
|
|
||||||
|
current_interval_price = float(current_interval["total"])
|
||||||
|
current_starts_at = dt_util.parse_datetime(current_interval["startsAt"])
|
||||||
|
if current_starts_at is None:
|
||||||
|
return None
|
||||||
|
current_starts_at = dt_util.as_local(current_starts_at)
|
||||||
|
|
||||||
|
# Get next interval timestamp (basis for calculation)
|
||||||
|
next_interval_start = current_starts_at + timedelta(minutes=MINUTES_PER_INTERVAL)
|
||||||
|
|
||||||
|
# Get future average price
|
||||||
|
future_avg = calculate_next_n_hours_avg(self.coordinator.data, hours)
|
||||||
|
if future_avg is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Get configured thresholds from options
|
||||||
|
threshold_rising = self.config.get("price_trend_threshold_rising", 5.0)
|
||||||
|
threshold_falling = self.config.get("price_trend_threshold_falling", -5.0)
|
||||||
|
volatility_threshold_moderate = self.config.get("volatility_threshold_moderate", 15.0)
|
||||||
|
volatility_threshold_high = self.config.get("volatility_threshold_high", 30.0)
|
||||||
|
|
||||||
|
# Prepare data for volatility-adaptive thresholds
|
||||||
|
price_info = self.coordinator.data.get("priceInfo", {})
|
||||||
|
today_prices = price_info.get("today", [])
|
||||||
|
tomorrow_prices = price_info.get("tomorrow", [])
|
||||||
|
all_intervals = today_prices + tomorrow_prices
|
||||||
|
lookahead_intervals = hours * 4 # Convert hours to 15-minute intervals
|
||||||
|
|
||||||
|
# Calculate trend with volatility-adaptive thresholds
|
||||||
|
trend_state, diff_pct = calculate_price_trend(
|
||||||
|
current_interval_price,
|
||||||
|
future_avg,
|
||||||
|
threshold_rising=threshold_rising,
|
||||||
|
threshold_falling=threshold_falling,
|
||||||
|
volatility_adjustment=True, # Always enabled
|
||||||
|
lookahead_intervals=lookahead_intervals,
|
||||||
|
all_intervals=all_intervals,
|
||||||
|
volatility_threshold_moderate=volatility_threshold_moderate,
|
||||||
|
volatility_threshold_high=volatility_threshold_high,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Determine icon color based on trend state
|
||||||
|
icon_color = {
|
||||||
|
"rising": "var(--error-color)", # Red/Orange for rising prices (expensive)
|
||||||
|
"falling": "var(--success-color)", # Green for falling prices (cheaper)
|
||||||
|
"stable": "var(--state-icon-color)", # Default gray for stable prices
|
||||||
|
}.get(trend_state, "var(--state-icon-color)")
|
||||||
|
|
||||||
|
# Store attributes in sensor-specific dictionary AND cache the trend value
|
||||||
|
self._trend_attributes = {
|
||||||
|
"timestamp": next_interval_start.isoformat(),
|
||||||
|
f"trend_{hours}h_%": round(diff_pct, 1),
|
||||||
|
f"next_{hours}h_avg": round(future_avg * 100, 2),
|
||||||
|
"interval_count": hours * 4,
|
||||||
|
"threshold_rising": threshold_rising,
|
||||||
|
"threshold_falling": threshold_falling,
|
||||||
|
"icon_color": icon_color,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Calculate additional attributes for better granularity
|
||||||
|
if hours > MIN_HOURS_FOR_LATER_HALF:
|
||||||
|
# Get second half average for longer periods
|
||||||
|
later_half_avg = self._calculate_later_half_average(hours, next_interval_start)
|
||||||
|
if later_half_avg is not None:
|
||||||
|
self._trend_attributes[f"second_half_{hours}h_avg"] = round(later_half_avg * 100, 2)
|
||||||
|
|
||||||
|
# Calculate incremental change: how much does the later half differ from current?
|
||||||
|
if current_interval_price > 0:
|
||||||
|
later_half_diff = ((later_half_avg - current_interval_price) / current_interval_price) * 100
|
||||||
|
self._trend_attributes[f"second_half_{hours}h_diff_from_current_%"] = round(later_half_diff, 1)
|
||||||
|
|
||||||
|
# Cache the trend value for consistency
|
||||||
|
self._cached_trend_value = trend_state
|
||||||
|
|
||||||
|
return trend_state
|
||||||
|
|
||||||
|
def get_current_trend_value(self) -> str | None:
|
||||||
|
"""
|
||||||
|
Get the current price trend that is valid until the next change.
|
||||||
|
|
||||||
|
Uses centralized _calculate_trend_info() for consistency with next_price_trend_change sensor.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Current trend state: "rising", "falling", or "stable"
|
||||||
|
|
||||||
|
"""
|
||||||
|
trend_info = self._calculate_trend_info()
|
||||||
|
|
||||||
|
if not trend_info:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Set attributes for this sensor
|
||||||
|
self._current_trend_attributes = {
|
||||||
|
"from_direction": trend_info["from_direction"],
|
||||||
|
"trend_duration_minutes": trend_info["trend_duration_minutes"],
|
||||||
|
}
|
||||||
|
|
||||||
|
return trend_info["current_trend_state"]
|
||||||
|
|
||||||
|
def get_next_trend_change_value(self) -> datetime | None:
|
||||||
|
"""
|
||||||
|
Calculate when the next price trend change will occur.
|
||||||
|
|
||||||
|
Uses centralized _calculate_trend_info() for consistency with current_price_trend sensor.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Timestamp of next trend change, or None if no change expected in next 24h
|
||||||
|
|
||||||
|
"""
|
||||||
|
trend_info = self._calculate_trend_info()
|
||||||
|
|
||||||
|
if not trend_info:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Set attributes for this sensor
|
||||||
|
self._trend_change_attributes = trend_info["trend_change_attributes"]
|
||||||
|
|
||||||
|
return trend_info["next_change_time"]
|
||||||
|
|
||||||
|
def get_trend_attributes(self) -> dict[str, Any]:
|
||||||
|
"""Get cached trend attributes for simple trend sensors (price_trend_Nh)."""
|
||||||
|
return self._trend_attributes
|
||||||
|
|
||||||
|
def get_current_trend_attributes(self) -> dict[str, Any] | None:
|
||||||
|
"""Get cached attributes for current_price_trend sensor."""
|
||||||
|
return self._current_trend_attributes
|
||||||
|
|
||||||
|
def get_trend_change_attributes(self) -> dict[str, Any] | None:
|
||||||
|
"""Get cached attributes for next_price_trend_change sensor."""
|
||||||
|
return self._trend_change_attributes
|
||||||
|
|
||||||
|
def clear_trend_cache(self) -> None:
|
||||||
|
"""Clear simple trend cache (called on coordinator update)."""
|
||||||
|
self._cached_trend_value = None
|
||||||
|
self._trend_attributes = {}
|
||||||
|
|
||||||
|
def clear_calculation_cache(self) -> None:
|
||||||
|
"""Clear centralized trend calculation cache (called on coordinator update)."""
|
||||||
|
self._trend_calculation_cache = None
|
||||||
|
self._trend_calculation_timestamp = None
|
||||||
|
|
||||||
|
# ========================================================================
|
||||||
|
# PRIVATE HELPER METHODS
|
||||||
|
# ========================================================================
|
||||||
|
|
||||||
|
def _calculate_later_half_average(self, hours: int, next_interval_start: datetime) -> float | None:
|
||||||
|
"""
|
||||||
|
Calculate average price for the later half of the future time window.
|
||||||
|
|
||||||
|
This provides additional granularity by showing what happens in the second half
|
||||||
|
of the prediction window, helping distinguish between near-term and far-term trends.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
hours: Total hours in the prediction window
|
||||||
|
next_interval_start: Start timestamp of the next interval
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Average price for the later half intervals, or None if insufficient data
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not self.coordinator.data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
price_info = self.coordinator.data.get("priceInfo", {})
|
||||||
|
today_prices = price_info.get("today", [])
|
||||||
|
tomorrow_prices = price_info.get("tomorrow", [])
|
||||||
|
all_prices = today_prices + tomorrow_prices
|
||||||
|
|
||||||
|
if not all_prices:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Calculate which intervals belong to the later half
|
||||||
|
total_intervals = hours * 4
|
||||||
|
first_half_intervals = total_intervals // 2
|
||||||
|
later_half_start = next_interval_start + timedelta(minutes=MINUTES_PER_INTERVAL * first_half_intervals)
|
||||||
|
later_half_end = next_interval_start + timedelta(minutes=MINUTES_PER_INTERVAL * total_intervals)
|
||||||
|
|
||||||
|
# Collect prices in the later half
|
||||||
|
later_prices = []
|
||||||
|
for price_data in all_prices:
|
||||||
|
starts_at = dt_util.parse_datetime(price_data["startsAt"])
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
|
||||||
|
if later_half_start <= starts_at < later_half_end:
|
||||||
|
price = price_data.get("total")
|
||||||
|
if price is not None:
|
||||||
|
later_prices.append(float(price))
|
||||||
|
|
||||||
|
if later_prices:
|
||||||
|
return sum(later_prices) / len(later_prices)
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _calculate_trend_info(self) -> dict[str, Any] | None:
|
||||||
|
"""
|
||||||
|
Centralized trend calculation for current_price_trend and next_price_trend_change sensors.
|
||||||
|
|
||||||
|
This method calculates all trend-related information in one place to avoid duplication
|
||||||
|
and ensure consistency between the two sensors. Results are cached per coordinator update.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with trend information for both sensors.
|
||||||
|
|
||||||
|
"""
|
||||||
|
trend_cache_duration_seconds = 60 # Cache for 1 minute
|
||||||
|
|
||||||
|
# Check if we have a valid cache
|
||||||
|
now = dt_util.now()
|
||||||
|
if (
|
||||||
|
self._trend_calculation_cache is not None
|
||||||
|
and self._trend_calculation_timestamp is not None
|
||||||
|
and (now - self._trend_calculation_timestamp).total_seconds() < trend_cache_duration_seconds
|
||||||
|
):
|
||||||
|
return self._trend_calculation_cache
|
||||||
|
|
||||||
|
# Validate coordinator data
|
||||||
|
if not self.coordinator.data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
price_info = self.coordinator.data.get("priceInfo", {})
|
||||||
|
all_intervals = price_info.get("today", []) + price_info.get("tomorrow", [])
|
||||||
|
current_interval = find_price_data_for_interval(price_info, now)
|
||||||
|
|
||||||
|
if not all_intervals or not current_interval:
|
||||||
|
return None
|
||||||
|
|
||||||
|
current_interval_start = dt_util.parse_datetime(current_interval["startsAt"])
|
||||||
|
current_interval_start = dt_util.as_local(current_interval_start) if current_interval_start else None
|
||||||
|
|
||||||
|
if not current_interval_start:
|
||||||
|
return None
|
||||||
|
|
||||||
|
current_index = self._find_current_interval_index(all_intervals, current_interval_start)
|
||||||
|
if current_index is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Get configured thresholds
|
||||||
|
thresholds = self._get_thresholds_config()
|
||||||
|
|
||||||
|
# Step 1: Calculate current momentum from trailing data (1h weighted)
|
||||||
|
current_price = float(current_interval["total"])
|
||||||
|
current_momentum = self._calculate_momentum(current_price, all_intervals, current_index)
|
||||||
|
|
||||||
|
# Step 2: Calculate 3h baseline trend for comparison
|
||||||
|
current_trend_3h = self._calculate_standard_trend(all_intervals, current_index, current_interval, thresholds)
|
||||||
|
|
||||||
|
# Step 3: Calculate final trend FIRST (momentum + future outlook)
|
||||||
|
min_intervals_for_trend = 4
|
||||||
|
standard_lookahead = 12 # 3 hours
|
||||||
|
lookahead_intervals = standard_lookahead
|
||||||
|
|
||||||
|
# Get future data
|
||||||
|
future_intervals = all_intervals[current_index + 1 : current_index + lookahead_intervals + 1]
|
||||||
|
future_prices = [float(fi["total"]) for fi in future_intervals if "total" in fi]
|
||||||
|
|
||||||
|
# Combine momentum + future outlook to get ACTUAL current trend
|
||||||
|
if len(future_intervals) >= min_intervals_for_trend and future_prices:
|
||||||
|
future_avg = sum(future_prices) / len(future_prices)
|
||||||
|
current_trend_state = self._combine_momentum_with_future(
|
||||||
|
current_momentum=current_momentum,
|
||||||
|
current_price=current_price,
|
||||||
|
future_avg=future_avg,
|
||||||
|
context={
|
||||||
|
"all_intervals": all_intervals,
|
||||||
|
"current_index": current_index,
|
||||||
|
"lookahead_intervals": lookahead_intervals,
|
||||||
|
"thresholds": thresholds,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
# Not enough future data - use 3h baseline as fallback
|
||||||
|
current_trend_state = current_trend_3h
|
||||||
|
|
||||||
|
# Step 4: Find next trend change FROM the current trend state (not momentum!)
|
||||||
|
scan_params = {
|
||||||
|
"current_index": current_index,
|
||||||
|
"current_trend_state": current_trend_state, # Use FINAL trend, not momentum
|
||||||
|
"current_interval": current_interval,
|
||||||
|
"now": now,
|
||||||
|
}
|
||||||
|
|
||||||
|
next_change_time = self._scan_for_trend_change(all_intervals, scan_params, thresholds)
|
||||||
|
|
||||||
|
# Step 5: Find when current trend started (scan backward)
|
||||||
|
trend_start_time, from_direction = self._find_trend_start_time(
|
||||||
|
all_intervals, current_index, current_trend_state, thresholds
|
||||||
|
)
|
||||||
|
|
||||||
|
# Calculate duration of current trend
|
||||||
|
trend_duration_minutes = None
|
||||||
|
if trend_start_time:
|
||||||
|
duration = now - trend_start_time
|
||||||
|
trend_duration_minutes = int(duration.total_seconds() / 60)
|
||||||
|
|
||||||
|
# Calculate minutes until change
|
||||||
|
minutes_until_change = None
|
||||||
|
if next_change_time:
|
||||||
|
time_diff = next_change_time - now
|
||||||
|
minutes_until_change = int(time_diff.total_seconds() / 60)
|
||||||
|
|
||||||
|
result = {
|
||||||
|
"current_trend_state": current_trend_state,
|
||||||
|
"next_change_time": next_change_time,
|
||||||
|
"trend_change_attributes": self._trend_change_attributes,
|
||||||
|
"trend_start_time": trend_start_time,
|
||||||
|
"from_direction": from_direction,
|
||||||
|
"trend_duration_minutes": trend_duration_minutes,
|
||||||
|
"minutes_until_change": minutes_until_change,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Cache the result
|
||||||
|
self._trend_calculation_cache = result
|
||||||
|
self._trend_calculation_timestamp = now
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
def _get_thresholds_config(self) -> dict[str, float]:
|
||||||
|
"""Get configured thresholds for trend calculation."""
|
||||||
|
return {
|
||||||
|
"rising": self.config.get("price_trend_threshold_rising", 5.0),
|
||||||
|
"falling": self.config.get("price_trend_threshold_falling", -5.0),
|
||||||
|
"moderate": self.config.get("volatility_threshold_moderate", 15.0),
|
||||||
|
"high": self.config.get("volatility_threshold_high", 30.0),
|
||||||
|
}
|
||||||
|
|
||||||
|
def _calculate_momentum(self, current_price: float, all_intervals: list, current_index: int) -> str:
|
||||||
|
"""
|
||||||
|
Calculate price momentum from weighted trailing average (last 1h).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
current_price: Current interval price
|
||||||
|
all_intervals: All price intervals
|
||||||
|
current_index: Index of current interval
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Momentum direction: "rising", "falling", or "stable"
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Look back 1 hour (4 intervals) for quick reaction
|
||||||
|
lookback_intervals = 4
|
||||||
|
min_intervals = 2 # Need at least 30 minutes of history
|
||||||
|
|
||||||
|
trailing_intervals = all_intervals[max(0, current_index - lookback_intervals) : current_index]
|
||||||
|
|
||||||
|
if len(trailing_intervals) < min_intervals:
|
||||||
|
return "stable" # Not enough history
|
||||||
|
|
||||||
|
# Weighted average: newer intervals count more
|
||||||
|
# Weights: [0.5, 0.75, 1.0, 1.25] for 4 intervals (grows linearly)
|
||||||
|
weights = [0.5 + 0.25 * i for i in range(len(trailing_intervals))]
|
||||||
|
trailing_prices = [float(interval["total"]) for interval in trailing_intervals if "total" in interval]
|
||||||
|
|
||||||
|
if not trailing_prices or len(trailing_prices) != len(weights):
|
||||||
|
return "stable"
|
||||||
|
|
||||||
|
weighted_sum = sum(price * weight for price, weight in zip(trailing_prices, weights, strict=True))
|
||||||
|
weighted_avg = weighted_sum / sum(weights)
|
||||||
|
|
||||||
|
# Calculate momentum with 3% threshold
|
||||||
|
momentum_threshold = 0.03
|
||||||
|
diff = (current_price - weighted_avg) / weighted_avg
|
||||||
|
|
||||||
|
if diff > momentum_threshold:
|
||||||
|
return "rising"
|
||||||
|
if diff < -momentum_threshold:
|
||||||
|
return "falling"
|
||||||
|
return "stable"
|
||||||
|
|
||||||
|
def _combine_momentum_with_future(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
current_momentum: str,
|
||||||
|
current_price: float,
|
||||||
|
future_avg: float,
|
||||||
|
context: dict,
|
||||||
|
) -> str:
|
||||||
|
"""
|
||||||
|
Combine momentum analysis with future outlook to determine final trend.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
current_momentum: Current momentum direction (rising/falling/stable)
|
||||||
|
current_price: Current interval price
|
||||||
|
future_avg: Average price in future window
|
||||||
|
context: Dict with all_intervals, current_index, lookahead_intervals, thresholds
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Final trend direction: "rising", "falling", or "stable"
|
||||||
|
|
||||||
|
"""
|
||||||
|
if current_momentum == "rising":
|
||||||
|
# We're in uptrend - does it continue?
|
||||||
|
return "rising" if future_avg >= current_price * 0.98 else "falling"
|
||||||
|
|
||||||
|
if current_momentum == "falling":
|
||||||
|
# We're in downtrend - does it continue?
|
||||||
|
return "falling" if future_avg <= current_price * 1.02 else "rising"
|
||||||
|
|
||||||
|
# current_momentum == "stable" - what's coming?
|
||||||
|
all_intervals = context["all_intervals"]
|
||||||
|
current_index = context["current_index"]
|
||||||
|
lookahead_intervals = context["lookahead_intervals"]
|
||||||
|
thresholds = context["thresholds"]
|
||||||
|
|
||||||
|
lookahead_for_volatility = all_intervals[current_index : current_index + lookahead_intervals]
|
||||||
|
trend_state, _ = calculate_price_trend(
|
||||||
|
current_price,
|
||||||
|
future_avg,
|
||||||
|
threshold_rising=thresholds["rising"],
|
||||||
|
threshold_falling=thresholds["falling"],
|
||||||
|
volatility_adjustment=True,
|
||||||
|
lookahead_intervals=lookahead_intervals,
|
||||||
|
all_intervals=lookahead_for_volatility,
|
||||||
|
volatility_threshold_moderate=thresholds["moderate"],
|
||||||
|
volatility_threshold_high=thresholds["high"],
|
||||||
|
)
|
||||||
|
return trend_state
|
||||||
|
|
||||||
|
def _calculate_standard_trend(
|
||||||
|
self,
|
||||||
|
all_intervals: list,
|
||||||
|
current_index: int,
|
||||||
|
current_interval: dict,
|
||||||
|
thresholds: dict,
|
||||||
|
) -> str:
|
||||||
|
"""Calculate standard 3h trend as baseline."""
|
||||||
|
min_intervals_for_trend = 4
|
||||||
|
standard_lookahead = 12 # 3 hours
|
||||||
|
|
||||||
|
standard_future_intervals = all_intervals[current_index + 1 : current_index + standard_lookahead + 1]
|
||||||
|
|
||||||
|
if len(standard_future_intervals) < min_intervals_for_trend:
|
||||||
|
return "stable"
|
||||||
|
|
||||||
|
standard_future_prices = [float(fi["total"]) for fi in standard_future_intervals if "total" in fi]
|
||||||
|
if not standard_future_prices:
|
||||||
|
return "stable"
|
||||||
|
|
||||||
|
standard_future_avg = sum(standard_future_prices) / len(standard_future_prices)
|
||||||
|
current_price = float(current_interval["total"])
|
||||||
|
|
||||||
|
standard_lookahead_volatility = all_intervals[current_index : current_index + standard_lookahead]
|
||||||
|
current_trend_3h, _ = calculate_price_trend(
|
||||||
|
current_price,
|
||||||
|
standard_future_avg,
|
||||||
|
threshold_rising=thresholds["rising"],
|
||||||
|
threshold_falling=thresholds["falling"],
|
||||||
|
volatility_adjustment=True,
|
||||||
|
lookahead_intervals=standard_lookahead,
|
||||||
|
all_intervals=standard_lookahead_volatility,
|
||||||
|
volatility_threshold_moderate=thresholds["moderate"],
|
||||||
|
volatility_threshold_high=thresholds["high"],
|
||||||
|
)
|
||||||
|
|
||||||
|
return current_trend_3h
|
||||||
|
|
||||||
|
def _find_current_interval_index(self, all_intervals: list, current_interval_start: datetime) -> int | None:
|
||||||
|
"""Find the index of current interval in all_intervals list."""
|
||||||
|
for idx, interval in enumerate(all_intervals):
|
||||||
|
interval_start = dt_util.parse_datetime(interval["startsAt"])
|
||||||
|
if interval_start and dt_util.as_local(interval_start) == current_interval_start:
|
||||||
|
return idx
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _find_trend_start_time(
|
||||||
|
self,
|
||||||
|
all_intervals: list,
|
||||||
|
current_index: int,
|
||||||
|
current_trend_state: str,
|
||||||
|
thresholds: dict,
|
||||||
|
) -> tuple[datetime | None, str | None]:
|
||||||
|
"""
|
||||||
|
Find when the current trend started by scanning backward.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
all_intervals: List of all price intervals
|
||||||
|
current_index: Index of current interval
|
||||||
|
current_trend_state: Current trend state ("rising", "falling", "stable")
|
||||||
|
thresholds: Threshold configuration
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of (start_time, from_direction):
|
||||||
|
- start_time: When current trend began, or None if at data boundary
|
||||||
|
- from_direction: Previous trend direction, or None if unknown
|
||||||
|
|
||||||
|
"""
|
||||||
|
intervals_in_3h = 12 # 3 hours = 12 intervals @ 15min each
|
||||||
|
|
||||||
|
# Scan backward to find when trend changed TO current state
|
||||||
|
for i in range(current_index - 1, max(-1, current_index - 97), -1):
|
||||||
|
if i < 0:
|
||||||
|
break
|
||||||
|
|
||||||
|
interval = all_intervals[i]
|
||||||
|
interval_start = dt_util.parse_datetime(interval["startsAt"])
|
||||||
|
if not interval_start:
|
||||||
|
continue
|
||||||
|
interval_start = dt_util.as_local(interval_start)
|
||||||
|
|
||||||
|
# Calculate trend at this past interval
|
||||||
|
future_intervals = all_intervals[i + 1 : i + intervals_in_3h + 1]
|
||||||
|
if len(future_intervals) < intervals_in_3h:
|
||||||
|
break # Not enough data to calculate trend
|
||||||
|
|
||||||
|
future_prices = [float(fi["total"]) for fi in future_intervals if "total" in fi]
|
||||||
|
if not future_prices:
|
||||||
|
continue
|
||||||
|
|
||||||
|
future_avg = sum(future_prices) / len(future_prices)
|
||||||
|
price = float(interval["total"])
|
||||||
|
|
||||||
|
# Calculate trend at this past point
|
||||||
|
lookahead_for_volatility = all_intervals[i : i + intervals_in_3h]
|
||||||
|
trend_state, _ = calculate_price_trend(
|
||||||
|
price,
|
||||||
|
future_avg,
|
||||||
|
threshold_rising=thresholds["rising"],
|
||||||
|
threshold_falling=thresholds["falling"],
|
||||||
|
volatility_adjustment=True,
|
||||||
|
lookahead_intervals=intervals_in_3h,
|
||||||
|
all_intervals=lookahead_for_volatility,
|
||||||
|
volatility_threshold_moderate=thresholds["moderate"],
|
||||||
|
volatility_threshold_high=thresholds["high"],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check if trend was different from current trend state
|
||||||
|
if trend_state != current_trend_state:
|
||||||
|
# Found the change point - the NEXT interval is where current trend started
|
||||||
|
next_interval = all_intervals[i + 1]
|
||||||
|
trend_start = dt_util.parse_datetime(next_interval["startsAt"])
|
||||||
|
if trend_start:
|
||||||
|
return dt_util.as_local(trend_start), trend_state
|
||||||
|
|
||||||
|
# Reached data boundary - current trend extends beyond available data
|
||||||
|
return None, None
|
||||||
|
|
||||||
|
def _scan_for_trend_change(
|
||||||
|
self,
|
||||||
|
all_intervals: list,
|
||||||
|
scan_params: dict,
|
||||||
|
thresholds: dict,
|
||||||
|
) -> datetime | None:
|
||||||
|
"""
|
||||||
|
Scan future intervals for trend change.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
all_intervals: List of all price intervals
|
||||||
|
scan_params: Dict with current_index, current_trend_state, current_interval, now
|
||||||
|
thresholds: Dict with rising, falling, moderate, high threshold values
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Timestamp of next trend change, or None if no change in next 24h
|
||||||
|
|
||||||
|
"""
|
||||||
|
intervals_in_3h = 12 # 3 hours = 12 intervals @ 15min each
|
||||||
|
current_index = scan_params["current_index"]
|
||||||
|
current_trend_state = scan_params["current_trend_state"]
|
||||||
|
current_interval = scan_params["current_interval"]
|
||||||
|
now = scan_params["now"]
|
||||||
|
|
||||||
|
for i in range(current_index + 1, min(current_index + 97, len(all_intervals))):
|
||||||
|
interval = all_intervals[i]
|
||||||
|
interval_start = dt_util.parse_datetime(interval["startsAt"])
|
||||||
|
if not interval_start:
|
||||||
|
continue
|
||||||
|
interval_start = dt_util.as_local(interval_start)
|
||||||
|
|
||||||
|
# Skip if this interval is in the past
|
||||||
|
if interval_start <= now:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Calculate trend at this future interval
|
||||||
|
future_intervals = all_intervals[i + 1 : i + intervals_in_3h + 1]
|
||||||
|
if len(future_intervals) < intervals_in_3h:
|
||||||
|
break # Not enough data to calculate trend
|
||||||
|
|
||||||
|
future_prices = [float(fi["total"]) for fi in future_intervals if "total" in fi]
|
||||||
|
if not future_prices:
|
||||||
|
continue
|
||||||
|
|
||||||
|
future_avg = sum(future_prices) / len(future_prices)
|
||||||
|
current_price = float(interval["total"])
|
||||||
|
|
||||||
|
# Calculate trend at this future point
|
||||||
|
lookahead_for_volatility = all_intervals[i : i + intervals_in_3h]
|
||||||
|
trend_state, _ = calculate_price_trend(
|
||||||
|
current_price,
|
||||||
|
future_avg,
|
||||||
|
threshold_rising=thresholds["rising"],
|
||||||
|
threshold_falling=thresholds["falling"],
|
||||||
|
volatility_adjustment=True,
|
||||||
|
lookahead_intervals=intervals_in_3h,
|
||||||
|
all_intervals=lookahead_for_volatility,
|
||||||
|
volatility_threshold_moderate=thresholds["moderate"],
|
||||||
|
volatility_threshold_high=thresholds["high"],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check if trend changed from current trend state
|
||||||
|
# We want to find ANY change from current state, including changes to/from stable
|
||||||
|
if trend_state != current_trend_state:
|
||||||
|
# Store details for attributes
|
||||||
|
time_diff = interval_start - now
|
||||||
|
minutes_until = int(time_diff.total_seconds() / 60)
|
||||||
|
|
||||||
|
self._trend_change_attributes = {
|
||||||
|
"direction": trend_state,
|
||||||
|
"from_direction": current_trend_state,
|
||||||
|
"minutes_until_change": minutes_until,
|
||||||
|
"current_price_now": round(float(current_interval["total"]) * 100, 2),
|
||||||
|
"price_at_change": round(current_price * 100, 2),
|
||||||
|
"avg_after_change": round(future_avg * 100, 2),
|
||||||
|
"trend_diff_%": round((future_avg - current_price) / current_price * 100, 1),
|
||||||
|
}
|
||||||
|
return interval_start
|
||||||
|
|
||||||
|
return None
|
||||||
111
custom_components/tibber_prices/sensor/calculators/volatility.py
Normal file
111
custom_components/tibber_prices/sensor/calculators/volatility.py
Normal file
|
|
@ -0,0 +1,111 @@
|
||||||
|
"""Calculator for price volatility analysis."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.entity_utils import add_icon_color_attribute
|
||||||
|
from custom_components.tibber_prices.sensor.attributes import (
|
||||||
|
add_volatility_type_attributes,
|
||||||
|
get_prices_for_volatility,
|
||||||
|
)
|
||||||
|
from custom_components.tibber_prices.utils.price import calculate_volatility_level
|
||||||
|
|
||||||
|
from .base import BaseCalculator
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
|
||||||
|
class VolatilityCalculator(BaseCalculator):
|
||||||
|
"""
|
||||||
|
Calculator for price volatility analysis.
|
||||||
|
|
||||||
|
Calculates volatility levels (low, moderate, high, very_high) using coefficient
|
||||||
|
of variation for different time periods (today, tomorrow, next 24h, today+tomorrow).
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||||
|
"""
|
||||||
|
Initialize calculator.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
*args: Positional arguments passed to BaseCalculator.
|
||||||
|
**kwargs: Keyword arguments passed to BaseCalculator.
|
||||||
|
|
||||||
|
"""
|
||||||
|
super().__init__(*args, **kwargs)
|
||||||
|
self._last_volatility_attributes: dict[str, Any] = {}
|
||||||
|
|
||||||
|
def get_volatility_value(self, *, volatility_type: str) -> str | None:
|
||||||
|
"""
|
||||||
|
Calculate price volatility using coefficient of variation for different time periods.
|
||||||
|
|
||||||
|
Also stores detailed attributes in self._last_volatility_attributes for use in
|
||||||
|
extra_state_attributes.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
volatility_type: One of "today", "tomorrow", "next_24h", "today_tomorrow".
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Volatility level: "low", "moderate", "high", "very_high", or None if unavailable.
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not self.coordinator_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
price_info = self.price_info
|
||||||
|
|
||||||
|
# Get volatility thresholds from config
|
||||||
|
thresholds = {
|
||||||
|
"threshold_moderate": self.config.get("volatility_threshold_moderate", 5.0),
|
||||||
|
"threshold_high": self.config.get("volatility_threshold_high", 15.0),
|
||||||
|
"threshold_very_high": self.config.get("volatility_threshold_very_high", 30.0),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Get prices based on volatility type
|
||||||
|
prices_to_analyze = get_prices_for_volatility(volatility_type, price_info)
|
||||||
|
|
||||||
|
if not prices_to_analyze:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Calculate spread and basic statistics
|
||||||
|
price_min = min(prices_to_analyze)
|
||||||
|
price_max = max(prices_to_analyze)
|
||||||
|
spread = price_max - price_min
|
||||||
|
price_avg = sum(prices_to_analyze) / len(prices_to_analyze)
|
||||||
|
|
||||||
|
# Convert to minor currency units (ct/øre) for display
|
||||||
|
spread_minor = spread * 100
|
||||||
|
|
||||||
|
# Calculate volatility level with custom thresholds (pass price list, not spread)
|
||||||
|
volatility = calculate_volatility_level(prices_to_analyze, **thresholds)
|
||||||
|
|
||||||
|
# Store attributes for this sensor
|
||||||
|
self._last_volatility_attributes = {
|
||||||
|
"price_spread": round(spread_minor, 2),
|
||||||
|
"price_volatility": volatility,
|
||||||
|
"price_min": round(price_min * 100, 2),
|
||||||
|
"price_max": round(price_max * 100, 2),
|
||||||
|
"price_avg": round(price_avg * 100, 2),
|
||||||
|
"interval_count": len(prices_to_analyze),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add icon_color for dynamic styling
|
||||||
|
add_icon_color_attribute(self._last_volatility_attributes, key="volatility", state_value=volatility)
|
||||||
|
|
||||||
|
# Add type-specific attributes
|
||||||
|
add_volatility_type_attributes(self._last_volatility_attributes, volatility_type, price_info, thresholds)
|
||||||
|
|
||||||
|
# Return lowercase for ENUM device class
|
||||||
|
return volatility.lower()
|
||||||
|
|
||||||
|
def get_volatility_attributes(self) -> dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get stored volatility attributes from last calculation.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary of volatility attributes, or empty dict if no calculation yet.
|
||||||
|
|
||||||
|
"""
|
||||||
|
return self._last_volatility_attributes
|
||||||
|
|
@ -0,0 +1,52 @@
|
||||||
|
"""Calculator for 24-hour sliding window statistics."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.entity_utils import get_price_value
|
||||||
|
|
||||||
|
from .base import BaseCalculator
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from collections.abc import Callable
|
||||||
|
|
||||||
|
|
||||||
|
class Window24hCalculator(BaseCalculator):
|
||||||
|
"""
|
||||||
|
Calculator for 24-hour sliding window statistics.
|
||||||
|
|
||||||
|
Handles sensors that calculate statistics over a 24-hour window relative to
|
||||||
|
the current interval (trailing = previous 24h, leading = next 24h).
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get_24h_window_value(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
stat_func: Callable,
|
||||||
|
) -> float | None:
|
||||||
|
"""
|
||||||
|
Unified method for 24-hour sliding window statistics.
|
||||||
|
|
||||||
|
Calculates statistics over a 24-hour window relative to the current interval:
|
||||||
|
- "trailing": Previous 24 hours (96 intervals before current)
|
||||||
|
- "leading": Next 24 hours (96 intervals after current)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
stat_func: Function from average_utils (e.g., calculate_current_trailing_avg).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Price value in minor currency units (cents/øre), or None if unavailable.
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not self.coordinator_data:
|
||||||
|
return None
|
||||||
|
|
||||||
|
value = stat_func(self.coordinator_data)
|
||||||
|
|
||||||
|
if value is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Always return in minor currency units (cents/øre) with 2 decimals
|
||||||
|
result = get_price_value(value, in_euro=False)
|
||||||
|
return round(result, 2)
|
||||||
144
custom_components/tibber_prices/sensor/chart_data.py
Normal file
144
custom_components/tibber_prices/sensor/chart_data.py
Normal file
|
|
@ -0,0 +1,144 @@
|
||||||
|
"""Chart data export functionality for Tibber Prices sensors."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
import yaml
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import CONF_CHART_DATA_CONFIG, DOMAIN
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.coordinator import TibberPricesDataUpdateCoordinator
|
||||||
|
from custom_components.tibber_prices.data import TibberPricesConfigEntry
|
||||||
|
from homeassistant.core import HomeAssistant
|
||||||
|
|
||||||
|
|
||||||
|
async def call_chartdata_service_async(
|
||||||
|
hass: HomeAssistant,
|
||||||
|
coordinator: TibberPricesDataUpdateCoordinator,
|
||||||
|
config_entry: TibberPricesConfigEntry,
|
||||||
|
) -> tuple[dict | None, str | None]:
|
||||||
|
"""
|
||||||
|
Call get_chartdata service with user-configured YAML (async).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of (response, error_message).
|
||||||
|
If successful: (response_dict, None)
|
||||||
|
If failed: (None, error_string)
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Get user-configured YAML
|
||||||
|
yaml_config = config_entry.options.get(CONF_CHART_DATA_CONFIG, "")
|
||||||
|
|
||||||
|
# Parse YAML if provided, otherwise use empty dict (service defaults)
|
||||||
|
service_params = {}
|
||||||
|
if yaml_config and yaml_config.strip():
|
||||||
|
try:
|
||||||
|
parsed = yaml.safe_load(yaml_config)
|
||||||
|
# Ensure we have a dict (yaml.safe_load can return str, int, etc.)
|
||||||
|
if isinstance(parsed, dict):
|
||||||
|
service_params = parsed
|
||||||
|
else:
|
||||||
|
coordinator.logger.warning(
|
||||||
|
"YAML configuration must be a dictionary, got %s. Using service defaults.",
|
||||||
|
type(parsed).__name__,
|
||||||
|
)
|
||||||
|
service_params = {}
|
||||||
|
except yaml.YAMLError as err:
|
||||||
|
coordinator.logger.warning(
|
||||||
|
"Invalid chart data YAML configuration: %s. Using service defaults.",
|
||||||
|
err,
|
||||||
|
)
|
||||||
|
service_params = {} # Fall back to service defaults
|
||||||
|
|
||||||
|
# Add required entry_id parameter
|
||||||
|
service_params["entry_id"] = config_entry.entry_id
|
||||||
|
|
||||||
|
# Call get_chartdata service using official HA service system
|
||||||
|
try:
|
||||||
|
response = await hass.services.async_call(
|
||||||
|
DOMAIN,
|
||||||
|
"get_chartdata",
|
||||||
|
service_params,
|
||||||
|
blocking=True,
|
||||||
|
return_response=True,
|
||||||
|
)
|
||||||
|
except Exception as ex:
|
||||||
|
coordinator.logger.exception("Chart data service call failed")
|
||||||
|
return None, str(ex)
|
||||||
|
else:
|
||||||
|
return response, None
|
||||||
|
|
||||||
|
|
||||||
|
def get_chart_data_state(
|
||||||
|
chart_data_response: dict | None,
|
||||||
|
chart_data_error: str | None,
|
||||||
|
) -> str | None:
|
||||||
|
"""
|
||||||
|
Return state for chart_data_export sensor.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
chart_data_response: Last service response (or None)
|
||||||
|
chart_data_error: Last error message (or None)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
"error" if error occurred
|
||||||
|
"ready" if data available
|
||||||
|
"pending" if no data yet
|
||||||
|
|
||||||
|
"""
|
||||||
|
if chart_data_error:
|
||||||
|
return "error"
|
||||||
|
if chart_data_response:
|
||||||
|
return "ready"
|
||||||
|
return "pending"
|
||||||
|
|
||||||
|
|
||||||
|
def build_chart_data_attributes(
|
||||||
|
chart_data_response: dict | None,
|
||||||
|
chart_data_last_update: datetime | None,
|
||||||
|
chart_data_error: str | None,
|
||||||
|
) -> dict[str, object] | None:
|
||||||
|
"""
|
||||||
|
Return chart data from last service call as attributes with metadata.
|
||||||
|
|
||||||
|
Attribute order: timestamp, error (if any), service data (at the end).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
chart_data_response: Last service response
|
||||||
|
chart_data_last_update: Timestamp of last update
|
||||||
|
chart_data_error: Error message if service call failed
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with timestamp, optional error, and service response data.
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Build base attributes with metadata FIRST
|
||||||
|
attributes: dict[str, object] = {
|
||||||
|
"timestamp": chart_data_last_update.isoformat() if chart_data_last_update else None,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add error message if service call failed
|
||||||
|
if chart_data_error:
|
||||||
|
attributes["error"] = chart_data_error
|
||||||
|
|
||||||
|
if not chart_data_response:
|
||||||
|
# No data - only metadata (timestamp, error)
|
||||||
|
return attributes
|
||||||
|
|
||||||
|
# Service data goes LAST - after metadata
|
||||||
|
if isinstance(chart_data_response, dict):
|
||||||
|
if len(chart_data_response) > 1:
|
||||||
|
# Multiple keys → wrap to prevent collision with metadata
|
||||||
|
attributes["data"] = chart_data_response
|
||||||
|
else:
|
||||||
|
# Single key → safe to merge directly
|
||||||
|
attributes.update(chart_data_response)
|
||||||
|
else:
|
||||||
|
# If response is array/list/primitive, wrap it in "data" key
|
||||||
|
attributes["data"] = chart_data_response
|
||||||
|
|
||||||
|
return attributes
|
||||||
File diff suppressed because it is too large
Load diff
|
|
@ -5,6 +5,8 @@ This module contains helper functions specific to the sensor platform:
|
||||||
- aggregate_price_data: Calculate average price from window data
|
- aggregate_price_data: Calculate average price from window data
|
||||||
- aggregate_level_data: Aggregate price levels from intervals
|
- aggregate_level_data: Aggregate price levels from intervals
|
||||||
- aggregate_rating_data: Aggregate price ratings from intervals
|
- aggregate_rating_data: Aggregate price ratings from intervals
|
||||||
|
- aggregate_window_data: Unified aggregation based on value type
|
||||||
|
- get_hourly_price_value: Get price for specific hour with offset
|
||||||
|
|
||||||
For shared helper functions (used by both sensor and binary_sensor platforms),
|
For shared helper functions (used by both sensor and binary_sensor platforms),
|
||||||
see entity_utils/helpers.py:
|
see entity_utils/helpers.py:
|
||||||
|
|
@ -16,10 +18,18 @@ see entity_utils/helpers.py:
|
||||||
|
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import timedelta
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.entity_utils.helpers import get_price_value
|
||||||
from custom_components.tibber_prices.utils.price import (
|
from custom_components.tibber_prices.utils.price import (
|
||||||
aggregate_price_levels,
|
aggregate_price_levels,
|
||||||
aggregate_price_rating,
|
aggregate_price_rating,
|
||||||
)
|
)
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from collections.abc import Callable
|
||||||
|
|
||||||
|
|
||||||
def aggregate_price_data(window_data: list[dict]) -> float | None:
|
def aggregate_price_data(window_data: list[dict]) -> float | None:
|
||||||
|
|
@ -81,3 +91,98 @@ def aggregate_rating_data(
|
||||||
|
|
||||||
aggregated, _ = aggregate_price_rating(differences, threshold_low, threshold_high)
|
aggregated, _ = aggregate_price_rating(differences, threshold_low, threshold_high)
|
||||||
return aggregated.lower() if aggregated else None
|
return aggregated.lower() if aggregated else None
|
||||||
|
|
||||||
|
|
||||||
|
def aggregate_window_data(
|
||||||
|
window_data: list[dict],
|
||||||
|
value_type: str,
|
||||||
|
threshold_low: float,
|
||||||
|
threshold_high: float,
|
||||||
|
) -> str | float | None:
|
||||||
|
"""
|
||||||
|
Aggregate data from multiple intervals based on value type.
|
||||||
|
|
||||||
|
Unified helper that routes to appropriate aggregation function.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
window_data: List of price interval dictionaries
|
||||||
|
value_type: Type of value to aggregate ('price', 'level', or 'rating')
|
||||||
|
threshold_low: Low threshold for rating calculation
|
||||||
|
threshold_high: High threshold for rating calculation
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Aggregated value (price as float, level/rating as str), or None if no data
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Map value types to aggregation functions
|
||||||
|
aggregators: dict[str, Callable] = {
|
||||||
|
"price": lambda data: aggregate_price_data(data),
|
||||||
|
"level": lambda data: aggregate_level_data(data),
|
||||||
|
"rating": lambda data: aggregate_rating_data(data, threshold_low, threshold_high),
|
||||||
|
}
|
||||||
|
|
||||||
|
aggregator = aggregators.get(value_type)
|
||||||
|
if aggregator:
|
||||||
|
return aggregator(window_data)
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def get_hourly_price_value(
|
||||||
|
price_info: dict,
|
||||||
|
*,
|
||||||
|
hour_offset: int,
|
||||||
|
in_euro: bool,
|
||||||
|
) -> float | None:
|
||||||
|
"""
|
||||||
|
Get price for current hour or with offset.
|
||||||
|
|
||||||
|
Legacy helper for hourly price access (not used by Calculator Pattern).
|
||||||
|
Kept for potential backward compatibility.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
price_info: Price information dict with 'today' and 'tomorrow' keys
|
||||||
|
hour_offset: Hour offset from current time (positive=future, negative=past)
|
||||||
|
in_euro: If True, return price in major currency (EUR), else minor (cents/øre)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Price value, or None if not found
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Use HomeAssistant's dt_util to get the current time in the user's timezone
|
||||||
|
now = dt_util.now()
|
||||||
|
|
||||||
|
# Calculate the exact target datetime (not just the hour)
|
||||||
|
# This properly handles day boundaries
|
||||||
|
target_datetime = now.replace(microsecond=0) + timedelta(hours=hour_offset)
|
||||||
|
target_hour = target_datetime.hour
|
||||||
|
target_date = target_datetime.date()
|
||||||
|
|
||||||
|
# Determine which day's data we need
|
||||||
|
day_key = "tomorrow" if target_date > now.date() else "today"
|
||||||
|
|
||||||
|
for price_data in price_info.get(day_key, []):
|
||||||
|
# Parse the timestamp and convert to local time
|
||||||
|
starts_at = dt_util.parse_datetime(price_data["startsAt"])
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Make sure it's in the local timezone for proper comparison
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
|
||||||
|
# Compare using both hour and date for accuracy
|
||||||
|
if starts_at.hour == target_hour and starts_at.date() == target_date:
|
||||||
|
return get_price_value(float(price_data["total"]), in_euro=in_euro)
|
||||||
|
|
||||||
|
# If we didn't find the price in the expected day's data, check the other day
|
||||||
|
# This is a fallback for potential edge cases
|
||||||
|
other_day_key = "today" if day_key == "tomorrow" else "tomorrow"
|
||||||
|
for price_data in price_info.get(other_day_key, []):
|
||||||
|
starts_at = dt_util.parse_datetime(price_data["startsAt"])
|
||||||
|
if starts_at is None:
|
||||||
|
continue
|
||||||
|
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
if starts_at.hour == target_hour and starts_at.date() == target_date:
|
||||||
|
return get_price_value(float(price_data["total"]), in_euro=in_euro)
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
|
||||||
276
custom_components/tibber_prices/sensor/value_getters.py
Normal file
276
custom_components/tibber_prices/sensor/value_getters.py
Normal file
|
|
@ -0,0 +1,276 @@
|
||||||
|
"""Value getter mapping for Tibber Prices sensors."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.utils.average import (
|
||||||
|
calculate_current_leading_avg,
|
||||||
|
calculate_current_leading_max,
|
||||||
|
calculate_current_leading_min,
|
||||||
|
calculate_current_trailing_avg,
|
||||||
|
calculate_current_trailing_max,
|
||||||
|
calculate_current_trailing_min,
|
||||||
|
)
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from collections.abc import Callable
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.sensor.calculators.daily_stat import DailyStatCalculator
|
||||||
|
from custom_components.tibber_prices.sensor.calculators.interval import IntervalCalculator
|
||||||
|
from custom_components.tibber_prices.sensor.calculators.metadata import MetadataCalculator
|
||||||
|
from custom_components.tibber_prices.sensor.calculators.rolling_hour import RollingHourCalculator
|
||||||
|
from custom_components.tibber_prices.sensor.calculators.timing import TimingCalculator
|
||||||
|
from custom_components.tibber_prices.sensor.calculators.trend import TrendCalculator
|
||||||
|
from custom_components.tibber_prices.sensor.calculators.volatility import VolatilityCalculator
|
||||||
|
from custom_components.tibber_prices.sensor.calculators.window_24h import Window24hCalculator
|
||||||
|
|
||||||
|
|
||||||
|
def get_value_getter_mapping( # noqa: PLR0913 - needs all calculators as parameters
|
||||||
|
interval_calculator: IntervalCalculator,
|
||||||
|
rolling_hour_calculator: RollingHourCalculator,
|
||||||
|
daily_stat_calculator: DailyStatCalculator,
|
||||||
|
window_24h_calculator: Window24hCalculator,
|
||||||
|
trend_calculator: TrendCalculator,
|
||||||
|
timing_calculator: TimingCalculator,
|
||||||
|
volatility_calculator: VolatilityCalculator,
|
||||||
|
metadata_calculator: MetadataCalculator,
|
||||||
|
get_next_avg_n_hours_value: Callable[[int], float | None],
|
||||||
|
get_price_forecast_value: Callable[[], str | None],
|
||||||
|
get_data_timestamp: Callable[[], str | None],
|
||||||
|
get_chart_data_export_value: Callable[[], str | None],
|
||||||
|
) -> dict[str, Callable]:
|
||||||
|
"""
|
||||||
|
Build mapping from entity key to value getter callable.
|
||||||
|
|
||||||
|
This function centralizes the handler mapping logic, making it easier to maintain
|
||||||
|
and understand the relationship between sensor types and their calculation methods.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
interval_calculator: Calculator for current/next/previous interval values
|
||||||
|
rolling_hour_calculator: Calculator for 5-interval rolling windows
|
||||||
|
daily_stat_calculator: Calculator for daily min/max/avg statistics
|
||||||
|
window_24h_calculator: Calculator for trailing/leading 24h windows
|
||||||
|
trend_calculator: Calculator for price trend analysis
|
||||||
|
timing_calculator: Calculator for best/peak price period timing
|
||||||
|
volatility_calculator: Calculator for price volatility analysis
|
||||||
|
metadata_calculator: Calculator for home/metering metadata
|
||||||
|
get_next_avg_n_hours_value: Method for next N-hour average forecasts
|
||||||
|
get_price_forecast_value: Method for price forecast sensor
|
||||||
|
get_data_timestamp: Method for data timestamp sensor
|
||||||
|
get_chart_data_export_value: Method for chart data export sensor
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary mapping entity keys to their value getter callables.
|
||||||
|
|
||||||
|
"""
|
||||||
|
return {
|
||||||
|
# ================================================================
|
||||||
|
# INTERVAL-BASED SENSORS - via IntervalCalculator
|
||||||
|
# ================================================================
|
||||||
|
# Price level sensors
|
||||||
|
"current_interval_price_level": interval_calculator.get_price_level_value,
|
||||||
|
"next_interval_price_level": lambda: interval_calculator.get_interval_value(
|
||||||
|
interval_offset=1, value_type="level"
|
||||||
|
),
|
||||||
|
"previous_interval_price_level": lambda: interval_calculator.get_interval_value(
|
||||||
|
interval_offset=-1, value_type="level"
|
||||||
|
),
|
||||||
|
# Price sensors (in cents)
|
||||||
|
"current_interval_price": lambda: interval_calculator.get_interval_value(
|
||||||
|
interval_offset=0, value_type="price", in_euro=False
|
||||||
|
),
|
||||||
|
"current_interval_price_major": lambda: interval_calculator.get_interval_value(
|
||||||
|
interval_offset=0, value_type="price", in_euro=True
|
||||||
|
),
|
||||||
|
"next_interval_price": lambda: interval_calculator.get_interval_value(
|
||||||
|
interval_offset=1, value_type="price", in_euro=False
|
||||||
|
),
|
||||||
|
"previous_interval_price": lambda: interval_calculator.get_interval_value(
|
||||||
|
interval_offset=-1, value_type="price", in_euro=False
|
||||||
|
),
|
||||||
|
# Rating sensors
|
||||||
|
"current_interval_price_rating": lambda: interval_calculator.get_rating_value(rating_type="current"),
|
||||||
|
"next_interval_price_rating": lambda: interval_calculator.get_interval_value(
|
||||||
|
interval_offset=1, value_type="rating"
|
||||||
|
),
|
||||||
|
"previous_interval_price_rating": lambda: interval_calculator.get_interval_value(
|
||||||
|
interval_offset=-1, value_type="rating"
|
||||||
|
),
|
||||||
|
# ================================================================
|
||||||
|
# ROLLING HOUR SENSORS (5-interval windows) - via RollingHourCalculator
|
||||||
|
# ================================================================
|
||||||
|
"current_hour_price_level": lambda: rolling_hour_calculator.get_rolling_hour_value(
|
||||||
|
hour_offset=0, value_type="level"
|
||||||
|
),
|
||||||
|
"next_hour_price_level": lambda: rolling_hour_calculator.get_rolling_hour_value(
|
||||||
|
hour_offset=1, value_type="level"
|
||||||
|
),
|
||||||
|
# Rolling hour average (5 intervals: 2 before + current + 2 after)
|
||||||
|
"current_hour_average_price": lambda: rolling_hour_calculator.get_rolling_hour_value(
|
||||||
|
hour_offset=0, value_type="price"
|
||||||
|
),
|
||||||
|
"next_hour_average_price": lambda: rolling_hour_calculator.get_rolling_hour_value(
|
||||||
|
hour_offset=1, value_type="price"
|
||||||
|
),
|
||||||
|
"current_hour_price_rating": lambda: rolling_hour_calculator.get_rolling_hour_value(
|
||||||
|
hour_offset=0, value_type="rating"
|
||||||
|
),
|
||||||
|
"next_hour_price_rating": lambda: rolling_hour_calculator.get_rolling_hour_value(
|
||||||
|
hour_offset=1, value_type="rating"
|
||||||
|
),
|
||||||
|
# ================================================================
|
||||||
|
# DAILY STATISTICS SENSORS - via DailyStatCalculator
|
||||||
|
# ================================================================
|
||||||
|
"lowest_price_today": lambda: daily_stat_calculator.get_daily_stat_value(day="today", stat_func=min),
|
||||||
|
"highest_price_today": lambda: daily_stat_calculator.get_daily_stat_value(day="today", stat_func=max),
|
||||||
|
"average_price_today": lambda: daily_stat_calculator.get_daily_stat_value(
|
||||||
|
day="today",
|
||||||
|
stat_func=lambda prices: sum(prices) / len(prices),
|
||||||
|
),
|
||||||
|
# Tomorrow statistics sensors
|
||||||
|
"lowest_price_tomorrow": lambda: daily_stat_calculator.get_daily_stat_value(day="tomorrow", stat_func=min),
|
||||||
|
"highest_price_tomorrow": lambda: daily_stat_calculator.get_daily_stat_value(day="tomorrow", stat_func=max),
|
||||||
|
"average_price_tomorrow": lambda: daily_stat_calculator.get_daily_stat_value(
|
||||||
|
day="tomorrow",
|
||||||
|
stat_func=lambda prices: sum(prices) / len(prices),
|
||||||
|
),
|
||||||
|
# Daily aggregated level sensors
|
||||||
|
"yesterday_price_level": lambda: daily_stat_calculator.get_daily_aggregated_value(
|
||||||
|
day="yesterday", value_type="level"
|
||||||
|
),
|
||||||
|
"today_price_level": lambda: daily_stat_calculator.get_daily_aggregated_value(day="today", value_type="level"),
|
||||||
|
"tomorrow_price_level": lambda: daily_stat_calculator.get_daily_aggregated_value(
|
||||||
|
day="tomorrow", value_type="level"
|
||||||
|
),
|
||||||
|
# Daily aggregated rating sensors
|
||||||
|
"yesterday_price_rating": lambda: daily_stat_calculator.get_daily_aggregated_value(
|
||||||
|
day="yesterday", value_type="rating"
|
||||||
|
),
|
||||||
|
"today_price_rating": lambda: daily_stat_calculator.get_daily_aggregated_value(
|
||||||
|
day="today", value_type="rating"
|
||||||
|
),
|
||||||
|
"tomorrow_price_rating": lambda: daily_stat_calculator.get_daily_aggregated_value(
|
||||||
|
day="tomorrow", value_type="rating"
|
||||||
|
),
|
||||||
|
# ================================================================
|
||||||
|
# 24H WINDOW SENSORS (trailing/leading from current) - via Window24hCalculator
|
||||||
|
# ================================================================
|
||||||
|
# Trailing and leading average sensors
|
||||||
|
"trailing_price_average": lambda: window_24h_calculator.get_24h_window_value(
|
||||||
|
stat_func=calculate_current_trailing_avg,
|
||||||
|
),
|
||||||
|
"leading_price_average": lambda: window_24h_calculator.get_24h_window_value(
|
||||||
|
stat_func=calculate_current_leading_avg,
|
||||||
|
),
|
||||||
|
# Trailing and leading min/max sensors
|
||||||
|
"trailing_price_min": lambda: window_24h_calculator.get_24h_window_value(
|
||||||
|
stat_func=calculate_current_trailing_min,
|
||||||
|
),
|
||||||
|
"trailing_price_max": lambda: window_24h_calculator.get_24h_window_value(
|
||||||
|
stat_func=calculate_current_trailing_max,
|
||||||
|
),
|
||||||
|
"leading_price_min": lambda: window_24h_calculator.get_24h_window_value(
|
||||||
|
stat_func=calculate_current_leading_min,
|
||||||
|
),
|
||||||
|
"leading_price_max": lambda: window_24h_calculator.get_24h_window_value(
|
||||||
|
stat_func=calculate_current_leading_max,
|
||||||
|
),
|
||||||
|
# ================================================================
|
||||||
|
# FUTURE FORECAST SENSORS
|
||||||
|
# ================================================================
|
||||||
|
# Future average sensors (next N hours from next interval)
|
||||||
|
"next_avg_1h": lambda: get_next_avg_n_hours_value(hours=1),
|
||||||
|
"next_avg_2h": lambda: get_next_avg_n_hours_value(hours=2),
|
||||||
|
"next_avg_3h": lambda: get_next_avg_n_hours_value(hours=3),
|
||||||
|
"next_avg_4h": lambda: get_next_avg_n_hours_value(hours=4),
|
||||||
|
"next_avg_5h": lambda: get_next_avg_n_hours_value(hours=5),
|
||||||
|
"next_avg_6h": lambda: get_next_avg_n_hours_value(hours=6),
|
||||||
|
"next_avg_8h": lambda: get_next_avg_n_hours_value(hours=8),
|
||||||
|
"next_avg_12h": lambda: get_next_avg_n_hours_value(hours=12),
|
||||||
|
# Current and next trend change sensors
|
||||||
|
"current_price_trend": trend_calculator.get_current_trend_value,
|
||||||
|
"next_price_trend_change": trend_calculator.get_next_trend_change_value,
|
||||||
|
# Price trend sensors
|
||||||
|
"price_trend_1h": lambda: trend_calculator.get_price_trend_value(hours=1),
|
||||||
|
"price_trend_2h": lambda: trend_calculator.get_price_trend_value(hours=2),
|
||||||
|
"price_trend_3h": lambda: trend_calculator.get_price_trend_value(hours=3),
|
||||||
|
"price_trend_4h": lambda: trend_calculator.get_price_trend_value(hours=4),
|
||||||
|
"price_trend_5h": lambda: trend_calculator.get_price_trend_value(hours=5),
|
||||||
|
"price_trend_6h": lambda: trend_calculator.get_price_trend_value(hours=6),
|
||||||
|
"price_trend_8h": lambda: trend_calculator.get_price_trend_value(hours=8),
|
||||||
|
"price_trend_12h": lambda: trend_calculator.get_price_trend_value(hours=12),
|
||||||
|
# Diagnostic sensors
|
||||||
|
"data_timestamp": get_data_timestamp,
|
||||||
|
# Price forecast sensor
|
||||||
|
"price_forecast": get_price_forecast_value,
|
||||||
|
# Home metadata sensors (via MetadataCalculator)
|
||||||
|
"home_type": lambda: metadata_calculator.get_home_metadata_value("type"),
|
||||||
|
"home_size": lambda: metadata_calculator.get_home_metadata_value("size"),
|
||||||
|
"main_fuse_size": lambda: metadata_calculator.get_home_metadata_value("mainFuseSize"),
|
||||||
|
"number_of_residents": lambda: metadata_calculator.get_home_metadata_value("numberOfResidents"),
|
||||||
|
"primary_heating_source": lambda: metadata_calculator.get_home_metadata_value("primaryHeatingSource"),
|
||||||
|
# Metering point sensors (via MetadataCalculator)
|
||||||
|
"grid_company": lambda: metadata_calculator.get_metering_point_value("gridCompany"),
|
||||||
|
"grid_area_code": lambda: metadata_calculator.get_metering_point_value("gridAreaCode"),
|
||||||
|
"price_area_code": lambda: metadata_calculator.get_metering_point_value("priceAreaCode"),
|
||||||
|
"consumption_ean": lambda: metadata_calculator.get_metering_point_value("consumptionEan"),
|
||||||
|
"production_ean": lambda: metadata_calculator.get_metering_point_value("productionEan"),
|
||||||
|
"energy_tax_type": lambda: metadata_calculator.get_metering_point_value("energyTaxType"),
|
||||||
|
"vat_type": lambda: metadata_calculator.get_metering_point_value("vatType"),
|
||||||
|
"estimated_annual_consumption": lambda: metadata_calculator.get_metering_point_value(
|
||||||
|
"estimatedAnnualConsumption"
|
||||||
|
),
|
||||||
|
# Subscription sensors (via MetadataCalculator)
|
||||||
|
"subscription_status": lambda: metadata_calculator.get_subscription_value("status"),
|
||||||
|
# Volatility sensors (via VolatilityCalculator)
|
||||||
|
"today_volatility": lambda: volatility_calculator.get_volatility_value(volatility_type="today"),
|
||||||
|
"tomorrow_volatility": lambda: volatility_calculator.get_volatility_value(volatility_type="tomorrow"),
|
||||||
|
"next_24h_volatility": lambda: volatility_calculator.get_volatility_value(volatility_type="next_24h"),
|
||||||
|
"today_tomorrow_volatility": lambda: volatility_calculator.get_volatility_value(
|
||||||
|
volatility_type="today_tomorrow"
|
||||||
|
),
|
||||||
|
# ================================================================
|
||||||
|
# BEST/PEAK PRICE TIMING SENSORS - via TimingCalculator
|
||||||
|
# ================================================================
|
||||||
|
# Best Price timing sensors
|
||||||
|
"best_price_end_time": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="best_price", value_type="end_time"
|
||||||
|
),
|
||||||
|
"best_price_period_duration": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="best_price", value_type="period_duration"
|
||||||
|
),
|
||||||
|
"best_price_remaining_minutes": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="best_price", value_type="remaining_minutes"
|
||||||
|
),
|
||||||
|
"best_price_progress": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="best_price", value_type="progress"
|
||||||
|
),
|
||||||
|
"best_price_next_start_time": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="best_price", value_type="next_start_time"
|
||||||
|
),
|
||||||
|
"best_price_next_in_minutes": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="best_price", value_type="next_in_minutes"
|
||||||
|
),
|
||||||
|
# Peak Price timing sensors
|
||||||
|
"peak_price_end_time": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="peak_price", value_type="end_time"
|
||||||
|
),
|
||||||
|
"peak_price_period_duration": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="peak_price", value_type="period_duration"
|
||||||
|
),
|
||||||
|
"peak_price_remaining_minutes": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="peak_price", value_type="remaining_minutes"
|
||||||
|
),
|
||||||
|
"peak_price_progress": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="peak_price", value_type="progress"
|
||||||
|
),
|
||||||
|
"peak_price_next_start_time": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="peak_price", value_type="next_start_time"
|
||||||
|
),
|
||||||
|
"peak_price_next_in_minutes": lambda: timing_calculator.get_period_timing_value(
|
||||||
|
period_type="peak_price", value_type="next_in_minutes"
|
||||||
|
),
|
||||||
|
# Chart data export sensor
|
||||||
|
"chart_data_export": get_chart_data_export_value,
|
||||||
|
}
|
||||||
Loading…
Reference in a new issue