mirror of
https://github.com/jpawlowski/hass.tibber_prices.git
synced 2026-03-30 05:13:40 +00:00
refactor(price_info): price data handling to use unified interval retrieval
- Introduced `get_intervals_for_day_offsets` helper to streamline access to price intervals for yesterday, today, and tomorrow. - Updated various components to replace direct access to `priceInfo` with the new helper, ensuring a flat structure for price intervals. - Adjusted calculations and data processing methods to accommodate the new data structure. - Enhanced documentation to reflect changes in caching strategy and data structure.
This commit is contained in:
parent
294d84128b
commit
981fb08a69
35 changed files with 726 additions and 736 deletions
|
|
@ -375,7 +375,7 @@ After successful refactoring:
|
||||||
|
|
||||||
**Core Data Flow:**
|
**Core Data Flow:**
|
||||||
|
|
||||||
1. `TibberPricesApiClient` (`api.py`) queries Tibber's GraphQL API with `resolution:QUARTER_HOURLY` for user data and prices (yesterday/today/tomorrow - 192 intervals total)
|
1. `TibberPricesApiClient` (`api.py`) queries Tibber's GraphQL API with `resolution:QUARTER_HOURLY` for user data and prices (day before yesterday/yesterday/today/tomorrow - 384 intervals total, ensuring trailing 24h averages are accurate for all intervals)
|
||||||
2. `TibberPricesDataUpdateCoordinator` (`coordinator.py`) orchestrates updates every 15 minutes, manages persistent storage via `Store`, and schedules quarter-hour entity refreshes
|
2. `TibberPricesDataUpdateCoordinator` (`coordinator.py`) orchestrates updates every 15 minutes, manages persistent storage via `Store`, and schedules quarter-hour entity refreshes
|
||||||
3. Price enrichment functions (`utils/price.py`, `utils/average.py`) calculate trailing/leading 24h averages, price differences, and rating levels for each 15-minute interval
|
3. Price enrichment functions (`utils/price.py`, `utils/average.py`) calculate trailing/leading 24h averages, price differences, and rating levels for each 15-minute interval
|
||||||
4. Entity platforms (`sensor/` package, `binary_sensor/` package) expose enriched data as Home Assistant entities
|
4. Entity platforms (`sensor/` package, `binary_sensor/` package) expose enriched data as Home Assistant entities
|
||||||
|
|
@ -1749,7 +1749,7 @@ Never use raw API price data directly. Always enrich via `enrich_price_info_with
|
||||||
Always use `dt_util` from `homeassistant.util` instead of Python's `datetime` module for timezone-aware operations. **Critical:** Use `dt_util.as_local()` when comparing API timestamps to local time. Import datetime types only for type hints: `from datetime import date, datetime, timedelta`.
|
Always use `dt_util` from `homeassistant.util` instead of Python's `datetime` module for timezone-aware operations. **Critical:** Use `dt_util.as_local()` when comparing API timestamps to local time. Import datetime types only for type hints: `from datetime import date, datetime, timedelta`.
|
||||||
|
|
||||||
**4. Coordinator Data Structure**
|
**4. Coordinator Data Structure**
|
||||||
Coordinator data follows structure: `coordinator.data = {"user_data": {...}, "priceInfo": {"yesterday": [...], "today": [...], "tomorrow": [...], "currency": "EUR"}}`. Each price list contains enriched interval dicts. See `coordinator/core.py` for data management.
|
Coordinator data follows structure: `coordinator.data = {"user_data": {...}, "priceInfo": [...], "currency": "EUR"}`. The `priceInfo` is a flat list containing all enriched interval dicts (yesterday + today + tomorrow). Currency is stored at top level for easy access. See `coordinator/core.py` for data management.
|
||||||
|
|
||||||
**5. Service Response Pattern**
|
**5. Service Response Pattern**
|
||||||
Services returning data must declare `supports_response=SupportsResponse.ONLY` in registration. See `services.py` for implementation patterns.
|
Services returning data must declare `supports_response=SupportsResponse.ONLY` in registration. See `services.py` for implementation patterns.
|
||||||
|
|
@ -2257,7 +2257,7 @@ df = (
|
||||||
# ✅ Annotate function signatures (public functions)
|
# ✅ Annotate function signatures (public functions)
|
||||||
def get_current_interval_price(coordinator: DataUpdateCoordinator) -> float:
|
def get_current_interval_price(coordinator: DataUpdateCoordinator) -> float:
|
||||||
"""Get current price from coordinator."""
|
"""Get current price from coordinator."""
|
||||||
return coordinator.data["priceInfo"]["today"][0]["total"]
|
return coordinator.data["priceInfo"][0]["total"]
|
||||||
|
|
||||||
# ✅ Use modern type syntax (Python 3.13)
|
# ✅ Use modern type syntax (Python 3.13)
|
||||||
def process_prices(prices: list[dict[str, Any]]) -> dict[str, float]:
|
def process_prices(prices: list[dict[str, Any]]) -> dict[str, float]:
|
||||||
|
|
|
||||||
|
|
@ -21,7 +21,7 @@ A Home Assistant integration that provides advanced price information and rating
|
||||||
|
|
||||||
## ✨ Features
|
## ✨ Features
|
||||||
|
|
||||||
- **Quarter-Hourly Price Data**: Access detailed 15-minute interval pricing (192 data points across yesterday/today/tomorrow)
|
- **Quarter-Hourly Price Data**: Access detailed 15-minute interval pricing (384 data points across 4 days: day before yesterday/yesterday/today/tomorrow)
|
||||||
- **Current and Next Interval Prices**: Get real-time price data in both major currency (€, kr) and minor units (ct, øre)
|
- **Current and Next Interval Prices**: Get real-time price data in both major currency (€, kr) and minor units (ct, øre)
|
||||||
- **Multi-Currency Support**: Automatic detection and formatting for EUR, NOK, SEK, DKK, USD, and GBP
|
- **Multi-Currency Support**: Automatic detection and formatting for EUR, NOK, SEK, DKK, USD, and GBP
|
||||||
- **Price Level Indicators**: Know when you're in a VERY_CHEAP, CHEAP, NORMAL, EXPENSIVE, or VERY_EXPENSIVE period
|
- **Price Level Indicators**: Know when you're in a VERY_CHEAP, CHEAP, NORMAL, EXPENSIVE, or VERY_EXPENSIVE period
|
||||||
|
|
|
||||||
|
|
@ -3,14 +3,18 @@
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import asyncio
|
import asyncio
|
||||||
|
import base64
|
||||||
import logging
|
import logging
|
||||||
import re
|
import re
|
||||||
import socket
|
import socket
|
||||||
from datetime import timedelta
|
from datetime import timedelta
|
||||||
from typing import TYPE_CHECKING, Any
|
from typing import TYPE_CHECKING, Any
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
|
|
||||||
import aiohttp
|
import aiohttp
|
||||||
|
|
||||||
|
from homeassistant.util import dt as dt_utils
|
||||||
|
|
||||||
from .exceptions import (
|
from .exceptions import (
|
||||||
TibberPricesApiClientAuthenticationError,
|
TibberPricesApiClientAuthenticationError,
|
||||||
TibberPricesApiClientCommunicationError,
|
TibberPricesApiClientCommunicationError,
|
||||||
|
|
@ -19,7 +23,6 @@ from .exceptions import (
|
||||||
)
|
)
|
||||||
from .helpers import (
|
from .helpers import (
|
||||||
flatten_price_info,
|
flatten_price_info,
|
||||||
flatten_price_rating,
|
|
||||||
prepare_headers,
|
prepare_headers,
|
||||||
verify_graphql_response,
|
verify_graphql_response,
|
||||||
verify_response_or_raise,
|
verify_response_or_raise,
|
||||||
|
|
@ -133,48 +136,65 @@ class TibberPricesApiClient:
|
||||||
query_type=TibberPricesQueryType.USER,
|
query_type=TibberPricesQueryType.USER,
|
||||||
)
|
)
|
||||||
|
|
||||||
async def async_get_price_info(self, home_ids: set[str]) -> dict:
|
async def async_get_price_info(self, home_ids: set[str], user_data: dict[str, Any]) -> dict:
|
||||||
"""
|
"""
|
||||||
Get price info data in flat format for specified homes.
|
Get price info for specific homes using GraphQL aliases.
|
||||||
|
|
||||||
|
Uses timezone-aware cursor calculation per home based on the home's actual timezone
|
||||||
|
from Tibber API (not HA system timezone). This ensures correct "day before yesterday
|
||||||
|
midnight" calculation for homes in different timezones.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
home_ids: Set of home IDs to fetch data for.
|
home_ids: Set of home IDs to fetch price data for.
|
||||||
|
user_data: User data dict containing home metadata (including timezone).
|
||||||
|
REQUIRED - must be fetched before calling this method.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Dictionary with homes data keyed by home_id.
|
Dict with "homes" key containing home_id -> price_data mapping.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
TibberPricesApiClientError: If TimeService not initialized or user_data missing.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
return await self._get_price_info_for_specific_homes(home_ids)
|
|
||||||
|
|
||||||
async def _get_price_info_for_specific_homes(self, home_ids: set[str]) -> dict:
|
|
||||||
"""Get price info for specific homes using GraphQL aliases."""
|
|
||||||
if not self.time:
|
if not self.time:
|
||||||
msg = "TimeService not initialized - required for price info processing"
|
msg = "TimeService not initialized - required for price info processing"
|
||||||
raise TibberPricesApiClientError(msg)
|
raise TibberPricesApiClientError(msg)
|
||||||
|
|
||||||
|
if not user_data:
|
||||||
|
msg = "User data required for timezone-aware price fetching - fetch user data first"
|
||||||
|
raise TibberPricesApiClientError(msg)
|
||||||
|
|
||||||
if not home_ids:
|
if not home_ids:
|
||||||
return {"homes": {}}
|
return {"homes": {}}
|
||||||
|
|
||||||
|
# Build home_id -> timezone mapping from user_data
|
||||||
|
home_timezones = self._extract_home_timezones(user_data)
|
||||||
|
|
||||||
# Build query with aliases for each home
|
# Build query with aliases for each home
|
||||||
# Example: home1: home(id: "abc") { ... }
|
# Each home gets its own cursor based on its timezone
|
||||||
home_queries = []
|
home_queries = []
|
||||||
for idx, home_id in enumerate(sorted(home_ids)):
|
for idx, home_id in enumerate(sorted(home_ids)):
|
||||||
alias = f"home{idx}"
|
alias = f"home{idx}"
|
||||||
|
|
||||||
|
# Get timezone for this home (fallback to HA system timezone)
|
||||||
|
home_tz = home_timezones.get(home_id)
|
||||||
|
|
||||||
|
# Calculate cursor: day before yesterday midnight in home's timezone
|
||||||
|
cursor = self._calculate_cursor_for_home(home_tz)
|
||||||
|
|
||||||
home_query = f"""
|
home_query = f"""
|
||||||
{alias}: home(id: "{home_id}") {{
|
{alias}: home(id: "{home_id}") {{
|
||||||
id
|
id
|
||||||
consumption(resolution:DAILY,last:1) {{
|
|
||||||
pageInfo{{currency}}
|
|
||||||
}}
|
|
||||||
currentSubscription {{
|
currentSubscription {{
|
||||||
priceInfoRange(resolution:QUARTER_HOURLY,last:192) {{
|
priceInfoRange(resolution:QUARTER_HOURLY, first:192, after: "{cursor}") {{
|
||||||
|
pageInfo{{ count }}
|
||||||
edges{{node{{
|
edges{{node{{
|
||||||
startsAt total energy tax level
|
startsAt total level
|
||||||
}}}}
|
}}}}
|
||||||
}}
|
}}
|
||||||
priceInfo(resolution:QUARTER_HOURLY) {{
|
priceInfo(resolution:QUARTER_HOURLY) {{
|
||||||
today{{startsAt total energy tax level}}
|
today{{startsAt total level}}
|
||||||
tomorrow{{startsAt total energy tax level}}
|
tomorrow{{startsAt total level}}
|
||||||
}}
|
}}
|
||||||
}}
|
}}
|
||||||
}}
|
}}
|
||||||
|
|
@ -204,18 +224,7 @@ class TibberPricesApiClient:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
if "currentSubscription" in home and home["currentSubscription"] is not None:
|
if "currentSubscription" in home and home["currentSubscription"] is not None:
|
||||||
# Extract currency from consumption data if available
|
homes_data[home_id] = flatten_price_info(home["currentSubscription"])
|
||||||
currency = None
|
|
||||||
if home.get("consumption"):
|
|
||||||
page_info = home["consumption"].get("pageInfo")
|
|
||||||
if page_info:
|
|
||||||
currency = page_info.get("currency")
|
|
||||||
|
|
||||||
homes_data[home_id] = flatten_price_info(
|
|
||||||
home["currentSubscription"],
|
|
||||||
currency,
|
|
||||||
time=self.time,
|
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
_LOGGER.debug(
|
_LOGGER.debug(
|
||||||
"Home %s has no active subscription - price data will be unavailable",
|
"Home %s has no active subscription - price data will be unavailable",
|
||||||
|
|
@ -226,101 +235,76 @@ class TibberPricesApiClient:
|
||||||
data["homes"] = homes_data
|
data["homes"] = homes_data
|
||||||
return data
|
return data
|
||||||
|
|
||||||
async def async_get_daily_price_rating(self) -> dict:
|
def _extract_home_timezones(self, user_data: dict[str, Any]) -> dict[str, str]:
|
||||||
"""Get daily price rating data in flat format for all homes."""
|
"""
|
||||||
data = await self._api_wrapper(
|
Extract home_id -> timezone mapping from user_data.
|
||||||
data={
|
|
||||||
"query": """
|
Args:
|
||||||
{viewer{homes{id,currentSubscription{priceRating{
|
user_data: User data dict from async_get_viewer_details() (required).
|
||||||
daily{
|
|
||||||
currency
|
Returns:
|
||||||
entries{time total energy tax difference level}
|
Dict mapping home_id to timezone string (e.g., "Europe/Oslo").
|
||||||
}
|
|
||||||
}}}}}"""
|
"""
|
||||||
},
|
home_timezones = {}
|
||||||
query_type=TibberPricesQueryType.DAILY_RATING,
|
viewer = user_data.get("viewer", {})
|
||||||
)
|
homes = viewer.get("homes", [])
|
||||||
homes = data.get("viewer", {}).get("homes", [])
|
|
||||||
|
|
||||||
homes_data = {}
|
|
||||||
for home in homes:
|
for home in homes:
|
||||||
home_id = home.get("id")
|
home_id = home.get("id")
|
||||||
if home_id:
|
timezone = home.get("timeZone")
|
||||||
if "currentSubscription" in home and home["currentSubscription"] is not None:
|
|
||||||
homes_data[home_id] = flatten_price_rating(home["currentSubscription"])
|
if home_id and timezone:
|
||||||
|
home_timezones[home_id] = timezone
|
||||||
|
_LOGGER.debug("Extracted timezone %s for home %s", timezone, home_id)
|
||||||
|
elif home_id:
|
||||||
|
_LOGGER.warning("Home %s has no timezone in user data, will use fallback", home_id)
|
||||||
|
|
||||||
|
return home_timezones
|
||||||
|
|
||||||
|
def _calculate_cursor_for_home(self, home_timezone: str | None) -> str:
|
||||||
|
"""
|
||||||
|
Calculate cursor (day before yesterday midnight) for a home's timezone.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
home_timezone: Timezone string (e.g., "Europe/Oslo", "America/New_York").
|
||||||
|
If None, falls back to HA system timezone.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Base64-encoded ISO timestamp string for use as GraphQL cursor.
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not self.time:
|
||||||
|
msg = "TimeService not initialized"
|
||||||
|
raise TibberPricesApiClientError(msg)
|
||||||
|
|
||||||
|
# Get current time
|
||||||
|
now = self.time.now()
|
||||||
|
|
||||||
|
# Convert to home's timezone or fallback to HA system timezone
|
||||||
|
if home_timezone:
|
||||||
|
try:
|
||||||
|
tz = ZoneInfo(home_timezone)
|
||||||
|
now_in_home_tz = now.astimezone(tz)
|
||||||
|
except (KeyError, ValueError, OSError) as error:
|
||||||
|
_LOGGER.warning(
|
||||||
|
"Invalid timezone %s (%s), falling back to HA system timezone",
|
||||||
|
home_timezone,
|
||||||
|
error,
|
||||||
|
)
|
||||||
|
now_in_home_tz = dt_utils.as_local(now)
|
||||||
else:
|
else:
|
||||||
_LOGGER.debug(
|
# Fallback to HA system timezone
|
||||||
"Home %s has no active subscription - daily rating data will be unavailable",
|
now_in_home_tz = dt_utils.as_local(now)
|
||||||
home_id,
|
|
||||||
|
# Calculate day before yesterday midnight in home's timezone
|
||||||
|
day_before_yesterday_midnight = (now_in_home_tz - timedelta(days=2)).replace(
|
||||||
|
hour=0, minute=0, second=0, microsecond=0
|
||||||
)
|
)
|
||||||
homes_data[home_id] = {}
|
|
||||||
|
|
||||||
data["homes"] = homes_data
|
# Convert to ISO format and base64 encode
|
||||||
return data
|
iso_string = day_before_yesterday_midnight.isoformat()
|
||||||
|
return base64.b64encode(iso_string.encode()).decode()
|
||||||
async def async_get_hourly_price_rating(self) -> dict:
|
|
||||||
"""Get hourly price rating data in flat format for all homes."""
|
|
||||||
data = await self._api_wrapper(
|
|
||||||
data={
|
|
||||||
"query": """
|
|
||||||
{viewer{homes{id,currentSubscription{priceRating{
|
|
||||||
hourly{
|
|
||||||
currency
|
|
||||||
entries{time total energy tax difference level}
|
|
||||||
}
|
|
||||||
}}}}}"""
|
|
||||||
},
|
|
||||||
query_type=TibberPricesQueryType.HOURLY_RATING,
|
|
||||||
)
|
|
||||||
homes = data.get("viewer", {}).get("homes", [])
|
|
||||||
|
|
||||||
homes_data = {}
|
|
||||||
for home in homes:
|
|
||||||
home_id = home.get("id")
|
|
||||||
if home_id:
|
|
||||||
if "currentSubscription" in home and home["currentSubscription"] is not None:
|
|
||||||
homes_data[home_id] = flatten_price_rating(home["currentSubscription"])
|
|
||||||
else:
|
|
||||||
_LOGGER.debug(
|
|
||||||
"Home %s has no active subscription - hourly rating data will be unavailable",
|
|
||||||
home_id,
|
|
||||||
)
|
|
||||||
homes_data[home_id] = {}
|
|
||||||
|
|
||||||
data["homes"] = homes_data
|
|
||||||
return data
|
|
||||||
|
|
||||||
async def async_get_monthly_price_rating(self) -> dict:
|
|
||||||
"""Get monthly price rating data in flat format for all homes."""
|
|
||||||
data = await self._api_wrapper(
|
|
||||||
data={
|
|
||||||
"query": """
|
|
||||||
{viewer{homes{id,currentSubscription{priceRating{
|
|
||||||
monthly{
|
|
||||||
currency
|
|
||||||
entries{time total energy tax difference level}
|
|
||||||
}
|
|
||||||
}}}}}"""
|
|
||||||
},
|
|
||||||
query_type=TibberPricesQueryType.MONTHLY_RATING,
|
|
||||||
)
|
|
||||||
homes = data.get("viewer", {}).get("homes", [])
|
|
||||||
|
|
||||||
homes_data = {}
|
|
||||||
for home in homes:
|
|
||||||
home_id = home.get("id")
|
|
||||||
if home_id:
|
|
||||||
if "currentSubscription" in home and home["currentSubscription"] is not None:
|
|
||||||
homes_data[home_id] = flatten_price_rating(home["currentSubscription"])
|
|
||||||
else:
|
|
||||||
_LOGGER.debug(
|
|
||||||
"Home %s has no active subscription - monthly rating data will be unavailable",
|
|
||||||
home_id,
|
|
||||||
)
|
|
||||||
homes_data[home_id] = {}
|
|
||||||
|
|
||||||
data["homes"] = homes_data
|
|
||||||
return data
|
|
||||||
|
|
||||||
async def _make_request(
|
async def _make_request(
|
||||||
self,
|
self,
|
||||||
|
|
|
||||||
|
|
@ -3,7 +3,6 @@
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import logging
|
import logging
|
||||||
from datetime import timedelta
|
|
||||||
from typing import TYPE_CHECKING
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
from homeassistant.const import __version__ as ha_version
|
from homeassistant.const import __version__ as ha_version
|
||||||
|
|
@ -11,8 +10,6 @@ from homeassistant.const import __version__ as ha_version
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
import aiohttp
|
import aiohttp
|
||||||
|
|
||||||
from custom_components.tibber_prices.coordinator.time_service import TibberPricesTimeService
|
|
||||||
|
|
||||||
from .queries import TibberPricesQueryType
|
from .queries import TibberPricesQueryType
|
||||||
|
|
||||||
from .exceptions import (
|
from .exceptions import (
|
||||||
|
|
@ -120,10 +117,6 @@ def is_data_empty(data: dict, query_type: str) -> bool:
|
||||||
- Must have range data
|
- Must have range data
|
||||||
- Must have today data
|
- Must have today data
|
||||||
- tomorrow can be empty if we have valid historical and today data
|
- tomorrow can be empty if we have valid historical and today data
|
||||||
|
|
||||||
For rating data:
|
|
||||||
- Must have thresholdPercentages
|
|
||||||
- Must have non-empty entries for the specific rating type
|
|
||||||
"""
|
"""
|
||||||
_LOGGER.debug("Checking if data is empty for query_type %s", query_type)
|
_LOGGER.debug("Checking if data is empty for query_type %s", query_type)
|
||||||
|
|
||||||
|
|
@ -174,8 +167,8 @@ def is_data_empty(data: dict, query_type: str) -> bool:
|
||||||
else:
|
else:
|
||||||
subscription = first_home["currentSubscription"]
|
subscription = first_home["currentSubscription"]
|
||||||
|
|
||||||
# Check priceInfoRange (192 quarter-hourly intervals)
|
# Check priceInfoRange (96 quarter-hourly intervals)
|
||||||
has_historical = (
|
has_yesterday = (
|
||||||
"priceInfoRange" in subscription
|
"priceInfoRange" in subscription
|
||||||
and subscription["priceInfoRange"] is not None
|
and subscription["priceInfoRange"] is not None
|
||||||
and "edges" in subscription["priceInfoRange"]
|
and "edges" in subscription["priceInfoRange"]
|
||||||
|
|
@ -192,47 +185,15 @@ def is_data_empty(data: dict, query_type: str) -> bool:
|
||||||
)
|
)
|
||||||
|
|
||||||
# Data is empty if we don't have historical data or today's data
|
# Data is empty if we don't have historical data or today's data
|
||||||
is_empty = not has_historical or not has_today
|
is_empty = not has_yesterday or not has_today
|
||||||
|
|
||||||
_LOGGER.debug(
|
_LOGGER.debug(
|
||||||
"Price info check - priceInfoRange: %s, today: %s, is_empty: %s",
|
"Price info check - priceInfoRange: %s, today: %s, is_empty: %s",
|
||||||
bool(has_historical),
|
bool(has_yesterday),
|
||||||
bool(has_today),
|
bool(has_today),
|
||||||
is_empty,
|
is_empty,
|
||||||
)
|
)
|
||||||
|
|
||||||
elif query_type in ["daily", "hourly", "monthly"]:
|
|
||||||
# Check for homes existence and non-emptiness before accessing
|
|
||||||
if (
|
|
||||||
"viewer" not in data
|
|
||||||
or "homes" not in data["viewer"]
|
|
||||||
or not isinstance(data["viewer"]["homes"], list)
|
|
||||||
or len(data["viewer"]["homes"]) == 0
|
|
||||||
or "currentSubscription" not in data["viewer"]["homes"][0]
|
|
||||||
or data["viewer"]["homes"][0]["currentSubscription"] is None
|
|
||||||
or "priceRating" not in data["viewer"]["homes"][0]["currentSubscription"]
|
|
||||||
):
|
|
||||||
_LOGGER.debug("Missing homes/currentSubscription/priceRating in rating check")
|
|
||||||
is_empty = True
|
|
||||||
else:
|
|
||||||
rating = data["viewer"]["homes"][0]["currentSubscription"]["priceRating"]
|
|
||||||
|
|
||||||
# Check rating entries
|
|
||||||
has_entries = (
|
|
||||||
query_type in rating
|
|
||||||
and rating[query_type] is not None
|
|
||||||
and "entries" in rating[query_type]
|
|
||||||
and rating[query_type]["entries"] is not None
|
|
||||||
and len(rating[query_type]["entries"]) > 0
|
|
||||||
)
|
|
||||||
|
|
||||||
is_empty = not has_entries
|
|
||||||
_LOGGER.debug(
|
|
||||||
"%s rating check - entries count: %d, is_empty: %s",
|
|
||||||
query_type,
|
|
||||||
len(rating[query_type]["entries"]) if has_entries else 0,
|
|
||||||
is_empty,
|
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
_LOGGER.debug("Unknown query type %s, treating as non-empty", query_type)
|
_LOGGER.debug("Unknown query type %s, treating as non-empty", query_type)
|
||||||
is_empty = False
|
is_empty = False
|
||||||
|
|
@ -252,23 +213,29 @@ def prepare_headers(access_token: str, version: str) -> dict[str, str]:
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
def flatten_price_info(subscription: dict, currency: str | None = None, *, time: TibberPricesTimeService) -> dict:
|
def flatten_price_info(subscription: dict) -> list[dict]:
|
||||||
"""
|
"""
|
||||||
Transform and flatten priceInfo from full API data structure.
|
Transform and flatten priceInfo from full API data structure.
|
||||||
|
|
||||||
Now handles priceInfoRange (192 quarter-hourly intervals) separately from
|
Returns a flat list of all price intervals ordered as:
|
||||||
priceInfo (today and tomorrow data). Currency is stored as a separate attribute.
|
[day_before_yesterday_prices, yesterday_prices, today_prices, tomorrow_prices]
|
||||||
|
|
||||||
|
priceInfoRange fetches 192 quarter-hourly intervals starting from the day before
|
||||||
|
yesterday midnight (2 days of historical data), which provides sufficient data
|
||||||
|
for calculating trailing 24h averages for all intervals including yesterday.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
subscription: The currentSubscription dictionary from API response.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
A flat list containing all price dictionaries (startsAt, total, level).
|
||||||
|
|
||||||
"""
|
"""
|
||||||
price_info = subscription.get("priceInfo", {})
|
|
||||||
price_info_range = subscription.get("priceInfoRange", {})
|
price_info_range = subscription.get("priceInfoRange", {})
|
||||||
|
|
||||||
# Get today and yesterday dates using TimeService
|
# Transform priceInfoRange edges data (extract historical quarter-hourly prices)
|
||||||
today_local = time.now().date()
|
# This contains 192 intervals (2 days) starting from day before yesterday midnight
|
||||||
yesterday_local = today_local - timedelta(days=1)
|
historical_prices = []
|
||||||
_LOGGER.debug("Processing data for yesterday's date: %s", yesterday_local)
|
|
||||||
|
|
||||||
# Transform priceInfoRange edges data (extract yesterday's quarter-hourly prices)
|
|
||||||
yesterday_prices = []
|
|
||||||
if "edges" in price_info_range:
|
if "edges" in price_info_range:
|
||||||
edges = price_info_range["edges"]
|
edges = price_info_range["edges"]
|
||||||
|
|
||||||
|
|
@ -276,47 +243,11 @@ def flatten_price_info(subscription: dict, currency: str | None = None, *, time:
|
||||||
if "node" not in edge:
|
if "node" not in edge:
|
||||||
_LOGGER.debug("Skipping edge without node: %s", edge)
|
_LOGGER.debug("Skipping edge without node: %s", edge)
|
||||||
continue
|
continue
|
||||||
|
historical_prices.append(edge["node"])
|
||||||
|
|
||||||
price_data = edge["node"]
|
# Return all intervals as a single flattened array
|
||||||
# Parse timestamp using TimeService for proper timezone handling
|
return (
|
||||||
starts_at = time.get_interval_time(price_data)
|
historical_prices
|
||||||
if starts_at is None:
|
+ subscription.get("priceInfo", {}).get("today", [])
|
||||||
_LOGGER.debug("Could not parse timestamp: %s", price_data["startsAt"])
|
+ subscription.get("priceInfo", {}).get("tomorrow", [])
|
||||||
continue
|
)
|
||||||
|
|
||||||
price_date = starts_at.date()
|
|
||||||
|
|
||||||
# Only include prices from yesterday
|
|
||||||
if price_date == yesterday_local:
|
|
||||||
yesterday_prices.append(price_data)
|
|
||||||
|
|
||||||
_LOGGER.debug("Found %d price entries for yesterday", len(yesterday_prices))
|
|
||||||
|
|
||||||
return {
|
|
||||||
"yesterday": yesterday_prices,
|
|
||||||
"today": price_info.get("today", []),
|
|
||||||
"tomorrow": price_info.get("tomorrow", []),
|
|
||||||
"currency": currency,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def flatten_price_rating(subscription: dict) -> dict:
|
|
||||||
"""Extract and flatten priceRating from subscription, including currency."""
|
|
||||||
price_rating = subscription.get("priceRating", {})
|
|
||||||
|
|
||||||
def extract_entries_and_currency(rating: dict) -> tuple[list, str | None]:
|
|
||||||
if rating is None:
|
|
||||||
return [], None
|
|
||||||
return rating.get("entries", []), rating.get("currency")
|
|
||||||
|
|
||||||
hourly_entries, hourly_currency = extract_entries_and_currency(price_rating.get("hourly"))
|
|
||||||
daily_entries, daily_currency = extract_entries_and_currency(price_rating.get("daily"))
|
|
||||||
monthly_entries, monthly_currency = extract_entries_and_currency(price_rating.get("monthly"))
|
|
||||||
# Prefer hourly, then daily, then monthly for top-level currency
|
|
||||||
currency = hourly_currency or daily_currency or monthly_currency
|
|
||||||
return {
|
|
||||||
"hourly": hourly_entries,
|
|
||||||
"daily": daily_entries,
|
|
||||||
"monthly": monthly_entries,
|
|
||||||
"currency": currency,
|
|
||||||
}
|
|
||||||
|
|
|
||||||
|
|
@ -6,10 +6,32 @@ from enum import Enum
|
||||||
|
|
||||||
|
|
||||||
class TibberPricesQueryType(Enum):
|
class TibberPricesQueryType(Enum):
|
||||||
"""Types of queries that can be made to the API."""
|
"""
|
||||||
|
Types of queries that can be made to the API.
|
||||||
|
|
||||||
|
CRITICAL: Query type selection is dictated by Tibber's API design and caching strategy.
|
||||||
|
|
||||||
|
PRICE_INFO:
|
||||||
|
- Used for current day-relative data (day before yesterday/yesterday/today/tomorrow)
|
||||||
|
- API automatically determines "today" and "tomorrow" based on current time
|
||||||
|
- MUST be used when querying any data from these 4 days, even if you only need
|
||||||
|
specific intervals, because Tibber's API requires this endpoint for current data
|
||||||
|
- Provides the core dataset needed for live data, recent historical context
|
||||||
|
(important until tomorrow's data arrives), and tomorrow's forecast
|
||||||
|
- Tibber likely has optimized caching for this frequently-accessed data range
|
||||||
|
|
||||||
|
PRICE_INFO_RANGE:
|
||||||
|
- Used for historical data older than day before yesterday
|
||||||
|
- Allows flexible date range queries with cursor-based pagination
|
||||||
|
- Required for any intervals beyond the 4-day window of PRICE_INFO
|
||||||
|
- Use this for historical analysis, comparisons, or trend calculations
|
||||||
|
|
||||||
|
USER:
|
||||||
|
- Fetches user account data and home metadata
|
||||||
|
- Separate from price data queries
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
PRICE_INFO = "price_info"
|
PRICE_INFO = "price_info"
|
||||||
DAILY_RATING = "daily"
|
PRICE_INFO_RANGE = "price_info_range"
|
||||||
HOURLY_RATING = "hourly"
|
|
||||||
MONTHLY_RATING = "monthly"
|
|
||||||
USER = "user"
|
USER = "user"
|
||||||
|
|
|
||||||
|
|
@ -4,6 +4,7 @@ from __future__ import annotations
|
||||||
|
|
||||||
from typing import TYPE_CHECKING
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import get_intervals_for_day_offsets
|
||||||
from custom_components.tibber_prices.entity_utils import add_icon_color_attribute
|
from custom_components.tibber_prices.entity_utils import add_icon_color_attribute
|
||||||
|
|
||||||
# Import TypedDict definitions for documentation (not used in signatures)
|
# Import TypedDict definitions for documentation (not used in signatures)
|
||||||
|
|
@ -39,12 +40,12 @@ def get_tomorrow_data_available_attributes(
|
||||||
if not coordinator_data:
|
if not coordinator_data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator_data.get("priceInfo", {})
|
# Use helper to get tomorrow's intervals
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
tomorrow_prices = get_intervals_for_day_offsets(coordinator_data, [1])
|
||||||
|
tomorrow_date = time.get_local_date(offset_days=1)
|
||||||
interval_count = len(tomorrow_prices)
|
interval_count = len(tomorrow_prices)
|
||||||
|
|
||||||
# Get expected intervals for tomorrow (handles DST)
|
# Get expected intervals for tomorrow (handles DST)
|
||||||
tomorrow_date = time.get_local_date(offset_days=1)
|
|
||||||
expected_intervals = time.get_expected_intervals_for_day(tomorrow_date)
|
expected_intervals = time.get_expected_intervals_for_day(tomorrow_date)
|
||||||
|
|
||||||
if interval_count == 0:
|
if interval_count == 0:
|
||||||
|
|
|
||||||
|
|
@ -6,6 +6,7 @@ from typing import TYPE_CHECKING
|
||||||
|
|
||||||
from custom_components.tibber_prices.coordinator import TIME_SENSITIVE_ENTITY_KEYS
|
from custom_components.tibber_prices.coordinator import TIME_SENSITIVE_ENTITY_KEYS
|
||||||
from custom_components.tibber_prices.coordinator.core import get_connection_state
|
from custom_components.tibber_prices.coordinator.core import get_connection_state
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import get_intervals_for_day_offsets
|
||||||
from custom_components.tibber_prices.entity import TibberPricesEntity
|
from custom_components.tibber_prices.entity import TibberPricesEntity
|
||||||
from custom_components.tibber_prices.entity_utils import get_binary_sensor_icon
|
from custom_components.tibber_prices.entity_utils import get_binary_sensor_icon
|
||||||
from homeassistant.components.binary_sensor import (
|
from homeassistant.components.binary_sensor import (
|
||||||
|
|
@ -134,12 +135,11 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Check tomorrow data availability (normal operation)
|
# Check tomorrow data availability (normal operation)
|
||||||
price_info = self.coordinator.data.get("priceInfo", {})
|
tomorrow_prices = get_intervals_for_day_offsets(self.coordinator.data, [1])
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
tomorrow_date = self.coordinator.time.get_local_date(offset_days=1)
|
||||||
interval_count = len(tomorrow_prices)
|
interval_count = len(tomorrow_prices)
|
||||||
|
|
||||||
# Get expected intervals for tomorrow (handles DST)
|
# Get expected intervals for tomorrow (handles DST)
|
||||||
tomorrow_date = self.coordinator.time.get_local_date(offset_days=1)
|
|
||||||
expected_intervals = self.coordinator.time.get_expected_intervals_for_day(tomorrow_date)
|
expected_intervals = self.coordinator.time.get_expected_intervals_for_day(tomorrow_date)
|
||||||
|
|
||||||
if interval_count == expected_intervals:
|
if interval_count == expected_intervals:
|
||||||
|
|
|
||||||
|
|
@ -210,7 +210,6 @@ class TibberPricesDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
|
||||||
self._data_transformer = TibberPricesDataTransformer(
|
self._data_transformer = TibberPricesDataTransformer(
|
||||||
config_entry=config_entry,
|
config_entry=config_entry,
|
||||||
log_prefix=self._log_prefix,
|
log_prefix=self._log_prefix,
|
||||||
perform_turnover_fn=self._perform_midnight_turnover,
|
|
||||||
calculate_periods_fn=lambda price_info: self._period_calculator.calculate_periods_for_price_info(
|
calculate_periods_fn=lambda price_info: self._period_calculator.calculate_periods_for_price_info(
|
||||||
price_info
|
price_info
|
||||||
),
|
),
|
||||||
|
|
@ -436,26 +435,18 @@ class TibberPricesDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
|
||||||
current_date,
|
current_date,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Perform rotation on cached data if available
|
# With flat interval list architecture, no rotation needed!
|
||||||
if self._cached_price_data and "homes" in self._cached_price_data:
|
# get_intervals_for_day_offsets() automatically filters by date.
|
||||||
for home_id, home_data in self._cached_price_data["homes"].items():
|
# Just update coordinator's data to trigger entity updates.
|
||||||
if "price_info" in home_data:
|
if self.data and self._cached_price_data:
|
||||||
price_info = home_data["price_info"]
|
# Re-transform data to ensure enrichment is refreshed
|
||||||
rotated = self._perform_midnight_turnover(price_info)
|
|
||||||
home_data["price_info"] = rotated
|
|
||||||
self._log("debug", "Rotated price data for home %s", home_id)
|
|
||||||
|
|
||||||
# Update coordinator's data with enriched rotated data
|
|
||||||
if self.data:
|
|
||||||
# Re-transform data to ensure enrichment is applied to rotated data
|
|
||||||
if self.is_main_entry():
|
if self.is_main_entry():
|
||||||
self.data = self._transform_data_for_main_entry(self._cached_price_data)
|
self.data = self._transform_data_for_main_entry(self._cached_price_data)
|
||||||
else:
|
else:
|
||||||
# For subentry, get fresh data from main coordinator after rotation
|
# For subentry, get fresh data from main coordinator
|
||||||
# Main coordinator will have performed rotation already
|
pass
|
||||||
self.data["timestamp"] = now
|
|
||||||
|
|
||||||
# CRITICAL: Update _last_price_update to current time after turnover
|
# CRITICAL: Update _last_price_update to current time after midnight
|
||||||
# This prevents cache_validity from showing "date_mismatch" after midnight
|
# This prevents cache_validity from showing "date_mismatch" after midnight
|
||||||
# The data is still valid (just rotated today→yesterday, tomorrow→today)
|
# The data is still valid (just rotated today→yesterday, tomorrow→today)
|
||||||
# Update timestamp to reflect that the data is current for the new day
|
# Update timestamp to reflect that the data is current for the new day
|
||||||
|
|
@ -719,25 +710,6 @@ class TibberPricesDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
|
||||||
# Restore handler state: mark today's midnight as last turnover
|
# Restore handler state: mark today's midnight as last turnover
|
||||||
self._midnight_handler.mark_turnover_done(today_midnight)
|
self._midnight_handler.mark_turnover_done(today_midnight)
|
||||||
|
|
||||||
def _perform_midnight_turnover(self, price_info: dict[str, Any]) -> dict[str, Any]:
|
|
||||||
"""
|
|
||||||
Perform midnight turnover on price data.
|
|
||||||
|
|
||||||
Moves: today → yesterday, tomorrow → today, clears tomorrow.
|
|
||||||
|
|
||||||
This handles cases where:
|
|
||||||
- Server was running through midnight
|
|
||||||
- Cache is being refreshed and needs proper day rotation
|
|
||||||
|
|
||||||
Args:
|
|
||||||
price_info: The price info dict with 'today', 'tomorrow', 'yesterday' keys
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Updated price_info with rotated day data
|
|
||||||
|
|
||||||
"""
|
|
||||||
return helpers.perform_midnight_turnover(price_info, time=self.time)
|
|
||||||
|
|
||||||
async def _store_cache(self) -> None:
|
async def _store_cache(self) -> None:
|
||||||
"""Store cache data."""
|
"""Store cache data."""
|
||||||
await self._data_fetcher.store_cache(self._midnight_handler.last_check_time)
|
await self._data_fetcher.store_cache(self._midnight_handler.last_check_time)
|
||||||
|
|
@ -796,22 +768,11 @@ class TibberPricesDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
|
||||||
if not self.data:
|
if not self.data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = self.data.get("priceInfo", {})
|
if not self.data:
|
||||||
if not price_info:
|
|
||||||
return None
|
return None
|
||||||
|
|
||||||
now = self.time.now()
|
now = self.time.now()
|
||||||
return find_price_data_for_interval(price_info, now, time=self.time)
|
return find_price_data_for_interval(self.data, now, time=self.time)
|
||||||
|
|
||||||
def get_all_intervals(self) -> list[dict[str, Any]]:
|
|
||||||
"""Get all price intervals (today + tomorrow)."""
|
|
||||||
if not self.data:
|
|
||||||
return []
|
|
||||||
|
|
||||||
price_info = self.data.get("priceInfo", {})
|
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
return today_prices + tomorrow_prices
|
|
||||||
|
|
||||||
async def refresh_user_data(self) -> bool:
|
async def refresh_user_data(self) -> bool:
|
||||||
"""Force refresh of user data and return True if data was updated."""
|
"""Force refresh of user data and return True if data was updated."""
|
||||||
|
|
|
||||||
|
|
@ -184,19 +184,66 @@ class TibberPricesDataFetcher:
|
||||||
"homes": {},
|
"homes": {},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# Ensure we have user_data before fetching price data
|
||||||
|
# This is critical for timezone-aware cursor calculation
|
||||||
|
if not self._cached_user_data:
|
||||||
|
self._log("info", "User data not cached, fetching before price data")
|
||||||
|
try:
|
||||||
|
user_data = await self.api.async_get_viewer_details()
|
||||||
|
self._cached_user_data = user_data
|
||||||
|
self._last_user_update = current_time
|
||||||
|
except (
|
||||||
|
TibberPricesApiClientError,
|
||||||
|
TibberPricesApiClientCommunicationError,
|
||||||
|
) as ex:
|
||||||
|
msg = f"Failed to fetch user data (required for price fetching): {ex}"
|
||||||
|
self._log("error", msg)
|
||||||
|
raise TibberPricesApiClientError(msg) from ex
|
||||||
|
|
||||||
# Get price data for configured homes only (API call with specific home_ids)
|
# Get price data for configured homes only (API call with specific home_ids)
|
||||||
|
# Pass user_data for timezone-aware cursor calculation per home
|
||||||
|
# At this point, _cached_user_data is guaranteed to be not None (checked above)
|
||||||
|
if not self._cached_user_data:
|
||||||
|
msg = "User data unexpectedly None after fetch attempt"
|
||||||
|
raise TibberPricesApiClientError(msg)
|
||||||
|
|
||||||
self._log("debug", "Fetching price data for %d configured home(s)", len(configured_home_ids))
|
self._log("debug", "Fetching price data for %d configured home(s)", len(configured_home_ids))
|
||||||
price_data = await self.api.async_get_price_info(home_ids=configured_home_ids)
|
price_data = await self.api.async_get_price_info(
|
||||||
|
home_ids=configured_home_ids,
|
||||||
|
user_data=self._cached_user_data,
|
||||||
|
)
|
||||||
|
|
||||||
all_homes_data = {}
|
all_homes_data = {}
|
||||||
homes_list = price_data.get("homes", {})
|
homes_list = price_data.get("homes", {})
|
||||||
|
|
||||||
|
# Build home_id -> currency mapping from user_data
|
||||||
|
currency_map = {}
|
||||||
|
if self._cached_user_data:
|
||||||
|
viewer = self._cached_user_data.get("viewer", {})
|
||||||
|
homes = viewer.get("homes", [])
|
||||||
|
for home in homes:
|
||||||
|
home_id = home.get("id")
|
||||||
|
if home_id:
|
||||||
|
# Extract currency from nested structure (with fallback to EUR)
|
||||||
|
currency = (
|
||||||
|
home.get("currentSubscription", {})
|
||||||
|
.get("priceInfo", {})
|
||||||
|
.get("current", {})
|
||||||
|
.get("currency", "EUR")
|
||||||
|
)
|
||||||
|
currency_map[home_id] = currency
|
||||||
|
self._log("debug", "Extracted currency %s for home %s", currency, home_id)
|
||||||
|
|
||||||
# Process returned data
|
# Process returned data
|
||||||
for home_id, home_price_data in homes_list.items():
|
for home_id, home_price_data in homes_list.items():
|
||||||
# Store raw price data without enrichment
|
# Get currency from user_data (cached)
|
||||||
|
currency = currency_map.get(home_id, "EUR")
|
||||||
|
|
||||||
|
# Store raw price data with currency from user_data
|
||||||
# Enrichment will be done dynamically when data is transformed
|
# Enrichment will be done dynamically when data is transformed
|
||||||
home_data = {
|
home_data = {
|
||||||
"price_info": home_price_data,
|
"price_info": home_price_data,
|
||||||
|
"currency": currency,
|
||||||
}
|
}
|
||||||
all_homes_data[home_id] = home_data
|
all_homes_data[home_id] = home_data
|
||||||
|
|
||||||
|
|
@ -283,21 +330,6 @@ class TibberPricesDataFetcher:
|
||||||
msg = f"Error communicating with API: {error}"
|
msg = f"Error communicating with API: {error}"
|
||||||
raise UpdateFailed(msg) from error
|
raise UpdateFailed(msg) from error
|
||||||
|
|
||||||
def perform_midnight_turnover(self, price_info: dict[str, Any]) -> dict[str, Any]:
|
|
||||||
"""
|
|
||||||
Perform midnight turnover on price data.
|
|
||||||
|
|
||||||
Moves: today → yesterday, tomorrow → today, clears tomorrow.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
price_info: The price info dict with 'today', 'tomorrow', 'yesterday' keys
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Updated price_info with rotated day data
|
|
||||||
|
|
||||||
"""
|
|
||||||
return helpers.perform_midnight_turnover(price_info, time=self.time)
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def cached_price_data(self) -> dict[str, Any] | None:
|
def cached_price_data(self) -> dict[str, Any] | None:
|
||||||
"""Get cached price data."""
|
"""Get cached price data."""
|
||||||
|
|
|
||||||
|
|
@ -26,14 +26,12 @@ class TibberPricesDataTransformer:
|
||||||
self,
|
self,
|
||||||
config_entry: ConfigEntry,
|
config_entry: ConfigEntry,
|
||||||
log_prefix: str,
|
log_prefix: str,
|
||||||
perform_turnover_fn: Callable[[dict[str, Any]], dict[str, Any]],
|
|
||||||
calculate_periods_fn: Callable[[dict[str, Any]], dict[str, Any]],
|
calculate_periods_fn: Callable[[dict[str, Any]], dict[str, Any]],
|
||||||
time: TibberPricesTimeService,
|
time: TibberPricesTimeService,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""Initialize the data transformer."""
|
"""Initialize the data transformer."""
|
||||||
self.config_entry = config_entry
|
self.config_entry = config_entry
|
||||||
self._log_prefix = log_prefix
|
self._log_prefix = log_prefix
|
||||||
self._perform_turnover_fn = perform_turnover_fn
|
|
||||||
self._calculate_periods_fn = calculate_periods_fn
|
self._calculate_periods_fn = calculate_periods_fn
|
||||||
self.time: TibberPricesTimeService = time
|
self.time: TibberPricesTimeService = time
|
||||||
|
|
||||||
|
|
@ -184,30 +182,26 @@ class TibberPricesDataTransformer:
|
||||||
|
|
||||||
# Use the first home's data as the main entry's data
|
# Use the first home's data as the main entry's data
|
||||||
first_home_data = next(iter(homes_data.values()))
|
first_home_data = next(iter(homes_data.values()))
|
||||||
price_info = first_home_data.get("price_info", {})
|
all_intervals = first_home_data.get("price_info", [])
|
||||||
|
|
||||||
# Perform midnight turnover if needed (handles day transitions)
|
# Extract currency from home_data (populated from user_data)
|
||||||
price_info = self._perform_turnover_fn(price_info)
|
currency = first_home_data.get("currency", "EUR")
|
||||||
|
|
||||||
# Ensure all required keys exist (API might not return tomorrow data yet)
|
|
||||||
price_info.setdefault("yesterday", [])
|
|
||||||
price_info.setdefault("today", [])
|
|
||||||
price_info.setdefault("tomorrow", [])
|
|
||||||
price_info.setdefault("currency", "EUR")
|
|
||||||
|
|
||||||
# Enrich price info dynamically with calculated differences and rating levels
|
# Enrich price info dynamically with calculated differences and rating levels
|
||||||
# This ensures enrichment is always up-to-date, especially after midnight turnover
|
# (Modifies all_intervals in-place, returns same list)
|
||||||
thresholds = self.get_threshold_percentages()
|
thresholds = self.get_threshold_percentages()
|
||||||
price_info = enrich_price_info_with_differences(
|
enriched_intervals = enrich_price_info_with_differences(
|
||||||
price_info,
|
all_intervals,
|
||||||
threshold_low=thresholds["low"],
|
threshold_low=thresholds["low"],
|
||||||
threshold_high=thresholds["high"],
|
threshold_high=thresholds["high"],
|
||||||
|
time=self.time,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Store enriched intervals directly as priceInfo (flat list)
|
||||||
transformed_data = {
|
transformed_data = {
|
||||||
"timestamp": raw_data.get("timestamp"),
|
|
||||||
"homes": homes_data,
|
"homes": homes_data,
|
||||||
"priceInfo": price_info,
|
"priceInfo": enriched_intervals,
|
||||||
|
"currency": currency,
|
||||||
}
|
}
|
||||||
|
|
||||||
# Calculate periods (best price and peak price)
|
# Calculate periods (best price and peak price)
|
||||||
|
|
@ -249,29 +243,25 @@ class TibberPricesDataTransformer:
|
||||||
"priceInfo": {},
|
"priceInfo": {},
|
||||||
}
|
}
|
||||||
|
|
||||||
price_info = home_data.get("price_info", {})
|
all_intervals = home_data.get("price_info", [])
|
||||||
|
|
||||||
# Perform midnight turnover if needed (handles day transitions)
|
# Extract currency from home_data (populated from user_data)
|
||||||
price_info = self._perform_turnover_fn(price_info)
|
currency = home_data.get("currency", "EUR")
|
||||||
|
|
||||||
# Ensure all required keys exist (API might not return tomorrow data yet)
|
|
||||||
price_info.setdefault("yesterday", [])
|
|
||||||
price_info.setdefault("today", [])
|
|
||||||
price_info.setdefault("tomorrow", [])
|
|
||||||
price_info.setdefault("currency", "EUR")
|
|
||||||
|
|
||||||
# Enrich price info dynamically with calculated differences and rating levels
|
# Enrich price info dynamically with calculated differences and rating levels
|
||||||
# This ensures enrichment is always up-to-date, especially after midnight turnover
|
# (Modifies all_intervals in-place, returns same list)
|
||||||
thresholds = self.get_threshold_percentages()
|
thresholds = self.get_threshold_percentages()
|
||||||
price_info = enrich_price_info_with_differences(
|
enriched_intervals = enrich_price_info_with_differences(
|
||||||
price_info,
|
all_intervals,
|
||||||
threshold_low=thresholds["low"],
|
threshold_low=thresholds["low"],
|
||||||
threshold_high=thresholds["high"],
|
threshold_high=thresholds["high"],
|
||||||
|
time=self.time,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Store enriched intervals directly as priceInfo (flat list)
|
||||||
transformed_data = {
|
transformed_data = {
|
||||||
"timestamp": main_data.get("timestamp"),
|
"priceInfo": enriched_intervals,
|
||||||
"priceInfo": price_info,
|
"currency": currency,
|
||||||
}
|
}
|
||||||
|
|
||||||
# Calculate periods (best price and peak price)
|
# Calculate periods (best price and peak price)
|
||||||
|
|
|
||||||
|
|
@ -3,8 +3,12 @@
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import logging
|
import logging
|
||||||
|
from datetime import timedelta
|
||||||
from typing import TYPE_CHECKING, Any
|
from typing import TYPE_CHECKING, Any
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.const import DOMAIN
|
||||||
|
from homeassistant.util import dt as dt_util
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from datetime import date
|
from datetime import date
|
||||||
|
|
||||||
|
|
@ -12,8 +16,6 @@ if TYPE_CHECKING:
|
||||||
|
|
||||||
from .time_service import TibberPricesTimeService
|
from .time_service import TibberPricesTimeService
|
||||||
|
|
||||||
from custom_components.tibber_prices.const import DOMAIN
|
|
||||||
|
|
||||||
_LOGGER = logging.getLogger(__name__)
|
_LOGGER = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -29,80 +31,125 @@ def get_configured_home_ids(hass: HomeAssistant) -> set[str]:
|
||||||
return home_ids
|
return home_ids
|
||||||
|
|
||||||
|
|
||||||
|
def get_intervals_for_day_offsets(
|
||||||
|
coordinator_data: dict[str, Any] | None,
|
||||||
|
offsets: list[int],
|
||||||
|
) -> list[dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get intervals for specific day offsets from coordinator data.
|
||||||
|
|
||||||
|
This is the core function for filtering intervals by date offset.
|
||||||
|
Abstracts the data structure - callers don't need to know where intervals are stored.
|
||||||
|
|
||||||
|
Performance optimized:
|
||||||
|
- Date comparison using .date() on datetime objects (fast)
|
||||||
|
- Single pass through intervals with date caching
|
||||||
|
- Only processes requested offsets
|
||||||
|
|
||||||
|
Args:
|
||||||
|
coordinator_data: Coordinator data dict (typically coordinator.data).
|
||||||
|
offsets: List of day offsets relative to today (e.g., [0, 1] for today and tomorrow).
|
||||||
|
Range: -374 to +1 (allows historical comparisons up to one year + one week).
|
||||||
|
0 = today, -1 = yesterday, +1 = tomorrow, -7 = one week ago, etc.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of intervals matching the requested day offsets, in chronological order.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
# Get only today's intervals
|
||||||
|
today_intervals = get_intervals_for_day_offsets(coordinator.data, [0])
|
||||||
|
|
||||||
|
# Get today and tomorrow
|
||||||
|
future_intervals = get_intervals_for_day_offsets(coordinator.data, [0, 1])
|
||||||
|
|
||||||
|
# Get all available intervals
|
||||||
|
all = get_intervals_for_day_offsets(coordinator.data, [-1, 0, 1])
|
||||||
|
|
||||||
|
# Compare last week with same week one year ago
|
||||||
|
comparison = get_intervals_for_day_offsets(coordinator.data, [-7, -371])
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not coordinator_data:
|
||||||
|
return []
|
||||||
|
|
||||||
|
# Validate offsets are within acceptable range
|
||||||
|
min_offset = -374 # One year + one week for comparisons
|
||||||
|
max_offset = 1 # Tomorrow (we don't have data further in the future)
|
||||||
|
|
||||||
|
# Extract intervals from coordinator data structure (priceInfo is now a list)
|
||||||
|
all_intervals = coordinator_data.get("priceInfo", [])
|
||||||
|
|
||||||
|
if not all_intervals:
|
||||||
|
return []
|
||||||
|
|
||||||
|
# Get current local date for comparison (no TimeService needed - use dt_util directly)
|
||||||
|
now_local = dt_util.now()
|
||||||
|
today_date = now_local.date()
|
||||||
|
|
||||||
|
# Build set of target dates based on requested offsets
|
||||||
|
target_dates = set()
|
||||||
|
for offset in offsets:
|
||||||
|
# Silently clamp offsets to valid range (don't fail on invalid input)
|
||||||
|
if offset < min_offset or offset > max_offset:
|
||||||
|
continue
|
||||||
|
target_date = today_date + timedelta(days=offset)
|
||||||
|
target_dates.add(target_date)
|
||||||
|
|
||||||
|
if not target_dates:
|
||||||
|
return []
|
||||||
|
|
||||||
|
# Filter intervals matching target dates
|
||||||
|
# Optimized: single pass, date() called once per interval
|
||||||
|
result = []
|
||||||
|
for interval in all_intervals:
|
||||||
|
starts_at = interval.get("startsAt")
|
||||||
|
if not starts_at:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Handle both datetime objects and strings (for flexibility)
|
||||||
|
if isinstance(starts_at, str):
|
||||||
|
# Parse if string (should be rare after parse_all_timestamps)
|
||||||
|
starts_at = dt_util.parse_datetime(starts_at)
|
||||||
|
if not starts_at:
|
||||||
|
continue
|
||||||
|
starts_at = dt_util.as_local(starts_at)
|
||||||
|
|
||||||
|
# Fast date comparison using datetime.date()
|
||||||
|
interval_date = starts_at.date()
|
||||||
|
if interval_date in target_dates:
|
||||||
|
result.append(interval)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
def needs_tomorrow_data(
|
def needs_tomorrow_data(
|
||||||
cached_price_data: dict[str, Any] | None,
|
cached_price_data: dict[str, Any] | None,
|
||||||
tomorrow_date: date,
|
tomorrow_date: date,
|
||||||
) -> bool:
|
) -> bool:
|
||||||
"""Check if tomorrow data is missing or invalid."""
|
"""Check if tomorrow data is missing or invalid in flat interval list."""
|
||||||
if not cached_price_data or "homes" not in cached_price_data:
|
if not cached_price_data or "homes" not in cached_price_data:
|
||||||
return False
|
return False
|
||||||
|
|
||||||
# Use provided TimeService or create new one
|
# Check each home's intervals for tomorrow's date
|
||||||
|
|
||||||
for home_data in cached_price_data["homes"].values():
|
for home_data in cached_price_data["homes"].values():
|
||||||
price_info = home_data.get("price_info", {})
|
all_intervals = home_data.get("price_info", [])
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
|
|
||||||
# Check if tomorrow data is missing
|
# Check if any interval exists for tomorrow's date
|
||||||
if not tomorrow_prices:
|
has_tomorrow = False
|
||||||
return True
|
for interval in all_intervals:
|
||||||
|
if starts_at := interval.get("startsAt"): # Already datetime in local timezone
|
||||||
|
interval_date = starts_at.date()
|
||||||
|
if interval_date == tomorrow_date:
|
||||||
|
has_tomorrow = True
|
||||||
|
break
|
||||||
|
|
||||||
# Check if tomorrow data is actually for tomorrow (validate date)
|
# If no interval for tomorrow found, we need tomorrow data
|
||||||
first_price = tomorrow_prices[0]
|
if not has_tomorrow:
|
||||||
if starts_at := first_price.get("startsAt"): # Already datetime in local timezone
|
|
||||||
price_date = starts_at.date()
|
|
||||||
if price_date != tomorrow_date:
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
def perform_midnight_turnover(price_info: dict[str, Any], *, time: TibberPricesTimeService) -> dict[str, Any]:
|
|
||||||
"""
|
|
||||||
Perform midnight turnover on price data.
|
|
||||||
|
|
||||||
Moves: today → yesterday, tomorrow → today, clears tomorrow.
|
|
||||||
|
|
||||||
This handles cases where:
|
|
||||||
- Server was running through midnight
|
|
||||||
- Cache is being refreshed and needs proper day rotation
|
|
||||||
|
|
||||||
Args:
|
|
||||||
price_info: The price info dict with 'today', 'tomorrow', 'yesterday' keys
|
|
||||||
time: TibberPricesTimeService instance (required)
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Updated price_info with rotated day data
|
|
||||||
|
|
||||||
"""
|
|
||||||
# Use provided TimeService or create new one
|
|
||||||
|
|
||||||
current_local_date = time.now().date()
|
|
||||||
|
|
||||||
# Extract current data
|
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
|
|
||||||
# Check if any of today's prices are from the previous day
|
|
||||||
prices_need_rotation = False
|
|
||||||
if today_prices:
|
|
||||||
first_today_price = today_prices[0].get("startsAt") # Already datetime in local timezone
|
|
||||||
if first_today_price:
|
|
||||||
first_today_price_date = first_today_price.date()
|
|
||||||
prices_need_rotation = first_today_price_date < current_local_date
|
|
||||||
|
|
||||||
if prices_need_rotation:
|
|
||||||
return {
|
|
||||||
"yesterday": today_prices,
|
|
||||||
"today": tomorrow_prices,
|
|
||||||
"tomorrow": [],
|
|
||||||
"currency": price_info.get("currency", "EUR"),
|
|
||||||
}
|
|
||||||
|
|
||||||
# No rotation needed, return original
|
|
||||||
return price_info
|
|
||||||
|
|
||||||
|
|
||||||
def parse_all_timestamps(price_data: dict[str, Any], *, time: TibberPricesTimeService) -> dict[str, Any]:
|
def parse_all_timestamps(price_data: dict[str, Any], *, time: TibberPricesTimeService) -> dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
Parse all API timestamp strings to datetime objects.
|
Parse all API timestamp strings to datetime objects.
|
||||||
|
|
@ -113,7 +160,7 @@ def parse_all_timestamps(price_data: dict[str, Any], *, time: TibberPricesTimeSe
|
||||||
Performance: ~200 timestamps parsed ONCE instead of multiple times per update cycle.
|
Performance: ~200 timestamps parsed ONCE instead of multiple times per update cycle.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
price_data: Raw API data with string timestamps
|
price_data: Raw API data with string timestamps (flat interval list)
|
||||||
time: TibberPricesTimeService for parsing
|
time: TibberPricesTimeService for parsing
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
|
|
@ -125,17 +172,18 @@ def parse_all_timestamps(price_data: dict[str, Any], *, time: TibberPricesTimeSe
|
||||||
|
|
||||||
# Process each home
|
# Process each home
|
||||||
for home_data in price_data["homes"].values():
|
for home_data in price_data["homes"].values():
|
||||||
price_info = home_data.get("price_info", {})
|
# price_info is now a flat list of intervals
|
||||||
|
price_info = home_data.get("price_info", [])
|
||||||
|
|
||||||
# Process each day's intervals
|
# Skip if price_info is not a list (empty or invalid)
|
||||||
for day_key in ["yesterday", "today", "tomorrow"]:
|
if not isinstance(price_info, list):
|
||||||
intervals = price_info.get(day_key, [])
|
continue
|
||||||
for interval in intervals:
|
|
||||||
|
# Parse timestamps in flat interval list
|
||||||
|
for interval in price_info:
|
||||||
if (starts_at_str := interval.get("startsAt")) and isinstance(starts_at_str, str):
|
if (starts_at_str := interval.get("startsAt")) and isinstance(starts_at_str, str):
|
||||||
# Parse once, convert to local timezone, store as datetime object
|
# Parse once, convert to local timezone, store as datetime object
|
||||||
interval["startsAt"] = time.parse_and_localize(starts_at_str)
|
interval["startsAt"] = time.parse_and_localize(starts_at_str)
|
||||||
# If already datetime (e.g., from cache), skip parsing
|
# If already datetime (e.g., from cache), skip parsing
|
||||||
|
|
||||||
return price_data
|
return price_data
|
||||||
|
|
||||||
return price_info
|
|
||||||
|
|
|
||||||
|
|
@ -14,15 +14,14 @@ from custom_components.tibber_prices import const as _const
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from custom_components.tibber_prices.coordinator.time_service import TibberPricesTimeService
|
from custom_components.tibber_prices.coordinator.time_service import TibberPricesTimeService
|
||||||
|
from homeassistant.config_entries import ConfigEntry
|
||||||
|
|
||||||
|
from .helpers import get_intervals_for_day_offsets
|
||||||
from .period_handlers import (
|
from .period_handlers import (
|
||||||
TibberPricesPeriodConfig,
|
TibberPricesPeriodConfig,
|
||||||
calculate_periods_with_relaxation,
|
calculate_periods_with_relaxation,
|
||||||
)
|
)
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
|
||||||
from homeassistant.config_entries import ConfigEntry
|
|
||||||
|
|
||||||
_LOGGER = logging.getLogger(__name__)
|
_LOGGER = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -72,8 +71,10 @@ class TibberPricesPeriodCalculator:
|
||||||
Hash string for cache key comparison.
|
Hash string for cache key comparison.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
# Get relevant price data
|
# Get relevant price data from flat interval list
|
||||||
today = price_info.get("today", [])
|
# Build minimal coordinator_data structure for get_intervals_for_day_offsets
|
||||||
|
coordinator_data = {"priceInfo": price_info}
|
||||||
|
today = get_intervals_for_day_offsets(coordinator_data, [0])
|
||||||
today_signature = tuple((interval.get("startsAt"), interval.get("rating_level")) for interval in today)
|
today_signature = tuple((interval.get("startsAt"), interval.get("rating_level")) for interval in today)
|
||||||
|
|
||||||
# Get period configs (both best and peak)
|
# Get period configs (both best and peak)
|
||||||
|
|
@ -491,8 +492,10 @@ class TibberPricesPeriodCalculator:
|
||||||
if level_config == "any":
|
if level_config == "any":
|
||||||
return True
|
return True
|
||||||
|
|
||||||
# Get today's intervals
|
# Get today's intervals from flat list
|
||||||
today_intervals = price_info.get("today", [])
|
# Build minimal coordinator_data structure for get_intervals_for_day_offsets
|
||||||
|
coordinator_data = {"priceInfo": price_info}
|
||||||
|
today_intervals = get_intervals_for_day_offsets(coordinator_data, [0])
|
||||||
|
|
||||||
if not today_intervals:
|
if not today_intervals:
|
||||||
return True # If no data, don't filter
|
return True # If no data, don't filter
|
||||||
|
|
@ -555,9 +558,12 @@ class TibberPricesPeriodCalculator:
|
||||||
|
|
||||||
self._log("debug", "Calculating periods (cache miss or hash mismatch)")
|
self._log("debug", "Calculating periods (cache miss or hash mismatch)")
|
||||||
|
|
||||||
yesterday_prices = price_info.get("yesterday", [])
|
# Get intervals by day from flat list
|
||||||
today_prices = price_info.get("today", [])
|
# Build minimal coordinator_data structure for get_intervals_for_day_offsets
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
coordinator_data = {"priceInfo": price_info}
|
||||||
|
yesterday_prices = get_intervals_for_day_offsets(coordinator_data, [-1])
|
||||||
|
today_prices = get_intervals_for_day_offsets(coordinator_data, [0])
|
||||||
|
tomorrow_prices = get_intervals_for_day_offsets(coordinator_data, [1])
|
||||||
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
||||||
|
|
||||||
# Get rating thresholds from config
|
# Get rating thresholds from config
|
||||||
|
|
|
||||||
|
|
@ -17,6 +17,7 @@ from custom_components.tibber_prices.const import (
|
||||||
PRICE_RATING_ICON_MAPPING,
|
PRICE_RATING_ICON_MAPPING,
|
||||||
VOLATILITY_ICON_MAPPING,
|
VOLATILITY_ICON_MAPPING,
|
||||||
)
|
)
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import get_intervals_for_day_offsets
|
||||||
from custom_components.tibber_prices.entity_utils.helpers import find_rolling_hour_center_index
|
from custom_components.tibber_prices.entity_utils.helpers import find_rolling_hour_center_index
|
||||||
from custom_components.tibber_prices.sensor.helpers import aggregate_level_data
|
from custom_components.tibber_prices.sensor.helpers import aggregate_level_data
|
||||||
from custom_components.tibber_prices.utils.price import find_price_data_for_interval
|
from custom_components.tibber_prices.utils.price import find_price_data_for_interval
|
||||||
|
|
@ -319,12 +320,11 @@ def get_price_level_for_icon(
|
||||||
if not coordinator_data or interval_offset is None:
|
if not coordinator_data or interval_offset is None:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator_data.get("priceInfo", {})
|
|
||||||
now = time.now()
|
now = time.now()
|
||||||
|
|
||||||
# Interval-based lookup
|
# Interval-based lookup
|
||||||
target_time = now + timedelta(minutes=_INTERVAL_MINUTES * interval_offset)
|
target_time = now + timedelta(minutes=_INTERVAL_MINUTES * interval_offset)
|
||||||
interval_data = find_price_data_for_interval(price_info, target_time, time=time)
|
interval_data = find_price_data_for_interval(coordinator_data, target_time, time=time)
|
||||||
|
|
||||||
if not interval_data or "level" not in interval_data:
|
if not interval_data or "level" not in interval_data:
|
||||||
return None
|
return None
|
||||||
|
|
@ -358,8 +358,8 @@ def get_rolling_hour_price_level_for_icon(
|
||||||
if not coordinator_data:
|
if not coordinator_data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator_data.get("priceInfo", {})
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
all_prices = price_info.get("yesterday", []) + price_info.get("today", []) + price_info.get("tomorrow", [])
|
all_prices = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return None
|
return None
|
||||||
|
|
|
||||||
|
|
@ -5,6 +5,9 @@ from __future__ import annotations
|
||||||
from typing import TYPE_CHECKING
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
from custom_components.tibber_prices.const import PRICE_RATING_MAPPING
|
from custom_components.tibber_prices.const import PRICE_RATING_MAPPING
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import (
|
||||||
|
get_intervals_for_day_offsets,
|
||||||
|
)
|
||||||
from homeassistant.const import PERCENTAGE
|
from homeassistant.const import PERCENTAGE
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
|
|
@ -46,20 +49,32 @@ def _get_day_key_from_sensor_key(key: str) -> str:
|
||||||
return "today"
|
return "today"
|
||||||
|
|
||||||
|
|
||||||
def _add_fallback_timestamp(attributes: dict, key: str, price_info: dict) -> None:
|
def _add_fallback_timestamp(
|
||||||
|
attributes: dict,
|
||||||
|
key: str,
|
||||||
|
price_info: dict,
|
||||||
|
) -> None:
|
||||||
"""
|
"""
|
||||||
Add fallback timestamp to attributes based on the day in the sensor key.
|
Add fallback timestamp to attributes based on the day in the sensor key.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
attributes: Dictionary to add timestamp to
|
attributes: Dictionary to add timestamp to
|
||||||
key: The sensor entity key
|
key: The sensor entity key
|
||||||
price_info: Price info dictionary from coordinator data
|
price_info: Price info dictionary from coordinator data (flat structure)
|
||||||
|
|
||||||
"""
|
"""
|
||||||
day_key = _get_day_key_from_sensor_key(key)
|
day_key = _get_day_key_from_sensor_key(key)
|
||||||
day_data = price_info.get(day_key, [])
|
|
||||||
if day_data:
|
# Use helper to get intervals for this day
|
||||||
attributes["timestamp"] = day_data[0].get("startsAt")
|
# Build minimal coordinator_data structure for helper
|
||||||
|
coordinator_data = {"priceInfo": price_info}
|
||||||
|
# Map day key to offset: yesterday=-1, today=0, tomorrow=1
|
||||||
|
day_offset = {"yesterday": -1, "today": 0, "tomorrow": 1}[day_key]
|
||||||
|
day_intervals = get_intervals_for_day_offsets(coordinator_data, [day_offset])
|
||||||
|
|
||||||
|
# Use first interval's timestamp if available
|
||||||
|
if day_intervals:
|
||||||
|
attributes["timestamp"] = day_intervals[0].get("startsAt")
|
||||||
|
|
||||||
|
|
||||||
def add_statistics_attributes(
|
def add_statistics_attributes(
|
||||||
|
|
|
||||||
|
|
@ -4,6 +4,8 @@ from __future__ import annotations
|
||||||
|
|
||||||
from typing import TYPE_CHECKING
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import get_intervals_for_day_offsets
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from custom_components.tibber_prices.coordinator.core import (
|
from custom_components.tibber_prices.coordinator.core import (
|
||||||
TibberPricesDataUpdateCoordinator,
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
|
@ -40,15 +42,11 @@ def add_next_avg_attributes(
|
||||||
# Use TimeService to get the N-hour window starting from next interval
|
# Use TimeService to get the N-hour window starting from next interval
|
||||||
next_interval_start, window_end = time.get_next_n_hours_window(hours)
|
next_interval_start, window_end = time.get_next_n_hours_window(hours)
|
||||||
|
|
||||||
# Get all price intervals
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
price_info = coordinator.data.get("priceInfo", {})
|
all_prices = get_intervals_for_day_offsets(coordinator.data, [-1, 0, 1])
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
all_prices = today_prices + tomorrow_prices
|
|
||||||
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return
|
return
|
||||||
|
|
||||||
# Find all intervals in the window
|
# Find all intervals in the window
|
||||||
intervals_in_window = []
|
intervals_in_window = []
|
||||||
for price_data in all_prices:
|
for price_data in all_prices:
|
||||||
|
|
@ -86,11 +84,8 @@ def get_future_prices(
|
||||||
if not coordinator.data:
|
if not coordinator.data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator.data.get("priceInfo", {})
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
|
all_prices = get_intervals_for_day_offsets(coordinator.data, [-1, 0, 1])
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
all_prices = today_prices + tomorrow_prices
|
|
||||||
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return None
|
return None
|
||||||
|
|
@ -101,8 +96,12 @@ def get_future_prices(
|
||||||
# Track the maximum intervals to return
|
# Track the maximum intervals to return
|
||||||
intervals_to_return = MAX_FORECAST_INTERVALS if max_intervals is None else max_intervals
|
intervals_to_return = MAX_FORECAST_INTERVALS if max_intervals is None else max_intervals
|
||||||
|
|
||||||
for day_key in ["today", "tomorrow"]:
|
# Get current date for day key determination
|
||||||
for price_data in price_info.get(day_key, []):
|
now = time.now()
|
||||||
|
today_date = now.date()
|
||||||
|
tomorrow_date = time.get_local_date(offset_days=1)
|
||||||
|
|
||||||
|
for price_data in all_prices:
|
||||||
starts_at = time.get_interval_time(price_data)
|
starts_at = time.get_interval_time(price_data)
|
||||||
if starts_at is None:
|
if starts_at is None:
|
||||||
continue
|
continue
|
||||||
|
|
@ -111,6 +110,15 @@ def get_future_prices(
|
||||||
|
|
||||||
# Use TimeService to check if interval is in future
|
# Use TimeService to check if interval is in future
|
||||||
if time.is_in_future(starts_at):
|
if time.is_in_future(starts_at):
|
||||||
|
# Determine which day this interval belongs to
|
||||||
|
interval_date = starts_at.date()
|
||||||
|
if interval_date == today_date:
|
||||||
|
day_key = "today"
|
||||||
|
elif interval_date == tomorrow_date:
|
||||||
|
day_key = "tomorrow"
|
||||||
|
else:
|
||||||
|
day_key = "unknown"
|
||||||
|
|
||||||
future_prices.append(
|
future_prices.append(
|
||||||
{
|
{
|
||||||
"interval_start": starts_at,
|
"interval_start": starts_at,
|
||||||
|
|
|
||||||
|
|
@ -42,7 +42,6 @@ def add_current_interval_price_attributes( # noqa: PLR0913
|
||||||
time: TibberPricesTimeService instance (required)
|
time: TibberPricesTimeService instance (required)
|
||||||
|
|
||||||
"""
|
"""
|
||||||
price_info = coordinator.data.get("priceInfo", {}) if coordinator.data else {}
|
|
||||||
now = time.now()
|
now = time.now()
|
||||||
|
|
||||||
# Determine which interval to use based on sensor type
|
# Determine which interval to use based on sensor type
|
||||||
|
|
@ -73,19 +72,19 @@ def add_current_interval_price_attributes( # noqa: PLR0913
|
||||||
interval_data = None
|
interval_data = None
|
||||||
if key in next_interval_sensors:
|
if key in next_interval_sensors:
|
||||||
target_time = time.get_next_interval_start()
|
target_time = time.get_next_interval_start()
|
||||||
interval_data = find_price_data_for_interval(price_info, target_time, time=time)
|
interval_data = find_price_data_for_interval(coordinator.data, target_time, time=time)
|
||||||
# Override timestamp with the NEXT interval's startsAt (when that interval starts)
|
# Override timestamp with the NEXT interval's startsAt (when that interval starts)
|
||||||
if interval_data:
|
if interval_data:
|
||||||
attributes["timestamp"] = interval_data["startsAt"]
|
attributes["timestamp"] = interval_data["startsAt"]
|
||||||
elif key in previous_interval_sensors:
|
elif key in previous_interval_sensors:
|
||||||
target_time = time.get_interval_offset_time(-1)
|
target_time = time.get_interval_offset_time(-1)
|
||||||
interval_data = find_price_data_for_interval(price_info, target_time, time=time)
|
interval_data = find_price_data_for_interval(coordinator.data, target_time, time=time)
|
||||||
# Override timestamp with the PREVIOUS interval's startsAt
|
# Override timestamp with the PREVIOUS interval's startsAt
|
||||||
if interval_data:
|
if interval_data:
|
||||||
attributes["timestamp"] = interval_data["startsAt"]
|
attributes["timestamp"] = interval_data["startsAt"]
|
||||||
elif key in next_hour_sensors:
|
elif key in next_hour_sensors:
|
||||||
target_time = now + timedelta(hours=1)
|
target_time = now + timedelta(hours=1)
|
||||||
interval_data = find_price_data_for_interval(price_info, target_time, time=time)
|
interval_data = find_price_data_for_interval(coordinator.data, target_time, time=time)
|
||||||
# Override timestamp with the center of the next rolling hour window
|
# Override timestamp with the center of the next rolling hour window
|
||||||
if interval_data:
|
if interval_data:
|
||||||
attributes["timestamp"] = interval_data["startsAt"]
|
attributes["timestamp"] = interval_data["startsAt"]
|
||||||
|
|
|
||||||
|
|
@ -62,9 +62,9 @@ def build_lifecycle_attributes(
|
||||||
data_completeness = lifecycle_calculator.get_data_completeness_status()
|
data_completeness = lifecycle_calculator.get_data_completeness_status()
|
||||||
attributes["data_completeness"] = data_completeness
|
attributes["data_completeness"] = data_completeness
|
||||||
|
|
||||||
attributes["yesterday_available"] = lifecycle_calculator.is_data_available("yesterday")
|
attributes["yesterday_available"] = lifecycle_calculator.is_data_available(-1)
|
||||||
attributes["today_available"] = lifecycle_calculator.is_data_available("today")
|
attributes["today_available"] = lifecycle_calculator.is_data_available(0)
|
||||||
attributes["tomorrow_available"] = lifecycle_calculator.is_data_available("tomorrow")
|
attributes["tomorrow_available"] = lifecycle_calculator.is_data_available(1)
|
||||||
attributes["tomorrow_expected_after"] = "13:00"
|
attributes["tomorrow_expected_after"] = "13:00"
|
||||||
|
|
||||||
# Next Actions (only show if meaningful)
|
# Next Actions (only show if meaningful)
|
||||||
|
|
|
||||||
|
|
@ -32,7 +32,6 @@ def get_current_interval_data(
|
||||||
if not coordinator.data:
|
if not coordinator.data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator.data.get("priceInfo", {})
|
|
||||||
now = time.now()
|
now = time.now()
|
||||||
|
|
||||||
return find_price_data_for_interval(price_info, now, time=time)
|
return find_price_data_for_interval(coordinator.data, now, time=time)
|
||||||
|
|
|
||||||
|
|
@ -5,6 +5,7 @@ from __future__ import annotations
|
||||||
from datetime import timedelta
|
from datetime import timedelta
|
||||||
from typing import TYPE_CHECKING
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import get_intervals_for_day_offsets
|
||||||
from custom_components.tibber_prices.utils.price import calculate_volatility_level
|
from custom_components.tibber_prices.utils.price import calculate_volatility_level
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
|
|
@ -32,7 +33,7 @@ def add_volatility_attributes(
|
||||||
|
|
||||||
def get_prices_for_volatility(
|
def get_prices_for_volatility(
|
||||||
volatility_type: str,
|
volatility_type: str,
|
||||||
price_info: dict,
|
coordinator_data: dict,
|
||||||
*,
|
*,
|
||||||
time: TibberPricesTimeService,
|
time: TibberPricesTimeService,
|
||||||
) -> list[float]:
|
) -> list[float]:
|
||||||
|
|
@ -41,18 +42,33 @@ def get_prices_for_volatility(
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
volatility_type: One of "today", "tomorrow", "next_24h", "today_tomorrow"
|
volatility_type: One of "today", "tomorrow", "next_24h", "today_tomorrow"
|
||||||
price_info: Price information dictionary from coordinator data
|
coordinator_data: Coordinator data dict
|
||||||
time: TibberPricesTimeService instance (required)
|
time: TibberPricesTimeService instance (required)
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
List of prices to analyze
|
List of prices to analyze
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
|
all_intervals = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
|
|
||||||
if volatility_type == "today":
|
if volatility_type == "today":
|
||||||
return [float(p["total"]) for p in price_info.get("today", []) if "total" in p]
|
# Filter for today's intervals
|
||||||
|
today_date = time.now().date()
|
||||||
|
return [
|
||||||
|
float(p["total"])
|
||||||
|
for p in all_intervals
|
||||||
|
if "total" in p and p.get("startsAt") and p["startsAt"].date() == today_date
|
||||||
|
]
|
||||||
|
|
||||||
if volatility_type == "tomorrow":
|
if volatility_type == "tomorrow":
|
||||||
return [float(p["total"]) for p in price_info.get("tomorrow", []) if "total" in p]
|
# Filter for tomorrow's intervals
|
||||||
|
tomorrow_date = (time.now() + timedelta(days=1)).date()
|
||||||
|
return [
|
||||||
|
float(p["total"])
|
||||||
|
for p in all_intervals
|
||||||
|
if "total" in p and p.get("startsAt") and p["startsAt"].date() == tomorrow_date
|
||||||
|
]
|
||||||
|
|
||||||
if volatility_type == "next_24h":
|
if volatility_type == "next_24h":
|
||||||
# Rolling 24h from now
|
# Rolling 24h from now
|
||||||
|
|
@ -60,8 +76,7 @@ def get_prices_for_volatility(
|
||||||
end_time = now + timedelta(hours=24)
|
end_time = now + timedelta(hours=24)
|
||||||
prices = []
|
prices = []
|
||||||
|
|
||||||
for day_key in ["today", "tomorrow"]:
|
for price_data in all_intervals:
|
||||||
for price_data in price_info.get(day_key, []):
|
|
||||||
starts_at = price_data.get("startsAt") # Already datetime in local timezone
|
starts_at = price_data.get("startsAt") # Already datetime in local timezone
|
||||||
if starts_at is None:
|
if starts_at is None:
|
||||||
continue
|
continue
|
||||||
|
|
@ -72,10 +87,12 @@ def get_prices_for_volatility(
|
||||||
|
|
||||||
if volatility_type == "today_tomorrow":
|
if volatility_type == "today_tomorrow":
|
||||||
# Combined today + tomorrow
|
# Combined today + tomorrow
|
||||||
|
today_date = time.now().date()
|
||||||
|
tomorrow_date = (time.now() + timedelta(days=1)).date()
|
||||||
prices = []
|
prices = []
|
||||||
for day_key in ["today", "tomorrow"]:
|
for price_data in all_intervals:
|
||||||
for price_data in price_info.get(day_key, []):
|
starts_at = price_data.get("startsAt")
|
||||||
if "total" in price_data:
|
if starts_at and starts_at.date() in [today_date, tomorrow_date] and "total" in price_data:
|
||||||
prices.append(float(price_data["total"]))
|
prices.append(float(price_data["total"]))
|
||||||
return prices
|
return prices
|
||||||
|
|
||||||
|
|
@ -85,7 +102,7 @@ def get_prices_for_volatility(
|
||||||
def add_volatility_type_attributes(
|
def add_volatility_type_attributes(
|
||||||
volatility_attributes: dict,
|
volatility_attributes: dict,
|
||||||
volatility_type: str,
|
volatility_type: str,
|
||||||
price_info: dict,
|
coordinator_data: dict,
|
||||||
thresholds: dict,
|
thresholds: dict,
|
||||||
*,
|
*,
|
||||||
time: TibberPricesTimeService,
|
time: TibberPricesTimeService,
|
||||||
|
|
@ -96,29 +113,43 @@ def add_volatility_type_attributes(
|
||||||
Args:
|
Args:
|
||||||
volatility_attributes: Dictionary to add type-specific attributes to
|
volatility_attributes: Dictionary to add type-specific attributes to
|
||||||
volatility_type: Type of volatility calculation
|
volatility_type: Type of volatility calculation
|
||||||
price_info: Price information dictionary from coordinator data
|
coordinator_data: Coordinator data dict
|
||||||
thresholds: Volatility thresholds configuration
|
thresholds: Volatility thresholds configuration
|
||||||
time: TibberPricesTimeService instance (required)
|
time: TibberPricesTimeService instance (required)
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
|
all_intervals = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
|
now = time.now()
|
||||||
|
today_date = now.date()
|
||||||
|
tomorrow_date = (now + timedelta(days=1)).date()
|
||||||
|
|
||||||
# Add timestamp for calendar day volatility sensors (midnight of the day)
|
# Add timestamp for calendar day volatility sensors (midnight of the day)
|
||||||
if volatility_type == "today":
|
if volatility_type == "today":
|
||||||
today_data = price_info.get("today", [])
|
today_data = [p for p in all_intervals if p.get("startsAt") and p["startsAt"].date() == today_date]
|
||||||
if today_data:
|
if today_data:
|
||||||
volatility_attributes["timestamp"] = today_data[0].get("startsAt")
|
volatility_attributes["timestamp"] = today_data[0].get("startsAt")
|
||||||
elif volatility_type == "tomorrow":
|
elif volatility_type == "tomorrow":
|
||||||
tomorrow_data = price_info.get("tomorrow", [])
|
tomorrow_data = [p for p in all_intervals if p.get("startsAt") and p["startsAt"].date() == tomorrow_date]
|
||||||
if tomorrow_data:
|
if tomorrow_data:
|
||||||
volatility_attributes["timestamp"] = tomorrow_data[0].get("startsAt")
|
volatility_attributes["timestamp"] = tomorrow_data[0].get("startsAt")
|
||||||
elif volatility_type == "today_tomorrow":
|
elif volatility_type == "today_tomorrow":
|
||||||
# For combined today+tomorrow, use today's midnight
|
# For combined today+tomorrow, use today's midnight
|
||||||
today_data = price_info.get("today", [])
|
today_data = [p for p in all_intervals if p.get("startsAt") and p["startsAt"].date() == today_date]
|
||||||
if today_data:
|
if today_data:
|
||||||
volatility_attributes["timestamp"] = today_data[0].get("startsAt")
|
volatility_attributes["timestamp"] = today_data[0].get("startsAt")
|
||||||
|
|
||||||
# Add breakdown for today vs tomorrow
|
# Add breakdown for today vs tomorrow
|
||||||
today_prices = [float(p["total"]) for p in price_info.get("today", []) if "total" in p]
|
today_prices = [
|
||||||
tomorrow_prices = [float(p["total"]) for p in price_info.get("tomorrow", []) if "total" in p]
|
float(p["total"])
|
||||||
|
for p in all_intervals
|
||||||
|
if "total" in p and p.get("startsAt") and p["startsAt"].date() == today_date
|
||||||
|
]
|
||||||
|
tomorrow_prices = [
|
||||||
|
float(p["total"])
|
||||||
|
for p in all_intervals
|
||||||
|
if "total" in p and p.get("startsAt") and p["startsAt"].date() == tomorrow_date
|
||||||
|
]
|
||||||
|
|
||||||
if today_prices:
|
if today_prices:
|
||||||
today_vol = calculate_volatility_level(today_prices, **thresholds)
|
today_vol = calculate_volatility_level(today_prices, **thresholds)
|
||||||
|
|
|
||||||
|
|
@ -4,6 +4,8 @@ from __future__ import annotations
|
||||||
|
|
||||||
from typing import TYPE_CHECKING
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import get_intervals_for_day_offsets
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from custom_components.tibber_prices.coordinator.core import (
|
from custom_components.tibber_prices.coordinator.core import (
|
||||||
TibberPricesDataUpdateCoordinator,
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
|
@ -58,12 +60,8 @@ def add_average_price_attributes(
|
||||||
# Determine if this is trailing or leading
|
# Determine if this is trailing or leading
|
||||||
is_trailing = "trailing" in key
|
is_trailing = "trailing" in key
|
||||||
|
|
||||||
# Get all price intervals
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
price_info = coordinator.data.get("priceInfo", {})
|
all_prices = get_intervals_for_day_offsets(coordinator.data, [-1, 0, 1])
|
||||||
yesterday_prices = price_info.get("yesterday", [])
|
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
|
||||||
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return
|
return
|
||||||
|
|
|
||||||
|
|
@ -4,6 +4,10 @@ from __future__ import annotations
|
||||||
|
|
||||||
from typing import TYPE_CHECKING, Any
|
from typing import TYPE_CHECKING, Any
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import (
|
||||||
|
get_intervals_for_day_offsets,
|
||||||
|
)
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from custom_components.tibber_prices.coordinator import (
|
from custom_components.tibber_prices.coordinator import (
|
||||||
TibberPricesDataUpdateCoordinator,
|
TibberPricesDataUpdateCoordinator,
|
||||||
|
|
@ -56,9 +60,9 @@ class TibberPricesBaseCalculator:
|
||||||
return self._coordinator.data
|
return self._coordinator.data
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def price_info(self) -> dict[str, Any]:
|
def price_info(self) -> list[dict[str, Any]]:
|
||||||
"""Get price information from coordinator data."""
|
"""Get price info (intervals list) from coordinator data."""
|
||||||
return self.coordinator_data.get("priceInfo", {})
|
return self.coordinator_data.get("priceInfo", [])
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def user_data(self) -> dict[str, Any]:
|
def user_data(self) -> dict[str, Any]:
|
||||||
|
|
@ -67,17 +71,19 @@ class TibberPricesBaseCalculator:
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def currency(self) -> str:
|
def currency(self) -> str:
|
||||||
"""Get currency code from price info."""
|
"""Get currency code from coordinator data."""
|
||||||
return self.price_info.get("currency", "EUR")
|
return self.coordinator_data.get("currency", "EUR")
|
||||||
|
|
||||||
# Smart data access methods with built-in None-safety
|
# Smart data access methods with built-in None-safety
|
||||||
|
|
||||||
def get_intervals(self, day: str) -> list[dict]:
|
def get_intervals(self, day_offset: int) -> list[dict]:
|
||||||
"""
|
"""
|
||||||
Get price intervals for a specific day with None-safety.
|
Get price intervals for a specific day with None-safety.
|
||||||
|
|
||||||
|
Uses get_intervals_for_day_offsets() to abstract data structure access.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
day: Day key ("yesterday", "today", "tomorrow").
|
day_offset: Day offset (-1=yesterday, 0=today, 1=tomorrow).
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
List of interval dictionaries, empty list if unavailable.
|
List of interval dictionaries, empty list if unavailable.
|
||||||
|
|
@ -85,36 +91,22 @@ class TibberPricesBaseCalculator:
|
||||||
"""
|
"""
|
||||||
if not self.coordinator_data:
|
if not self.coordinator_data:
|
||||||
return []
|
return []
|
||||||
return self.price_info.get(day, [])
|
return get_intervals_for_day_offsets(self.coordinator_data, [day_offset])
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def intervals_today(self) -> list[dict]:
|
def intervals_today(self) -> list[dict]:
|
||||||
"""Get today's intervals with None-safety."""
|
"""Get today's intervals with None-safety."""
|
||||||
return self.get_intervals("today")
|
return self.get_intervals(0)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def intervals_tomorrow(self) -> list[dict]:
|
def intervals_tomorrow(self) -> list[dict]:
|
||||||
"""Get tomorrow's intervals with None-safety."""
|
"""Get tomorrow's intervals with None-safety."""
|
||||||
return self.get_intervals("tomorrow")
|
return self.get_intervals(1)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def intervals_yesterday(self) -> list[dict]:
|
def intervals_yesterday(self) -> list[dict]:
|
||||||
"""Get yesterday's intervals with None-safety."""
|
"""Get yesterday's intervals with None-safety."""
|
||||||
return self.get_intervals("yesterday")
|
return self.get_intervals(-1)
|
||||||
|
|
||||||
def get_all_intervals(self) -> list[dict]:
|
|
||||||
"""
|
|
||||||
Get all available intervals (yesterday + today + tomorrow).
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Combined list of all interval dictionaries.
|
|
||||||
|
|
||||||
"""
|
|
||||||
return [
|
|
||||||
*self.intervals_yesterday,
|
|
||||||
*self.intervals_today,
|
|
||||||
*self.intervals_tomorrow,
|
|
||||||
]
|
|
||||||
|
|
||||||
def find_interval_at_offset(self, offset: int) -> dict | None:
|
def find_interval_at_offset(self, offset: int) -> dict | None:
|
||||||
"""
|
"""
|
||||||
|
|
@ -136,7 +128,7 @@ class TibberPricesBaseCalculator:
|
||||||
|
|
||||||
time = self.coordinator.time
|
time = self.coordinator.time
|
||||||
target_time = time.get_interval_offset_time(offset)
|
target_time = time.get_interval_offset_time(offset)
|
||||||
return find_price_data_for_interval(self.price_info, target_time, time=time)
|
return find_price_data_for_interval(self.coordinator.data, target_time, time=time)
|
||||||
|
|
||||||
def safe_get_from_interval(
|
def safe_get_from_interval(
|
||||||
self,
|
self,
|
||||||
|
|
@ -178,17 +170,17 @@ class TibberPricesBaseCalculator:
|
||||||
"""
|
"""
|
||||||
return bool(self.price_info)
|
return bool(self.price_info)
|
||||||
|
|
||||||
def get_day_intervals(self, day: str) -> list[dict]:
|
def get_day_intervals(self, day_offset: int) -> list[dict]:
|
||||||
"""
|
"""
|
||||||
Get intervals for a specific day from coordinator data.
|
Get intervals for a specific day from coordinator data.
|
||||||
|
|
||||||
This is an alias for get_intervals() with consistent naming.
|
This is an alias for get_intervals() with consistent naming.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
day: Day key ("yesterday", "today", "tomorrow").
|
day_offset: Day offset (-1=yesterday, 0=today, 1=tomorrow).
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
List of interval dictionaries, empty list if unavailable.
|
List of interval dictionaries, empty list if unavailable.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
return self.get_intervals(day)
|
return self.get_intervals(day_offset)
|
||||||
|
|
|
||||||
|
|
@ -75,8 +75,8 @@ class TibberPricesDailyStatCalculator(TibberPricesBaseCalculator):
|
||||||
# Collect all prices and their intervals from both today and tomorrow data
|
# Collect all prices and their intervals from both today and tomorrow data
|
||||||
# that fall within the target day's local date boundaries
|
# that fall within the target day's local date boundaries
|
||||||
price_intervals = []
|
price_intervals = []
|
||||||
for day_key in ["today", "tomorrow"]:
|
for day_offset in [0, 1]: # today=0, tomorrow=1
|
||||||
for price_data in self.get_intervals(day_key):
|
for price_data in self.get_intervals(day_offset):
|
||||||
starts_at = price_data.get("startsAt") # Already datetime in local timezone
|
starts_at = price_data.get("startsAt") # Already datetime in local timezone
|
||||||
if not starts_at:
|
if not starts_at:
|
||||||
continue
|
continue
|
||||||
|
|
@ -139,8 +139,8 @@ class TibberPricesDailyStatCalculator(TibberPricesBaseCalculator):
|
||||||
# Collect all intervals from both today and tomorrow data
|
# Collect all intervals from both today and tomorrow data
|
||||||
# that fall within the target day's local date boundaries
|
# that fall within the target day's local date boundaries
|
||||||
day_intervals = []
|
day_intervals = []
|
||||||
for day_key in ["yesterday", "today", "tomorrow"]:
|
for day_offset in [-1, 0, 1]: # yesterday=-1, today=0, tomorrow=1
|
||||||
for price_data in self.get_intervals(day_key):
|
for price_data in self.get_intervals(day_offset):
|
||||||
starts_at = price_data.get("startsAt") # Already datetime in local timezone
|
starts_at = price_data.get("startsAt") # Already datetime in local timezone
|
||||||
if not starts_at:
|
if not starts_at:
|
||||||
continue
|
continue
|
||||||
|
|
|
||||||
|
|
@ -190,12 +190,12 @@ class TibberPricesLifecycleCalculator(TibberPricesBaseCalculator):
|
||||||
# Next midnight
|
# Next midnight
|
||||||
return now_local.replace(hour=0, minute=0, second=0, microsecond=0) + timedelta(days=1)
|
return now_local.replace(hour=0, minute=0, second=0, microsecond=0) + timedelta(days=1)
|
||||||
|
|
||||||
def is_data_available(self, day: str) -> bool:
|
def is_data_available(self, day_offset: int) -> bool:
|
||||||
"""
|
"""
|
||||||
Check if data is available for a specific day.
|
Check if data is available for a specific day.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
day: "yesterday", "today", or "tomorrow"
|
day_offset: Day offset (-1=yesterday, 0=today, 1=tomorrow)
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
True if data exists and is not empty
|
True if data exists and is not empty
|
||||||
|
|
@ -204,7 +204,7 @@ class TibberPricesLifecycleCalculator(TibberPricesBaseCalculator):
|
||||||
if not self.has_data():
|
if not self.has_data():
|
||||||
return False
|
return False
|
||||||
|
|
||||||
day_data = self.get_intervals(day)
|
day_data = self.get_intervals(day_offset)
|
||||||
return bool(day_data)
|
return bool(day_data)
|
||||||
|
|
||||||
def get_data_completeness_status(self) -> str:
|
def get_data_completeness_status(self) -> str:
|
||||||
|
|
@ -219,9 +219,9 @@ class TibberPricesLifecycleCalculator(TibberPricesBaseCalculator):
|
||||||
'no_data': No data available at all
|
'no_data': No data available at all
|
||||||
|
|
||||||
"""
|
"""
|
||||||
yesterday_available = self.is_data_available("yesterday")
|
yesterday_available = self.is_data_available(-1)
|
||||||
today_available = self.is_data_available("today")
|
today_available = self.is_data_available(0)
|
||||||
tomorrow_available = self.is_data_available("tomorrow")
|
tomorrow_available = self.is_data_available(1)
|
||||||
|
|
||||||
if yesterday_available and today_available and tomorrow_available:
|
if yesterday_available and today_available and tomorrow_available:
|
||||||
return "complete"
|
return "complete"
|
||||||
|
|
|
||||||
|
|
@ -8,6 +8,7 @@ from custom_components.tibber_prices.const import (
|
||||||
DEFAULT_PRICE_RATING_THRESHOLD_HIGH,
|
DEFAULT_PRICE_RATING_THRESHOLD_HIGH,
|
||||||
DEFAULT_PRICE_RATING_THRESHOLD_LOW,
|
DEFAULT_PRICE_RATING_THRESHOLD_LOW,
|
||||||
)
|
)
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import get_intervals_for_day_offsets
|
||||||
from custom_components.tibber_prices.entity_utils import find_rolling_hour_center_index
|
from custom_components.tibber_prices.entity_utils import find_rolling_hour_center_index
|
||||||
from custom_components.tibber_prices.sensor.helpers import (
|
from custom_components.tibber_prices.sensor.helpers import (
|
||||||
aggregate_level_data,
|
aggregate_level_data,
|
||||||
|
|
@ -51,8 +52,8 @@ class TibberPricesRollingHourCalculator(TibberPricesBaseCalculator):
|
||||||
if not self.has_data():
|
if not self.has_data():
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Get all available price data
|
# Get all available price data (yesterday, today, tomorrow)
|
||||||
all_prices = self.get_all_intervals()
|
all_prices = get_intervals_for_day_offsets(self.coordinator_data, [-1, 0, 1])
|
||||||
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return None
|
return None
|
||||||
|
|
|
||||||
|
|
@ -15,6 +15,7 @@ Caching strategy:
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from typing import TYPE_CHECKING, Any
|
from typing import TYPE_CHECKING, Any
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import get_intervals_for_day_offsets
|
||||||
from custom_components.tibber_prices.utils.average import calculate_next_n_hours_avg
|
from custom_components.tibber_prices.utils.average import calculate_next_n_hours_avg
|
||||||
from custom_components.tibber_prices.utils.price import (
|
from custom_components.tibber_prices.utils.price import (
|
||||||
calculate_price_trend,
|
calculate_price_trend,
|
||||||
|
|
@ -308,8 +309,8 @@ class TibberPricesTrendCalculator(TibberPricesBaseCalculator):
|
||||||
if not self.has_data():
|
if not self.has_data():
|
||||||
return None
|
return None
|
||||||
|
|
||||||
all_intervals = self.get_all_intervals()
|
all_intervals = get_intervals_for_day_offsets(self.coordinator_data, [-1, 0, 1])
|
||||||
current_interval = find_price_data_for_interval(self.price_info, now, time=time)
|
current_interval = find_price_data_for_interval(self.coordinator.data, now, time=time)
|
||||||
|
|
||||||
if not all_intervals or not current_interval:
|
if not all_intervals or not current_interval:
|
||||||
return None
|
return None
|
||||||
|
|
|
||||||
|
|
@ -62,7 +62,9 @@ class TibberPricesVolatilityCalculator(TibberPricesBaseCalculator):
|
||||||
}
|
}
|
||||||
|
|
||||||
# Get prices based on volatility type
|
# Get prices based on volatility type
|
||||||
prices_to_analyze = get_prices_for_volatility(volatility_type, self.price_info, time=self.coordinator.time)
|
prices_to_analyze = get_prices_for_volatility(
|
||||||
|
volatility_type, self.coordinator.data, time=self.coordinator.time
|
||||||
|
)
|
||||||
|
|
||||||
if not prices_to_analyze:
|
if not prices_to_analyze:
|
||||||
return None
|
return None
|
||||||
|
|
@ -94,7 +96,11 @@ class TibberPricesVolatilityCalculator(TibberPricesBaseCalculator):
|
||||||
|
|
||||||
# Add type-specific attributes
|
# Add type-specific attributes
|
||||||
add_volatility_type_attributes(
|
add_volatility_type_attributes(
|
||||||
self._last_volatility_attributes, volatility_type, self.price_info, thresholds, time=self.coordinator.time
|
self._last_volatility_attributes,
|
||||||
|
volatility_type,
|
||||||
|
self.coordinator.data,
|
||||||
|
thresholds,
|
||||||
|
time=self.coordinator.time,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Return lowercase for ENUM device class
|
# Return lowercase for ENUM device class
|
||||||
|
|
|
||||||
|
|
@ -21,6 +21,9 @@ from custom_components.tibber_prices.coordinator import (
|
||||||
MINUTE_UPDATE_ENTITY_KEYS,
|
MINUTE_UPDATE_ENTITY_KEYS,
|
||||||
TIME_SENSITIVE_ENTITY_KEYS,
|
TIME_SENSITIVE_ENTITY_KEYS,
|
||||||
)
|
)
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import (
|
||||||
|
get_intervals_for_day_offsets,
|
||||||
|
)
|
||||||
from custom_components.tibber_prices.entity import TibberPricesEntity
|
from custom_components.tibber_prices.entity import TibberPricesEntity
|
||||||
from custom_components.tibber_prices.entity_utils import (
|
from custom_components.tibber_prices.entity_utils import (
|
||||||
add_icon_color_attribute,
|
add_icon_color_attribute,
|
||||||
|
|
@ -253,9 +256,8 @@ class TibberPricesSensor(TibberPricesEntity, SensorEntity):
|
||||||
if not self.coordinator.data:
|
if not self.coordinator.data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Get all available price data
|
# Get all available price data (yesterday, today, tomorrow) via helper
|
||||||
price_info = self.coordinator.data.get("priceInfo", {})
|
all_prices = get_intervals_for_day_offsets(self.coordinator.data, [-1, 0, 1])
|
||||||
all_prices = price_info.get("yesterday", []) + price_info.get("today", []) + price_info.get("tomorrow", [])
|
|
||||||
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return None
|
return None
|
||||||
|
|
@ -313,25 +315,13 @@ class TibberPricesSensor(TibberPricesEntity, SensorEntity):
|
||||||
if not self.coordinator.data:
|
if not self.coordinator.data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = self.coordinator.data.get("priceInfo", {})
|
# Map day key to offset: yesterday=-1, today=0, tomorrow=1
|
||||||
|
day_offset = {"yesterday": -1, "today": 0, "tomorrow": 1}[day]
|
||||||
|
day_intervals = get_intervals_for_day_offsets(self.coordinator.data, [day_offset])
|
||||||
|
|
||||||
# Get TimeService from coordinator
|
# Collect all prices and their intervals
|
||||||
time = self.coordinator.time
|
|
||||||
|
|
||||||
# Get local midnight boundaries based on the requested day using TimeService
|
|
||||||
local_midnight, local_midnight_next_day = time.get_day_boundaries(day)
|
|
||||||
|
|
||||||
# Collect all prices and their intervals from both today and tomorrow data
|
|
||||||
# that fall within the target day's local date boundaries
|
|
||||||
price_intervals = []
|
price_intervals = []
|
||||||
for day_key in ["today", "tomorrow"]:
|
for price_data in day_intervals:
|
||||||
for price_data in price_info.get(day_key, []):
|
|
||||||
starts_at = price_data.get("startsAt") # Already datetime in local timezone
|
|
||||||
if not starts_at:
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Include price if it starts within the target day's local date boundaries
|
|
||||||
if local_midnight <= starts_at < local_midnight_next_day:
|
|
||||||
total_price = price_data.get("total")
|
total_price = price_data.get("total")
|
||||||
if total_price is not None:
|
if total_price is not None:
|
||||||
price_intervals.append(
|
price_intervals.append(
|
||||||
|
|
@ -381,24 +371,9 @@ class TibberPricesSensor(TibberPricesEntity, SensorEntity):
|
||||||
if not self.coordinator.data:
|
if not self.coordinator.data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = self.coordinator.data.get("priceInfo", {})
|
# Map day key to offset: yesterday=-1, today=0, tomorrow=1
|
||||||
|
day_offset = {"yesterday": -1, "today": 0, "tomorrow": 1}[day]
|
||||||
# Get local midnight boundaries based on the requested day using TimeService
|
day_intervals = get_intervals_for_day_offsets(self.coordinator.data, [day_offset])
|
||||||
time = self.coordinator.time
|
|
||||||
local_midnight, local_midnight_next_day = time.get_day_boundaries(day)
|
|
||||||
|
|
||||||
# Collect all intervals from both today and tomorrow data
|
|
||||||
# that fall within the target day's local date boundaries
|
|
||||||
day_intervals = []
|
|
||||||
for day_key in ["yesterday", "today", "tomorrow"]:
|
|
||||||
for price_data in price_info.get(day_key, []):
|
|
||||||
starts_at = price_data.get("startsAt") # Already datetime in local timezone
|
|
||||||
if not starts_at:
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Include interval if it starts within the target day's local date boundaries
|
|
||||||
if local_midnight <= starts_at < local_midnight_next_day:
|
|
||||||
day_intervals.append(price_data)
|
|
||||||
|
|
||||||
if not day_intervals:
|
if not day_intervals:
|
||||||
return None
|
return None
|
||||||
|
|
@ -512,11 +487,12 @@ class TibberPricesSensor(TibberPricesEntity, SensorEntity):
|
||||||
if not self.coordinator.data:
|
if not self.coordinator.data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = self.coordinator.data.get("priceInfo", {})
|
# Use helper to get all intervals (today and tomorrow)
|
||||||
|
all_intervals = get_intervals_for_day_offsets(self.coordinator.data, [0, 1])
|
||||||
latest_timestamp = None
|
latest_timestamp = None
|
||||||
|
|
||||||
for day in ["today", "tomorrow"]:
|
# Search through intervals to find latest timestamp
|
||||||
for price_data in price_info.get(day, []):
|
for price_data in all_intervals:
|
||||||
starts_at = price_data.get("startsAt") # Already datetime in local timezone
|
starts_at = price_data.get("startsAt") # Already datetime in local timezone
|
||||||
if not starts_at:
|
if not starts_at:
|
||||||
continue
|
continue
|
||||||
|
|
@ -540,8 +516,6 @@ class TibberPricesSensor(TibberPricesEntity, SensorEntity):
|
||||||
if not self.coordinator.data:
|
if not self.coordinator.data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = self.coordinator.data.get("priceInfo", {})
|
|
||||||
|
|
||||||
# Get volatility thresholds from config
|
# Get volatility thresholds from config
|
||||||
thresholds = {
|
thresholds = {
|
||||||
"threshold_moderate": self.coordinator.config_entry.options.get("volatility_threshold_moderate", 5.0),
|
"threshold_moderate": self.coordinator.config_entry.options.get("volatility_threshold_moderate", 5.0),
|
||||||
|
|
@ -550,7 +524,9 @@ class TibberPricesSensor(TibberPricesEntity, SensorEntity):
|
||||||
}
|
}
|
||||||
|
|
||||||
# Get prices based on volatility type
|
# Get prices based on volatility type
|
||||||
prices_to_analyze = get_prices_for_volatility(volatility_type, price_info, time=self.coordinator.time)
|
prices_to_analyze = get_prices_for_volatility(
|
||||||
|
volatility_type, self.coordinator.data, time=self.coordinator.time
|
||||||
|
)
|
||||||
|
|
||||||
if not prices_to_analyze:
|
if not prices_to_analyze:
|
||||||
return None
|
return None
|
||||||
|
|
@ -582,7 +558,11 @@ class TibberPricesSensor(TibberPricesEntity, SensorEntity):
|
||||||
|
|
||||||
# Add type-specific attributes
|
# Add type-specific attributes
|
||||||
add_volatility_type_attributes(
|
add_volatility_type_attributes(
|
||||||
self._last_volatility_attributes, volatility_type, price_info, thresholds, time=self.coordinator.time
|
self._last_volatility_attributes,
|
||||||
|
volatility_type,
|
||||||
|
self.coordinator.data,
|
||||||
|
thresholds,
|
||||||
|
time=self.coordinator.time,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Return lowercase for ENUM device class
|
# Return lowercase for ENUM device class
|
||||||
|
|
@ -722,8 +702,7 @@ class TibberPricesSensor(TibberPricesEntity, SensorEntity):
|
||||||
if self.entity_description.device_class == SensorDeviceClass.MONETARY:
|
if self.entity_description.device_class == SensorDeviceClass.MONETARY:
|
||||||
currency = None
|
currency = None
|
||||||
if self.coordinator.data:
|
if self.coordinator.data:
|
||||||
price_info = self.coordinator.data.get("priceInfo", {})
|
currency = self.coordinator.data.get("currency")
|
||||||
currency = price_info.get("currency")
|
|
||||||
|
|
||||||
# Use major currency unit for Energy Dashboard sensor
|
# Use major currency unit for Energy Dashboard sensor
|
||||||
if self.entity_description.key == "current_interval_price_major":
|
if self.entity_description.key == "current_interval_price_major":
|
||||||
|
|
|
||||||
|
|
@ -24,6 +24,7 @@ from typing import TYPE_CHECKING
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from custom_components.tibber_prices.coordinator.time_service import TibberPricesTimeService
|
from custom_components.tibber_prices.coordinator.time_service import TibberPricesTimeService
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import get_intervals_for_day_offsets
|
||||||
from custom_components.tibber_prices.entity_utils.helpers import get_price_value
|
from custom_components.tibber_prices.entity_utils.helpers import get_price_value
|
||||||
from custom_components.tibber_prices.utils.price import (
|
from custom_components.tibber_prices.utils.price import (
|
||||||
aggregate_price_levels,
|
aggregate_price_levels,
|
||||||
|
|
@ -130,7 +131,7 @@ def aggregate_window_data(
|
||||||
|
|
||||||
|
|
||||||
def get_hourly_price_value(
|
def get_hourly_price_value(
|
||||||
price_info: dict,
|
coordinator_data: dict,
|
||||||
*,
|
*,
|
||||||
hour_offset: int,
|
hour_offset: int,
|
||||||
in_euro: bool,
|
in_euro: bool,
|
||||||
|
|
@ -143,7 +144,7 @@ def get_hourly_price_value(
|
||||||
Kept for potential backward compatibility.
|
Kept for potential backward compatibility.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
price_info: Price information dict with 'today' and 'tomorrow' keys
|
coordinator_data: Coordinator data dict
|
||||||
hour_offset: Hour offset from current time (positive=future, negative=past)
|
hour_offset: Hour offset from current time (positive=future, negative=past)
|
||||||
in_euro: If True, return price in major currency (EUR), else minor (cents/øre)
|
in_euro: If True, return price in major currency (EUR), else minor (cents/øre)
|
||||||
time: TibberPricesTimeService instance (required)
|
time: TibberPricesTimeService instance (required)
|
||||||
|
|
@ -161,30 +162,18 @@ def get_hourly_price_value(
|
||||||
target_hour = target_datetime.hour
|
target_hour = target_datetime.hour
|
||||||
target_date = target_datetime.date()
|
target_date = target_datetime.date()
|
||||||
|
|
||||||
# Determine which day's data we need
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
day_key = "tomorrow" if target_date > now.date() else "today"
|
all_intervals = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
|
|
||||||
for price_data in price_info.get(day_key, []):
|
# Search through all intervals to find the matching hour
|
||||||
|
for price_data in all_intervals:
|
||||||
# Parse the timestamp and convert to local time
|
# Parse the timestamp and convert to local time
|
||||||
starts_at = time.get_interval_time(price_data)
|
starts_at = time.get_interval_time(price_data)
|
||||||
if starts_at is None:
|
if starts_at is None:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# Make sure it's in the local timezone for proper comparison
|
|
||||||
|
|
||||||
# Compare using both hour and date for accuracy
|
# Compare using both hour and date for accuracy
|
||||||
if starts_at.hour == target_hour and starts_at.date() == target_date:
|
if starts_at.hour == target_hour and starts_at.date() == target_date:
|
||||||
return get_price_value(float(price_data["total"]), in_euro=in_euro)
|
return get_price_value(float(price_data["total"]), in_euro=in_euro)
|
||||||
|
|
||||||
# If we didn't find the price in the expected day's data, check the other day
|
|
||||||
# This is a fallback for potential edge cases
|
|
||||||
other_day_key = "today" if day_key == "tomorrow" else "tomorrow"
|
|
||||||
for price_data in price_info.get(other_day_key, []):
|
|
||||||
starts_at = time.get_interval_time(price_data)
|
|
||||||
if starts_at is None:
|
|
||||||
continue
|
|
||||||
|
|
||||||
if starts_at.hour == target_hour and starts_at.date() == target_date:
|
|
||||||
return get_price_value(float(price_data["total"]), in_euro=in_euro)
|
|
||||||
|
|
||||||
return None
|
return None
|
||||||
|
|
|
||||||
|
|
@ -95,8 +95,8 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]:
|
||||||
|
|
||||||
# Get coordinator to access price data (for currency)
|
# Get coordinator to access price data (for currency)
|
||||||
_, coordinator, _ = get_entry_and_data(hass, entry_id)
|
_, coordinator, _ = get_entry_and_data(hass, entry_id)
|
||||||
price_info = coordinator.data.get("priceInfo", {})
|
# Get currency from coordinator data
|
||||||
currency = price_info.get("currency", "EUR")
|
currency = coordinator.data.get("currency", "EUR")
|
||||||
price_unit = format_price_unit_minor(currency)
|
price_unit = format_price_unit_minor(currency)
|
||||||
|
|
||||||
# Get a sample entity_id for the series (first sensor from this entry)
|
# Get a sample entity_id for the series (first sensor from this entry)
|
||||||
|
|
|
||||||
|
|
@ -42,6 +42,9 @@ from custom_components.tibber_prices.const import (
|
||||||
PRICE_RATING_LOW,
|
PRICE_RATING_LOW,
|
||||||
PRICE_RATING_NORMAL,
|
PRICE_RATING_NORMAL,
|
||||||
)
|
)
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import (
|
||||||
|
get_intervals_for_day_offsets,
|
||||||
|
)
|
||||||
from homeassistant.exceptions import ServiceValidationError
|
from homeassistant.exceptions import ServiceValidationError
|
||||||
|
|
||||||
from .formatters import aggregate_hourly_exact, get_period_data, normalize_level_filter, normalize_rating_level_filter
|
from .formatters import aggregate_hourly_exact, get_period_data, normalize_level_filter, normalize_rating_level_filter
|
||||||
|
|
@ -205,7 +208,6 @@ async def handle_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR091
|
||||||
|
|
||||||
# === NORMAL HANDLING: Interval Data ===
|
# === NORMAL HANDLING: Interval Data ===
|
||||||
# Get price data for all requested days
|
# Get price data for all requested days
|
||||||
price_info = coordinator.data.get("priceInfo", {})
|
|
||||||
chart_data = []
|
chart_data = []
|
||||||
|
|
||||||
# Build set of timestamps that match period_filter if specified
|
# Build set of timestamps that match period_filter if specified
|
||||||
|
|
@ -231,21 +233,27 @@ async def handle_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR091
|
||||||
# Collect all timestamps if insert_nulls='all' (needed to insert NULLs for missing filter matches)
|
# Collect all timestamps if insert_nulls='all' (needed to insert NULLs for missing filter matches)
|
||||||
all_timestamps = set()
|
all_timestamps = set()
|
||||||
if insert_nulls == "all" and (level_filter or rating_level_filter):
|
if insert_nulls == "all" and (level_filter or rating_level_filter):
|
||||||
for day in days:
|
# Use helper to get intervals for requested days
|
||||||
day_prices = price_info.get(day, [])
|
# Map day keys to offsets: yesterday=-1, today=0, tomorrow=1
|
||||||
for interval in day_prices:
|
day_offset_map = {"yesterday": -1, "today": 0, "tomorrow": 1}
|
||||||
start_time = interval.get("startsAt")
|
offsets = [day_offset_map[day] for day in days]
|
||||||
if start_time:
|
day_intervals = get_intervals_for_day_offsets(coordinator.data, offsets)
|
||||||
all_timestamps.add(start_time)
|
all_timestamps = {interval["startsAt"] for interval in day_intervals if interval.get("startsAt")}
|
||||||
all_timestamps = sorted(all_timestamps)
|
all_timestamps = sorted(all_timestamps)
|
||||||
|
|
||||||
# Calculate average if requested
|
# Calculate average if requested
|
||||||
day_averages = {}
|
day_averages = {}
|
||||||
if include_average:
|
if include_average:
|
||||||
for day in days:
|
for day in days:
|
||||||
day_prices = price_info.get(day, [])
|
# Use helper to get intervals for this day
|
||||||
if day_prices:
|
# Build minimal coordinator_data for single day query
|
||||||
prices = [p["total"] for p in day_prices if p.get("total") is not None]
|
# Map day key to offset: yesterday=-1, today=0, tomorrow=1
|
||||||
|
day_offset = {"yesterday": -1, "today": 0, "tomorrow": 1}[day]
|
||||||
|
day_intervals = get_intervals_for_day_offsets(coordinator.data, [day_offset])
|
||||||
|
|
||||||
|
# Collect prices from intervals
|
||||||
|
prices = [p["total"] for p in day_intervals if p.get("total") is not None]
|
||||||
|
|
||||||
if prices:
|
if prices:
|
||||||
avg = sum(prices) / len(prices)
|
avg = sum(prices) / len(prices)
|
||||||
# Apply same transformations as to regular prices
|
# Apply same transformations as to regular prices
|
||||||
|
|
@ -255,7 +263,10 @@ async def handle_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR091
|
||||||
day_averages[day] = avg
|
day_averages[day] = avg
|
||||||
|
|
||||||
for day in days:
|
for day in days:
|
||||||
day_prices = price_info.get(day, [])
|
# Use helper to get intervals for this day
|
||||||
|
# Map day key to offset: yesterday=-1, today=0, tomorrow=1
|
||||||
|
day_offset = {"yesterday": -1, "today": 0, "tomorrow": 1}[day]
|
||||||
|
day_prices = get_intervals_for_day_offsets(coordinator.data, [day_offset])
|
||||||
|
|
||||||
if resolution == "interval":
|
if resolution == "interval":
|
||||||
# Original 15-minute intervals
|
# Original 15-minute intervals
|
||||||
|
|
@ -331,7 +342,7 @@ async def handle_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR091
|
||||||
next_value = next_interval.get(filter_field)
|
next_value = next_interval.get(filter_field)
|
||||||
|
|
||||||
# Check if current interval matches filter
|
# Check if current interval matches filter
|
||||||
if interval_value in filter_values:
|
if interval_value in filter_values: # type: ignore[operator]
|
||||||
# Convert price
|
# Convert price
|
||||||
converted_price = round(price * 100, 2) if minor_currency else round(price, 4)
|
converted_price = round(price * 100, 2) if minor_currency else round(price, 4)
|
||||||
if round_decimals is not None:
|
if round_decimals is not None:
|
||||||
|
|
@ -372,7 +383,7 @@ async def handle_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR091
|
||||||
last_price = last_interval.get("total")
|
last_price = last_interval.get("total")
|
||||||
last_value = last_interval.get(filter_field)
|
last_value = last_interval.get(filter_field)
|
||||||
|
|
||||||
if last_start_time and last_price is not None and last_value in filter_values:
|
if last_start_time and last_price is not None and last_value in filter_values: # type: ignore[operator]
|
||||||
# Timestamp is already datetime in local timezone
|
# Timestamp is already datetime in local timezone
|
||||||
last_dt = last_start_time # Already datetime object
|
last_dt = last_start_time # Already datetime object
|
||||||
if last_dt:
|
if last_dt:
|
||||||
|
|
@ -393,9 +404,12 @@ async def handle_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR091
|
||||||
midnight_interval = None
|
midnight_interval = None
|
||||||
|
|
||||||
if next_day_name:
|
if next_day_name:
|
||||||
next_day_prices = price_info.get(next_day_name, [])
|
# Use helper to get first interval of next day
|
||||||
if next_day_prices:
|
# Map day key to offset: yesterday=-1, today=0, tomorrow=1
|
||||||
first_next = next_day_prices[0]
|
next_day_offset = {"yesterday": -1, "today": 0, "tomorrow": 1}[next_day_name]
|
||||||
|
next_day_intervals = get_intervals_for_day_offsets(coordinator.data, [next_day_offset])
|
||||||
|
if next_day_intervals:
|
||||||
|
first_next = next_day_intervals[0]
|
||||||
first_next_value = first_next.get(filter_field)
|
first_next_value = first_next.get(filter_field)
|
||||||
# Only use tomorrow's price if it matches the same filter
|
# Only use tomorrow's price if it matches the same filter
|
||||||
if first_next_value == last_value:
|
if first_next_value == last_value:
|
||||||
|
|
|
||||||
|
|
@ -27,6 +27,9 @@ from custom_components.tibber_prices.const import (
|
||||||
DEFAULT_PRICE_RATING_THRESHOLD_LOW,
|
DEFAULT_PRICE_RATING_THRESHOLD_LOW,
|
||||||
get_translation,
|
get_translation,
|
||||||
)
|
)
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import (
|
||||||
|
get_intervals_for_day_offsets,
|
||||||
|
)
|
||||||
from custom_components.tibber_prices.sensor.helpers import aggregate_level_data, aggregate_rating_data
|
from custom_components.tibber_prices.sensor.helpers import aggregate_level_data, aggregate_rating_data
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -256,18 +259,12 @@ def get_period_data( # noqa: PLR0913, PLR0912, PLR0915
|
||||||
# Filter periods by day if requested
|
# Filter periods by day if requested
|
||||||
filtered_periods = []
|
filtered_periods = []
|
||||||
if days:
|
if days:
|
||||||
# Build set of allowed dates
|
# Use helper to get intervals for requested days, extract their dates
|
||||||
allowed_dates = set()
|
# Map day keys to offsets: yesterday=-1, today=0, tomorrow=1
|
||||||
for day in days:
|
day_offset_map = {"yesterday": -1, "today": 0, "tomorrow": 1}
|
||||||
# Map day names to actual dates from coordinator
|
offsets = [day_offset_map[day] for day in days]
|
||||||
price_info = coordinator.data.get("priceInfo", {})
|
day_intervals = get_intervals_for_day_offsets(coordinator.data, offsets)
|
||||||
day_prices = price_info.get(day, [])
|
allowed_dates = {interval["startsAt"].date() for interval in day_intervals if interval.get("startsAt")}
|
||||||
if day_prices:
|
|
||||||
# Extract date from first interval (already datetime in local timezone)
|
|
||||||
first_interval = day_prices[0]
|
|
||||||
starts_at = first_interval.get("startsAt") # Already datetime object
|
|
||||||
if starts_at:
|
|
||||||
allowed_dates.add(starts_at.date())
|
|
||||||
|
|
||||||
# Filter periods to those within allowed dates
|
# Filter periods to those within allowed dates
|
||||||
for period in period_summaries:
|
for period in period_summaries:
|
||||||
|
|
|
||||||
|
|
@ -5,6 +5,8 @@ from __future__ import annotations
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from typing import TYPE_CHECKING
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import get_intervals_for_day_offsets
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from custom_components.tibber_prices.coordinator.time_service import TibberPricesTimeService
|
from custom_components.tibber_prices.coordinator.time_service import TibberPricesTimeService
|
||||||
|
|
||||||
|
|
@ -98,17 +100,13 @@ def calculate_current_trailing_avg(
|
||||||
if not coordinator_data:
|
if not coordinator_data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator_data.get("priceInfo", {})
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
yesterday_prices = price_info.get("yesterday", [])
|
all_prices = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
|
|
||||||
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
now = time.now()
|
now = time.now()
|
||||||
return calculate_trailing_24h_avg(all_prices, now)
|
return calculate_trailing_24h_min(all_prices, now, time=time)
|
||||||
|
|
||||||
|
|
||||||
def calculate_current_leading_avg(
|
def calculate_current_leading_avg(
|
||||||
|
|
@ -130,17 +128,13 @@ def calculate_current_leading_avg(
|
||||||
if not coordinator_data:
|
if not coordinator_data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator_data.get("priceInfo", {})
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
yesterday_prices = price_info.get("yesterday", [])
|
all_prices = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
|
|
||||||
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
now = time.now()
|
now = time.now()
|
||||||
return calculate_leading_24h_avg(all_prices, now)
|
return calculate_leading_24h_min(all_prices, now, time=time)
|
||||||
|
|
||||||
|
|
||||||
def calculate_trailing_24h_min(
|
def calculate_trailing_24h_min(
|
||||||
|
|
@ -322,12 +316,8 @@ def calculate_current_trailing_min(
|
||||||
if not coordinator_data:
|
if not coordinator_data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator_data.get("priceInfo", {})
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
yesterday_prices = price_info.get("yesterday", [])
|
all_prices = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
|
|
||||||
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
@ -354,12 +344,8 @@ def calculate_current_trailing_max(
|
||||||
if not coordinator_data:
|
if not coordinator_data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator_data.get("priceInfo", {})
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
yesterday_prices = price_info.get("yesterday", [])
|
all_prices = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
|
|
||||||
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
@ -386,17 +372,13 @@ def calculate_current_leading_min(
|
||||||
if not coordinator_data:
|
if not coordinator_data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator_data.get("priceInfo", {})
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
yesterday_prices = price_info.get("yesterday", [])
|
all_prices = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
|
|
||||||
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
now = time.now()
|
now = time.now()
|
||||||
return calculate_leading_24h_min(all_prices, now, time=time)
|
return calculate_leading_24h_avg(all_prices, now)
|
||||||
|
|
||||||
|
|
||||||
def calculate_current_leading_max(
|
def calculate_current_leading_max(
|
||||||
|
|
@ -418,12 +400,8 @@ def calculate_current_leading_max(
|
||||||
if not coordinator_data:
|
if not coordinator_data:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator_data.get("priceInfo", {})
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
yesterday_prices = price_info.get("yesterday", [])
|
all_prices = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
|
|
||||||
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
@ -455,12 +433,8 @@ def calculate_next_n_hours_avg(
|
||||||
if not coordinator_data or hours <= 0:
|
if not coordinator_data or hours <= 0:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
price_info = coordinator_data.get("priceInfo", {})
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
yesterday_prices = price_info.get("yesterday", [])
|
all_prices = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
today_prices = price_info.get("today", [])
|
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
|
|
||||||
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
|
||||||
if not all_prices:
|
if not all_prices:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -22,6 +22,7 @@ from custom_components.tibber_prices.const import (
|
||||||
VOLATILITY_MODERATE,
|
VOLATILITY_MODERATE,
|
||||||
VOLATILITY_VERY_HIGH,
|
VOLATILITY_VERY_HIGH,
|
||||||
)
|
)
|
||||||
|
from custom_components.tibber_prices.coordinator.helpers import get_intervals_for_day_offsets
|
||||||
|
|
||||||
_LOGGER = logging.getLogger(__name__)
|
_LOGGER = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
@ -294,24 +295,26 @@ def _process_price_interval(
|
||||||
|
|
||||||
|
|
||||||
def enrich_price_info_with_differences(
|
def enrich_price_info_with_differences(
|
||||||
price_info: dict[str, Any],
|
all_intervals: list[dict[str, Any]],
|
||||||
|
*,
|
||||||
threshold_low: float | None = None,
|
threshold_low: float | None = None,
|
||||||
threshold_high: float | None = None,
|
threshold_high: float | None = None,
|
||||||
) -> dict[str, Any]:
|
time: TibberPricesTimeService,
|
||||||
|
) -> list[dict[str, Any]]:
|
||||||
"""
|
"""
|
||||||
Enrich price info with calculated 'difference' and 'rating_level' values.
|
Enrich price intervals with calculated 'difference' and 'rating_level' values.
|
||||||
|
|
||||||
Computes the trailing 24-hour average, difference percentage, and rating level
|
Computes the trailing 24-hour average, difference percentage, and rating level
|
||||||
for each interval in today and tomorrow (excluding yesterday since it's historical).
|
for each interval in the flat list (in-place modification).
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
price_info: Dictionary with 'yesterday', 'today', 'tomorrow' keys
|
all_intervals: Flat list of all price intervals.
|
||||||
threshold_low: Low threshold percentage for rating_level (defaults to -10)
|
threshold_low: Low threshold percentage for rating_level (defaults to -10)
|
||||||
threshold_high: High threshold percentage for rating_level (defaults to 10)
|
threshold_high: High threshold percentage for rating_level (defaults to 10)
|
||||||
time: TibberPricesTimeService instance (required)
|
time: TibberPricesTimeService instance (required)
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Updated price_info dict with 'difference' and 'rating_level' added
|
Same list (modified in-place) with 'difference' and 'rating_level' added
|
||||||
|
|
||||||
"""
|
"""
|
||||||
if threshold_low is None:
|
if threshold_low is None:
|
||||||
|
|
@ -319,36 +322,52 @@ def enrich_price_info_with_differences(
|
||||||
if threshold_high is None:
|
if threshold_high is None:
|
||||||
threshold_high = 10
|
threshold_high = 10
|
||||||
|
|
||||||
yesterday_prices = price_info.get("yesterday", [])
|
if not all_intervals:
|
||||||
today_prices = price_info.get("today", [])
|
return all_intervals
|
||||||
tomorrow_prices = price_info.get("tomorrow", [])
|
|
||||||
|
|
||||||
# Combine all prices for lookback calculation
|
# Determine day keys for logging
|
||||||
all_prices = yesterday_prices + today_prices + tomorrow_prices
|
now_date = time.now().date()
|
||||||
|
yesterday_date = now_date - timedelta(days=1)
|
||||||
|
tomorrow_date = now_date + timedelta(days=1)
|
||||||
|
|
||||||
|
yesterday_count = sum(1 for p in all_intervals if p.get("startsAt") and p["startsAt"].date() == yesterday_date)
|
||||||
|
today_count = sum(1 for p in all_intervals if p.get("startsAt") and p["startsAt"].date() == now_date)
|
||||||
|
tomorrow_count = sum(1 for p in all_intervals if p.get("startsAt") and p["startsAt"].date() == tomorrow_date)
|
||||||
|
|
||||||
_LOGGER.debug(
|
_LOGGER.debug(
|
||||||
"Enriching price info with differences and rating levels: "
|
"Enriching price info with differences and rating levels: "
|
||||||
"yesterday=%d, today=%d, tomorrow=%d, thresholds: low=%.2f, high=%.2f",
|
"yesterday=%d, today=%d, tomorrow=%d, thresholds: low=%.2f, high=%.2f",
|
||||||
len(yesterday_prices),
|
yesterday_count,
|
||||||
len(today_prices),
|
today_count,
|
||||||
len(tomorrow_prices),
|
tomorrow_count,
|
||||||
threshold_low,
|
threshold_low,
|
||||||
threshold_high,
|
threshold_high,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Process today's prices
|
# Process all intervals (modifies in-place)
|
||||||
for price_interval in today_prices:
|
for price_interval in all_intervals:
|
||||||
_process_price_interval(price_interval, all_prices, threshold_low, threshold_high, "today")
|
starts_at = price_interval.get("startsAt")
|
||||||
|
if not starts_at:
|
||||||
|
continue
|
||||||
|
|
||||||
# Process tomorrow's prices
|
# Determine day key for this interval
|
||||||
for price_interval in tomorrow_prices:
|
interval_date = starts_at.date()
|
||||||
_process_price_interval(price_interval, all_prices, threshold_low, threshold_high, "tomorrow")
|
if interval_date == yesterday_date:
|
||||||
|
day_key = "yesterday"
|
||||||
|
elif interval_date == now_date:
|
||||||
|
day_key = "today"
|
||||||
|
elif interval_date == tomorrow_date:
|
||||||
|
day_key = "tomorrow"
|
||||||
|
else:
|
||||||
|
day_key = "unknown"
|
||||||
|
|
||||||
return price_info
|
_process_price_interval(price_interval, all_intervals, threshold_low, threshold_high, day_key)
|
||||||
|
|
||||||
|
return all_intervals
|
||||||
|
|
||||||
|
|
||||||
def find_price_data_for_interval(
|
def find_price_data_for_interval(
|
||||||
price_info: Any,
|
coordinator_data: dict,
|
||||||
target_time: datetime,
|
target_time: datetime,
|
||||||
*,
|
*,
|
||||||
time: TibberPricesTimeService,
|
time: TibberPricesTimeService,
|
||||||
|
|
@ -357,7 +376,7 @@ def find_price_data_for_interval(
|
||||||
Find the price data for a specific 15-minute interval timestamp.
|
Find the price data for a specific 15-minute interval timestamp.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
price_info: The price info dictionary from Tibber API
|
coordinator_data: The coordinator data dict
|
||||||
target_time: The target timestamp to find price data for
|
target_time: The target timestamp to find price data for
|
||||||
time: TibberPricesTimeService instance (required)
|
time: TibberPricesTimeService instance (required)
|
||||||
|
|
||||||
|
|
@ -368,22 +387,19 @@ def find_price_data_for_interval(
|
||||||
# Round to nearest quarter-hour to handle edge cases where we're called
|
# Round to nearest quarter-hour to handle edge cases where we're called
|
||||||
# slightly before the boundary (e.g., 14:59:59.999 → 15:00:00)
|
# slightly before the boundary (e.g., 14:59:59.999 → 15:00:00)
|
||||||
rounded_time = time.round_to_nearest_quarter(target_time)
|
rounded_time = time.round_to_nearest_quarter(target_time)
|
||||||
|
rounded_date = rounded_time.date()
|
||||||
|
|
||||||
day_key = "tomorrow" if rounded_time.date() > time.now().date() else "today"
|
# Get all intervals (yesterday, today, tomorrow) via helper
|
||||||
search_days = [day_key, "tomorrow" if day_key == "today" else "today"]
|
all_intervals = get_intervals_for_day_offsets(coordinator_data, [-1, 0, 1])
|
||||||
|
|
||||||
for search_day in search_days:
|
# Search for matching interval
|
||||||
day_prices = price_info.get(search_day, [])
|
for price_data in all_intervals:
|
||||||
if not day_prices:
|
|
||||||
continue
|
|
||||||
|
|
||||||
for price_data in day_prices:
|
|
||||||
starts_at = time.get_interval_time(price_data)
|
starts_at = time.get_interval_time(price_data)
|
||||||
if starts_at is None:
|
if starts_at is None:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# Exact match after rounding
|
# Exact match after rounding (both time and date must match)
|
||||||
if starts_at == rounded_time and starts_at.date() == rounded_time.date():
|
if starts_at == rounded_time and starts_at.date() == rounded_date:
|
||||||
return price_data
|
return price_data
|
||||||
|
|
||||||
return None
|
return None
|
||||||
|
|
|
||||||
|
|
@ -20,7 +20,7 @@ The integration uses **4 distinct caching layers** with different purposes and l
|
||||||
**Purpose:** Reduce API calls to Tibber by caching user data and price data between HA restarts.
|
**Purpose:** Reduce API calls to Tibber by caching user data and price data between HA restarts.
|
||||||
|
|
||||||
**What is cached:**
|
**What is cached:**
|
||||||
- **Price data** (`price_data`): Yesterday/today/tomorrow price intervals with enriched fields
|
- **Price data** (`price_data`): Day before yesterday/yesterday/today/tomorrow price intervals with enriched fields (384 intervals total)
|
||||||
- **User data** (`user_data`): Homes, subscriptions, features from Tibber GraphQL `viewer` query
|
- **User data** (`user_data`): Homes, subscriptions, features from Tibber GraphQL `viewer` query
|
||||||
- **Timestamps**: Last update times for validation
|
- **Timestamps**: Last update times for validation
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -185,12 +185,8 @@ These patterns were analyzed and classified as **not critical**:
|
||||||
```python
|
```python
|
||||||
coordinator.data = {
|
coordinator.data = {
|
||||||
"user_data": {...},
|
"user_data": {...},
|
||||||
"priceInfo": {
|
"priceInfo": [...], # Flat list of all enriched intervals
|
||||||
"yesterday": [...],
|
"currency": "EUR" # Top-level for easy access
|
||||||
"today": [...],
|
|
||||||
"tomorrow": [...],
|
|
||||||
"currency": "EUR"
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
|
||||||
Loading…
Reference in a new issue