feat(chart_export): add Chart Data Export diagnostic sensor

Added optional diagnostic binary sensor that exposes get_chartdata
service results as entity attributes for legacy dashboard tools.

Key features:
- Entity: binary_sensor.tibber_home_NAME_chart_data_export
- Configurable via Options Flow Step 7 (YAML parameters)
- Calls get_chartdata service with user configuration
- Exposes response as attributes for chart cards
- Disabled by default (opt-in)
- Auto-refreshes on coordinator updates
- Manual refresh via homeassistant.update_entity

Implementation details:
- Added chart_data_export entity description to definitions.py
- Implemented state/attribute logic in binary_sensor/core.py
- Added YAML configuration schema in schemas.py
- Added validation in options_flow.py (Step 7)
- Service call validation with detailed error messages
- Attribute ordering: metadata first, descriptions next, service data last
- Dynamic icon mapping (database-export/database-alert)

Translations:
- Added chart_data_export_config to all 5 languages
- Added Step 7 descriptions with legacy warning
- Added invalid_yaml_syntax/invalid_yaml_structure error messages
- Added custom_translations for sensor descriptions

Documentation:
- Added Chart Data Export section to sensors.md
- Added comprehensive service guide to services.md
- Migration path from sensor to service
- Configuration instructions via Options Flow

Impact: Provides backward compatibility for dashboard tools that can
only read entity attributes (e.g., older ApexCharts versions). New
integrations should use tibber_prices.get_chartdata service directly.
This commit is contained in:
Julian Pawlowski 2025-11-17 03:14:02 +00:00
parent 0bf810f0d5
commit 38ce1c4c50
20 changed files with 1049 additions and 89 deletions

View file

@ -274,7 +274,7 @@ def build_final_attributes_simple(
}
async def build_async_extra_state_attributes( # noqa: PLR0913
async def build_async_extra_state_attributes( # noqa: PLR0913, PLR0912
entity_key: str,
translation_key: str | None,
hass: HomeAssistant,
@ -302,11 +302,31 @@ async def build_async_extra_state_attributes( # noqa: PLR0913
"""
attributes = {}
# Add dynamic attributes if available
# For chart_data_export: Add metadata first, descriptions next, service data last
# For other sensors: Follow normal order (dynamic_attrs first, then descriptions)
is_chart_export = entity_key == "chart_data_export"
# Extract metadata and service data for chart_data_export
chart_metadata = {}
chart_service_data = {}
if is_chart_export and dynamic_attrs:
# Separate metadata (timestamp, error) from service data
for key, value in dynamic_attrs.items():
if key in ("timestamp", "error"):
chart_metadata[key] = value
else:
chart_service_data[key] = value
# Add dynamic attributes in correct order
if dynamic_attrs:
# Copy and remove internal fields before exposing to user
clean_attrs = {k: v for k, v in dynamic_attrs.items() if not k.startswith("_")}
attributes.update(clean_attrs)
if is_chart_export:
# For chart_data_export: Start with metadata only
attributes.update(chart_metadata)
else:
# For other sensors: Add all dynamic attributes first
# Copy and remove internal fields before exposing to user
clean_attrs = {k: v for k, v in dynamic_attrs.items() if not k.startswith("_")}
attributes.update(clean_attrs)
# Add icon_color for best/peak price period sensors using shared utility
add_icon_color_attribute(attributes, entity_key, is_on=is_on)
@ -357,10 +377,14 @@ async def build_async_extra_state_attributes( # noqa: PLR0913
if usage_tips:
attributes["usage_tips"] = usage_tips
# For chart_data_export: Add service data at the END (after descriptions)
if is_chart_export and chart_service_data:
attributes.update(chart_service_data)
return attributes if attributes else None
def build_sync_extra_state_attributes( # noqa: PLR0913
def build_sync_extra_state_attributes( # noqa: PLR0913, PLR0912
entity_key: str,
translation_key: str | None,
hass: HomeAssistant,
@ -388,11 +412,31 @@ def build_sync_extra_state_attributes( # noqa: PLR0913
"""
attributes = {}
# Add dynamic attributes if available
# For chart_data_export: Add metadata first, descriptions next, service data last
# For other sensors: Follow normal order (dynamic_attrs first, then descriptions)
is_chart_export = entity_key == "chart_data_export"
# Extract metadata and service data for chart_data_export
chart_metadata = {}
chart_service_data = {}
if is_chart_export and dynamic_attrs:
# Separate metadata (timestamp, error) from service data
for key, value in dynamic_attrs.items():
if key in ("timestamp", "error"):
chart_metadata[key] = value
else:
chart_service_data[key] = value
# Add dynamic attributes in correct order
if dynamic_attrs:
# Copy and remove internal fields before exposing to user
clean_attrs = {k: v for k, v in dynamic_attrs.items() if not k.startswith("_")}
attributes.update(clean_attrs)
if is_chart_export:
# For chart_data_export: Start with metadata only
attributes.update(chart_metadata)
else:
# For other sensors: Add all dynamic attributes first
# Copy and remove internal fields before exposing to user
clean_attrs = {k: v for k, v in dynamic_attrs.items() if not k.startswith("_")}
attributes.update(clean_attrs)
# Add icon_color for best/peak price period sensors using shared utility
add_icon_color_attribute(attributes, entity_key, is_on=is_on)
@ -440,4 +484,8 @@ def build_sync_extra_state_attributes( # noqa: PLR0913
if usage_tips:
attributes["usage_tips"] = usage_tips
# For chart_data_export: Add service data at the END (after descriptions)
if is_chart_export and chart_service_data:
attributes.update(chart_service_data)
return attributes if attributes else None

View file

@ -5,6 +5,9 @@ from __future__ import annotations
from datetime import timedelta
from typing import TYPE_CHECKING
import yaml
from custom_components.tibber_prices.const import CONF_CHART_DATA_CONFIG, DOMAIN
from custom_components.tibber_prices.coordinator import TIME_SENSITIVE_ENTITY_KEYS
from custom_components.tibber_prices.entity import TibberPricesEntity
from custom_components.tibber_prices.entity_utils import get_binary_sensor_icon
@ -49,6 +52,8 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
self._state_getter: Callable | None = self._get_state_getter()
self._attribute_getter: Callable | None = self._get_attribute_getter()
self._time_sensitive_remove_listener: Callable | None = None
self._chart_data_last_update = None # Track last service call timestamp
self._chart_data_error = None # Track last service call error
async def async_added_to_hass(self) -> None:
"""When entity is added to hass."""
@ -60,6 +65,10 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
self._handle_time_sensitive_update
)
# For chart_data_export, trigger initial service call
if self.entity_description.key == "chart_data_export":
await self._refresh_chart_data()
async def async_will_remove_from_hass(self) -> None:
"""When entity will be removed from hass."""
await super().async_will_remove_from_hass()
@ -85,6 +94,7 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
"tomorrow_data_available": self._tomorrow_data_available_state,
"has_ventilation_system": self._has_ventilation_system_state,
"realtime_consumption_enabled": self._realtime_consumption_enabled_state,
"chart_data_export": self._chart_data_export_state,
}
return state_getters.get(key)
@ -174,6 +184,79 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
value = features.get("realTimeConsumptionEnabled")
return value if isinstance(value, bool) else None
def _chart_data_export_state(self) -> bool | None:
"""Return True if chart data export was successful."""
if not self.coordinator.data:
return None
# Try to fetch chart data - state is ON if successful
# Note: This is called in property context, so we can't use async
# We'll check if data was cached from last async call
chart_data = self._get_cached_chart_data()
return chart_data is not None
def _get_cached_chart_data(self) -> dict | None:
"""Get cached chart data from last service call."""
# Store service response in instance variable for reuse
if not hasattr(self, "_chart_data_cache"):
self._chart_data_cache = None
return self._chart_data_cache
async def _call_chartdata_service_async(self) -> dict | None:
"""Call get_chartdata service with user-configured YAML (async)."""
# Get user-configured YAML
yaml_config = self.coordinator.config_entry.options.get(CONF_CHART_DATA_CONFIG, "")
# Parse YAML if provided, otherwise use empty dict (service defaults)
service_params = {}
if yaml_config and yaml_config.strip():
try:
parsed = yaml.safe_load(yaml_config)
# Ensure we have a dict (yaml.safe_load can return str, int, etc.)
if isinstance(parsed, dict):
service_params = parsed
else:
self.coordinator.logger.warning(
"YAML configuration must be a dictionary, got %s. Using service defaults.",
type(parsed).__name__,
extra={"entity": self.entity_description.key},
)
service_params = {}
except yaml.YAMLError as err:
self.coordinator.logger.warning(
"Invalid chart data YAML configuration: %s. Using service defaults.",
err,
extra={"entity": self.entity_description.key},
)
service_params = {} # Fall back to service defaults
# Add required entry_id parameter
service_params["entry_id"] = self.coordinator.config_entry.entry_id
# Call get_chartdata service using official HA service system
try:
response = await self.hass.services.async_call(
DOMAIN,
"get_chartdata",
service_params,
blocking=True,
return_response=True,
)
except Exception as ex:
self.coordinator.logger.exception(
"Chart data service call failed",
extra={"entity": self.entity_description.key},
)
self._chart_data_cache = None
self._chart_data_last_update = dt_util.now()
self._chart_data_error = str(ex)
return None
else:
self._chart_data_cache = response
self._chart_data_last_update = dt_util.now()
self._chart_data_error = None
return response
def _get_tomorrow_data_available_attributes(self) -> dict | None:
"""Return attributes for tomorrow_data_available binary sensor."""
return get_tomorrow_data_available_attributes(self.coordinator.data)
@ -188,9 +271,81 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
return lambda: get_price_intervals_attributes(self.coordinator.data, reverse_sort=False)
if key == "tomorrow_data_available":
return self._get_tomorrow_data_available_attributes
if key == "chart_data_export":
return self._get_chart_data_export_attributes
return None
def _get_chart_data_export_attributes(self) -> dict[str, object] | None:
"""
Return chart data from service call as attributes with metadata.
Strategy to avoid attribute name collisions:
- If service returns dict with SINGLE top-level key use directly
- If service returns dict with MULTIPLE top-level keys wrap in {"data": {...}}
- If service returns array/primitive wrap in {"data": <response>}
Attribute order: timestamp, error (if any), descriptions, service data (at the end).
"""
chart_data = self._get_cached_chart_data()
# Build base attributes with metadata
# timestamp = when service was last called (not current interval)
attributes: dict[str, object] = {
"timestamp": self._chart_data_last_update.isoformat() if self._chart_data_last_update else None,
}
# Add error message if service call failed
if self._chart_data_error:
attributes["error"] = self._chart_data_error
# Note: descriptions will be added by build_async_extra_state_attributes
# and will appear before service data because we return attributes first,
# then they get merged with descriptions, then service data is appended
if not chart_data:
# No data - only metadata (timestamp, error)
return attributes
# Service data goes at the END - append after metadata
# If response is a dict with multiple top-level keys, wrap it
# to avoid collision with our own attributes (timestamp, error, etc.)
if isinstance(chart_data, dict):
if len(chart_data) > 1:
# Multiple keys → wrap to prevent collision
attributes["data"] = chart_data
else:
# Single key → safe to merge directly
attributes.update(chart_data)
else:
# If response is array/list/primitive, wrap it in "data" key
attributes["data"] = chart_data
return attributes
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
# Chart data export: No automatic refresh needed.
# Data only refreshes on:
# 1. Initial sensor activation (async_added_to_hass)
# 2. Config changes via Options Flow (triggers re-add)
# Hourly coordinator updates don't change the chart data content.
super()._handle_coordinator_update()
async def _refresh_chart_data(self) -> None:
"""
Refresh chart data by calling service.
Called only on:
- Initial sensor activation (async_added_to_hass)
- Config changes via Options Flow (triggers re-add async_added_to_hass)
NOT called on routine coordinator updates to avoid unnecessary service calls.
"""
await self._call_chartdata_service_async()
self.async_write_ha_state()
@property
def is_on(self) -> bool | None:
"""Return true if the binary_sensor is on."""
@ -262,6 +417,19 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
async def async_extra_state_attributes(self) -> dict | None:
"""Return additional state attributes asynchronously."""
try:
# For chart_data_export, use custom attribute builder with descriptions
if self.entity_description.key == "chart_data_export":
chart_attrs = self._get_chart_data_export_attributes()
# Add descriptions like other sensors
return await build_async_extra_state_attributes(
self.entity_description.key,
self.entity_description.translation_key,
self.hass,
config_entry=self.coordinator.config_entry,
dynamic_attrs=chart_attrs,
is_on=self.is_on,
)
# Get the dynamic attributes if the getter is available
if not self.coordinator.data:
return None
@ -294,7 +462,20 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
def extra_state_attributes(self) -> dict | None:
"""Return additional state attributes synchronously."""
try:
# Start with dynamic attributes if available
# For chart_data_export, use custom attribute builder with descriptions
if self.entity_description.key == "chart_data_export":
chart_attrs = self._get_chart_data_export_attributes()
# Add descriptions like other sensors
return build_sync_extra_state_attributes(
self.entity_description.key,
self.entity_description.translation_key,
self.hass,
config_entry=self.coordinator.config_entry,
dynamic_attrs=chart_attrs,
is_on=self.is_on,
)
# Get the dynamic attributes if the getter is available
if not self.coordinator.data:
return None
@ -324,4 +505,9 @@ class TibberPricesBinarySensor(TibberPricesEntity, BinarySensorEntity):
async def async_update(self) -> None:
"""Force a refresh when homeassistant.update_entity is called."""
# Always refresh coordinator data
await self.coordinator.async_request_refresh()
# For chart_data_export, also refresh the service call
if self.entity_description.key == "chart_data_export":
await self._refresh_chart_data()

View file

@ -62,4 +62,13 @@ ENTITY_DESCRIPTIONS = (
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
),
BinarySensorEntityDescription(
key="chart_data_export",
translation_key="chart_data_export",
name="Chart Data Export",
icon="mdi:database-export",
device_class=BinarySensorDeviceClass.CONNECTIVITY, # ON = data available
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False, # Opt-in
),
)

View file

@ -2,10 +2,14 @@
from __future__ import annotations
import logging
from typing import Any, ClassVar
import yaml
from custom_components.tibber_prices.config_flow_handlers.schemas import (
get_best_price_schema,
get_chart_data_export_schema,
get_options_init_schema,
get_peak_price_schema,
get_price_rating_schema,
@ -15,12 +19,14 @@ from custom_components.tibber_prices.config_flow_handlers.schemas import (
from custom_components.tibber_prices.const import DOMAIN
from homeassistant.config_entries import ConfigFlowResult, OptionsFlow
_LOGGER = logging.getLogger(__name__)
class TibberPricesOptionsFlowHandler(OptionsFlow):
"""Handle options for tibber_prices entries."""
# Step progress tracking
_TOTAL_STEPS: ClassVar[int] = 6
_TOTAL_STEPS: ClassVar[int] = 7
_STEP_INFO: ClassVar[dict[str, int]] = {
"init": 1,
"current_interval_price_rating": 2,
@ -28,6 +34,7 @@ class TibberPricesOptionsFlowHandler(OptionsFlow):
"best_price": 4,
"peak_price": 5,
"price_trend": 6,
"chart_data_export": 7,
}
def __init__(self) -> None:
@ -114,7 +121,7 @@ class TibberPricesOptionsFlowHandler(OptionsFlow):
"""Configure price trend thresholds."""
if user_input is not None:
self._options.update(user_input)
return self.async_create_entry(title="", data=self._options)
return await self.async_step_chart_data_export()
return self.async_show_form(
step_id="price_trend",
@ -122,6 +129,61 @@ class TibberPricesOptionsFlowHandler(OptionsFlow):
description_placeholders=self._get_step_description_placeholders("price_trend"),
)
async def async_step_chart_data_export(self, user_input: dict[str, Any] | None = None) -> ConfigFlowResult:
"""Configure chart data export sensor."""
errors: dict[str, str] = {}
if user_input is not None:
# Get YAML configuration (default to empty string if not provided)
yaml_config = user_input.get("chart_data_config", "")
if yaml_config.strip(): # Only validate if not empty
try:
parsed = yaml.safe_load(yaml_config)
if parsed is not None and not isinstance(parsed, dict):
errors["base"] = "invalid_yaml_structure"
except yaml.YAMLError:
errors["base"] = "invalid_yaml_syntax"
# Test service call with parsed parameters
if not errors and parsed:
try:
# Add entry_id to service call data
service_data = {**parsed, "entry_id": self.config_entry.entry_id}
# Call the service to validate parameters
await self.hass.services.async_call(
domain="tibber_prices",
service="get_chartdata",
service_data=service_data,
blocking=True,
return_response=True,
)
except Exception as ex: # noqa: BLE001
# Set error with detailed message directly (no translation key)
error_msg = str(ex)
_LOGGER.warning(
"Service validation failed for chart_data_export: %s",
error_msg,
)
# Use field-level error to show detailed message
errors["chart_data_config"] = error_msg
if not errors:
# Explicitly store chart_data_config (including empty string to allow clearing)
self._options.update(user_input)
# Ensure the key exists even if empty
if "chart_data_config" not in user_input:
self._options["chart_data_config"] = ""
return self.async_create_entry(title="", data=self._options)
return self.async_show_form(
step_id="chart_data_export",
data_schema=get_chart_data_export_schema(self.config_entry.options),
description_placeholders=self._get_step_description_placeholders("chart_data_export"),
errors=errors,
)
async def async_step_volatility(self, user_input: dict[str, Any] | None = None) -> ConfigFlowResult:
"""Configure volatility thresholds and period filtering."""
if user_input is not None:

View file

@ -16,6 +16,7 @@ from custom_components.tibber_prices.const import (
CONF_BEST_PRICE_MAX_LEVEL_GAP_COUNT,
CONF_BEST_PRICE_MIN_DISTANCE_FROM_AVG,
CONF_BEST_PRICE_MIN_PERIOD_LENGTH,
CONF_CHART_DATA_CONFIG,
CONF_ENABLE_MIN_PERIODS_BEST,
CONF_ENABLE_MIN_PERIODS_PEAK,
CONF_EXTENDED_DESCRIPTIONS,
@ -575,3 +576,26 @@ def get_price_trend_schema(options: Mapping[str, Any]) -> vol.Schema:
),
}
)
def get_chart_data_export_schema(options: Mapping[str, Any]) -> vol.Schema:
"""Return schema for chart data export configuration."""
default_yaml = options.get(
CONF_CHART_DATA_CONFIG,
# Default: Empty string - uses service defaults (today, interval resolution)
"",
)
return vol.Schema(
{
vol.Optional(
CONF_CHART_DATA_CONFIG,
description={"suggested_value": default_yaml},
): TextSelector(
TextSelectorConfig(
multiline=True,
type=TextSelectorType.TEXT,
)
),
}
)

View file

@ -43,6 +43,7 @@ CONF_ENABLE_MIN_PERIODS_PEAK = "enable_min_periods_peak"
CONF_MIN_PERIODS_PEAK = "min_periods_peak"
CONF_RELAXATION_STEP_PEAK = "relaxation_step_peak"
CONF_RELAXATION_ATTEMPTS_PEAK = "relaxation_attempts_peak"
CONF_CHART_DATA_CONFIG = "chart_data_config" # YAML config for chart data export
ATTRIBUTION = "Data provided by Tibber"
@ -328,6 +329,10 @@ BINARY_SENSOR_ICON_MAPPING = {
"off": "mdi:shield-check", # Has future periods
"off_no_future": "mdi:sleep", # No future periods in next 6h
},
"chart_data_export": {
"on": "mdi:database-export", # Data available
"off": "mdi:database-alert", # Service call failed or no config
},
}
# Color mapping for binary sensors (CSS variables for theme compatibility)

View file

@ -464,6 +464,11 @@
"description": "Ob die Echtzeit-Verbrauchsüberwachung aktiv ist",
"long_description": "Zeigt an, ob die Echtzeit-Stromverbrauchsüberwachung für dein Tibber-Zuhause aktiviert und aktiv ist. Dies erfordert kompatible Messhardware (z. B. Tibber Pulse) und ein aktives Abonnement.",
"usage_tips": "Verwende dies, um zu überprüfen, ob Echtzeit-Verbrauchsdaten verfügbar sind. Aktiviere Benachrichtigungen, falls dies unerwartet auf 'Aus' wechselt, was auf potenzielle Hardware- oder Verbindungsprobleme hinweist."
},
"chart_data_export": {
"description": "Datenexport für Dashboard-Integrationen",
"long_description": "Dieser Binärsensor ruft den get_chartdata-Service auf, um Daten für externe Dashboard-Integrationen wie ApexCharts bereitzustellen. Wird verwendet, um Preisdaten in benutzerdefinierten Visualisierungen anzuzeigen.",
"usage_tips": "Konfiguriere die YAML-Parameter in den Integrationsoptionen, um Datenquellen, Zeiträume und Aggregationsmethoden festzulegen. Verwende dies mit benutzerdefinierten Karten oder ApexCharts-Dashboards zur Visualisierung von Preistrends und Verbrauchsdaten."
}
},
"home_types": {

View file

@ -464,6 +464,11 @@
"description": "Whether realtime consumption monitoring is active",
"long_description": "Indicates if realtime electricity consumption monitoring is enabled and active for your Tibber home. This requires compatible metering hardware (e.g., Tibber Pulse) and an active subscription.",
"usage_tips": "Use this to verify that realtime consumption data is available. Enable notifications if this changes to 'off' unexpectedly, indicating potential hardware or connectivity issues."
},
"chart_data_export": {
"description": "Data export for dashboard integrations",
"long_description": "This binary sensor calls the get_chartdata service with your configured YAML parameters and exposes the result as entity attributes. The state is 'on' when the service call succeeds and data is available, 'off' when the call fails or no configuration is set. Perfect for dashboard integrations like ApexCharts that need to read price data from entity attributes.",
"usage_tips": "Configure the YAML parameters in the integration options to match your get_chartdata service call. The sensor will automatically refresh when price data updates (typically after midnight and when tomorrow's data arrives). Access the service response data directly from the entity's attributes - the structure matches exactly what get_chartdata returns."
}
},
"home_types": {

View file

@ -464,6 +464,11 @@
"description": "Om sanntidsforbruksovervåking er aktiv",
"long_description": "Indikerer om sanntidsovervåking av strømforbruk er aktivert og aktiv for ditt Tibber-hjem. Dette krever kompatibel målehardware (f.eks. Tibber Pulse) og et aktivt abonnement.",
"usage_tips": "Bruk dette for å bekrefte at sanntidsforbruksdata er tilgjengelig. Aktiver varsler hvis dette endres til 'av' uventet, noe som indikerer potensielle maskinvare- eller tilkoblingsproblemer."
},
"chart_data_export": {
"description": "Dataeksport for dashboardintegrasjoner",
"long_description": "Denne binærsensoren kaller get_chartdata-tjenesten for å eksportere prisdata i formater som er kompatible med ApexCharts og andre dashboardverktøy. Dataeksporten inkluderer historiske og fremtidsrettede prisdata strukturert for visualisering.",
"usage_tips": "Konfigurer YAML-parametrene i integrasjonsalternativene. Bruk denne sensoren til å trigge dataeksporthendelser for dashboards. Når den slås på, eksporteres data til en fil eller tjeneste som er konfigurert for integrering med ApexCharts eller tilsvarende visualiseringsverktøy."
}
},
"home_types": {

View file

@ -464,6 +464,11 @@
"description": "Of realtime verbruiksmonitoring actief is",
"long_description": "Geeft aan of realtime elektriciteitsverbruikmonitoring is ingeschakeld en actief voor je Tibber-woning. Dit vereist compatibele meethardware (bijv. Tibber Pulse) en een actief abonnement.",
"usage_tips": "Gebruik dit om te verifiëren dat realtimeverbruiksgegevens beschikbaar zijn. Schakel meldingen in als dit onverwacht verandert naar 'uit', wat wijst op mogelijke hardware- of verbindingsproblemen."
},
"chart_data_export": {
"description": "Gegevensexport voor dashboardintegraties",
"long_description": "Deze binaire sensor roept de get_chartdata-service aan om gegevens voor dashboard-widgets te exporteren. Ondersteunt ApexCharts en andere dashboardoplossingen die prijsgegevens willen visualiseren.",
"usage_tips": "Configureer de YAML-parameters in de integratieopties onder 'Geavanceerd'. Deze sensor biedt meestal geen praktische waarde in automatiseringen - hij dient hoofdzakelijk als servicecontainer voor dashboardgebruik. Raadpleeg de documentatie voor specifieke parameterformat."
}
},
"home_types": {

View file

@ -464,6 +464,11 @@
"description": "Om realtidsförbrukningsövervakning är aktiv",
"long_description": "Indikerar om realtidsövervakning av elförbrukning är aktiverad och aktiv för ditt Tibber-hem. Detta kräver kompatibel mätutrustning (t.ex. Tibber Pulse) och en aktiv prenumeration.",
"usage_tips": "Använd detta för att verifiera att realtidsförbrukningen är tillgänglig. Aktivera meddelanden om detta oväntat ändras till 'av', vilket indikerar potentiella hårdvaru- eller anslutningsproblem."
},
"chart_data_export": {
"description": "Dataexport för instrumentpanelsintegrationer",
"long_description": "Denna binär sensor anropar tjänsten get_chartdata för att exportera prissensordata i format som är kompatibelt med ApexCharts och andra instrumentpanelskomponenter. Använd denna tillsammans med custom:apexcharts-card för att visa prissensorer på din instrumentpanel.",
"usage_tips": "Konfigurera YAML-parametrarna i integrationens alternativ under 'ApexCharts-datakonfiguration'. Tjänsten kräver en giltig sensorenhet och returnerar formaterad data för kartrendring. Se dokumentationen för tillgängliga parametrar och anpassningsalternativ."
}
},
"home_types": {

View file

@ -100,7 +100,9 @@ CHARTDATA_SERVICE_SCHEMA: Final = vol.Schema(
),
vol.Optional("insert_nulls", default="none"): vol.In(["none", "segments", "all"]),
vol.Optional("add_trailing_null", default=False): bool,
vol.Optional("timestamp_field", default="start_time"): str,
vol.Optional("period_filter"): vol.In(["best_price", "peak_price"]),
vol.Optional("start_time_field", default="start_time"): str,
vol.Optional("end_time_field", default="end_time"): str,
vol.Optional("price_field", default="price_per_kwh"): str,
vol.Optional("level_field", default="level"): str,
vol.Optional("rating_level_field", default="rating_level"): str,
@ -118,9 +120,9 @@ REFRESH_USER_DATA_SERVICE_SCHEMA: Final = vol.Schema(
# --- Entry point: Service handler ---
def _aggregate_hourly_exact( # noqa: PLR0913, PLR0912
def _aggregate_hourly_exact( # noqa: PLR0913, PLR0912, PLR0915
intervals: list[dict],
timestamp_field: str,
start_time_field: str,
price_field: str,
*,
use_minor_currency: bool = False,
@ -136,6 +138,7 @@ def _aggregate_hourly_exact( # noqa: PLR0913, PLR0912
day_average: float | None = None,
threshold_low: float = DEFAULT_PRICE_RATING_THRESHOLD_LOW,
threshold_high: float = DEFAULT_PRICE_RATING_THRESHOLD_HIGH,
period_timestamps: set[str] | None = None,
) -> list[dict]:
"""
Aggregate 15-minute intervals to exact hourly averages.
@ -175,6 +178,12 @@ def _aggregate_hourly_exact( # noqa: PLR0913, PLR0912
if i + j < len(intervals):
interval = intervals[i + j]
# Apply period filter if specified (check startsAt timestamp)
if period_timestamps is not None:
interval_start = interval.get("startsAt")
if interval_start and interval_start not in period_timestamps:
continue
# Apply level filter if specified
if level_filter is not None and "level" in interval and interval["level"] not in level_filter:
continue
@ -203,7 +212,7 @@ def _aggregate_hourly_exact( # noqa: PLR0913, PLR0912
if round_decimals is not None:
avg_price = round(avg_price, round_decimals)
data_point = {timestamp_field: start_time_str, price_field: avg_price}
data_point = {start_time_field: start_time_str, price_field: avg_price}
# Add aggregated level using same logic as sensors
if include_level and hour_interval_data:
@ -229,6 +238,165 @@ def _aggregate_hourly_exact( # noqa: PLR0913, PLR0912
return hourly_data
def _get_period_data( # noqa: PLR0913, PLR0912, PLR0915
*,
coordinator: Any,
period_filter: str,
days: list[str],
output_format: str,
minor_currency: bool,
round_decimals: int | None,
level_filter: list[str] | None,
rating_level_filter: list[str] | None,
include_level: bool,
include_rating_level: bool,
start_time_field: str,
end_time_field: str,
price_field: str,
level_field: str,
rating_level_field: str,
data_key: str,
add_trailing_null: bool,
) -> dict[str, Any]:
"""
Get period summary data instead of interval data.
When period_filter is specified, returns the precomputed period summaries
from the coordinator instead of filtering intervals.
Note: Period prices (price_avg) are stored in minor currency units (ct/øre).
They are converted to major currency unless minor_currency=True.
Args:
coordinator: Data coordinator with period summaries
period_filter: "best_price" or "peak_price"
days: List of days to include
output_format: "array_of_objects" or "array_of_arrays"
minor_currency: If False, convert prices from minor to major units
round_decimals: Optional decimal rounding
level_filter: Optional level filter
rating_level_filter: Optional rating level filter
include_level: Whether to include level field in output
include_rating_level: Whether to include rating_level field in output
start_time_field: Custom name for start time field
end_time_field: Custom name for end time field
price_field: Custom name for price field
level_field: Custom name for level field
rating_level_field: Custom name for rating_level field
data_key: Top-level key name in response
add_trailing_null: Whether to add trailing null point
Returns:
Dictionary with period data in requested format
"""
periods_data = coordinator.data.get("periods", {})
period_data = periods_data.get(period_filter)
if not period_data:
return {data_key: []}
period_summaries = period_data.get("periods", [])
if not period_summaries:
return {data_key: []}
chart_data = []
# Filter periods by day if requested
filtered_periods = []
if days:
# Build set of allowed dates
allowed_dates = set()
for day in days:
# Map day names to actual dates from coordinator
price_info = coordinator.data.get("priceInfo", {})
day_prices = price_info.get(day, [])
if day_prices:
# Extract date from first interval
first_interval = day_prices[0]
starts_at = first_interval.get("startsAt")
if starts_at:
dt = dt_util.parse_datetime(starts_at)
if dt:
dt = dt_util.as_local(dt)
allowed_dates.add(dt.date())
# Filter periods to those within allowed dates
for period in period_summaries:
start = period.get("start")
if start and start.date() in allowed_dates:
filtered_periods.append(period)
else:
filtered_periods = period_summaries
# Apply level and rating_level filters
for period in filtered_periods:
# Apply level filter (normalize to uppercase for comparison)
if level_filter and "level" in period and period["level"].upper() not in level_filter:
continue
# Apply rating_level filter (normalize to uppercase for comparison)
if (
rating_level_filter
and "rating_level" in period
and period["rating_level"].upper() not in rating_level_filter
):
continue
# Build data point based on output format
if output_format == "array_of_objects":
# Map period fields to custom field names
# Period has: start, end, level, rating_level, price_avg, price_min, price_max
data_point = {}
# Start time
data_point[start_time_field] = period["start"]
# End time
data_point[end_time_field] = period.get("end")
# Price (use price_avg from period, stored in minor units)
price_avg = period.get("price_avg", 0.0)
# Convert to major currency unless minor_currency=True
if not minor_currency:
price_avg = price_avg / 100
if round_decimals is not None:
price_avg = round(price_avg, round_decimals)
data_point[price_field] = price_avg
# Level (only if requested and present)
if include_level and "level" in period:
data_point[level_field] = period["level"].upper()
# Rating level (only if requested and present)
if include_rating_level and "rating_level" in period:
data_point[rating_level_field] = period["rating_level"].upper()
chart_data.append(data_point)
else: # array_of_arrays
# For array_of_arrays, include: [start, price_avg]
price_avg = period.get("price_avg", 0.0)
# Convert to major currency unless minor_currency=True
if not minor_currency:
price_avg = price_avg / 100
if round_decimals is not None:
price_avg = round(price_avg, round_decimals)
chart_data.append([period["start"], price_avg])
# Add trailing null point if requested
if add_trailing_null and chart_data:
if output_format == "array_of_objects":
null_point = {start_time_field: None, end_time_field: None}
for field in [price_field, level_field, rating_level_field]:
null_point[field] = None
chart_data.append(null_point)
else: # array_of_arrays
chart_data.append([None, None])
return {data_key: chart_data}
async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912, PLR0915, C901
"""Return price data in a simple chart-friendly format similar to Tibber Core integration."""
hass = call.hass
@ -247,7 +415,8 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
else:
days = days_raw
timestamp_field = call.data.get("timestamp_field", "start_time")
start_time_field = call.data.get("start_time_field", "start_time")
end_time_field = call.data.get("end_time_field", "end_time")
price_field = call.data.get("price_field", "price_per_kwh")
level_field = call.data.get("level_field", "level")
rating_level_field = call.data.get("rating_level_field", "rating_level")
@ -262,6 +431,7 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
include_average = call.data.get("include_average", False)
insert_nulls = call.data.get("insert_nulls", "none")
add_trailing_null = call.data.get("add_trailing_null", False)
period_filter = call.data.get("period_filter")
# Filter values are already normalized to uppercase by schema validators
level_filter = call.data.get("level_filter")
rating_level_filter = call.data.get("rating_level_filter")
@ -286,10 +456,55 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
CONF_PRICE_RATING_THRESHOLD_HIGH, DEFAULT_PRICE_RATING_THRESHOLD_HIGH
)
# === SPECIAL HANDLING: Period Filter ===
# When period_filter is set, return period summaries instead of interval data
# Period summaries are already complete objects with aggregated data
if period_filter:
return _get_period_data(
coordinator=coordinator,
period_filter=period_filter,
days=days,
output_format=output_format,
minor_currency=minor_currency,
round_decimals=round_decimals,
level_filter=level_filter,
rating_level_filter=rating_level_filter,
include_level=include_level,
include_rating_level=include_rating_level,
start_time_field=start_time_field,
end_time_field=end_time_field,
price_field=price_field,
level_field=level_field,
rating_level_field=rating_level_field,
data_key=data_key,
add_trailing_null=add_trailing_null,
)
# === NORMAL HANDLING: Interval Data ===
# Get price data for all requested days
price_info = coordinator.data.get("priceInfo", {})
chart_data = []
# Build set of timestamps that match period_filter if specified
period_timestamps = None
if period_filter:
period_timestamps = set()
periods_data = coordinator.data.get("periods", {})
period_data = periods_data.get(period_filter)
if period_data:
period_summaries = period_data.get("periods", [])
# Period summaries don't contain intervals, only start/end timestamps
# Build set of all 15-minute intervals within period ranges
for period_summary in period_summaries:
start = period_summary.get("start")
end = period_summary.get("end")
if start and end:
# Generate all 15-minute timestamps within this period
current = start
while current < end:
period_timestamps.add(current.isoformat())
current = current + timedelta(minutes=15)
# Collect all timestamps if insert_nulls='all' (needed to insert NULLs for missing filter matches)
all_timestamps = set()
if insert_nulls == "all" and (level_filter or rating_level_filter):
@ -357,7 +572,7 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
if round_decimals is not None:
price = round(price, round_decimals)
data_point = {timestamp_field: start_time, price_field: price}
data_point = {start_time_field: start_time, price_field: price}
# Add level if requested (only when price is not NULL)
if include_level and "level" in interval and price is not None:
@ -400,7 +615,7 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
converted_price = round(converted_price, round_decimals)
# Add current point
data_point = {timestamp_field: start_time, price_field: converted_price}
data_point = {start_time_field: start_time, price_field: converted_price}
if include_level and "level" in interval:
data_point[level_field] = interval["level"]
@ -414,7 +629,7 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
# Check if next interval is different level (segment boundary)
if next_value != interval_value:
# Hold current price until next timestamp (stepline effect)
hold_point = {timestamp_field: next_start_time, price_field: converted_price}
hold_point = {start_time_field: next_start_time, price_field: converted_price}
if include_level and "level" in interval:
hold_point[level_field] = interval["level"]
if include_rating_level and "rating_level" in interval:
@ -424,7 +639,7 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
chart_data.append(hold_point)
# Add NULL point to create gap
null_point = {timestamp_field: next_start_time, price_field: None}
null_point = {start_time_field: next_start_time, price_field: None}
chart_data.append(null_point)
# Handle last interval of the day - extend to midnight
@ -478,7 +693,7 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
converted_price = round(converted_price, round_decimals)
# Add point at midnight with appropriate price (extends graph to end of day)
end_point = {timestamp_field: midnight_timestamp, price_field: converted_price}
end_point = {start_time_field: midnight_timestamp, price_field: converted_price}
if midnight_interval is not None:
if include_level and "level" in midnight_interval:
end_point[level_field] = midnight_interval["level"]
@ -494,6 +709,14 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
price = interval.get("total")
if start_time is not None and price is not None:
# Apply period filter if specified
if (
period_filter is not None
and period_timestamps is not None
and start_time not in period_timestamps
):
continue
# Apply level filter if specified
if level_filter is not None and "level" in interval and interval["level"] not in level_filter:
continue
@ -513,7 +736,7 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
if round_decimals is not None:
price = round(price, round_decimals)
data_point = {timestamp_field: start_time, price_field: price}
data_point = {start_time_field: start_time, price_field: price}
# Add level if requested
if include_level and "level" in interval:
@ -534,7 +757,7 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
chart_data.extend(
_aggregate_hourly_exact(
day_prices,
timestamp_field,
start_time_field,
price_field,
use_minor_currency=minor_currency,
round_decimals=round_decimals,
@ -548,6 +771,7 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
average_field=average_field,
day_average=day_averages.get(day),
threshold_low=threshold_low,
period_timestamps=period_timestamps,
threshold_high=threshold_high,
)
)
@ -558,7 +782,7 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
# Default: nur timestamp und price
if not array_fields_template:
array_fields_template = f"{{{timestamp_field}}}, {{{price_field}}}"
array_fields_template = f"{{{start_time_field}}}, {{{price_field}}}"
# Parse template to extract field names
field_pattern = re.compile(r"\{([^}]+)\}")
@ -593,7 +817,7 @@ async def _get_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR0912,
if add_trailing_null and chart_data:
# Create a null point with only timestamp from last item, all other fields as None
last_item = chart_data[-1]
null_point = {timestamp_field: last_item.get(timestamp_field)}
null_point = {start_time_field: last_item.get(start_time_field)}
# Set all other potential fields to None
for field in [price_field, level_field, rating_level_field, average_field]:
if field in last_item:

View file

@ -42,7 +42,7 @@ get_chartdata:
description: >-
Returns price data in a simple chart-friendly format compatible with the Tibber Core integration output structure. Perfect for use with popular chart cards like ha-price-timeline-card, ApexCharts Card, Plotly Graph Card, Mini Graph Card, or the built-in History Graph Card. Field names and data structure can be customized to match your specific chart requirements.
fields:
# === Basic Settings (for all formats) ===
# === REQUIRED ===
entry_id:
name: Entry ID
description: The config entry ID for the Tibber integration.
@ -51,6 +51,7 @@ get_chartdata:
selector:
config_entry:
integration: tibber_prices
# === DATA SELECTION ===
day:
name: Day
description: Which day(s) to fetch prices for. You can select multiple days. If not specified, returns all available data (today + tomorrow if available).
@ -76,27 +77,7 @@ get_chartdata:
- interval
- hourly
translation_key: resolution
minor_currency:
name: Minor Currency
description: >-
Return prices in minor currency units (cents for EUR, øre for NOK/SEK) instead of major currency units. Disabled by default.
required: false
default: false
example: true
selector:
boolean:
round_decimals:
name: Round Decimals
description: >-
Number of decimal places to round prices to (0-10). If not specified, uses default precision (4 decimals for major currency, 2 for minor currency).
required: false
example: 2
selector:
number:
min: 0
max: 10
mode: box
# === Filters (for all formats) ===
# === FILTERS ===
level_filter:
name: Level Filter
description: >-
@ -125,7 +106,18 @@ get_chartdata:
- high
multiple: true
translation_key: rating_level_filter
# === NULL Insertion ===
period_filter:
name: Period Filter
description: >-
Filter intervals to include only those within Best Price or Peak Price periods. Options: 'best_price' (only intervals in Best Price periods), 'peak_price' (only intervals in Peak Price periods). If not specified, all intervals are included. This uses the precomputed period data from binary sensors (binary_sensor.best_price_period / binary_sensor.peak_price_period).
required: false
selector:
select:
options:
- best_price
- peak_price
translation_key: period_filter
# === FILTER BEHAVIOR ===
insert_nulls:
name: Insert NULL Values
description: >-
@ -139,7 +131,7 @@ get_chartdata:
- segments
- all
translation_key: insert_nulls
# === Output Format Selection ===
# === OUTPUT FORMAT ===
output_format:
name: Output Format
description: >-
@ -153,7 +145,28 @@ get_chartdata:
- array_of_objects
- array_of_arrays
translation_key: output_format
# === For array_of_arrays format ===
# === CURRENCY & PRECISION ===
minor_currency:
name: Minor Currency
description: >-
Return prices in minor currency units (cents for EUR, øre for NOK/SEK) instead of major currency units. Disabled by default.
required: false
default: false
example: true
selector:
boolean:
round_decimals:
name: Round Decimals
description: >-
Number of decimal places to round prices to (0-10). If not specified, uses default precision (4 decimals for major currency, 2 for minor currency).
required: false
example: 2
selector:
number:
min: 0
max: 10
mode: box
# === ARRAY OF ARRAYS OPTIONS ===
add_trailing_null:
name: Add Trailing Null Point
description: >-
@ -169,7 +182,7 @@ get_chartdata:
required: false
selector:
text:
# === For array_of_objects format - Include Fields ===
# === ARRAY OF OBJECTS OPTIONS ===
include_level:
name: Include Level (array_of_objects only)
description: >-
@ -196,15 +209,23 @@ get_chartdata:
default: false
selector:
boolean:
# === For array_of_objects format - Customize Field Names ===
timestamp_field:
name: Timestamp Field Name (array_of_objects only)
# === ARRAY OF OBJECTS - FIELD NAMES ===
start_time_field:
name: Start Time Field Name (array_of_objects only)
description: >-
[ONLY FOR array_of_objects FORMAT] Custom name for the timestamp field in the output. Defaults to "start_time" if not specified.
[ONLY FOR array_of_objects FORMAT] Custom name for the start time field in the output. Defaults to "start_time" if not specified.
required: false
example: time
selector:
text:
end_time_field:
name: End Time Field Name (array_of_objects only)
description: >-
[ONLY FOR array_of_objects FORMAT] Custom name for the end time field in the output. Defaults to "end_time" if not specified. Only used with period_filter.
required: false
example: end
selector:
text:
price_field:
name: Price Field Name (array_of_objects only)
description: >-

View file

@ -39,7 +39,10 @@
"unknown": "Ein unerwarteter Fehler ist aufgetreten. Bitte überprüfe die Logs für Details.",
"cannot_connect": "Verbindung fehlgeschlagen",
"invalid_access_token": "Ungültiges Zugriffstoken",
"missing_homes": "Der neue Zugriffstoken hat keinen Zugriff auf alle konfigurierten Zuhause. Bitte verwende einen Zugriffstoken, der Zugriff auf die gleichen Tibber-Zuhause hat."
"missing_homes": "Der neue Zugriffstoken hat keinen Zugriff auf alle konfigurierten Zuhause. Bitte verwende einen Zugriffstoken, der Zugriff auf die gleichen Tibber-Zuhause hat.",
"invalid_yaml_syntax": "Ungültige YAML-Syntax. Bitte prüfe Einrückung, Doppelpunkte und Sonderzeichen.",
"invalid_yaml_structure": "YAML muss ein Dictionary/Objekt sein (Schlüssel: Wert-Paare), keine Liste oder reiner Text.",
"service_call_failed": "Service-Aufruf-Validierung fehlgeschlagen: {error_detail}"
},
"abort": {
"already_configured": "Integration ist bereits konfiguriert",
@ -152,7 +155,7 @@
"price_trend_threshold_rising": "Steigender Schwellenwert (über dem aktuellen Preis)",
"price_trend_threshold_falling": "Fallender Schwellenwert (unter dem aktuellen Preis, negativer Wert)"
},
"submit": "Konfiguration abschließen"
"submit": "Weiter zu Schritt 7"
},
"volatility": {
"title": "Volatilität Schwellenwerte",
@ -163,6 +166,17 @@
"volatility_threshold_very_high": "Sehr hohe Schwelle (VK ≥ dieser %, z.B. 50)"
},
"submit": "Weiter zu Schritt 4"
},
"chart_data_export": {
"title": "Diagramm-Datenexport",
"description": "{step_progress}\n\n⚠ **Dieser Schritt ist optional:** Konfiguriere nur, wenn du einen diagnostischen Sensor für ältere Dashboard-Tools benötigst, die KEINE Services aufrufen können.\n\n⚠ **WICHTIG: Legacy-Feature**\nDieser Sensor existiert nur für Abwärtskompatibilität mit älteren Tools, die Daten nur aus Entity-Attributen lesen können.\n\n**Für neue Integrationen nutze stattdessen direkt den `tibber_prices.get_chartdata` Service** - er ist flexibler, effizienter und der empfohlene Home Assistant Ansatz.\n\n**Wann diesen Sensor nutzen:**\n✅ NUR wenn dein Dashboard-Tool NUR Attribute lesen kann (z.B. ältere Versionen)\n❌ NICHT für neue Integrationen - nutze den Service direkt\n\n**Was ist dieser Sensor?**\nEin diagnostischer Sensor, der den `tibber_prices.get_chartdata` Service mit deinen YAML-Parametern aufruft und das Ergebnis als Attribute bereitstellt.\n\n**Wie nutzen:**\n1. Öffne Settings → Devices & Services → Tibber Prices\n2. Wähle dein Home\n3. Finde 'Diagramm-Datenexport' im Diagnose-Bereich\n4. Aktiviere den Sensor (standardmäßig deaktiviert)\n5. Nutze `homeassistant.update_entity` um Daten manuell zu aktualisieren\n\n**Konfigurationsanleitung:**\n1. Teste deinen Service-Aufruf in Entwicklerwerkzeuge → Services → `tibber_prices.get_chartdata`\n2. Kopiere das YAML aus dem `data:` Bereich (ohne `entry_id`)\n3. Füge es unten ein (nutze Shift+Enter oder Strg+Enter für neue Zeilen)\n\n**Beispiel-YAML:**\n```yaml\nday:\n - today\n - tomorrow\ninclude_level: true\ninclude_rating_level: true\n```\n\n**Hinweis:** Leer lassen oder diesen Schritt überspringen, wenn du den Service direkt nutzt (empfohlen).",
"data": {
"chart_data_config": "Service-Konfiguration (YAML)"
},
"data_description": {
"chart_data_config": "YAML-Konfiguration für get_chartdata Service. Unterstützte Parameter: day (Liste aus yesterday/today/tomorrow), resolution (interval/hourly), output_format, include_level, include_rating_level, include_average, und viele mehr. Siehe Service-Dokumentation für vollständige Liste. Leer lassen für Standard-Einstellungen (today, interval resolution, array of objects)."
},
"submit": "Konfiguration abschließen"
}
},
"error": {
@ -654,6 +668,9 @@
},
"realtime_consumption_enabled": {
"name": "Echtzeitverbrauch aktiviert"
},
"chart_data_export": {
"name": "Diagramm-Datenexport"
}
}
},
@ -738,17 +755,25 @@
"name": "Preisbewertungs-Filter",
"description": "Intervalle filtern, um nur bestimmte Preisbewertungen einzuschließen (LOW, NORMAL, HIGH). Falls nicht angegeben, werden alle Bewertungen eingeschlossen."
},
"period_filter": {
"name": "Perioden-Filter",
"description": "Intervalle filtern, um nur solche innerhalb von Bestepreis- oder Spitzenpreis-Zeiträumen einzuschließen. Optionen: 'best_price' (nur Intervalle in Beste-Preis-Perioden), 'peak_price' (nur Intervalle in Spitzen-Preis-Perioden). Falls nicht angegeben, werden alle Intervalle eingeschlossen. Verwendet vorberechnete Periodendaten von Binär-Sensoren."
},
"insert_nulls": {
"name": "NULL-Werte einfügen",
"description": "Steuert das Einfügen von NULL-Werten für gefilterte Daten. 'none' (Standard): Keine NULL-Werte, nur passende Intervalle. 'segments': NULL-Punkte an Segmentgrenzen für saubere Lücken in Diagrammen hinzufügen (empfohlen für Stufenliniendiagramme). 'all': NULL für alle Zeitstempel einfügen, bei denen der Filter nicht übereinstimmt (nützlich für kontinuierliche Zeitreihenvisualisierung)."
},
"add_trailing_null": {
"name": "Abschließenden Null-Punkt hinzufügen",
"description": "[BEIDE FORMATE] Füge einen finalen Datenpunkt mit Null-Werten (außer Zeitstempel) am Ende hinzu. Manche Diagramm-Bibliotheken benötigen dies, um Extrapolation/Interpolation bis zum Rand des Anzeigebereichs bei Stufenliniendarstellung zu verhindern. Deaktiviert lassen, es sei denn, dein Diagramm erfordert es."
"description": "[BEIDE FORMATE] Füge einen finalen Datenpunkt mit Nullwerten (außer Zeitstempel) am Ende hinzu. Einige Diagrammbibliotheken benötigen dies, um Extrapolation/Interpolation zum Viewport-Rand bei Verwendung von Stufendarstellung zu verhindern. Deaktiviert lassen, es sei denn, dein Diagramm benötigt es."
},
"timestamp_field": {
"name": "Zeitstempel-Feldname (nur Array von Objekten)",
"description": "[NUR FÜR Array von Objekten FORMAT] Benutzerdefinierter Name für das Zeitstempel-Feld in der Ausgabe. Standard ist 'start_time', falls nicht angegeben."
"start_time_field": {
"name": "Startzeit-Feldname (nur Array von Objekten)",
"description": "[NUR FÜR Array von Objekten FORMAT] Benutzerdefinierter Name für das Startzeit-Feld in der Ausgabe. Standardmäßig 'start_time', wenn nicht angegeben."
},
"end_time_field": {
"name": "Endzeit-Feldname (nur Array von Objekten)",
"description": "[NUR FÜR Array von Objekten FORMAT] Benutzerdefinierter Name für das Endzeit-Feld in der Ausgabe. Standardmäßig 'end_time', wenn nicht angegeben. Nur verwendet mit period_filter."
},
"price_field": {
"name": "Preis-Feldname (nur Array von Objekten)",
@ -832,6 +857,12 @@
"all": "Alle"
}
},
"period_filter": {
"options": {
"best_price": "Bestpreis-Zeiträume",
"peak_price": "Spitzenpreis-Zeiträume"
}
},
"volatility": {
"options": {
"low": "Niedrig",

View file

@ -39,7 +39,10 @@
"unknown": "Unexpected error",
"cannot_connect": "Failed to connect",
"invalid_access_token": "Invalid access token",
"missing_homes": "The new access token does not have access to all configured homes. Please use an access token that has access to the same Tibber homes."
"missing_homes": "The new access token does not have access to all configured homes. Please use an access token that has access to the same Tibber homes.",
"invalid_yaml_syntax": "Invalid YAML syntax. Please check indentation, colons, and special characters.",
"invalid_yaml_structure": "YAML must be a dictionary/object (key: value pairs), not a list or plain text.",
"service_call_failed": "Service call validation failed: {error_detail}"
},
"abort": {
"already_configured": "Integration is already configured",
@ -152,6 +155,17 @@
"price_trend_threshold_rising": "Rising Threshold (above current price)",
"price_trend_threshold_falling": "Falling Threshold (below current price, negative value)"
},
"submit": "Next to Step 7"
},
"chart_data_export": {
"title": "Chart Data Export",
"description": "{step_progress}\n\n**⚠️ This step is optional** and only relevant if you plan to enable the **Chart Data Export** diagnostic sensor.\n\n**⚠️ IMPORTANT: Legacy Feature**\nThis sensor exists for **backwards compatibility** with tools that can only read entity attributes (e.g., older ApexCharts versions). **For new integrations, use the `tibber_prices.get_chartdata` service directly** - it's more flexible, efficient, and the recommended Home Assistant approach.\n\n**When to use this sensor:**\n- ✅ Your dashboard tool can ONLY read attributes (not call services)\n- ✅ You need static data that updates automatically\n- ❌ NOT recommended for new integrations that support service calls\n\n**What is this sensor?**\nThis sensor calls the `tibber_prices.get_chartdata` service with your custom YAML parameters and exposes the result as entity attributes.\n\n**How to use it:**\n1. **Complete this configuration** (or leave empty for service defaults)\n2. Go to **Settings → Devices & Services → Tibber Prices**\n3. Click on your home → Find **'Chart Data Export'** in the **Diagnostic** section\n4. Enable the sensor - it will automatically call the service with your configuration\n5. Use the sensor's attributes in your dashboard cards\n\n**Configuration guide:**\n1. Test your service call in **Developer Tools → Services → `tibber_prices.get_chartdata`**\n2. Copy the YAML from the `data:` section (without `entry_id`)\n3. Paste it below (use **Shift+Enter** or **Ctrl+Enter** for new lines, NOT just Enter)\n\n**Example YAML:**\n```yaml\nday:\n - today\n - tomorrow\ninclude_level: true\ninclude_rating_level: true\n```\n\n**Note:** Leave empty to use service defaults or skip this feature if you don't need the sensor.",
"data": {
"chart_data_config": "Service Configuration (YAML)"
},
"data_description": {
"chart_data_config": "YAML configuration for get_chartdata service. Supported parameters: day (list of yesterday/today/tomorrow), resolution (interval/hourly), output_format, include_level, include_rating_level, include_average, and many more. See service documentation for full list. Leave empty to use service defaults (today, interval resolution, array of objects)."
},
"submit": "Complete Configuration"
},
"volatility": {
@ -650,6 +664,9 @@
},
"realtime_consumption_enabled": {
"name": "Realtime Consumption Enabled"
},
"chart_data_export": {
"name": "Chart Data Export"
}
}
},
@ -734,6 +751,10 @@
"name": "Rating Level Filter",
"description": "Filter intervals to include only specific rating levels (LOW, NORMAL, HIGH). If not specified, all rating levels are included."
},
"period_filter": {
"name": "Period Filter",
"description": "Filter intervals to include only those within Best Price or Peak Price periods. Options: 'best_price' (only intervals in Best Price periods), 'peak_price' (only intervals in Peak Price periods). If not specified, all intervals are included. This uses the precomputed period data from binary sensors."
},
"insert_nulls": {
"name": "Insert NULL Values",
"description": "Control NULL value insertion for filtered data. 'none' (default): No NULL values, only matching intervals. 'segments': Add NULL points at segment boundaries for clean gaps in charts (recommended for stepline charts). 'all': Insert NULL for all timestamps where filter doesn't match (useful for continuous time series visualization)."
@ -742,9 +763,13 @@
"name": "Add Trailing Null Point",
"description": "[BOTH FORMATS] Add a final data point with null values (except timestamp) at the end. Some chart libraries need this to prevent extrapolation/interpolation to the viewport edge when using stepline rendering. Leave disabled unless your chart requires it."
},
"timestamp_field": {
"name": "Timestamp Field Name (Array of Objects only)",
"description": "[ONLY FOR Array of Objects FORMAT] Custom name for the timestamp field in the output. Defaults to 'start_time' if not specified."
"start_time_field": {
"name": "Start Time Field Name (Array of Objects only)",
"description": "[ONLY FOR Array of Objects FORMAT] Custom name for the start time field in the output. Defaults to 'start_time' if not specified."
},
"end_time_field": {
"name": "End Time Field Name (Array of Objects only)",
"description": "[ONLY FOR Array of Objects FORMAT] Custom name for the end time field in the output. Defaults to 'end_time' if not specified. Only used with period_filter."
},
"price_field": {
"name": "Price Field Name (Array of Objects only)",
@ -828,6 +853,12 @@
"all": "All"
}
},
"period_filter": {
"options": {
"best_price": "Best Price Periods",
"peak_price": "Peak Price Periods"
}
},
"volatility": {
"options": {
"low": "Low",

View file

@ -39,7 +39,10 @@
"unknown": "Uventet feil",
"cannot_connect": "Kunne ikke koble til",
"invalid_access_token": "Ugyldig tilgangstoken",
"missing_homes": "Det nye tilgangstokenet har ikke tilgang til alle konfigurerte hjem. Vennligst bruk et tilgangstoken som har tilgang til de samme Tibber-hjemmene."
"missing_homes": "Det nye tilgangstokenet har ikke tilgang til alle konfigurerte hjem. Vennligst bruk et tilgangstoken som har tilgang til de samme Tibber-hjemmene.",
"invalid_yaml_syntax": "Ugyldig YAML-syntaks. Vennligst sjekk innrykk, kolon og spesialtegn.",
"invalid_yaml_structure": "YAML må være en ordbok/objekt (nøkkel: verdi-par), ikke en liste eller ren tekst.",
"service_call_failed": "Service-kall validering feilet: {error_detail}"
},
"abort": {
"already_configured": "Integrasjonen er allerede konfigurert",
@ -152,7 +155,7 @@
"price_trend_threshold_rising": "Stigende terskel (% over nåværende pris)",
"price_trend_threshold_falling": "Fallende terskel (% under nåværende pris, negativ verdi)"
},
"submit": "Fullfør konfigurasjon"
"submit": "Neste til trinn 7"
},
"volatility": {
"title": "Volatilitet Terskler",
@ -163,6 +166,17 @@
"volatility_threshold_very_high": "Veldig høy terskel (VK ≥ denne %, f.eks. 50)"
},
"submit": "Neste til steg 4"
},
"chart_data_export": {
"title": "Diagramdataeksport",
"description": "{step_progress}\n\n⚠ **Dette trinnet er valgfritt:** Konfigurer bare hvis du trenger en diagnostisk sensor for eldre dashboardverktøy som IKKE kan kalle tjenester.\n\n⚠ **VIKTIG: Legacy-funksjon**\nDenne sensoren finnes kun for bakoverkompatibilitet med eldre verktøy som bare kan lese data fra entitetsattributter.\n\n**For nye integrasjoner, bruk `tibber_prices.get_chartdata` tjenesten direkte** - den er mer fleksibel, effektiv og den anbefalte Home Assistant-tilnærmingen.\n\n**Når bruke denne sensoren:**\n✅ KUN hvis dashboardverktøyet ditt KUN kan lese attributter (f.eks. eldre versjoner)\n❌ IKKE for nye integrasjoner - bruk tjenesten direkte\n\n**Hva er denne sensoren?**\nEn diagnostisk sensor som kaller `tibber_prices.get_chartdata` tjenesten med dine YAML-parametere og gir resultatet som attributter.\n\n**Hvordan bruke:**\n1. Åpne Innstillinger → Enheter og tjenester → Tibber Prices\n2. Velg ditt hjem\n3. Finn 'Diagramdataeksport' i Diagnostikk-seksjonen\n4. Aktiver sensoren (deaktivert som standard)\n5. Bruk `homeassistant.update_entity` for å oppdatere data manuelt\n\n**Konfigurasjonsveiledning:**\n1. Test tjenestekallet ditt i Utviklerverktøy → Tjenester → `tibber_prices.get_chartdata`\n2. Kopier YAML fra `data:` seksjonen (uten `entry_id`)\n3. Lim det inn nedenfor (bruk Shift+Enter eller Ctrl+Enter for nye linjer)\n\n**Eksempel YAML:**\n```yaml\nday:\n - today\n - tomorrow\ninclude_level: true\ninclude_rating_level: true\n```\n\n**Merk:** La være tomt eller hopp over dette trinnet hvis du bruker tjenesten direkte (anbefalt).",
"data": {
"chart_data_config": "Tjenestekonfigurasjon (YAML)"
},
"data_description": {
"chart_data_config": "YAML-konfigurasjon for `get_chartdata` tjeneste. Støttede parametere: day (liste med yesterday/today/tomorrow), resolution (interval/hourly), output_format, include_level, include_rating_level, include_average, og mange flere. Se tjenestens dokumentasjon for full liste. La være tomt for å bruke standardinnstillinger (today, interval resolution, array of objects)."
},
"submit": "Fullfør konfigurasjon"
}
},
"error": {
@ -650,6 +664,9 @@
},
"realtime_consumption_enabled": {
"name": "Sanntidsforbruk aktivert"
},
"chart_data_export": {
"name": "Diagramdataeksport"
}
}
},
@ -734,17 +751,25 @@
"name": "Prisvurderingsfilter",
"description": "Filtrer intervaller for å inkludere bare spesifikke prisvurderinger (LOW, NORMAL, HIGH). Hvis ikke spesifisert, inkluderes alle vurderinger."
},
"period_filter": {
"name": "Periodefilter",
"description": "Filtrer intervaller for å inkludere kun de innenfor Beste pris- eller Topp pris-perioder. Alternativer: 'best_price' (kun intervaller i Beste pris-perioder), 'peak_price' (kun intervaller i Topp pris-perioder). Hvis ikke spesifisert, inkluderes alle intervaller. Bruker forhåndsberegnede periodedata fra binære sensorer."
},
"insert_nulls": {
"name": "Sett inn NULL-verdier",
"description": "Kontroller innsetting av NULL-verdier for filtrerte data. 'none' (standard): Ingen NULL-verdier, bare matchende intervaller. 'segments': Legg til NULL-punkter ved segmentgrenser for rene hull i diagrammer (anbefalt for trinnlinjediagrammer). 'all': Sett inn NULL for alle tidsstempler der filteret ikke samsvarer (nyttig for kontinuerlig tidsserievisualisering)."
},
"add_trailing_null": {
"name": "Legg til avsluttende null-punkt",
"description": "[BEGGE FORMATER] Legg til et siste datapunkt med null-verdier (unntatt tidsstempel) på slutten. Noen diagrambiblioteker trenger dette for å forhindre ekstrapolering/interpolering til kanten av visningsområdet ved stepline-gjengivelse. La være deaktivert med mindre diagrammet ditt krever det."
"description": "[BEGGE FORMATER] Legg til et siste datapunkt med nullverdier (unntatt tidsstempel) på slutten. Noen diagrambiblioteker trenger dette for å forhindre ekstrapolering/interpolering til visningsportens kant ved bruk av trinnlinje-rendering. La være deaktivert med mindre diagrammet ditt krever det."
},
"timestamp_field": {
"name": "Tidsstempel-feltnavn (kun Array av objekter)",
"description": "[KUN FOR Array av objekter FORMAT] Tilpasset navn for tidsstempel-feltet i utdata. Standard er 'start_time' hvis ikke angitt."
"start_time_field": {
"name": "Starttid-feltnavn (kun Array of Objects)",
"description": "[KUN FOR Array of Objects FORMAT] Egendefinert navn for starttid-feltet i utdata. Standard er 'start_time' hvis ikke angitt."
},
"end_time_field": {
"name": "Sluttid-feltnavn (kun Array of Objects)",
"description": "[KUN FOR Array of Objects FORMAT] Egendefinert navn for sluttid-feltet i utdata. Standard er 'end_time' hvis ikke angitt. Brukes kun med period_filter."
},
"price_field": {
"name": "Prisfelt-navn (kun Array av objekter)",
@ -828,6 +853,12 @@
"all": "Alle"
}
},
"period_filter": {
"options": {
"best_price": "Beste prisperioder",
"peak_price": "Topp prisperioder"
}
},
"volatility": {
"options": {
"low": "Lav",

View file

@ -39,7 +39,10 @@
"unknown": "Onverwachte fout",
"cannot_connect": "Verbinding mislukt",
"invalid_access_token": "Ongeldig toegangstoken",
"missing_homes": "Het nieuwe toegangstoken heeft geen toegang tot alle geconfigureerde huizen. Gebruik een toegangstoken dat toegang heeft tot dezelfde Tibber-huizen."
"missing_homes": "Het nieuwe toegangstoken heeft geen toegang tot alle geconfigureerde huizen. Gebruik een toegangstoken dat toegang heeft tot dezelfde Tibber-huizen.",
"invalid_yaml_syntax": "Ongeldige YAML-syntaxis. Controleer inspringen, dubbele punten en speciale tekens.",
"invalid_yaml_structure": "YAML moet een woordenboek/object zijn (sleutel: waarde-paren), geen lijst of platte tekst.",
"service_call_failed": "Service-aanroep validatie mislukt: {error_detail}"
},
"abort": {
"already_configured": "Integratie is al geconfigureerd",
@ -152,7 +155,7 @@
"price_trend_threshold_rising": "Stijgende drempel (% boven huidige prijs)",
"price_trend_threshold_falling": "Dalende drempel (% onder huidige prijs, negatieve waarde)"
},
"submit": "Configuratie voltooien"
"submit": "Verder naar stap 7"
},
"volatility": {
"title": "Volatiliteit Drempels",
@ -163,6 +166,17 @@
"volatility_threshold_very_high": "Zeer hoge drempel (VC ≥ deze %, bijv. 50)"
},
"submit": "Volgende naar stap 4"
},
"chart_data_export": {
"title": "Grafiek Data Export",
"description": "{step_progress}\n\n⚠ **Deze stap is optioneel:** Configureer alleen als je een diagnostische sensor nodig hebt voor oudere dashboardtools die GEEN services kunnen aanroepen.\n\n⚠ **BELANGRIJK: Legacy-functie**\nDeze sensor bestaat alleen voor achterwaartse compatibiliteit met oudere tools die alleen data kunnen lezen uit entiteitsattributen.\n\n**Voor nieuwe integraties, gebruik de `tibber_prices.get_chartdata` service direct** - deze is flexibeler, efficiënter en de aanbevolen Home Assistant-aanpak.\n\n**Wanneer deze sensor gebruiken:**\n✅ ALLEEN als jouw dashboardtool ALLEEN attributen kan lezen (bijv. oudere versies)\n❌ NIET voor nieuwe integraties - gebruik de service direct\n\n**Wat is deze sensor?**\nEen diagnostische sensor die de `tibber_prices.get_chartdata` service aanroept met jouw YAML-parameters en het resultaat beschikbaar stelt als attributen.\n\n**Hoe te gebruiken:**\n1. Open Instellingen → Apparaten en services → Tibber Prices\n2. Selecteer jouw woning\n3. Vind 'Grafiek Data Export' in de Diagnostiek-sectie\n4. Activeer de sensor (standaard uitgeschakeld)\n5. Gebruik `homeassistant.update_entity` om data handmatig bij te werken\n\n**Configuratiehandleiding:**\n1. Test je service-aanroep in Ontwikkelaarstools → Services → `tibber_prices.get_chartdata`\n2. Kopieer de YAML uit de `data:` sectie (zonder `entry_id`)\n3. Plak het hieronder (gebruik Shift+Enter of Ctrl+Enter voor nieuwe regels)\n\n**Voorbeeld YAML:**\n```yaml\nday:\n - today\n - tomorrow\ninclude_level: true\ninclude_rating_level: true\n```\n\n**Opmerking:** Laat leeg of sla deze stap over als je de service direct gebruikt (aanbevolen).",
"data": {
"chart_data_config": "Serviceconfiguratie (YAML)"
},
"data_description": {
"chart_data_config": "YAML-configuratie voor `get_chartdata` service. Ondersteunde parameters: day (lijst met yesterday/today/tomorrow), resolution (interval/hourly), output_format, include_level, include_rating_level, include_average, en nog veel meer. Zie servicedocumentatie voor volledige lijst. Laat leeg om standaardinstellingen te gebruiken (today, interval resolution, array of objects)."
},
"submit": "Configuratie voltooien"
}
},
"error": {
@ -650,6 +664,9 @@
},
"realtime_consumption_enabled": {
"name": "Realtime verbruik ingeschakeld"
},
"chart_data_export": {
"name": "Grafiek Data Export"
}
}
},
@ -734,6 +751,10 @@
"name": "Prijsbeoordeling Filter",
"description": "Filter intervallen om alleen specifieke prijsbeoordelingen op te nemen (LOW, NORMAL, HIGH). Als dit niet is opgegeven, worden alle beoordelingen opgenomen."
},
"period_filter": {
"name": "Periode Filter",
"description": "Filter intervallen om alleen die binnen Beste prijs- of Piek prijs-perioden op te nemen. Opties: 'best_price' (alleen intervallen in Beste prijs-perioden), 'peak_price' (alleen intervallen in Piek prijs-perioden). Als niet opgegeven, worden alle intervallen opgenomen. Gebruikt vooraf berekende periodegegevens van binaire sensoren."
},
"insert_nulls": {
"name": "NULL-waarden invoegen",
"description": "Beheer het invoegen van NULL-waarden voor gefilterde gegevens. 'none' (standaard): Geen NULL-waarden, alleen overeenkomende intervallen. 'segments': Voeg NULL-punten toe bij segmentgrenzen voor schone gaten in grafieken (aanbevolen voor traplijngrafieken). 'all': Voeg NULL in voor alle tijdstempels waarbij het filter niet overeenkomt (handig voor continue tijdreeksvisualisatie)."
@ -742,9 +763,13 @@
"name": "Voeg afsluitend null-punt toe",
"description": "[BEIDE FORMATEN] Voeg een laatste datapunt met null-waarden (behalve tijdstempel) toe aan het einde. Sommige diagrambibliotheken hebben dit nodig om extrapolatie/interpolatie naar de rand van het viewport te voorkomen bij stepline-weergave. Laat uitgeschakeld tenzij je diagram dit vereist."
},
"timestamp_field": {
"name": "Tijdstempel-veldnaam (alleen Array van objecten)",
"description": "[ALLEEN VOOR Array van objecten FORMAAT] Aangepaste naam voor het tijdstempel-veld in de uitvoer. Standaard is 'start_time' als niet opgegeven."
"start_time_field": {
"name": "Starttijd-veldnaam (alleen Array van objecten)",
"description": "[ALLEEN VOOR Array van objecten FORMAAT] Aangepaste naam voor het starttijd-veld in de uitvoer. Standaard is 'start_time' als niet opgegeven."
},
"end_time_field": {
"name": "Eindtijd-veldnaam (alleen Array van objecten)",
"description": "[ALLEEN VOOR Array van objecten FORMAAT] Aangepaste naam voor het eindtijd-veld in de uitvoer. Standaard is 'end_time' als niet opgegeven. Alleen gebruikt met period_filter."
},
"price_field": {
"name": "Prijsveld-naam (alleen Array van objecten)",
@ -828,6 +853,12 @@
"all": "Alle"
}
},
"period_filter": {
"options": {
"best_price": "Beste prijsperioden",
"peak_price": "Piek prijsperioden"
}
},
"volatility": {
"options": {
"low": "Laag",

View file

@ -39,7 +39,10 @@
"unknown": "Oväntat fel",
"cannot_connect": "Kunde inte ansluta",
"invalid_access_token": "Ogiltig åtkomsttoken",
"missing_homes": "Den nya åtkomsttoken har inte åtkomst till alla konfigurerade hem. Vänligen använd en åtkomsttoken som har åtkomst till samma Tibber-hem."
"missing_homes": "Den nya åtkomsttoken har inte åtkomst till alla konfigurerade hem. Vänligen använd en åtkomsttoken som har åtkomst till samma Tibber-hem.",
"invalid_yaml_syntax": "Ogiltig YAML-syntax. Kontrollera indrag, kolon och specialtecken.",
"invalid_yaml_structure": "YAML måste vara en ordbok/objekt (nyckel: värde-par), inte en lista eller ren text.",
"service_call_failed": "Service-anrop validering misslyckades: {error_detail}"
},
"abort": {
"already_configured": "Integrationen är redan konfigurerad",
@ -152,7 +155,7 @@
"price_trend_threshold_rising": "Stigande tröskel (% över aktuellt pris)",
"price_trend_threshold_falling": "Fallande tröskel (% under aktuellt pris, negativt värde)"
},
"submit": "Slutför konfigurationen"
"submit": "Nästa till steg 7"
},
"volatility": {
"title": "Volatilitet Trösklar",
@ -163,6 +166,17 @@
"volatility_threshold_very_high": "Mycket hög tröskel (VK ≥ detta %, t.ex. 50)"
},
"submit": "Nästa till steg 4"
},
"chart_data_export": {
"title": "Diagramdataexport",
"description": "{step_progress}\n\n⚠ **Detta steg är valfritt:** Konfigurera bara om du behöver en diagnostisk sensor för äldre instrumentpanelsverktyg som INTE kan anropa tjänster.\n\n⚠ **VIKTIGT: Legacy-funktion**\nDenna sensor finns endast för bakåtkompatibilitet med äldre verktyg som bara kan läsa data från entitetsattribut.\n\n**För nya integrationer, använd `tibber_prices.get_chartdata` tjänsten direkt** - den är mer flexibel, effektiv och den rekommenderade Home Assistant-metoden.\n\n**När använda denna sensor:**\n✅ ENDAST om ditt instrumentpanelsverktyg ENDAST kan läsa attribut (t.ex. äldre versioner)\n❌ INTE för nya integrationer - använd tjänsten direkt\n\n**Vad är denna sensor?**\nEn diagnostisk sensor som anropar `tibber_prices.get_chartdata` tjänsten med dina YAML-parametrar och tillhandahåller resultatet som attribut.\n\n**Hur använda:**\n1. Öppna Inställningar → Enheter och tjänster → Tibber Prices\n2. Välj ditt hem\n3. Hitta 'Diagramdataexport' i Diagnostik-sektionen\n4. Aktivera sensorn (inaktiverad som standard)\n5. Använd `homeassistant.update_entity` för att uppdatera data manuellt\n\n**Konfigurationsguide:**\n1. Testa ditt tjänsteanrop i Utvecklarverktyg → Tjänster → `tibber_prices.get_chartdata`\n2. Kopiera YAML från `data:` sektionen (utan `entry_id`)\n3. Klistra in det nedan (använd Shift+Enter eller Ctrl+Enter för nya rader)\n\n**Exempel YAML:**\n```yaml\nday:\n - today\n - tomorrow\ninclude_level: true\ninclude_rating_level: true\n```\n\n**Observera:** Lämna tomt eller hoppa över detta steg om du använder tjänsten direkt (rekommenderat).",
"data": {
"chart_data_config": "Tjänstkonfiguration (YAML)"
},
"data_description": {
"chart_data_config": "YAML-konfiguration för `get_chartdata` tjänst. Stödda parametrar: day (lista med yesterday/today/tomorrow), resolution (interval/hourly), output_format, include_level, include_rating_level, include_average, och många fler. Se tjänstdokumentation för fullständig lista. Lämna tomt för att använda standardinställningar (today, interval resolution, array of objects)."
},
"submit": "Slutför konfiguration"
}
},
"error": {
@ -650,6 +664,9 @@
},
"realtime_consumption_enabled": {
"name": "Realtidsförbrukning aktiverad"
},
"chart_data_export": {
"name": "Diagramdataexport"
}
}
},
@ -734,17 +751,25 @@
"name": "Prisvärderingsfilter",
"description": "Filtrera intervall för att endast inkludera specifika prisvärderingar (LOW, NORMAL, HIGH). Om inte angivet inkluderas alla värderingar."
},
"period_filter": {
"name": "Periodfilter",
"description": "Filtrera intervall för att endast inkludera de inom Bästa pris- eller Topp pris-perioder. Alternativ: 'best_price' (endast intervall i Bästa pris-perioder), 'peak_price' (endast intervall i Topp pris-perioder). Om inte angivet inkluderas alla intervall. Använder förberäknad perioddata från binära sensorer."
},
"insert_nulls": {
"name": "Infoga NULL-värden",
"description": "Kontrollera infogning av NULL-värden för filtrerad data. 'none' (standard): Inga NULL-värden, endast matchande intervall. 'segments': Lägg till NULL-punkter vid segmentgränser för rena luckor i diagram (rekommenderas för steglinjediagram). 'all': Infoga NULL för alla tidsstämplar där filtret inte matchar (användbart för kontinuerlig tidsserievisualisering)."
},
"add_trailing_null": {
"name": "Lägg till avslutande null-punkt",
"description": "[BÅDA FORMAT] Lägg till en sista datapunkt med null-värden (förutom tidsstämpel) i slutet. Vissa diagrambibliotek behöver detta för att förhindra extrapolering/interpolering till kanten av vyområdet vid stepline-rendering. Lämna inaktiverad om inte ditt diagram kräver det."
"description": "[BÅDA FORMATEN] Lägg till en sista datapunkt med nullvärden (utom tidsstämpel) i slutet. Vissa diagrambibliotek behöver detta för att förhindra extrapolering/interpolering till visningsportens kant vid användning av trappstegsrendering. Lämna inaktiverad om inte ditt diagram kräver det."
},
"timestamp_field": {
"name": "Tidsstämpel-fältnamn (endast Array av objekt)",
"description": "[ENDAST FÖR Array av objekt FORMAT] Anpassat namn för tidsstämpel-fältet i utdata. Standard är 'start_time' om inte angivet."
"start_time_field": {
"name": "Starttid-fältnamn (endast Array of Objects)",
"description": "[ENDAST FÖR Array of Objects FORMAT] Anpassat namn för starttid-fältet i utdata. Standard är 'start_time' om inte angivet."
},
"end_time_field": {
"name": "Sluttid-fältnamn (endast Array of Objects)",
"description": "[ENDAST FÖR Array of Objects FORMAT] Anpassat namn för sluttid-fältet i utdata. Standard är 'end_time' om inte angivet. Används endast med period_filter."
},
"price_field": {
"name": "Prisfältsnamn (endast Array av objekt)",
@ -828,6 +853,12 @@
"all": "Alla"
}
},
"period_filter": {
"options": {
"best_price": "Bästa prisperioder",
"peak_price": "Topp prisperioder"
}
},
"volatility": {
"options": {
"low": "Låg",

View file

@ -31,4 +31,73 @@ Coming soon...
## Diagnostic Sensors
Coming soon...
### Chart Data Export
**Entity ID:** `binary_sensor.tibber_home_NAME_chart_data_export`
**Default State:** Disabled (must be manually enabled)
> **⚠️ Legacy Feature**: This sensor is maintained for backward compatibility. For new integrations, use the **`tibber_prices.get_chartdata`** service instead, which offers more flexibility and better performance.
This diagnostic sensor provides cached chart-friendly price data that can be consumed by chart cards (ApexCharts, custom cards, etc.).
**Key Features:**
- **Configurable via Options Flow**: Service parameters can be configured through the integration's options menu (Step 7 of 7)
- **Automatic Updates**: Data refreshes on coordinator updates (every 15 minutes)
- **Attribute-Based Output**: Chart data is stored in sensor attributes for easy access
**Important Notes:**
- ⚠️ Disabled by default - must be manually enabled in entity settings
- ⚠️ Consider using the service instead for better control and flexibility
- ⚠️ Configuration updates require HA restart
**Attributes:**
The sensor exposes a single attribute containing the chart data in your configured format:
- **`data`** (or custom name): Array of price data points in configured format
**Configuration:**
To configure the sensor's output format:
1. Go to **Settings → Devices & Services → Tibber Prices**
2. Click **Configure** on your Tibber home
3. Navigate through the options wizard to **Step 7: Chart Data Export Settings**
4. Configure output format, filters, field names, and other options
5. Save and restart Home Assistant
**Available Settings:**
See the `tibber_prices.get_chartdata` service documentation below for a complete list of available parameters. All service parameters can be configured through the options flow.
**Example Usage:**
```yaml
# ApexCharts card consuming the sensor
type: custom:apexcharts-card
series:
- entity: binary_sensor.tibber_home_chart_data_export
data_generator: |
return entity.attributes.data;
```
**Migration Path:**
If you're currently using this sensor, consider migrating to the service:
```yaml
# Old approach (sensor)
- service: apexcharts_card.update
data:
entity: binary_sensor.tibber_home_chart_data_export
# New approach (service)
- service: tibber_prices.get_chartdata
data:
entry_id: YOUR_ENTRY_ID
day: ["today", "tomorrow"]
output_format: array_of_objects
response_variable: chart_data
```

View file

@ -1,11 +1,143 @@
# Services
> **Note:** This guide is under construction. For now, please refer to the Services tab in Home Assistant Developer Tools.
This integration provides several services for advanced price data access and manipulation.
## Available Services
Coming soon...
### tibber_prices.get_chartdata
## Examples
**Purpose:** Returns electricity price data in chart-friendly formats for visualization and analysis.
Coming soon...
**Key Features:**
- **Flexible Output Formats**: Array of objects or array of arrays
- **Time Range Selection**: Filter by day (yesterday, today, tomorrow)
- **Price Filtering**: Filter by price level or rating
- **Period Support**: Return best/peak price period summaries instead of intervals
- **Resolution Control**: Interval (15-minute) or hourly aggregation
- **Customizable Field Names**: Rename output fields to match your chart library
- **Currency Control**: Major (EUR/NOK) or minor (ct/øre) units
**Basic Example:**
```yaml
service: tibber_prices.get_chartdata
data:
entry_id: YOUR_ENTRY_ID
day: ["today", "tomorrow"]
output_format: array_of_objects
response_variable: chart_data
```
**Response Format:**
```json
{
"data": [
{
"start_time": "2025-11-17T00:00:00+01:00",
"price_per_kwh": 0.2534
},
{
"start_time": "2025-11-17T00:15:00+01:00",
"price_per_kwh": 0.2498
}
]
}
```
**Common Parameters:**
| Parameter | Description | Default |
| ---------------- | ------------------------------------------- | ----------------------- |
| `entry_id` | Integration entry ID (required) | - |
| `day` | Days to include: yesterday, today, tomorrow | `["today", "tomorrow"]` |
| `output_format` | `array_of_objects` or `array_of_arrays` | `array_of_objects` |
| `resolution` | `interval` (15-min) or `hourly` | `interval` |
| `minor_currency` | Return prices in ct/øre instead of EUR/NOK | `false` |
| `round_decimals` | Decimal places (0-10) | 4 (major) or 2 (minor) |
**Period Filter Example:**
Get best price periods as summaries instead of intervals:
```yaml
service: tibber_prices.get_chartdata
data:
entry_id: YOUR_ENTRY_ID
period_filter: best_price # or peak_price
day: ["today", "tomorrow"]
include_level: true
include_rating_level: true
response_variable: periods
```
**Advanced Filtering:**
```yaml
service: tibber_prices.get_chartdata
data:
entry_id: YOUR_ENTRY_ID
level_filter: ["VERY_CHEAP", "CHEAP"] # Only cheap periods
rating_level_filter: ["LOW"] # Only low-rated prices
insert_nulls: segments # Add nulls at segment boundaries
```
**Complete Documentation:**
For detailed parameter descriptions, see the service definition in **Developer Tools → Services → tibber_prices.get_chartdata** or check the inline documentation in the integration's `services.yaml` file.
---
### tibber_prices.get_apexcharts_yaml
**Purpose:** Generates complete ApexCharts card YAML configuration for visualizing electricity prices.
**Quick Example:**
```yaml
service: tibber_prices.get_apexcharts_yaml
data:
entry_id: YOUR_ENTRY_ID
response_variable: apexcharts_config
```
Use the response in Lovelace dashboards by copying the generated YAML.
**Documentation:** See Developer Tools → Services for parameter details.
---
### tibber_prices.refresh_user_data
**Purpose:** Forces an immediate refresh of user data (homes, subscriptions) from the Tibber API.
**Example:**
```yaml
service: tibber_prices.refresh_user_data
data:
entry_id: YOUR_ENTRY_ID
```
**Note:** User data is cached for 24 hours. Use this service only when you need immediate updates (e.g., after changing Tibber subscriptions).
---
## Migration from Chart Data Export Sensor
If you're currently using the `binary_sensor.tibber_home_chart_data_export` sensor, consider migrating to `tibber_prices.get_chartdata`:
**Benefits:**
- No HA restart required for configuration changes
- More flexible filtering and formatting options
- Better performance (on-demand instead of polling)
- Future-proof (active development)
**Migration Steps:**
1. Note your current sensor configuration (Step 7 in Options Flow)
2. Create automation/script using `tibber_prices.get_chartdata` with same parameters
3. Test the new approach
4. Disable the old sensor when satisfied