feat(apexcharts): add server-side metadata with dynamic yaxis and gradient

Implemented comprehensive metadata calculation for chart data export service
with automatic Y-axis scaling and gradient positioning based on actual price
statistics.

Changes:
- Added 'metadata' parameter to get_chartdata service (include/only/none)
- Implemented _calculate_metadata() with per-day price statistics
  * min/max/avg/median prices
  * avg_position and median_position (0-1 scale for gradient stops)
  * yaxis_suggested bounds (floor(min)-1, ceil(max)+1)
  * time_range with day boundaries
  * currency info with symbol and unit
- Integrated metadata into rolling_window modes via config-template-card
  * Pre-calculated yaxis bounds (no async issues in templates)
  * Dynamic gradient stops based on avg_position
  * Server-side calculation ensures consistency

Visual refinements:
- Best price overlay opacity reduced to 0.05 (ultra-subtle green hint)
- Stroke width increased to 1.5 for better visibility
- Gradient opacity adjusted to 0.45 with "light" shade
- Marker configuration: size 0, hover size 2, strokeWidth 1
- Header display: Only show LOW/HIGH rating_levels (min/max prices)
  * Conditional logic excludes NORMAL and level types
  * Entity state shows meaningful extrema values
- NOW marker label removed for rolling_window_autozoom mode
  * Static position at 120min lookback makes label misleading

Code cleanup:
- Removed redundant all_series_config (server-side data formatting)
- Currency names capitalized (Cents, Øre, Öre, Pence)

Translation updates:
- Added metadata selector translations (de, en, nb, nl, sv)
- Added metadata field description in services
- Synchronized all language files

Impact: Users get dynamic Y-axis scaling based on actual price data,
eliminating manual configuration. Rolling window charts automatically
adjust axis bounds and gradient positioning. Header shows only
meaningful extreme values (daily min/max). All data transformation
happens server-side for optimal performance and consistency.
This commit is contained in:
Julian Pawlowski 2025-12-05 18:14:18 +00:00
parent 2f1929fbdc
commit c8e9f7ec2a
9 changed files with 486 additions and 54 deletions

View file

@ -162,12 +162,12 @@ HOME_TYPES = {
# Currency mapping: ISO code -> (major_symbol, minor_symbol, minor_name)
# For currencies with Home Assistant constants, use those; otherwise define custom ones
CURRENCY_INFO = {
"EUR": (CURRENCY_EURO, "ct", "cents"),
"NOK": ("kr", "øre", "øre"),
"SEK": ("kr", "öre", "öre"),
"DKK": ("kr", "øre", "øre"),
"USD": (CURRENCY_DOLLAR, "¢", "cents"),
"GBP": ("£", "p", "pence"),
"EUR": (CURRENCY_EURO, "ct", "Cents"),
"NOK": ("kr", "øre", "Øre"),
"SEK": ("kr", "öre", "Öre"),
"DKK": ("kr", "øre", "Øre"),
"USD": (CURRENCY_DOLLAR, "¢", "Cents"),
"GBP": ("£", "p", "Pence"),
}

View file

@ -173,6 +173,16 @@ get_chartdata:
example: prices
selector:
text:
metadata:
required: false
default: include
selector:
select:
options:
- include
- only
- none
translation_key: metadata
arrays_of_objects:
collapsed: true
fields:

View file

@ -18,7 +18,7 @@ Response: YAML configuration dict for ApexCharts card
from __future__ import annotations
from typing import TYPE_CHECKING, Any, Final
from typing import Any, Final
import voluptuous as vol
@ -35,6 +35,7 @@ from custom_components.tibber_prices.const import (
format_price_unit_minor,
get_translation,
)
from homeassistant.core import ServiceCall
from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_registry import (
@ -45,11 +46,9 @@ from homeassistant.helpers.entity_registry import (
)
from .formatters import get_level_translation
from .get_chartdata import handle_chartdata
from .helpers import get_entry_and_data
if TYPE_CHECKING:
from homeassistant.core import ServiceCall
# Service constants
APEXCHARTS_YAML_SERVICE_NAME: Final = "get_apexcharts_yaml"
ATTR_DAY: Final = "day"
@ -261,21 +260,41 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]: # noqa:
# For rolling_window and rolling_window_autozoom, omit day parameter (dynamic selection)
day_param = "" if day in ("rolling_window", "rolling_window_autozoom", None) else f"day: ['{day}'], "
data_generator = (
f"const response = await hass.callWS({{ "
f"type: 'call_service', "
f"domain: 'tibber_prices', "
f"service: 'get_chartdata', "
f"return_response: true, "
f"service_data: {{ entry_id: '{entry_id}', {day_param}{filter_param}, "
f"output_format: 'array_of_arrays', insert_nulls: 'segments', minor_currency: true, "
f"connect_segments: true }} }}); "
f"return response.response.data;"
)
# All series use same configuration (no extremas on data_generator series)
# Hide all levels in header since data_generator series don't show meaningful state values
# (the entity state is the min/max/avg price, not the current price for this level)
show_config = {"legend_value": False, "in_header": False}
# For rolling window modes, we'll capture metadata for dynamic config
# For static day modes, just return data array
if day in ("rolling_window", "rolling_window_autozoom", None):
data_generator = (
f"const response = await hass.callWS({{ "
f"type: 'call_service', "
f"domain: 'tibber_prices', "
f"service: 'get_chartdata', "
f"return_response: true, "
f"service_data: {{ entry_id: '{entry_id}', {day_param}{filter_param}, "
f"output_format: 'array_of_arrays', insert_nulls: 'segments', minor_currency: true, "
f"connect_segments: true }} }}); "
f"return response.response.data;"
)
else:
# Static day modes: just return data (no metadata needed)
data_generator = (
f"const response = await hass.callWS({{ "
f"type: 'call_service', "
f"domain: 'tibber_prices', "
f"service: 'get_chartdata', "
f"return_response: true, "
f"service_data: {{ entry_id: '{entry_id}', {day_param}{filter_param}, "
f"output_format: 'array_of_arrays', insert_nulls: 'segments', minor_currency: true, "
f"connect_segments: true }} }}); "
f"return response.response.data;"
)
# Configure show options based on level_type and level_key
# rating_level LOW/HIGH: Show raw state in header (entity state = min/max price of day)
# rating_level NORMAL: Hide from header (not meaningful as extrema)
# level (VERY_CHEAP/CHEAP/etc): Hide from header (entity state is aggregated value)
if level_type == "rating_level" and level_key in (PRICE_RATING_LOW, PRICE_RATING_HIGH):
show_config = {"legend_value": False, "in_header": "raw"}
else:
show_config = {"legend_value": False, "in_header": False}
series.append(
{
@ -286,13 +305,12 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]: # noqa:
"yaxis_id": "price",
"show": show_config,
"data_generator": data_generator,
"stroke_width": 1,
"stroke_width": 1.5,
}
)
# Note: Extrema markers don't work with data_generator approach
# ApexCharts requires entity time-series data for extremas feature
# Min/Max sensors are single values, not time-series
# ApexCharts card requires direct entity data for extremas feature, not dynamically generated data
# Get translated name for best price periods (needed for tooltip formatter)
best_price_name = (
@ -334,12 +352,11 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]: # noqa:
"entity": best_price_entity,
"name": best_price_name,
"type": "area",
"color": "rgba(46, 204, 113, 0.2)", # Semi-transparent green
"color": "rgba(46, 204, 113, 0.05)", # Ultra-subtle green overlay (barely visible)
"yaxis_id": "highlight", # Use separate Y-axis (0-1) for full-height overlay
"show": {"legend_value": False, "in_header": False, "in_legend": False},
"data_generator": best_price_generator,
"stroke_width": 0,
"curve": "stepline",
}
)
@ -391,7 +408,7 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]: # noqa:
"header": {
"show": True,
"title": title,
"show_states": True,
"show_states": False,
},
"apex_config": {
"chart": {
@ -399,16 +416,16 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]: # noqa:
"toolbar": {"show": True, "tools": {"zoom": True, "pan": True}},
"zoom": {"enabled": True},
},
"stroke": {"curve": "stepline", "width": 2},
"stroke": {"curve": "stepline"},
"fill": {
"type": "gradient",
"opacity": 0.4,
"opacity": 0.45,
"gradient": {
"shade": "dark",
"shade": "light",
"type": "vertical",
"shadeIntensity": 0.5,
"shadeIntensity": 0.2,
"opacityFrom": 0.7,
"opacityTo": 0.2,
"opacityTo": 0.25,
},
},
"dataLabels": {"enabled": False},
@ -418,18 +435,21 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]: # noqa:
},
"legend": {
"show": False,
"position": "top",
"horizontalAlign": "left",
"markers": {"radius": 2},
"position": "bottom",
"horizontalAlign": "center",
},
"grid": {
"show": True,
"borderColor": "#f5f5f5",
"strokeDashArray": 0,
"xaxis": {"lines": {"show": True}},
"xaxis": {"lines": {"show": False}},
"yaxis": {"lines": {"show": True}},
},
"markers": {"size": 0},
"markers": {
"size": 0, # No markers on data points
"hover": {"size": 2}, # Show marker only on hover
"strokeWidth": 1,
},
},
"yaxis": [
{
@ -446,11 +466,11 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]: # noqa:
"opposite": True,
},
],
"now": {"show": True, "color": "#8e24aa", "label": "🕒 LIVE"},
"all_series_config": {
"stroke_width": 1,
"group_by": {"func": "raw", "duration": "15min"},
},
"now": (
{"show": True, "color": "#8e24aa"}
if day == "rolling_window_autozoom"
else {"show": True, "color": "#8e24aa", "label": "🕒 LIVE"}
),
"series": series,
}
@ -529,6 +549,30 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]: # noqa:
if current_price_sensor:
trigger_entities.append(current_price_sensor)
# Pre-calculate metadata server-side for dynamic yaxis and gradient
# This avoids async issues with config-template-card variables
# Create service call to get metadata
metadata_call = ServiceCall(
hass=hass,
domain="tibber_prices",
service="get_chartdata",
data={
"entry_id": entry_id,
"minor_currency": True,
"metadata": "only",
},
context=call.context,
return_response=True,
)
metadata_response = await handle_chartdata(metadata_call)
metadata = metadata_response.get("metadata", {})
# Extract values with fallbacks
yaxis_min = metadata.get("yaxis_suggested", {}).get("min", 0)
yaxis_max = metadata.get("yaxis_suggested", {}).get("max", 100)
avg_position = metadata.get("price_stats", {}).get("combined", {}).get("avg_position", 0.5)
gradient_stop = round(avg_position * 100)
return {
"type": "custom:config-template-card",
"variables": {
@ -539,6 +583,42 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]: # noqa:
**result,
"span": {"start": "minute", "offset": "-120min"},
"graph_span": "${v_graph_span}",
"yaxis": [
{
"id": "price",
"decimals": 2,
"min": yaxis_min,
"max": yaxis_max,
"apex_config": {
"title": {"text": price_unit},
"decimalsInFloat": 0,
"forceNiceScale": False,
},
},
{
"id": "highlight",
"min": 0,
"max": 1,
"show": False,
"opposite": True,
},
],
"apex_config": {
**result["apex_config"],
"fill": {
"type": "gradient",
"opacity": 0.45,
"gradient": {
"shade": "light",
"type": "vertical",
"shadeIntensity": 0.2,
"opacityFrom": 0.7,
"opacityTo": 0.25,
"gradientToColors": ["#transparent"],
"stops": [gradient_stop, 100],
},
},
},
},
}
# Rolling window modes (day is None or rolling_window): Dynamic offset
@ -549,6 +629,31 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]: # noqa:
# If 'on' (tomorrow data available) → offset +1d (show today+tomorrow)
# If 'off' (no tomorrow data) → offset +0d (show yesterday+today)
template_value = f"states['{tomorrow_data_sensor}'].state === 'on' ? '+1d' : '+0d'"
# Pre-calculate metadata server-side for dynamic yaxis and gradient
# This avoids async issues with config-template-card variables
# Create service call to get metadata
metadata_call = ServiceCall(
hass=hass,
domain="tibber_prices",
service="get_chartdata",
data={
"entry_id": entry_id,
"minor_currency": True,
"metadata": "only",
},
context=call.context,
return_response=True,
)
metadata_response = await handle_chartdata(metadata_call)
metadata = metadata_response.get("metadata", {})
# Extract values with fallbacks
yaxis_min = metadata.get("yaxis_suggested", {}).get("min", 0)
yaxis_max = metadata.get("yaxis_suggested", {}).get("max", 100)
avg_position = metadata.get("price_stats", {}).get("combined", {}).get("avg_position", 0.5)
gradient_stop = round(avg_position * 100)
return {
"type": "custom:config-template-card",
"variables": {
@ -561,6 +666,42 @@ async def handle_apexcharts_yaml(call: ServiceCall) -> dict[str, Any]: # noqa:
"end": "day",
"offset": "${v_offset}",
},
"yaxis": [
{
"id": "price",
"decimals": 2,
"min": yaxis_min,
"max": yaxis_max,
"apex_config": {
"title": {"text": price_unit},
"decimalsInFloat": 0,
"forceNiceScale": False,
},
},
{
"id": "highlight",
"min": 0,
"max": 1,
"show": False,
"opposite": True,
},
],
"apex_config": {
**result["apex_config"],
"fill": {
"type": "gradient",
"opacity": 0.45,
"gradient": {
"shade": "light",
"type": "vertical",
"shadeIntensity": 0.2,
"opacityFrom": 0.7,
"opacityTo": 0.25,
"gradientToColors": ["#transparent"],
"stops": [gradient_stop, 100],
},
},
},
},
}

View file

@ -21,8 +21,9 @@ Response: JSON with chart-ready data
from __future__ import annotations
import math
import re
from datetime import timedelta
from datetime import datetime, timedelta
from typing import TYPE_CHECKING, Any, Final
import voluptuous as vol
@ -41,6 +42,9 @@ from custom_components.tibber_prices.const import (
PRICE_RATING_HIGH,
PRICE_RATING_LOW,
PRICE_RATING_NORMAL,
format_price_unit_major,
format_price_unit_minor,
get_currency_info,
)
from custom_components.tibber_prices.coordinator.helpers import (
get_intervals_for_day_offsets,
@ -53,6 +57,161 @@ from .helpers import get_entry_and_data, has_tomorrow_data
if TYPE_CHECKING:
from homeassistant.core import ServiceCall
def _calculate_metadata( # noqa: PLR0912, PLR0913, PLR0915
chart_data: list[dict[str, Any]],
price_field: str,
start_time_field: str,
currency: str,
*,
resolution: str,
minor_currency: bool = False,
) -> dict[str, Any]:
"""
Calculate metadata for chart visualization.
Args:
chart_data: The chart data array
price_field: Name of the price field in chart_data
start_time_field: Name of the start time field
currency: Currency code (e.g., "EUR", "NOK")
resolution: Resolution type ("interval" or "hourly")
minor_currency: Whether prices are in minor currency units
Returns:
Metadata dictionary with price statistics, yaxis suggestions, and time info
"""
# Get currency info (returns tuple: major_symbol, minor_symbol, minor_name)
major_symbol, minor_symbol, minor_name = get_currency_info(currency)
# Build currency object with only the active unit
if minor_currency:
currency_obj = {
"code": currency,
"symbol": minor_symbol,
"name": minor_name, # Already capitalized in CURRENCY_INFO
"unit": format_price_unit_minor(currency),
}
else:
currency_obj = {
"code": currency,
"symbol": major_symbol,
"unit": format_price_unit_major(currency),
}
# Extract all prices (excluding None values)
prices = [item[price_field] for item in chart_data if item.get(price_field) is not None]
if not prices:
return {}
# Parse timestamps to determine day boundaries
# Group by date (midnight-to-midnight)
dates_seen = set()
for item in chart_data:
timestamp_str = item.get(start_time_field)
if timestamp_str and item.get(price_field) is not None:
# Parse ISO timestamp
dt = datetime.fromisoformat(timestamp_str) if isinstance(timestamp_str, str) else timestamp_str
date = dt.date()
dates_seen.add(date)
# Sort dates to ensure consistent day numbering
sorted_dates = sorted(dates_seen)
# Split data by day - dynamically handle any number of days
days_data: dict[str, list[float]] = {}
for i, _date in enumerate(sorted_dates, start=1):
day_key = f"day{i}"
days_data[day_key] = []
# Assign prices to their respective days
for item in chart_data:
timestamp_str = item.get(start_time_field)
price = item.get(price_field)
if timestamp_str and price is not None:
dt = datetime.fromisoformat(timestamp_str) if isinstance(timestamp_str, str) else timestamp_str
date = dt.date()
# Find which day this date corresponds to
day_index = sorted_dates.index(date) + 1
day_key = f"day{day_index}"
days_data[day_key].append(price)
def calc_stats(data: list[float]) -> dict[str, float]:
"""Calculate comprehensive statistics for a dataset."""
if not data:
return {}
min_val = min(data)
max_val = max(data)
avg_val = sum(data) / len(data)
median_val = sorted(data)[len(data) // 2]
# Calculate avg_position and median_position (0-1 scale)
price_range = max_val - min_val
avg_position = (avg_val - min_val) / price_range if price_range > 0 else 0.5
median_position = (median_val - min_val) / price_range if price_range > 0 else 0.5
# Position precision: 2 decimals for minor currency, 4 for major currency
position_decimals = 2 if minor_currency else 4
return {
"min": round(min_val, 2),
"max": round(max_val, 2),
"avg": round(avg_val, 2),
"avg_position": round(avg_position, position_decimals),
"median": round(median_val, 2),
"median_position": round(median_position, position_decimals),
}
# Calculate stats for combined and per-day data
combined_stats = calc_stats(prices)
# Calculate stats for each day dynamically
per_day_stats: dict[str, dict[str, float]] = {}
for day_key, day_data in days_data.items():
if day_data:
per_day_stats[day_key] = calc_stats(day_data)
# Calculate suggested yaxis bounds (floor(min) - 1 and ceil(max) + 1)
yaxis_min = math.floor(combined_stats["min"]) - 1 if combined_stats else 0
yaxis_max = math.ceil(combined_stats["max"]) + 1 if combined_stats else 100
# Get time range from chart data
timestamps = [item[start_time_field] for item in chart_data if item.get(start_time_field)]
time_range = {}
if timestamps:
time_range = {
"start": timestamps[0],
"end": timestamps[-1],
"days_included": list(days_data.keys()),
}
# Determine interval duration in minutes based on resolution
interval_duration_minutes = 15 if resolution == "interval" else 60
# Calculate suggested yaxis bounds
# For minor currency (ct, øre): integer values (floor/ceil)
# For major currency (€, kr): 2 decimal places precision
if minor_currency:
yaxis_min = math.floor(combined_stats["min"]) - 1 if combined_stats else 0
yaxis_max = math.ceil(combined_stats["max"]) + 1 if combined_stats else 100
else:
# Major currency: round to 2 decimal places with padding
yaxis_min = round(math.floor(combined_stats["min"] * 100) / 100 - 0.01, 2) if combined_stats else 0
yaxis_max = round(math.ceil(combined_stats["max"] * 100) / 100 + 0.01, 2) if combined_stats else 1.0
return {
"currency": currency_obj,
"resolution": interval_duration_minutes,
"data_count": len(chart_data),
"price_stats": {"combined": combined_stats, **per_day_stats},
"yaxis_suggested": {"min": yaxis_min, "max": yaxis_max},
"time_range": time_range,
}
# Service constants
CHARTDATA_SERVICE_NAME: Final = "get_chartdata"
ATTR_DAY: Final = "day"
@ -102,6 +261,7 @@ CHARTDATA_SERVICE_SCHEMA: Final = vol.Schema(
vol.Optional("rating_level_field", default="rating_level"): str,
vol.Optional("average_field", default="average"): str,
vol.Optional("data_key", default="data"): str,
vol.Optional("metadata", default="include"): vol.In(["include", "only", "none"]),
}
)
@ -162,6 +322,7 @@ async def handle_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR091
resolution = call.data.get("resolution", "interval")
output_format = call.data.get("output_format", "array_of_objects")
minor_currency = call.data.get("minor_currency", False)
metadata = call.data.get("metadata", "include")
round_decimals = call.data.get("round_decimals")
include_level = call.data.get("include_level", False)
include_rating_level = call.data.get("include_rating_level", False)
@ -174,6 +335,44 @@ async def handle_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR091
level_filter = call.data.get("level_filter")
rating_level_filter = call.data.get("rating_level_filter")
# === METADATA-ONLY MODE ===
# Early return: calculate and return only metadata, skip all data processing
if metadata == "only":
# Get minimal data to calculate metadata (just timestamps and prices)
# Use helper to get intervals for requested days
day_offset_map = {"yesterday": -1, "today": 0, "tomorrow": 1}
offsets = [day_offset_map[day] for day in days]
all_intervals = get_intervals_for_day_offsets(coordinator.data, offsets)
# Build minimal chart_data for metadata calculation
chart_data_for_meta = []
for interval in all_intervals:
start_time = interval.get("startsAt")
price = interval.get("total")
if start_time is not None and price is not None:
# Convert price to requested currency
converted_price = round(price * 100, 2) if minor_currency else round(price, 4)
chart_data_for_meta.append(
{
start_time_field: start_time.isoformat() if hasattr(start_time, "isoformat") else start_time,
price_field: converted_price,
}
)
# Calculate metadata
metadata = _calculate_metadata(
chart_data=chart_data_for_meta,
price_field=price_field,
start_time_field=start_time_field,
currency=coordinator.data.get("currency", "EUR"),
resolution=resolution,
minor_currency=minor_currency,
)
return {"metadata": metadata}
# Filter values are already normalized to uppercase by schema validators
# If array_fields is specified, implicitly enable fields that are used
array_fields_template = call.data.get("array_fields")
if array_fields_template and output_format == "array_of_arrays":
@ -620,7 +819,34 @@ async def handle_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR091
null_row = [points[-1][0]] + [None] * (len(field_names) - 1)
points.append(null_row)
return {data_key: points}
# Calculate metadata (before adding trailing null to chart_data)
result = {data_key: points}
if metadata in ("include", "only"):
metadata_obj = _calculate_metadata(
chart_data=chart_data,
price_field=price_field,
start_time_field=start_time_field,
currency=coordinator.data.get("currency", "EUR"),
resolution=resolution,
minor_currency=minor_currency,
)
if metadata_obj:
result["metadata"] = metadata_obj # type: ignore[index]
return result
# Calculate metadata (before adding trailing null)
result = {data_key: chart_data}
if metadata in ("include", "only"):
metadata_obj = _calculate_metadata(
chart_data=chart_data,
price_field=price_field,
start_time_field=start_time_field,
currency=coordinator.data.get("currency", "EUR"),
resolution=resolution,
minor_currency=minor_currency,
)
if metadata_obj:
result["metadata"] = metadata_obj # type: ignore[index]
# Add trailing null point for array_of_objects format if requested
if add_trailing_null and chart_data:
@ -633,4 +859,4 @@ async def handle_chartdata(call: ServiceCall) -> dict[str, Any]: # noqa: PLR091
null_point[field] = None
chart_data.append(null_point)
return {data_key: chart_data}
return result

View file

@ -995,9 +995,13 @@
"name": "Durchschnitts-Feldname",
"description": "Benutzerdefinierter Name für das Durchschnitts-Feld in der Ausgabe. Standard ist 'average', falls nicht angegeben. Wird nur verwendet, wenn include_average aktiviert ist."
},
"metadata": {
"name": "Metadaten",
"description": "Steuerung der Metadaten-Einbindung in der Antwort. 'include' (Standard): Gibt Chart-Daten und Metadaten mit Preisstatistiken, Währungsinformationen, Y-Achsen-Vorschlägen und Zeitbereich zurück. 'only': Gibt nur Metadaten zurück ohne Chart-Daten zu verarbeiten (schnell, nützlich für dynamische Y-Achsen-Konfiguration). 'none': Gibt nur Chart-Daten ohne Metadaten zurück."
},
"data_key": {
"name": "Daten-Schlüssel",
"description": "Benutzerdefinierter Name für den obersten Datenschlüssel in der Antwort. Standard ist 'data', falls nicht angegeben. Für ApexCharts-Kompatibilität mit Array von Arrays verwende 'points'."
"description": "Benutzerdefinierter Name für den obersten Datenschlüssel in der Antwort. Standard ist 'data', falls nicht angegeben."
}
}
},
@ -1074,6 +1078,13 @@
"peak_price": "Spitzenpreis-Zeiträume"
}
},
"metadata": {
"options": {
"include": "Einbeziehen (Daten + Metadaten)",
"only": "Nur Metadaten",
"none": "Keine (nur Daten)"
}
},
"volatility": {
"options": {
"low": "Niedrig",

View file

@ -933,7 +933,7 @@
},
"data_key": {
"name": "Data Key",
"description": "Custom name for the top-level data key in the response. Defaults to 'data' if not specified. For ApexCharts compatibility with Array of Arrays, use 'points'."
"description": "Custom name for the top-level data key in the response. Defaults to 'data' if not specified."
},
"include_level": {
"name": "Include Level",
@ -994,6 +994,10 @@
"average_field": {
"name": "Average Field Name",
"description": "Custom name for the average field in the output. Defaults to 'average' if not specified. Only used when include_average is enabled."
},
"metadata": {
"name": "Metadata",
"description": "Control metadata inclusion in the response. 'include' (default): Returns both chart data and metadata with price statistics, currency info, Y-axis suggestions, and time range. 'only': Returns only metadata without processing chart data (fast, useful for dynamic Y-axis configuration). 'none': Returns only chart data without metadata."
}
}
},
@ -1070,6 +1074,13 @@
"peak_price": "Peak Price Periods"
}
},
"metadata": {
"options": {
"include": "Include (data + metadata)",
"only": "Only metadata",
"none": "None (data only)"
}
},
"volatility": {
"options": {
"low": "Low",

View file

@ -991,9 +991,13 @@
"name": "Gjennomsnittsfelt-navn",
"description": "Tilpasset navn for gjennomsnittsfeltet i utdata. Standard er 'average'. Brukes bare når include_average er aktivert."
},
"metadata": {
"name": "Metadata",
"description": "Kontroller metadata-inkludering i svaret. 'include' (standard): Returnerer både diagramdata og metadata med prisstatistikk, valutainformasjon, Y-akse forslag og tidsperiode. 'only': Returnerer bare metadata uten å behandle diagramdata (raskt, nyttig for dynamisk Y-akse konfigurasjon). 'none': Returnerer bare diagramdata uten metadata."
},
"data_key": {
"name": "Datanøkkel",
"description": "Tilpasset navn for datanøkkelen på toppnivå i svaret. Standard er 'data' hvis ikke angitt. For ApexCharts-kompatibilitet med Array av arrays, bruk 'points'."
"description": "Tilpasset navn for datanøkkelen på toppnivå i svaret. Standard er 'data' hvis ikke angitt."
}
}
},
@ -1070,6 +1074,13 @@
"peak_price": "Topp prisperioder"
}
},
"metadata": {
"options": {
"include": "Inkluder (data + metadata)",
"only": "Kun metadata",
"none": "Ingen (kun data)"
}
},
"volatility": {
"options": {
"low": "Lav",

View file

@ -991,9 +991,13 @@
"name": "Gemiddelde veld-naam",
"description": "Aangepaste naam voor het gemiddelde veld in de uitvoer. Standaard is 'average'. Alleen gebruikt wanneer include_average is ingeschakeld."
},
"metadata": {
"name": "Metadata",
"description": "Beheer metadata-opname in het antwoord. 'include' (standaard): Retourneert zowel grafiekdata als metadata met prijsstatistieken, valuta-info, Y-as suggesties en tijdsbereik. 'only': Retourneert alleen metadata zonder grafiekdata te verwerken (snel, handig voor dynamische Y-as configuratie). 'none': Retourneert alleen grafiekdata zonder metadata."
},
"data_key": {
"name": "Gegevenssleutel",
"description": "Aangepaste naam voor de gegevenssleutel op het hoogste niveau in het antwoord. Standaard is 'data' als niet opgegeven. Voor ApexCharts-compatibiliteit met Array van arrays, gebruik 'points'."
"description": "Aangepaste naam voor de gegevenssleutel op het hoogste niveau in het antwoord. Standaard is 'data' als niet opgegeven."
}
}
},
@ -1070,6 +1074,13 @@
"peak_price": "Piek prijsperioden"
}
},
"metadata": {
"options": {
"include": "Opnemen (data + metadata)",
"only": "Alleen metadata",
"none": "Geen (alleen data)"
}
},
"volatility": {
"options": {
"low": "Laag",

View file

@ -991,9 +991,13 @@
"name": "Genomsnittsfältsnamn",
"description": "Anpassat namn för genomsnittsfältet i utdata. Standard är 'average'. Används endast när include_average är aktiverad."
},
"metadata": {
"name": "Metadata",
"description": "Styr metadata-inkludering i svaret. 'include' (standard): Returnerar både diagramdata och metadata med prisstatistik, valutainformation, Y-axelförslag och tidsperiod. 'only': Returnerar endast metadata utan att bearbeta diagramdata (snabbt, användbart för dynamisk Y-axel konfiguration). 'none': Returnerar endast diagramdata utan metadata."
},
"data_key": {
"name": "Datanyckel",
"description": "Anpassat namn för datanyckeln på toppnivå i svaret. Standard är 'data' om inte angivet. För ApexCharts-kompatibilitet med Array av arrayer, använd 'points'."
"description": "Anpassat namn för datanyckeln på toppnivå i svaret. Standard är 'data' om inte angivet."
}
}
},
@ -1070,6 +1074,13 @@
"peak_price": "Topp prisperioder"
}
},
"metadata": {
"options": {
"include": "Inkludera (data + metadata)",
"only": "Endast metadata",
"none": "Ingen (endast data)"
}
},
"volatility": {
"options": {
"low": "Låg",