refactor(periods): merge adjacent periods and remove is_extension logic

BREAKING CHANGE: Period overlap resolution now merges adjacent/overlapping periods
instead of marking them as extensions. This simplifies automation logic and provides
clearer period boundaries for users.

Previous Behavior:
- Adjacent periods created by relaxation were marked with is_extension=true
- Multiple short periods instead of one continuous period
- Complex logic needed to determine actual period length in automations

New Behavior:
- Adjacent/overlapping periods are merged into single continuous periods
- Newer period's relaxation attributes override older period's
- Simpler automation: one period = one continuous time window

Changes:
- Period Overlap Resolution (new file: period_overlap.py):
  * Added merge_adjacent_periods() to combine periods and preserve attributes
  * Rewrote resolve_period_overlaps() with simplified merge logic
  * Removed split_period_by_overlaps() (no longer needed)
  * Removed is_extension marking logic
  * Removed unused parameters: min_period_length, baseline_periods

- Relaxation Strategy (relaxation.py):
  * Removed all is_extension filtering from period counting
  * Simplified standalone counting to just len(periods)
  * Changed from period_merging import to period_overlap import
  * Added MAX_FLEX_HARD_LIMIT constant (0.50)
  * Improved debug logging for merged periods

- Code Quality:
  * Fixed all remaining linter errors (N806, PLR2004, PLR0912)
  * Extracted magic values to module-level constants:
    - FLEX_SCALING_THRESHOLD = 0.20
    - SCALE_FACTOR_WARNING_THRESHOLD = 0.8
    - MAX_FLEX_HARD_LIMIT = 0.50
  * Added appropriate noqa comments for unavoidable patterns

- Configuration (from previous work in this session):
  * Removed CONF_RELAXATION_STEP_BEST, CONF_RELAXATION_STEP_PEAK
  * Hard-coded 3% relaxation increment for reliability
  * Optimized defaults: RELAXATION_ATTEMPTS 8→11, ENABLE_MIN_PERIODS False→True,
    MIN_PERIODS undefined→2
  * Removed relaxation_step UI fields from config flow
  * Updated all 5 translation files

- Documentation:
  * Updated period_handlers/__init__.py: period_merging → period_overlap
  * No user-facing docs changes needed (already described continuous periods)

Rationale - Period Merging:
User experience was complicated by fragmented periods:
- Automations had to check multiple adjacent periods
- Binary sensors showed ON/OFF transitions within same cheap time
- No clear way to determine actual continuous period length

With merging:
- One continuous cheap time = one period
- Binary sensor clearly ON during entire period
- Attributes show merge history via merged_from dict
- Relaxation info preserved from newest/highest flex period

Rationale - Hard-Coded Relaxation Increment:
The configurable relaxation_step parameter proved problematic:
- High base flex + high step → rapid explosion (40% base + 10% step → 100% in 6 steps)
- Users don't understand the multiplicative nature
- 3% increment provides optimal balance: 11 attempts to reach 50% hard cap

Impact:
- Existing installations: Periods may appear longer (merged instead of split)
- Automations benefit from simpler logic (no is_extension checks needed)
- Custom relaxation_step values will use new 3% increment
- Users may need to adjust relaxation_attempts if they relied on high step sizes
This commit is contained in:
Julian Pawlowski 2025-11-19 20:16:58 +00:00
parent 625bc222ca
commit 457fa7c03f
21 changed files with 1280 additions and 733 deletions

108
AGENTS.md
View file

@ -523,6 +523,114 @@ custom_components/tibber_prices/
└── services.yaml # Service definitions └── services.yaml # Service definitions
``` ```
## Period Calculation System (Best/Peak Price Periods)
**CRITICAL:** Period calculation uses multi-criteria filtering that can create **mathematical conflicts** at high flexibility values. Understanding these interactions is essential for reliable period detection.
**Core Challenge:**
The period calculation applies **three independent filters** that ALL must pass:
1. **Flex filter**: `price ≤ daily_min × (1 + flex)`
2. **Min_Distance filter**: `price ≤ daily_avg × (1 - min_distance/100)`
3. **Level filter**: `rating_level IN [allowed_levels]`
**Mathematical Conflict Condition:**
When `daily_min × (1 + flex) > daily_avg × (1 - min_distance/100)`, the flex filter permits intervals that the min_distance filter blocks, causing zero periods despite high flexibility.
Example: daily_min=10 ct, daily_avg=20 ct, flex=50%, min_distance=5%
- Flex allows: ≤15 ct
- Distance allows: ≤19 ct
- But combined: Only intervals ≤15 ct AND ≤19 ct AND matching level → Distance becomes dominant constraint
**Solutions Implemented (Nov 2025):**
1. **Hard Caps on Flex** (`coordinator/period_handlers/core.py`):
- `MAX_SAFE_FLEX = 0.50` (50% overall maximum)
- `MAX_OUTLIER_FLEX = 0.25` (25% for price spike detection)
- Warns users when base flex exceeds thresholds (INFO at 25%, WARNING at 30%)
2. **Relaxation Increment Cap** (`coordinator/period_handlers/relaxation.py`):
- Maximum 3% increment per relaxation step (prevents explosion from high base flex)
- Example: Base flex 40% → increments as 43%, 46%, 49% (capped at 50%)
- Without cap: 40% × 1.25 = 50% step → reaches 100% in 6 steps
3. **Dynamic Min_Distance Scaling** (`coordinator/period_handlers/level_filtering.py`):
- Reduces min_distance proportionally as flex increases above 20%
- Formula: `scale_factor = max(0.25, 1.0 - ((flex - 0.20) × 2.5))`
- Example: flex=30% → scale=0.75 → min_distance reduced by 25%
- Minimum scaling: 25% of original (prevents complete removal)
4. **Enhanced Debug Logging** (`coordinator/period_handlers/period_building.py`):
- Tracks exact counts of intervals filtered by flex, min_distance, and level
- Shows which filter blocked the most candidates
- Enables diagnosis of configuration issues
**Configuration Guidance:**
**Recommended Flex Ranges:**
- **With relaxation enabled**: 10-20% base flex (relaxation will escalate as needed)
- **Without relaxation**: 20-35% direct flex (no automatic escalation)
- **Anti-pattern**: Base flex >30% with relaxation enabled → causes rapid escalation and filter conflicts
**Key Constants** (defined in `coordinator/period_handlers/core.py`):
```python
MAX_SAFE_FLEX = 0.50 # 50% absolute maximum
MAX_OUTLIER_FLEX = 0.25 # 25% for stable outlier detection
FLEX_WARNING_THRESHOLD_RELAXATION = 0.25 # INFO warning at 25% base flex
FLEX_HIGH_THRESHOLD_RELAXATION = 0.30 # WARNING at 30% base flex
```
**Relaxation Strategy** (`coordinator/period_handlers/relaxation.py`):
- Per-day independent loops (each day escalates separately based on its needs)
- Hard cap: 3% absolute maximum increment per step (prevents explosion from high base flex)
- Default configuration: 11 flex levels (15% base → 18% → 21% → ... → 48% max)
- Filter combinations: original level → level="any" (tries both price and volatility levels)
- Each flex level tries all filter combinations before increasing flex further
**Period Boundary Behavior** (`coordinator/period_handlers/period_building.py`):
- Periods can **cross midnight** (day boundaries) naturally
- Reference price locked to **period start day** for consistency across the entire period
- Pattern: "Uses reference price from start day of the period for consistency" (same as period statistics)
- Example: Period starting 23:45 on Day 1 continues into Day 2 using Day 1's daily_min as reference
- This prevents artificial splits at midnight when prices remain favorable across the boundary
**Default Configuration Values** (`const.py`):
```python
DEFAULT_BEST_PRICE_FLEX = 15 # 15% base - optimal for relaxation mode
DEFAULT_PEAK_PRICE_FLEX = -20 # 20% base (negative for peak detection)
DEFAULT_RELAXATION_ATTEMPTS_BEST = 11 # 11 steps: 15% → 48% (3% increment per step)
DEFAULT_RELAXATION_ATTEMPTS_PEAK = 11 # 11 steps: 20% → 50% (3% increment per step)
```
The relaxation increment is **hard-coded at 3% per step** in `relaxation.py` for reliability and predictability. This prevents configuration issues with high base flex values while still allowing sufficient escalation to the 50% hard maximum.
**Dynamic Scaling Table** (min_distance adjustment):
```
Flex Scale Example (min_distance=5%)
-------------------------------------------
≤20% 100% 5.00% (no reduction)
25% 87.5% 4.38%
30% 75% 3.75%
35% 62.5% 3.13%
40% 50% 2.50%
45% 37.5% 1.88%
≥50% 25% 1.25% (minimum)
```
**Testing Scenarios:**
When debugging period calculation issues:
1. Check flex value: Is base flex >30%? Reduce to 15-20% if using relaxation
2. Check logs for "scaled min_distance": Is it reducing too much? May need lower base flex
3. Check filter statistics: Which filter blocks most intervals? (flex, distance, or level)
4. Check relaxation warnings: INFO at 25%, WARNING at 30% indicate suboptimal config
**See:**
- **Theory documentation**: `docs/development/period-calculation-theory.md` (comprehensive mathematical analysis, conflict conditions, configuration pitfalls)
- **Implementation**: `coordinator/period_handlers/` package (core.py, relaxation.py, level_filtering.py, period_building.py)
- **User guide**: `docs/user/period-calculation.md` (simplified user-facing explanations)
## Development Environment Setup ## Development Environment Setup
**Python Virtual Environment:** **Python Virtual Environment:**

View file

@ -33,8 +33,6 @@ from custom_components.tibber_prices.const import (
CONF_PRICE_TREND_THRESHOLD_RISING, CONF_PRICE_TREND_THRESHOLD_RISING,
CONF_RELAXATION_ATTEMPTS_BEST, CONF_RELAXATION_ATTEMPTS_BEST,
CONF_RELAXATION_ATTEMPTS_PEAK, CONF_RELAXATION_ATTEMPTS_PEAK,
CONF_RELAXATION_STEP_BEST,
CONF_RELAXATION_STEP_PEAK,
CONF_VOLATILITY_THRESHOLD_HIGH, CONF_VOLATILITY_THRESHOLD_HIGH,
CONF_VOLATILITY_THRESHOLD_MODERATE, CONF_VOLATILITY_THRESHOLD_MODERATE,
CONF_VOLATILITY_THRESHOLD_VERY_HIGH, CONF_VOLATILITY_THRESHOLD_VERY_HIGH,
@ -59,8 +57,6 @@ from custom_components.tibber_prices.const import (
DEFAULT_PRICE_TREND_THRESHOLD_RISING, DEFAULT_PRICE_TREND_THRESHOLD_RISING,
DEFAULT_RELAXATION_ATTEMPTS_BEST, DEFAULT_RELAXATION_ATTEMPTS_BEST,
DEFAULT_RELAXATION_ATTEMPTS_PEAK, DEFAULT_RELAXATION_ATTEMPTS_PEAK,
DEFAULT_RELAXATION_STEP_BEST,
DEFAULT_RELAXATION_STEP_PEAK,
DEFAULT_VOLATILITY_THRESHOLD_HIGH, DEFAULT_VOLATILITY_THRESHOLD_HIGH,
DEFAULT_VOLATILITY_THRESHOLD_MODERATE, DEFAULT_VOLATILITY_THRESHOLD_MODERATE,
DEFAULT_VOLATILITY_THRESHOLD_VERY_HIGH, DEFAULT_VOLATILITY_THRESHOLD_VERY_HIGH,
@ -355,23 +351,6 @@ def get_best_price_schema(options: Mapping[str, Any]) -> vol.Schema:
mode=NumberSelectorMode.SLIDER, mode=NumberSelectorMode.SLIDER,
), ),
), ),
vol.Optional(
CONF_RELAXATION_STEP_BEST,
default=int(
options.get(
CONF_RELAXATION_STEP_BEST,
DEFAULT_RELAXATION_STEP_BEST,
)
),
): NumberSelector(
NumberSelectorConfig(
min=5,
max=50,
step=5,
unit_of_measurement="%",
mode=NumberSelectorMode.SLIDER,
),
),
vol.Optional( vol.Optional(
CONF_RELAXATION_ATTEMPTS_BEST, CONF_RELAXATION_ATTEMPTS_BEST,
default=int( default=int(
@ -499,23 +478,6 @@ def get_peak_price_schema(options: Mapping[str, Any]) -> vol.Schema:
mode=NumberSelectorMode.SLIDER, mode=NumberSelectorMode.SLIDER,
), ),
), ),
vol.Optional(
CONF_RELAXATION_STEP_PEAK,
default=int(
options.get(
CONF_RELAXATION_STEP_PEAK,
DEFAULT_RELAXATION_STEP_PEAK,
)
),
): NumberSelector(
NumberSelectorConfig(
min=5,
max=50,
step=5,
unit_of_measurement="%",
mode=NumberSelectorMode.SLIDER,
),
),
vol.Optional( vol.Optional(
CONF_RELAXATION_ATTEMPTS_PEAK, CONF_RELAXATION_ATTEMPTS_PEAK,
default=int( default=int(

View file

@ -39,11 +39,9 @@ CONF_BEST_PRICE_MAX_LEVEL_GAP_COUNT = "best_price_max_level_gap_count"
CONF_PEAK_PRICE_MAX_LEVEL_GAP_COUNT = "peak_price_max_level_gap_count" CONF_PEAK_PRICE_MAX_LEVEL_GAP_COUNT = "peak_price_max_level_gap_count"
CONF_ENABLE_MIN_PERIODS_BEST = "enable_min_periods_best" CONF_ENABLE_MIN_PERIODS_BEST = "enable_min_periods_best"
CONF_MIN_PERIODS_BEST = "min_periods_best" CONF_MIN_PERIODS_BEST = "min_periods_best"
CONF_RELAXATION_STEP_BEST = "relaxation_step_best"
CONF_RELAXATION_ATTEMPTS_BEST = "relaxation_attempts_best" CONF_RELAXATION_ATTEMPTS_BEST = "relaxation_attempts_best"
CONF_ENABLE_MIN_PERIODS_PEAK = "enable_min_periods_peak" CONF_ENABLE_MIN_PERIODS_PEAK = "enable_min_periods_peak"
CONF_MIN_PERIODS_PEAK = "min_periods_peak" CONF_MIN_PERIODS_PEAK = "min_periods_peak"
CONF_RELAXATION_STEP_PEAK = "relaxation_step_peak"
CONF_RELAXATION_ATTEMPTS_PEAK = "relaxation_attempts_peak" CONF_RELAXATION_ATTEMPTS_PEAK = "relaxation_attempts_peak"
CONF_CHART_DATA_CONFIG = "chart_data_config" # YAML config for chart data export CONF_CHART_DATA_CONFIG = "chart_data_config" # YAML config for chart data export
@ -52,14 +50,14 @@ ATTRIBUTION = "Data provided by Tibber"
# Integration name should match manifest.json # Integration name should match manifest.json
DEFAULT_NAME = "Tibber Price Information & Ratings" DEFAULT_NAME = "Tibber Price Information & Ratings"
DEFAULT_EXTENDED_DESCRIPTIONS = False DEFAULT_EXTENDED_DESCRIPTIONS = False
DEFAULT_BEST_PRICE_FLEX = 15 # 15% flexibility for best price (user-facing, percent) DEFAULT_BEST_PRICE_FLEX = 15 # 15% base flexibility - optimal for relaxation mode (default enabled)
# Peak price flexibility is set to -20 (20%) to allow for more adaptive detection of expensive periods. # Peak price flexibility is set to -20% (20% base flexibility - optimal for relaxation mode).
# This is intentionally more flexible than best price (15%) because peak price periods can be more variable, # This is intentionally more flexible than best price (15%) because peak price periods can be more variable,
# and users may benefit from earlier warnings about expensive periods, even if they are less sharply defined. # and users may benefit from earlier warnings about expensive periods, even if they are less sharply defined.
# The negative sign indicates that the threshold is set below the MAX price # The negative sign indicates that the threshold is set below the MAX price
# (e.g., -20% means MAX * 0.8), not above the average price. # (e.g., -20% means MAX * 0.8), not above the average price.
# A higher percentage allows for more conservative detection, reducing false negatives for peak price warnings. # A higher percentage allows for more conservative detection, reducing false negatives for peak price warnings.
DEFAULT_PEAK_PRICE_FLEX = -20 # 20% flexibility for peak price (user-facing, percent) DEFAULT_PEAK_PRICE_FLEX = -20 # 20% base flexibility (user-facing, percent)
DEFAULT_BEST_PRICE_MIN_DISTANCE_FROM_AVG = 5 # 5% minimum distance from daily average (ensures significance) DEFAULT_BEST_PRICE_MIN_DISTANCE_FROM_AVG = 5 # 5% minimum distance from daily average (ensures significance)
DEFAULT_PEAK_PRICE_MIN_DISTANCE_FROM_AVG = 5 # 5% minimum distance from daily average (ensures significance) DEFAULT_PEAK_PRICE_MIN_DISTANCE_FROM_AVG = 5 # 5% minimum distance from daily average (ensures significance)
DEFAULT_BEST_PRICE_MIN_PERIOD_LENGTH = 60 # 60 minutes minimum period length for best price (user-facing, minutes) DEFAULT_BEST_PRICE_MIN_PERIOD_LENGTH = 60 # 60 minutes minimum period length for best price (user-facing, minutes)
@ -85,12 +83,10 @@ DEFAULT_PEAK_PRICE_MAX_LEVEL_GAP_COUNT = 1 # Default: allow 1 level gap for pea
MIN_INTERVALS_FOR_GAP_TOLERANCE = 6 # Minimum period length (in 15-min intervals = 1.5h) required for gap tolerance MIN_INTERVALS_FOR_GAP_TOLERANCE = 6 # Minimum period length (in 15-min intervals = 1.5h) required for gap tolerance
DEFAULT_ENABLE_MIN_PERIODS_BEST = True # Default: minimum periods feature enabled for best price DEFAULT_ENABLE_MIN_PERIODS_BEST = True # Default: minimum periods feature enabled for best price
DEFAULT_MIN_PERIODS_BEST = 2 # Default: require at least 2 best price periods (when enabled) DEFAULT_MIN_PERIODS_BEST = 2 # Default: require at least 2 best price periods (when enabled)
DEFAULT_RELAXATION_STEP_BEST = 25 # Default: 25% of original threshold per relaxation step for best price DEFAULT_RELAXATION_ATTEMPTS_BEST = 11 # Default: 11 steps allows escalation from 15% to 48% (3% increment per step)
DEFAULT_RELAXATION_ATTEMPTS_BEST = 8 # Default: try 8 flex levels during relaxation (best price)
DEFAULT_ENABLE_MIN_PERIODS_PEAK = True # Default: minimum periods feature enabled for peak price DEFAULT_ENABLE_MIN_PERIODS_PEAK = True # Default: minimum periods feature enabled for peak price
DEFAULT_MIN_PERIODS_PEAK = 2 # Default: require at least 2 peak price periods (when enabled) DEFAULT_MIN_PERIODS_PEAK = 2 # Default: require at least 2 peak price periods (when enabled)
DEFAULT_RELAXATION_STEP_PEAK = 25 # Default: 25% of original threshold per relaxation step for peak price DEFAULT_RELAXATION_ATTEMPTS_PEAK = 11 # Default: 11 steps allows escalation from 20% to 50% (3% increment per step)
DEFAULT_RELAXATION_ATTEMPTS_PEAK = 8 # Default: try 8 flex levels during relaxation (peak price)
# Home types # Home types
HOME_TYPE_APARTMENT = "APARTMENT" HOME_TYPE_APARTMENT = "APARTMENT"

View file

@ -616,7 +616,6 @@ class TibberPricesDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
"max_level_gap_count": self.config_entry.options.get(_const.CONF_BEST_PRICE_MAX_LEVEL_GAP_COUNT, 0), "max_level_gap_count": self.config_entry.options.get(_const.CONF_BEST_PRICE_MAX_LEVEL_GAP_COUNT, 0),
"enable_min_periods": self.config_entry.options.get(_const.CONF_ENABLE_MIN_PERIODS_BEST, False), "enable_min_periods": self.config_entry.options.get(_const.CONF_ENABLE_MIN_PERIODS_BEST, False),
"min_periods": self.config_entry.options.get(_const.CONF_MIN_PERIODS_BEST, 2), "min_periods": self.config_entry.options.get(_const.CONF_MIN_PERIODS_BEST, 2),
"relaxation_step": self.config_entry.options.get(_const.CONF_RELAXATION_STEP_BEST, 5.0),
"relaxation_attempts": self.config_entry.options.get(_const.CONF_RELAXATION_ATTEMPTS_BEST, 4), "relaxation_attempts": self.config_entry.options.get(_const.CONF_RELAXATION_ATTEMPTS_BEST, 4),
}, },
"peak_price_config": { "peak_price_config": {
@ -629,7 +628,6 @@ class TibberPricesDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
"max_level_gap_count": self.config_entry.options.get(_const.CONF_PEAK_PRICE_MAX_LEVEL_GAP_COUNT, 0), "max_level_gap_count": self.config_entry.options.get(_const.CONF_PEAK_PRICE_MAX_LEVEL_GAP_COUNT, 0),
"enable_min_periods": self.config_entry.options.get(_const.CONF_ENABLE_MIN_PERIODS_PEAK, False), "enable_min_periods": self.config_entry.options.get(_const.CONF_ENABLE_MIN_PERIODS_PEAK, False),
"min_periods": self.config_entry.options.get(_const.CONF_MIN_PERIODS_PEAK, 2), "min_periods": self.config_entry.options.get(_const.CONF_MIN_PERIODS_PEAK, 2),
"relaxation_step": self.config_entry.options.get(_const.CONF_RELAXATION_STEP_PEAK, 5.0),
"relaxation_attempts": self.config_entry.options.get(_const.CONF_RELAXATION_ATTEMPTS_PEAK, 4), "relaxation_attempts": self.config_entry.options.get(_const.CONF_RELAXATION_ATTEMPTS_PEAK, 4),
}, },
} }

View file

@ -89,7 +89,6 @@ class DataTransformer:
"max_level_gap_count": self.config_entry.options.get(_const.CONF_BEST_PRICE_MAX_LEVEL_GAP_COUNT, 0), "max_level_gap_count": self.config_entry.options.get(_const.CONF_BEST_PRICE_MAX_LEVEL_GAP_COUNT, 0),
"enable_min_periods": self.config_entry.options.get(_const.CONF_ENABLE_MIN_PERIODS_BEST, False), "enable_min_periods": self.config_entry.options.get(_const.CONF_ENABLE_MIN_PERIODS_BEST, False),
"min_periods": self.config_entry.options.get(_const.CONF_MIN_PERIODS_BEST, 2), "min_periods": self.config_entry.options.get(_const.CONF_MIN_PERIODS_BEST, 2),
"relaxation_step": self.config_entry.options.get(_const.CONF_RELAXATION_STEP_BEST, 5.0),
"relaxation_attempts": self.config_entry.options.get(_const.CONF_RELAXATION_ATTEMPTS_BEST, 4), "relaxation_attempts": self.config_entry.options.get(_const.CONF_RELAXATION_ATTEMPTS_BEST, 4),
}, },
"peak_price_config": { "peak_price_config": {
@ -102,7 +101,6 @@ class DataTransformer:
"max_level_gap_count": self.config_entry.options.get(_const.CONF_PEAK_PRICE_MAX_LEVEL_GAP_COUNT, 0), "max_level_gap_count": self.config_entry.options.get(_const.CONF_PEAK_PRICE_MAX_LEVEL_GAP_COUNT, 0),
"enable_min_periods": self.config_entry.options.get(_const.CONF_ENABLE_MIN_PERIODS_PEAK, False), "enable_min_periods": self.config_entry.options.get(_const.CONF_ENABLE_MIN_PERIODS_PEAK, False),
"min_periods": self.config_entry.options.get(_const.CONF_MIN_PERIODS_PEAK, 2), "min_periods": self.config_entry.options.get(_const.CONF_MIN_PERIODS_PEAK, 2),
"relaxation_step": self.config_entry.options.get(_const.CONF_RELAXATION_STEP_PEAK, 5.0),
"relaxation_attempts": self.config_entry.options.get(_const.CONF_RELAXATION_ATTEMPTS_PEAK, 4), "relaxation_attempts": self.config_entry.options.get(_const.CONF_RELAXATION_ATTEMPTS_PEAK, 4),
}, },
} }

View file

@ -6,7 +6,7 @@ This package splits period calculation logic into focused modules:
- level_filtering: Interval-level filtering logic - level_filtering: Interval-level filtering logic
- period_building: Period construction from intervals - period_building: Period construction from intervals
- period_statistics: Statistics calculation - period_statistics: Statistics calculation
- period_merging: Overlap resolution and merging - period_overlap: Overlap resolution logic
- relaxation: Per-day relaxation strategy - relaxation: Per-day relaxation strategy
- core: Main API orchestration - core: Main API orchestration
- outlier_filtering: Price spike detection and smoothing - outlier_filtering: Price spike detection and smoothing

View file

@ -20,14 +20,15 @@ from .period_building import (
filter_periods_by_min_length, filter_periods_by_min_length,
split_intervals_by_day, split_intervals_by_day,
) )
from .period_merging import (
merge_adjacent_periods_at_midnight,
)
from .period_statistics import ( from .period_statistics import (
extract_period_summaries, extract_period_summaries,
) )
from .types import ThresholdConfig from .types import ThresholdConfig
# Flex limits to prevent degenerate behavior (see docs/development/period-calculation-theory.md)
MAX_SAFE_FLEX = 0.50 # 50% - hard cap: above this, period detection becomes unreliable
MAX_OUTLIER_FLEX = 0.25 # 25% - cap for outlier filtering: above this, spike detection too permissive
def calculate_periods( def calculate_periods(
all_prices: list[dict], all_prices: list[dict],
@ -46,8 +47,9 @@ def calculate_periods(
2. Calculate reference prices (min/max per day) 2. Calculate reference prices (min/max per day)
3. Build periods based on criteria 3. Build periods based on criteria
4. Filter by minimum length 4. Filter by minimum length
5. Merge adjacent periods at midnight 5. Add interval ends
6. Extract period summaries (start/end times, not full price data) 6. Filter periods by end date
7. Extract period summaries (start/end times, not full price data)
Args: Args:
all_prices: All price data points from yesterday/today/tomorrow all_prices: All price data points from yesterday/today/tomorrow
@ -62,14 +64,33 @@ def calculate_periods(
- reference_data: Daily min/max/avg for on-demand annotation - reference_data: Daily min/max/avg for on-demand annotation
""" """
# Import logger at the start of function
import logging # noqa: PLC0415
from .types import INDENT_L0 # noqa: PLC0415
_LOGGER = logging.getLogger(__name__) # noqa: N806
# Extract config values # Extract config values
reverse_sort = config.reverse_sort reverse_sort = config.reverse_sort
flex = config.flex flex_raw = config.flex
min_distance_from_avg = config.min_distance_from_avg min_distance_from_avg = config.min_distance_from_avg
min_period_length = config.min_period_length min_period_length = config.min_period_length
threshold_low = config.threshold_low threshold_low = config.threshold_low
threshold_high = config.threshold_high threshold_high = config.threshold_high
# CRITICAL: Hard cap flex at 50% to prevent degenerate behavior
# Above 50%, period detection becomes unreliable (too many intervals qualify)
flex = flex_raw
if abs(flex_raw) > MAX_SAFE_FLEX:
flex = MAX_SAFE_FLEX if flex_raw > 0 else -MAX_SAFE_FLEX
_LOGGER.warning(
"Flex %.1f%% exceeds maximum safe value! Capping at %.0f%%. "
"Recommendation: Use 15-20%% with relaxation enabled, or 25-35%% without relaxation.",
abs(flex_raw) * 100,
MAX_SAFE_FLEX * 100,
)
if not all_prices: if not all_prices:
return { return {
"periods": [], "periods": [],
@ -100,9 +121,23 @@ def calculate_periods(
# Step 2.5: Filter price outliers (smoothing for period formation only) # Step 2.5: Filter price outliers (smoothing for period formation only)
# This runs BEFORE period formation to prevent isolated price spikes # This runs BEFORE period formation to prevent isolated price spikes
# from breaking up otherwise continuous periods # from breaking up otherwise continuous periods
# CRITICAL: Cap flexibility for outlier filtering at 25%
# High flex (>25%) makes outlier detection too permissive, accepting
# unstable price contexts as "normal". This breaks period formation.
# User's flex setting still applies to period criteria (in_flex check).
outlier_flex = min(abs(flex) * 100, MAX_OUTLIER_FLEX * 100)
if abs(flex) * 100 > MAX_OUTLIER_FLEX * 100:
_LOGGER.debug(
"%sOutlier filtering: Using capped flex %.1f%% (user setting: %.1f%%)",
INDENT_L0,
outlier_flex,
abs(flex) * 100,
)
all_prices_smoothed = filter_price_outliers( all_prices_smoothed = filter_price_outliers(
all_prices_sorted, all_prices_sorted,
abs(flex) * 100, # Convert to percentage (e.g., 0.15 → 15.0) outlier_flex, # Use capped flex for outlier detection
min_period_length, min_period_length,
) )
@ -122,16 +157,27 @@ def calculate_periods(
time=time, time=time,
) )
_LOGGER.debug(
"%sAfter build_periods: %d raw periods found (flex=%.1f%%, level_filter=%s)",
INDENT_L0,
len(raw_periods),
abs(flex) * 100,
config.level_filter or "None",
)
# Step 4: Filter by minimum length # Step 4: Filter by minimum length
raw_periods = filter_periods_by_min_length(raw_periods, min_period_length, time=time) raw_periods = filter_periods_by_min_length(raw_periods, min_period_length, time=time)
_LOGGER.debug(
"%sAfter filter_by_min_length (>= %d min): %d periods remain",
INDENT_L0,
min_period_length,
len(raw_periods),
)
# Step 5: Merge adjacent periods at midnight # Step 5: Add interval ends
raw_periods = merge_adjacent_periods_at_midnight(raw_periods, time=time)
# Step 6: Add interval ends
add_interval_ends(raw_periods, time=time) add_interval_ends(raw_periods, time=time)
# Step 7: Filter periods by end date (keep periods ending today or later) # Step 6: Filter periods by end date (keep periods ending today or later)
raw_periods = filter_periods_by_end_date(raw_periods, time=time) raw_periods = filter_periods_by_end_date(raw_periods, time=time)
# Step 8: Extract lightweight period summaries (no full price data) # Step 8: Extract lightweight period summaries (no full price data)

View file

@ -1,4 +1,13 @@
"""Interval-level filtering logic for period calculation.""" """
Interval-level filtering logic for period calculation.
Key Concepts:
- Flex Filter: Limits price distance from daily min/max
- Min Distance Filter: Ensures prices are significantly different from average
- Dynamic Scaling: Min_Distance reduces at high Flex to prevent conflicts
See docs/development/period-calculation-theory.md for detailed explanation.
"""
from __future__ import annotations from __future__ import annotations
@ -9,6 +18,13 @@ if TYPE_CHECKING:
from custom_components.tibber_prices.const import PRICE_LEVEL_MAPPING from custom_components.tibber_prices.const import PRICE_LEVEL_MAPPING
# Module-local log indentation (each module starts at level 0)
INDENT_L0 = "" # Entry point / main function
# Flex threshold for min_distance scaling
FLEX_SCALING_THRESHOLD = 0.20 # 20% - start adjusting min_distance
SCALE_FACTOR_WARNING_THRESHOLD = 0.8 # Log when reduction > 20%
def check_level_with_gap_tolerance( def check_level_with_gap_tolerance(
interval_level: int, interval_level: int,
@ -107,14 +123,47 @@ def check_interval_criteria(
# Check if interval qualifies for the period # Check if interval qualifies for the period
in_flex = percent_diff >= criteria.flex * 100 if criteria.reverse_sort else percent_diff <= criteria.flex * 100 in_flex = percent_diff >= criteria.flex * 100 if criteria.reverse_sort else percent_diff <= criteria.flex * 100
# Minimum distance from average # CRITICAL: Adjust min_distance dynamically based on flex to prevent conflicts
# Problem: High flex (e.g., 50%) can conflict with fixed min_distance (e.g., 5%)
# Solution: When flex is high, reduce min_distance requirement proportionally
#
# At low flex (≤20%), use full min_distance (e.g., 5%)
# At high flex (≥40%), reduce min_distance to avoid over-filtering
# Linear interpolation between 20-40% flex range
adjusted_min_distance = criteria.min_distance_from_avg
flex_abs = abs(criteria.flex)
if flex_abs > FLEX_SCALING_THRESHOLD:
# Scale down min_distance as flex increases
# At 20% flex: multiplier = 1.0 (full min_distance)
# At 40% flex: multiplier = 0.5 (half min_distance)
# At 50% flex: multiplier = 0.25 (quarter min_distance)
flex_excess = flex_abs - 0.20 # How much above 20%
scale_factor = max(0.25, 1.0 - (flex_excess * 2.5)) # Linear reduction, min 25%
adjusted_min_distance = criteria.min_distance_from_avg * scale_factor
# Log adjustment at DEBUG level (only when significant reduction)
if scale_factor < SCALE_FACTOR_WARNING_THRESHOLD:
import logging # noqa: PLC0415
_LOGGER = logging.getLogger(__name__) # noqa: N806
_LOGGER.debug(
"High flex %.1f%% detected: Reducing min_distance %.1f%%%.1f%% (scale %.2f)",
flex_abs * 100,
criteria.min_distance_from_avg,
adjusted_min_distance,
scale_factor,
)
# Minimum distance from average (using adjusted value)
if criteria.reverse_sort: if criteria.reverse_sort:
# Peak price: must be at least min_distance_from_avg% above average # Peak price: must be at least adjusted_min_distance% above average
min_distance_threshold = criteria.avg_price * (1 + criteria.min_distance_from_avg / 100) min_distance_threshold = criteria.avg_price * (1 + adjusted_min_distance / 100)
meets_min_distance = price >= min_distance_threshold meets_min_distance = price >= min_distance_threshold
else: else:
# Best price: must be at least min_distance_from_avg% below average # Best price: must be at least adjusted_min_distance% below average
min_distance_threshold = criteria.avg_price * (1 - criteria.min_distance_from_avg / 100) min_distance_threshold = criteria.avg_price * (1 - adjusted_min_distance / 100)
meets_min_distance = price <= min_distance_threshold meets_min_distance = price <= min_distance_threshold
return in_flex, meets_min_distance return in_flex, meets_min_distance

View file

@ -53,7 +53,7 @@ def calculate_reference_prices(intervals_by_day: dict[date, list[dict]], *, reve
return ref_prices return ref_prices
def build_periods( # noqa: PLR0913, PLR0915 - Complex period building logic requires many arguments and statements def build_periods( # noqa: PLR0913, PLR0915, PLR0912 - Complex period building logic requires many arguments, statements, and branches
all_prices: list[dict], all_prices: list[dict],
price_context: dict[str, Any], price_context: dict[str, Any],
*, *,
@ -65,9 +65,9 @@ def build_periods( # noqa: PLR0913, PLR0915 - Complex period building logic req
""" """
Build periods, allowing periods to cross midnight (day boundary). Build periods, allowing periods to cross midnight (day boundary).
Periods are built day-by-day, comparing each interval to its own day's reference. Periods can span multiple days. Each period uses the reference price (min/max) from
When a day boundary is crossed, the current period is ended. the day when the period started, ensuring consistent filtering criteria throughout
Adjacent periods at midnight are merged in a later step. the period even when crossing midnight.
Args: Args:
all_prices: All price data points all_prices: All price data points
@ -105,10 +105,12 @@ def build_periods( # noqa: PLR0913, PLR0915 - Complex period building logic req
periods: list[list[dict]] = [] periods: list[list[dict]] = []
current_period: list[dict] = [] current_period: list[dict] = []
last_ref_date: date | None = None period_start_date: date | None = None # Track start day of current period
consecutive_gaps = 0 # Track consecutive intervals that deviate by 1 level step consecutive_gaps = 0 # Track consecutive intervals that deviate by 1 level step
intervals_checked = 0 intervals_checked = 0
intervals_filtered_by_level = 0 intervals_filtered_by_level = 0
intervals_filtered_by_flex = 0
intervals_filtered_by_min_distance = 0
for price_data in all_prices: for price_data in all_prices:
starts_at = time.get_interval_time(price_data) starts_at = time.get_interval_time(price_data)
@ -123,16 +125,26 @@ def build_periods( # noqa: PLR0913, PLR0915 - Complex period building logic req
intervals_checked += 1 intervals_checked += 1
# Check flex and minimum distance criteria (using smoothed price) # Use reference price from period start day (for consistency across midnight)
# If no period active, use current interval's day
ref_date = period_start_date if period_start_date is not None else date_key
# Check flex and minimum distance criteria (using smoothed price and period start date reference)
criteria = IntervalCriteria( criteria = IntervalCriteria(
ref_price=ref_prices[date_key], ref_price=ref_prices[ref_date],
avg_price=avg_prices[date_key], avg_price=avg_prices[ref_date],
flex=flex, flex=flex,
min_distance_from_avg=min_distance_from_avg, min_distance_from_avg=min_distance_from_avg,
reverse_sort=reverse_sort, reverse_sort=reverse_sort,
) )
in_flex, meets_min_distance = check_interval_criteria(price_for_criteria, criteria) in_flex, meets_min_distance = check_interval_criteria(price_for_criteria, criteria)
# Track why intervals are filtered
if not in_flex:
intervals_filtered_by_flex += 1
if not meets_min_distance:
intervals_filtered_by_min_distance += 1
# If this interval was smoothed, check if smoothing actually made a difference # If this interval was smoothed, check if smoothing actually made a difference
smoothing_was_impactful = False smoothing_was_impactful = False
if price_data.get("_smoothed", False): if price_data.get("_smoothed", False):
@ -150,16 +162,12 @@ def build_periods( # noqa: PLR0913, PLR0915 - Complex period building logic req
if not meets_level: if not meets_level:
intervals_filtered_by_level += 1 intervals_filtered_by_level += 1
# Split period if day changes
if last_ref_date is not None and date_key != last_ref_date and current_period:
periods.append(current_period)
current_period = []
consecutive_gaps = 0 # Reset gap counter on day boundary
last_ref_date = date_key
# Add to period if all criteria are met # Add to period if all criteria are met
if in_flex and meets_min_distance and meets_level: if in_flex and meets_min_distance and meets_level:
# Start new period if none active
if not current_period:
period_start_date = date_key # Lock reference to start day
current_period.append( current_period.append(
{ {
"interval_hour": starts_at.hour, "interval_hour": starts_at.hour,
@ -176,21 +184,46 @@ def build_periods( # noqa: PLR0913, PLR0915 - Complex period building logic req
# Criteria no longer met, end current period # Criteria no longer met, end current period
periods.append(current_period) periods.append(current_period)
current_period = [] current_period = []
period_start_date = None # Reset period start date
consecutive_gaps = 0 # Reset gap counter consecutive_gaps = 0 # Reset gap counter
# Add final period if exists # Add final period if exists
if current_period: if current_period:
periods.append(current_period) periods.append(current_period)
# Log summary # Log detailed filter statistics
if level_filter_active and intervals_checked > 0: if intervals_checked > 0:
filtered_pct = (intervals_filtered_by_level / intervals_checked) * 100
_LOGGER.debug( _LOGGER.debug(
"%sLevel filter summary: %d/%d intervals filtered (%.1f%%)", "%sFilter statistics: %d intervals checked",
INDENT_L0,
intervals_checked,
)
if intervals_filtered_by_flex > 0:
flex_pct = (intervals_filtered_by_flex / intervals_checked) * 100
_LOGGER.debug(
"%s Filtered by FLEX (price too far from ref): %d/%d (%.1f%%)",
INDENT_L0,
intervals_filtered_by_flex,
intervals_checked,
flex_pct,
)
if intervals_filtered_by_min_distance > 0:
distance_pct = (intervals_filtered_by_min_distance / intervals_checked) * 100
_LOGGER.debug(
"%s Filtered by MIN_DISTANCE (price too close to avg): %d/%d (%.1f%%)",
INDENT_L0,
intervals_filtered_by_min_distance,
intervals_checked,
distance_pct,
)
if level_filter_active and intervals_filtered_by_level > 0:
level_pct = (intervals_filtered_by_level / intervals_checked) * 100
_LOGGER.debug(
"%s Filtered by LEVEL (wrong price level): %d/%d (%.1f%%)",
INDENT_L0, INDENT_L0,
intervals_filtered_by_level, intervals_filtered_by_level,
intervals_checked, intervals_checked,
filtered_pct, level_pct,
) )
return periods return periods

View file

@ -1,385 +0,0 @@
"""Period merging and overlap resolution logic."""
from __future__ import annotations
import logging
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from datetime import datetime
from custom_components.tibber_prices.coordinator.time_service import TimeService
_LOGGER = logging.getLogger(__name__)
# Module-local log indentation (each module starts at level 0)
INDENT_L0 = "" # Entry point / main function
INDENT_L1 = " " # Nested logic / loop iterations
INDENT_L2 = " " # Deeper nesting
def merge_adjacent_periods_at_midnight(periods: list[list[dict]], *, time: TimeService) -> list[list[dict]]:
"""
Merge adjacent periods that meet at midnight.
When two periods are detected separately for consecutive days but are directly
adjacent at midnight (15 minutes apart), merge them into a single period.
"""
if not periods:
return periods
merged = []
i = 0
while i < len(periods):
current_period = periods[i]
# Check if there's a next period and if they meet at midnight
if i + 1 < len(periods):
next_period = periods[i + 1]
last_start = current_period[-1].get("interval_start")
next_start = next_period[0].get("interval_start")
if last_start and next_start:
time_diff = next_start - last_start
last_date = last_start.date()
next_date = next_start.date()
# If they are one interval apart and on different days (crossing midnight)
if time_diff == time.get_interval_duration() and next_date > last_date:
# Merge the two periods
merged_period = current_period + next_period
merged.append(merged_period)
i += 2 # Skip both periods as we've merged them
continue
# If no merge happened, just add the current period
merged.append(current_period)
i += 1
return merged
def recalculate_period_metadata(periods: list[dict], *, time: TimeService) -> None:
"""
Recalculate period metadata after merging periods.
Updates period_position, periods_total, and periods_remaining for all periods
based on chronological order.
This must be called after resolve_period_overlaps() to ensure metadata
reflects the final merged period list.
Args:
periods: List of period summary dicts (mutated in-place)
time: TimeService instance (required)
"""
if not periods:
return
# Sort periods chronologically by start time
periods.sort(key=lambda p: p.get("start") or time.now())
# Update metadata for all periods
total_periods = len(periods)
for position, period in enumerate(periods, 1):
period["period_position"] = position
period["periods_total"] = total_periods
period["periods_remaining"] = total_periods - position
def split_period_by_overlaps(
period_start: datetime,
period_end: datetime,
overlaps: list[tuple[datetime, datetime]],
) -> list[tuple[datetime, datetime]]:
"""
Split a time period into segments that don't overlap with given ranges.
Args:
period_start: Start of period to split
period_end: End of period to split
overlaps: List of (start, end) tuples representing overlapping ranges
Returns:
List of (start, end) tuples for non-overlapping segments
Example:
period: 09:00-15:00
overlaps: [(10:00-12:00), (14:00-16:00)]
result: [(09:00-10:00), (12:00-14:00)]
"""
# Sort overlaps by start time
sorted_overlaps = sorted(overlaps, key=lambda x: x[0])
segments = []
current_pos = period_start
for overlap_start, overlap_end in sorted_overlaps:
# Add segment before this overlap (if any)
if current_pos < overlap_start:
segments.append((current_pos, overlap_start))
# Move position past this overlap
current_pos = max(current_pos, overlap_end)
# Add final segment after all overlaps (if any)
if current_pos < period_end:
segments.append((current_pos, period_end))
return segments
def resolve_period_overlaps( # noqa: PLR0912, PLR0915, C901 - Complex overlap resolution with replacement and extension logic
existing_periods: list[dict],
new_relaxed_periods: list[dict],
min_period_length: int,
baseline_periods: list[dict] | None = None,
) -> tuple[list[dict], int]:
"""
Resolve overlaps between existing periods and newly found relaxed periods.
Existing periods (baseline + previous relaxation phases) have priority and remain unchanged.
Newly relaxed periods are adjusted to not overlap with existing periods.
After splitting relaxed periods to avoid overlaps, each segment is validated against
min_period_length. Segments shorter than this threshold are discarded.
This function is called incrementally after each relaxation phase:
- Phase 1: existing = accumulated, baseline = baseline
- Phase 2: existing = accumulated, baseline = baseline
- Phase 3: existing = accumulated, baseline = baseline
Args:
existing_periods: All previously found periods (baseline + earlier relaxation phases)
new_relaxed_periods: Periods found in current relaxation phase (will be adjusted)
min_period_length: Minimum period length in minutes (segments shorter than this are discarded)
baseline_periods: Original baseline periods (for extension detection). Extensions only count
against baseline, not against other relaxation periods.
Returns:
Tuple of (merged_periods, count_standalone_relaxed):
- merged_periods: All periods (existing + adjusted new), sorted by start time
- count_standalone_relaxed: Number of new relaxed periods that count toward min_periods
(excludes extensions of baseline periods only)
"""
if baseline_periods is None:
baseline_periods = existing_periods # Fallback to existing if not provided
_LOGGER.debug(
"%sresolve_period_overlaps called: existing=%d, new=%d, baseline=%d",
INDENT_L0,
len(existing_periods),
len(new_relaxed_periods),
len(baseline_periods),
)
if not new_relaxed_periods:
return existing_periods.copy(), 0
if not existing_periods:
# No overlaps possible - all relaxed periods are standalone
return new_relaxed_periods.copy(), len(new_relaxed_periods)
merged = existing_periods.copy()
count_standalone = 0
for relaxed in new_relaxed_periods:
# Skip if this exact period is already in existing_periods (duplicate from previous relaxation attempt)
# Compare current start/end (before any splitting), not original_start/end
# Note: original_start/end are set AFTER splitting and indicate split segments from same source
relaxed_start = relaxed["start"]
relaxed_end = relaxed["end"]
is_duplicate = False
for existing in existing_periods:
# Only compare with existing periods that haven't been adjusted (unsplit originals)
# If existing has original_start/end, it's already a split segment - skip comparison
if "original_start" in existing:
continue
existing_start = existing["start"]
existing_end = existing["end"]
# Duplicate if same boundaries (within 1 minute tolerance)
tolerance_seconds = 60 # 1 minute tolerance for duplicate detection
if (
abs((relaxed_start - existing_start).total_seconds()) < tolerance_seconds
and abs((relaxed_end - existing_end).total_seconds()) < tolerance_seconds
):
is_duplicate = True
_LOGGER.debug(
"%sSkipping duplicate period %s-%s (already exists from previous relaxation)",
INDENT_L1,
relaxed_start.strftime("%H:%M"),
relaxed_end.strftime("%H:%M"),
)
break
if is_duplicate:
continue
# Find all overlapping existing periods
overlaps = []
for existing in existing_periods:
existing_start = existing["start"]
existing_end = existing["end"]
# Check for overlap
if relaxed_start < existing_end and relaxed_end > existing_start:
overlaps.append((existing_start, existing_end))
if not overlaps:
# No overlap - check if adjacent to baseline period (= extension)
# Only baseline extensions don't count toward min_periods
is_extension = False
for baseline in baseline_periods:
if relaxed_end == baseline["start"] or relaxed_start == baseline["end"]:
is_extension = True
break
if is_extension:
relaxed["is_extension"] = True
_LOGGER.debug(
"%sMarking period %s-%s as extension (no overlap, adjacent to baseline)",
INDENT_L1,
relaxed_start.strftime("%H:%M"),
relaxed_end.strftime("%H:%M"),
)
else:
count_standalone += 1
merged.append(relaxed)
else:
# Has overlaps - check if this new period extends BASELINE periods
# Extension = new period encompasses/extends baseline period(s)
# Note: If new period encompasses OTHER RELAXED periods, that's a replacement, not extension!
is_extension = False
periods_to_replace = []
for existing in existing_periods:
existing_start = existing["start"]
existing_end = existing["end"]
# Check if new period completely encompasses existing period
if relaxed_start <= existing_start and relaxed_end >= existing_end:
# Is this existing period a BASELINE period?
is_baseline = any(
bp["start"] == existing_start and bp["end"] == existing_end for bp in baseline_periods
)
if is_baseline:
# Extension of baseline → counts as extension
is_extension = True
_LOGGER.debug(
"%sNew period %s-%s extends BASELINE period %s-%s",
INDENT_L1,
relaxed_start.strftime("%H:%M"),
relaxed_end.strftime("%H:%M"),
existing_start.strftime("%H:%M"),
existing_end.strftime("%H:%M"),
)
else:
# Encompasses another relaxed period → REPLACEMENT, not extension
periods_to_replace.append(existing)
_LOGGER.debug(
"%sNew period %s-%s replaces relaxed period %s-%s (larger is better)",
INDENT_L1,
relaxed_start.strftime("%H:%M"),
relaxed_end.strftime("%H:%M"),
existing_start.strftime("%H:%M"),
existing_end.strftime("%H:%M"),
)
# Remove periods that are being replaced by this larger period
if periods_to_replace:
for period_to_remove in periods_to_replace:
if period_to_remove in merged:
merged.remove(period_to_remove)
_LOGGER.debug(
"%sReplaced period %s-%s with larger period %s-%s",
INDENT_L2,
period_to_remove["start"].strftime("%H:%M"),
period_to_remove["end"].strftime("%H:%M"),
relaxed_start.strftime("%H:%M"),
relaxed_end.strftime("%H:%M"),
)
# Split the relaxed period into non-overlapping segments
segments = split_period_by_overlaps(relaxed_start, relaxed_end, overlaps)
# If no segments (completely overlapped), but we replaced periods, add the full period
if not segments and periods_to_replace:
_LOGGER.debug(
"%sAdding full replacement period %s-%s (no non-overlapping segments)",
INDENT_L2,
relaxed_start.strftime("%H:%M"),
relaxed_end.strftime("%H:%M"),
)
# Mark as extension if it extends baseline, otherwise standalone
if is_extension:
relaxed["is_extension"] = True
merged.append(relaxed)
continue
for seg_start, seg_end in segments:
# Calculate segment duration in minutes
segment_duration_minutes = int((seg_end - seg_start).total_seconds() / 60)
# Skip segment if it's too short
if segment_duration_minutes < min_period_length:
continue
# Create adjusted period segment
adjusted_period = relaxed.copy()
adjusted_period["start"] = seg_start
adjusted_period["end"] = seg_end
adjusted_period["duration_minutes"] = segment_duration_minutes
# Mark as adjusted and potentially as extension
adjusted_period["adjusted_for_overlap"] = True
adjusted_period["original_start"] = relaxed_start
adjusted_period["original_end"] = relaxed_end
# If the original period was an extension, all its segments are extensions too
# OR if segment is adjacent to baseline
segment_is_extension = is_extension
if not segment_is_extension:
# Check if segment is directly adjacent to BASELINE period
for baseline in baseline_periods:
if seg_end == baseline["start"] or seg_start == baseline["end"]:
segment_is_extension = True
break
if segment_is_extension:
adjusted_period["is_extension"] = True
_LOGGER.debug(
"%sMarking segment %s-%s as extension (original was extension or adjacent to baseline)",
INDENT_L2,
seg_start.strftime("%H:%M"),
seg_end.strftime("%H:%M"),
)
else:
# Standalone segment counts toward min_periods
count_standalone += 1
merged.append(adjusted_period)
# Sort all periods by start time
merged.sort(key=lambda p: p["start"])
# Count ACTUAL standalone periods in final merged list (not just newly added ones)
# This accounts for replacements where old standalone was replaced by new standalone
final_standalone_count = len([p for p in merged if not p.get("is_extension")])
# Subtract baseline standalone count to get NEW standalone from this relaxation
baseline_standalone_count = len([p for p in baseline_periods if not p.get("is_extension")])
new_standalone_count = final_standalone_count - baseline_standalone_count
return merged, new_standalone_count

View file

@ -0,0 +1,241 @@
"""Period overlap resolution logic."""
from __future__ import annotations
import logging
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from custom_components.tibber_prices.coordinator.time_service import TimeService
_LOGGER = logging.getLogger(__name__)
# Module-local log indentation (each module starts at level 0)
INDENT_L0 = "" # Entry point / main function
INDENT_L1 = " " # Nested logic / loop iterations
INDENT_L2 = " " # Deeper nesting
def recalculate_period_metadata(periods: list[dict], *, time: TimeService) -> None:
"""
Recalculate period metadata after merging periods.
Updates period_position, periods_total, and periods_remaining for all periods
based on chronological order.
This must be called after resolve_period_overlaps() to ensure metadata
reflects the final merged period list.
Args:
periods: List of period summary dicts (mutated in-place)
time: TimeService instance (required)
"""
if not periods:
return
# Sort periods chronologically by start time
periods.sort(key=lambda p: p.get("start") or time.now())
# Update metadata for all periods
total_periods = len(periods)
for position, period in enumerate(periods, 1):
period["period_position"] = position
period["periods_total"] = total_periods
period["periods_remaining"] = total_periods - position
def merge_adjacent_periods(period1: dict, period2: dict) -> dict:
"""
Merge two adjacent or overlapping periods into one.
The newer period's relaxation attributes override the older period's.
Takes the earliest start time and latest end time.
Relaxation attributes from the newer period (period2) override those from period1:
- relaxation_active
- relaxation_level
- relaxation_threshold_original_%
- relaxation_threshold_applied_%
- period_interval_level_gap_count
- period_interval_smoothed_count
Args:
period1: First period (older baseline or relaxed period)
period2: Second period (newer relaxed period with higher flex)
Returns:
Merged period dict with combined time span and newer period's attributes
"""
# Take earliest start and latest end
merged_start = min(period1["start"], period2["start"])
merged_end = max(period1["end"], period2["end"])
merged_duration = int((merged_end - merged_start).total_seconds() / 60)
# Start with period1 as base
merged = period1.copy()
# Update time boundaries
merged["start"] = merged_start
merged["end"] = merged_end
merged["duration_minutes"] = merged_duration
# Override with period2's relaxation attributes (newer/higher flex wins)
relaxation_attrs = [
"relaxation_active",
"relaxation_level",
"relaxation_threshold_original_%",
"relaxation_threshold_applied_%",
"period_interval_level_gap_count",
"period_interval_smoothed_count",
]
for attr in relaxation_attrs:
if attr in period2:
merged[attr] = period2[attr]
# Mark as merged (for debugging)
merged["merged_from"] = {
"period1_start": period1["start"].isoformat(),
"period1_end": period1["end"].isoformat(),
"period2_start": period2["start"].isoformat(),
"period2_end": period2["end"].isoformat(),
}
_LOGGER.debug(
"%sMerged periods: %s-%s + %s-%s%s-%s (duration: %d min)",
INDENT_L2,
period1["start"].strftime("%H:%M"),
period1["end"].strftime("%H:%M"),
period2["start"].strftime("%H:%M"),
period2["end"].strftime("%H:%M"),
merged_start.strftime("%H:%M"),
merged_end.strftime("%H:%M"),
merged_duration,
)
return merged
def resolve_period_overlaps(
existing_periods: list[dict],
new_relaxed_periods: list[dict],
) -> tuple[list[dict], int]:
"""
Resolve overlaps between existing periods and newly found relaxed periods.
Adjacent or overlapping periods are merged into single continuous periods.
The newer period's relaxation attributes override the older period's.
This function is called incrementally after each relaxation phase:
- Phase 1: existing = baseline, new = first relaxation
- Phase 2: existing = baseline + phase 1, new = second relaxation
- Phase 3: existing = baseline + phase 1 + phase 2, new = third relaxation
Args:
existing_periods: All previously found periods (baseline + earlier relaxation phases)
new_relaxed_periods: Periods found in current relaxation phase (will be merged if adjacent)
Returns:
Tuple of (merged_periods, new_periods_count):
- merged_periods: All periods after merging, sorted by start time
- new_periods_count: Number of new periods added (some may have been merged)
"""
_LOGGER.debug(
"%sresolve_period_overlaps called: existing=%d, new=%d",
INDENT_L0,
len(existing_periods),
len(new_relaxed_periods),
)
if not new_relaxed_periods:
return existing_periods.copy(), 0
if not existing_periods:
# No existing periods - return all new periods
return new_relaxed_periods.copy(), len(new_relaxed_periods)
merged = existing_periods.copy()
periods_added = 0
for relaxed in new_relaxed_periods:
relaxed_start = relaxed["start"]
relaxed_end = relaxed["end"]
# Check if this period is duplicate (exact match within tolerance)
tolerance_seconds = 60 # 1 minute tolerance
is_duplicate = False
for existing in merged:
if (
abs((relaxed_start - existing["start"]).total_seconds()) < tolerance_seconds
and abs((relaxed_end - existing["end"]).total_seconds()) < tolerance_seconds
):
is_duplicate = True
_LOGGER.debug(
"%sSkipping duplicate period %s-%s (already exists)",
INDENT_L1,
relaxed_start.strftime("%H:%M"),
relaxed_end.strftime("%H:%M"),
)
break
if is_duplicate:
continue
# Find periods that are adjacent or overlapping (should be merged)
periods_to_merge = []
for idx, existing in enumerate(merged):
existing_start = existing["start"]
existing_end = existing["end"]
# Check if adjacent (no gap) or overlapping
is_adjacent = relaxed_end == existing_start or relaxed_start == existing_end
is_overlapping = relaxed_start < existing_end and relaxed_end > existing_start
if is_adjacent or is_overlapping:
periods_to_merge.append((idx, existing))
_LOGGER.debug(
"%sPeriod %s-%s %s with existing period %s-%s",
INDENT_L1,
relaxed_start.strftime("%H:%M"),
relaxed_end.strftime("%H:%M"),
"overlaps" if is_overlapping else "is adjacent to",
existing_start.strftime("%H:%M"),
existing_end.strftime("%H:%M"),
)
if not periods_to_merge:
# No merge needed - add as new period
merged.append(relaxed)
periods_added += 1
_LOGGER.debug(
"%sAdded new period %s-%s (no overlap/adjacency)",
INDENT_L1,
relaxed_start.strftime("%H:%M"),
relaxed_end.strftime("%H:%M"),
)
else:
# Merge with all adjacent/overlapping periods
# Start with the new relaxed period
merged_period = relaxed.copy()
# Remove old periods (in reverse order to maintain indices)
for idx, existing in reversed(periods_to_merge):
merged_period = merge_adjacent_periods(existing, merged_period)
merged.pop(idx)
# Add the merged result
merged.append(merged_period)
# Count as added if we merged exactly one existing period
# (means we extended/merged, not replaced multiple)
if len(periods_to_merge) == 1:
periods_added += 1
# Sort all periods by start time
merged.sort(key=lambda p: p["start"])
return merged, periods_added

View file

@ -13,7 +13,7 @@ if TYPE_CHECKING:
from .types import PeriodConfig from .types import PeriodConfig
from .period_merging import ( from .period_overlap import (
recalculate_period_metadata, recalculate_period_metadata,
resolve_period_overlaps, resolve_period_overlaps,
) )
@ -25,6 +25,12 @@ from .types import (
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
# Flex thresholds for warnings (see docs/development/period-calculation-theory.md)
# With relaxation active, high base flex is counterproductive (reduces relaxation effectiveness)
FLEX_WARNING_THRESHOLD_RELAXATION = 0.25 # 25% - INFO: suggest lowering to 15-20%
MAX_FLEX_HARD_LIMIT = 0.50 # 50% - hard maximum flex value
FLEX_HIGH_THRESHOLD_RELAXATION = 0.30 # 30% - WARNING: base flex too high for relaxation mode
def group_periods_by_day(periods: list[dict]) -> dict[date, list[dict]]: def group_periods_by_day(periods: list[dict]) -> dict[date, list[dict]]:
""" """
@ -120,16 +126,14 @@ def check_min_periods_per_day(
periods_by_day = group_periods_by_day(periods) periods_by_day = group_periods_by_day(periods)
# Check each day with price data: ALL must have at least min_periods # Check each day with price data: ALL must have at least min_periods
# Only count standalone periods (exclude extensions)
for day in available_days: for day in available_days:
day_periods = periods_by_day.get(day, []) day_periods = periods_by_day.get(day, [])
# Count only standalone periods (not extensions) period_count = len(day_periods)
standalone_count = sum(1 for p in day_periods if not p.get("is_extension")) if period_count < min_periods:
if standalone_count < min_periods:
_LOGGER.debug( _LOGGER.debug(
"Day %s has only %d standalone periods (need %d) - continuing relaxation", "Day %s has only %d periods (need %d) - continuing relaxation",
day, day,
standalone_count, period_count,
min_periods, min_periods,
) )
return False # This day doesn't have enough, continue relaxation return False # This day doesn't have enough, continue relaxation
@ -170,7 +174,6 @@ def calculate_periods_with_relaxation( # noqa: PLR0913, PLR0915 - Per-day relax
config: PeriodConfig, config: PeriodConfig,
enable_relaxation: bool, enable_relaxation: bool,
min_periods: int, min_periods: int,
relaxation_step_pct: int,
max_relaxation_attempts: int, max_relaxation_attempts: int,
should_show_callback: Callable[[str | None], bool], should_show_callback: Callable[[str | None], bool],
time: TimeService, time: TimeService,
@ -184,7 +187,7 @@ def calculate_periods_with_relaxation( # noqa: PLR0913, PLR0915 - Per-day relax
If min_periods is not reached with normal filters, this function gradually If min_periods is not reached with normal filters, this function gradually
relaxes filters in multiple phases FOR EACH DAY SEPARATELY: relaxes filters in multiple phases FOR EACH DAY SEPARATELY:
Phase 1: Increase flex threshold step-by-step (up to 4 attempts) Phase 1: Increase flex threshold step-by-step (up to max_relaxation_attempts)
Phase 2: Disable level filter (set to "any") Phase 2: Disable level filter (set to "any")
Args: Args:
@ -192,10 +195,9 @@ def calculate_periods_with_relaxation( # noqa: PLR0913, PLR0915 - Per-day relax
config: Base period configuration config: Base period configuration
enable_relaxation: Whether relaxation is enabled enable_relaxation: Whether relaxation is enabled
min_periods: Minimum number of periods required PER DAY min_periods: Minimum number of periods required PER DAY
relaxation_step_pct: Percentage of the original flex threshold to add per relaxation
step (controls how aggressively flex widens with each attempt)
max_relaxation_attempts: Maximum number of flex levels (attempts) to try per day max_relaxation_attempts: Maximum number of flex levels (attempts) to try per day
before giving up (each attempt runs the full filter matrix) before giving up (each attempt runs the full filter matrix). With 3% increment
per step, 11 attempts allows escalation from 15% to 48% flex.
should_show_callback: Callback function(level_override) -> bool should_show_callback: Callback function(level_override) -> bool
Returns True if periods should be shown with given filter overrides. Pass None Returns True if periods should be shown with given filter overrides. Pass None
to use original configured filter values. to use original configured filter values.
@ -256,9 +258,8 @@ def calculate_periods_with_relaxation( # noqa: PLR0913, PLR0915 - Per-day relax
min_periods, min_periods,
) )
_LOGGER.debug( _LOGGER.debug(
"%sRelaxation strategy: %.1f%% flex increment per step (%d flex levels x 4 filter combinations)", "%sRelaxation strategy: 3%% fixed flex increment per step (%d flex levels x 2 filter combinations)",
INDENT_L0, INDENT_L0,
relaxation_step_pct,
max_relaxation_attempts, max_relaxation_attempts,
) )
_LOGGER.debug( _LOGGER.debug(
@ -270,10 +271,11 @@ def calculate_periods_with_relaxation( # noqa: PLR0913, PLR0915 - Per-day relax
INDENT_L0, INDENT_L0,
) )
# Group prices by day (for both relaxation enabled/disabled) # Validate we have price data for today/future
prices_by_day = group_prices_by_day(all_prices, time=time) today = time.now().date()
future_prices = [p for p in all_prices if time.get_interval_time(p).date() >= today]
if not prices_by_day: if not future_prices:
# No price data for today/future # No price data for today/future
_LOGGER.warning( _LOGGER.warning(
"No price data available for today/future - cannot calculate periods", "No price data available for today/future - cannot calculate periods",
@ -285,103 +287,104 @@ def calculate_periods_with_relaxation( # noqa: PLR0913, PLR0915 - Per-day relax
"periods_found": 0, "periods_found": 0,
} }
# Count available days for logging
prices_by_day = group_prices_by_day(all_prices, time=time)
total_days = len(prices_by_day) total_days = len(prices_by_day)
_LOGGER.info( _LOGGER.info(
"Calculating baseline periods for %d days...", "Calculating baseline periods for %d days...",
total_days, total_days,
) )
# === BASELINE CALCULATION (same for both modes) ===
all_periods: list[dict] = []
all_phases_used: list[str] = []
relaxation_was_needed = False
days_meeting_requirement = 0
for day, day_prices in sorted(prices_by_day.items()):
_LOGGER.debug( _LOGGER.debug(
"%sProcessing day %s with %d price intervals", "%sProcessing ALL %d price intervals together (allows midnight crossing)",
INDENT_L1, INDENT_L1,
day, len(future_prices),
len(day_prices),
) )
# Calculate baseline periods for this day # === BASELINE CALCULATION (process ALL prices together) ===
day_result = calculate_periods(day_prices, config=config, time=time) baseline_result = calculate_periods(future_prices, config=config, time=time)
day_periods = day_result["periods"] all_periods = baseline_result["periods"]
standalone_count = len([p for p in day_periods if not p.get("is_extension")])
# Count periods per day for min_periods check
periods_by_day = group_periods_by_day(all_periods)
days_meeting_requirement = 0
for day in sorted(prices_by_day.keys()):
day_periods = periods_by_day.get(day, [])
period_count = len(day_periods)
_LOGGER.debug( _LOGGER.debug(
"%sDay %s baseline: Found %d standalone periods%s", "%sDay %s baseline: Found %d periods%s",
INDENT_L1, INDENT_L1,
day, day,
standalone_count, period_count,
f" (need {min_periods})" if enable_relaxation else "", f" (need {min_periods})" if enable_relaxation else "",
) )
# Check if relaxation is needed for this day if period_count >= min_periods:
if not enable_relaxation or standalone_count >= min_periods:
# No relaxation needed/possible - use baseline
if enable_relaxation:
_LOGGER.debug(
"%sDay %s: Target reached with baseline - no relaxation needed",
INDENT_L1,
day,
)
all_periods.extend(day_periods)
days_meeting_requirement += 1 days_meeting_requirement += 1
continue
# === RELAXATION PATH (only when enabled AND needed) === # Check if relaxation is needed
relaxation_was_needed = False
all_phases_used: list[str] = []
if enable_relaxation and days_meeting_requirement < total_days:
# At least one day doesn't have enough periods
_LOGGER.debug( _LOGGER.debug(
"%sDay %s: Baseline insufficient - starting relaxation", "%sBaseline insufficient (%d/%d days met target) - starting relaxation",
INDENT_L1, INDENT_L1,
day, days_meeting_requirement,
total_days,
) )
relaxation_was_needed = True relaxation_was_needed = True
# Run full relaxation for this specific day # Run relaxation on ALL prices together
day_relaxed_result, day_metadata = relax_single_day( relaxed_result, relax_metadata = relax_all_prices(
day_prices=day_prices, all_prices=future_prices,
config=config, config=config,
min_periods=min_periods, min_periods=min_periods,
relaxation_step_pct=relaxation_step_pct,
max_relaxation_attempts=max_relaxation_attempts, max_relaxation_attempts=max_relaxation_attempts,
should_show_callback=should_show_callback, should_show_callback=should_show_callback,
baseline_periods=day_periods, baseline_periods=all_periods,
day_label=str(day),
time=time, time=time,
) )
all_periods.extend(day_relaxed_result["periods"]) all_periods = relaxed_result["periods"]
if day_metadata.get("phases_used"): if relax_metadata.get("phases_used"):
all_phases_used.extend(day_metadata["phases_used"]) all_phases_used = relax_metadata["phases_used"]
# Check if this day met the requirement after relaxation # Recount after relaxation
day_standalone = len([p for p in day_relaxed_result["periods"] if not p.get("is_extension")]) periods_by_day = group_periods_by_day(all_periods)
if day_standalone >= min_periods: days_meeting_requirement = 0
for day in sorted(prices_by_day.keys()):
day_periods = periods_by_day.get(day, [])
period_count = len(day_periods)
if period_count >= min_periods:
days_meeting_requirement += 1 days_meeting_requirement += 1
elif enable_relaxation:
_LOGGER.debug(
"%sAll %d days met target with baseline - no relaxation needed",
INDENT_L1,
total_days,
)
# Sort all periods by start time # Sort periods by start time
all_periods.sort(key=lambda p: p["start"]) all_periods.sort(key=lambda p: p["start"])
# Recalculate metadata for combined periods # Recalculate metadata for combined periods
recalculate_period_metadata(all_periods, time=time) recalculate_period_metadata(all_periods, time=time)
# Build combined result # Build final result
if all_periods: final_result = baseline_result.copy()
# Use the last day's result as template
final_result = day_result.copy()
final_result["periods"] = all_periods final_result["periods"] = all_periods
else:
final_result = {"periods": [], "metadata": {}, "reference_data": {}}
total_standalone = len([p for p in all_periods if not p.get("is_extension")]) total_periods = len(all_periods)
return final_result, { return final_result, {
"relaxation_active": relaxation_was_needed, "relaxation_active": relaxation_was_needed,
"relaxation_attempted": relaxation_was_needed, "relaxation_attempted": relaxation_was_needed,
"min_periods_requested": min_periods, "min_periods_requested": min_periods,
"periods_found": total_standalone, "periods_found": total_periods,
"phases_used": list(set(all_phases_used)), # Unique phases used across all days "phases_used": list(set(all_phases_used)), # Unique phases used across all days
"days_processed": total_days, "days_processed": total_days,
"days_meeting_requirement": days_meeting_requirement, "days_meeting_requirement": days_meeting_requirement,
@ -389,45 +392,34 @@ def calculate_periods_with_relaxation( # noqa: PLR0913, PLR0915 - Per-day relax
} }
def relax_single_day( # noqa: PLR0913 - Comprehensive filter relaxation per day def relax_all_prices( # noqa: PLR0913 - Comprehensive filter relaxation requires many parameters and statements
day_prices: list[dict], all_prices: list[dict],
config: PeriodConfig, config: PeriodConfig,
min_periods: int, min_periods: int,
relaxation_step_pct: int,
max_relaxation_attempts: int, max_relaxation_attempts: int,
should_show_callback: Callable[[str | None], bool], should_show_callback: Callable[[str | None], bool],
baseline_periods: list[dict], baseline_periods: list[dict],
day_label: str,
*, *,
time: TimeService, time: TimeService,
) -> tuple[dict[str, Any], dict[str, Any]]: ) -> tuple[dict[str, Any], dict[str, Any]]:
""" """
Run comprehensive relaxation for a single day. Relax filters for all prices until min_periods per day is reached.
NEW STRATEGY: For each flex level, try all filter combinations before increasing flex. Strategy: Try increasing flex by 3% increments, then relax level filter.
This finds solutions faster by relaxing filters first (cheaper than increasing flex). Processes all prices together, allowing periods to cross midnight boundaries.
Returns when ALL days have min_periods (or max attempts exhausted).
Per flex level (6.25%, 7.5%, 8.75%, 10%), try in order:
1. Original filters (level=configured)
2. Relax level filter (level=any)
This ensures we find the minimal relaxation needed. Example:
- If periods exist at flex=6.25% with level=any, we find them before trying flex=7.5%
Args: Args:
day_prices: Price data for this specific day only all_prices: All price intervals (today + future)
config: Base period configuration config: Base period configuration
min_periods: Minimum periods needed for this day min_periods: Target number of periods PER DAY
relaxation_step_pct: Relaxation increment percentage max_relaxation_attempts: Maximum flex levels to try
max_relaxation_attempts: Maximum number of flex levels (attempts) to try for this day should_show_callback: Callback to check if a flex level should be shown
should_show_callback: Filter visibility callback(level_override) baseline_periods: Baseline periods (before relaxation)
Returns True if periods should be shown with given overrides. time: TimeService instance
baseline_periods: Periods found with normal filters
day_label: Label for logging (e.g., "2025-11-11")
time: TimeService instance (required)
Returns: Returns:
Tuple of (periods_result, metadata) for this day Tuple of (result_dict, metadata_dict)
""" """
# Import here to avoid circular dependency # Import here to avoid circular dependency
@ -435,128 +427,124 @@ def relax_single_day( # noqa: PLR0913 - Comprehensive filter relaxation per day
calculate_periods, calculate_periods,
) )
accumulated_periods = baseline_periods.copy() flex_increment = 0.03 # 3% per step (hard-coded for reliability)
original_flex = abs(config.flex) base_flex = abs(config.flex)
relaxation_increment = original_flex * (relaxation_step_pct / 100.0) original_level_filter = config.level_filter
existing_periods = list(baseline_periods) # Start with baseline
phases_used = [] phases_used = []
relaxed_result = None
baseline_standalone = len([p for p in baseline_periods if not p.get("is_extension")]) # Get available days from prices for checking
prices_by_day = group_prices_by_day(all_prices, time=time)
total_days = len(prices_by_day)
# Try flex levels (3% increments)
attempts = max(1, int(max_relaxation_attempts)) attempts = max(1, int(max_relaxation_attempts))
for attempt in range(1, attempts + 1):
current_flex = base_flex + (attempt * flex_increment)
# Flex levels: original + N steps (e.g., 5% → 6.25% → ...) # Stop if we exceed hard maximum
for flex_step in range(1, attempts + 1): if current_flex > MAX_FLEX_HARD_LIMIT:
new_flex = original_flex + (flex_step * relaxation_increment) _LOGGER.debug(
new_flex = min(new_flex, 100.0) "%s Reached 50%% flex hard limit",
INDENT_L2,
)
break
if config.reverse_sort: phase_label = f"flex={current_flex * 100:.1f}%"
new_flex = -new_flex
# Try filter combinations for this flex level # Skip this flex level if callback says not to show it
# Each tuple contains: level_override, label_suffix if not should_show_callback(phase_label):
filter_attempts = [
(None, ""), # Original config
("any", "+level_any"), # Relax level filter
]
for lvl_override, label_suffix in filter_attempts:
# Check if this combination is allowed by user config
if not should_show_callback(lvl_override):
continue continue
# Calculate periods with this flex + filter combination # Try current flex with level="any" (in relaxation mode)
# Apply level override if specified if original_level_filter != "any":
level_filter_value = lvl_override if lvl_override is not None else config.level_filter
# Log filter changes
flex_pct = round(abs(new_flex) * 100, 1)
if lvl_override is not None:
_LOGGER.debug( _LOGGER.debug(
"%sDay %s flex=%.1f%%: OVERRIDING level_filter: %s%s", "%s Flex=%.1f%%: OVERRIDING level_filter: %s → ANY",
INDENT_L2, INDENT_L2,
day_label, current_flex * 100,
flex_pct, original_level_filter,
config.level_filter or "None",
str(lvl_override).upper(),
) )
relaxed_config = config._replace( relaxed_config = config._replace(
flex=new_flex, flex=current_flex if config.flex >= 0 else -current_flex,
level_filter=level_filter_value, level_filter="any",
)
relaxed_result = calculate_periods(day_prices, config=relaxed_config, time=time)
new_periods = relaxed_result["periods"]
# Build relaxation level label BEFORE marking periods
relaxation_level = f"price_diff_{flex_pct}%{label_suffix}"
phases_used.append(relaxation_level)
# Mark NEW periods with their specific relaxation metadata BEFORE merging
for period in new_periods:
period["relaxation_active"] = True
# Set the metadata immediately - this preserves which phase found this period
mark_periods_with_relaxation([period], relaxation_level, original_flex, abs(new_flex))
# Merge with accumulated periods
merged, standalone_count = resolve_period_overlaps(
accumulated_periods, new_periods, config.min_period_length, baseline_periods
) )
total_standalone = standalone_count + baseline_standalone phase_label_full = f"flex={current_flex * 100:.1f}% +level_any"
filters_label = label_suffix if label_suffix else "(original filters)" _LOGGER.debug(
"%s Trying %s: config has %d intervals (all days together), level_filter=%s",
INDENT_L2,
phase_label_full,
len(all_prices),
relaxed_config.level_filter,
)
# Process ALL prices together (allows midnight crossing)
result = calculate_periods(all_prices, config=relaxed_config, time=time)
new_periods = result["periods"]
_LOGGER.debug( _LOGGER.debug(
"%sDay %s flex=%.1f%% %s: found %d new periods, %d standalone total (%d baseline + %d new)", "%s %s: calculate_periods returned %d periods",
INDENT_L2, INDENT_L2,
day_label, phase_label_full,
flex_pct,
filters_label,
len(new_periods), len(new_periods),
total_standalone,
baseline_standalone,
standalone_count,
) )
accumulated_periods = merged.copy() # Resolve overlaps between existing and new periods
combined, standalone_count = resolve_period_overlaps(
existing_periods=existing_periods,
new_relaxed_periods=new_periods,
)
# ✅ EARLY EXIT: Check after EACH filter combination # Count periods per day to check if requirement met
if total_standalone >= min_periods: periods_by_day = group_periods_by_day(combined)
days_meeting_requirement = 0
for day in sorted(prices_by_day.keys()):
day_periods = periods_by_day.get(day, [])
period_count = len(day_periods)
if period_count >= min_periods:
days_meeting_requirement += 1
_LOGGER.debug(
"%s Day %s: %d periods%s",
INDENT_L2,
day,
period_count,
"" if period_count >= min_periods else f" (need {min_periods})",
)
total_periods = len(combined)
_LOGGER.debug(
"%s %s: found %d periods total, %d/%d days meet requirement",
INDENT_L2,
phase_label_full,
total_periods,
days_meeting_requirement,
total_days,
)
existing_periods = combined
phases_used.append(phase_label_full)
# Check if ALL days reached target
if days_meeting_requirement >= total_days:
_LOGGER.info( _LOGGER.info(
"Day %s: Success with flex=%.1f%% %s - found %d/%d periods (%d baseline + %d from relaxation)", "Success with %s - all %d days have %d+ periods (%d total)",
day_label, phase_label_full,
flex_pct, total_days,
filters_label,
total_standalone,
min_periods, min_periods,
baseline_standalone, total_periods,
standalone_count,
) )
recalculate_period_metadata(merged, time=time) break
result = relaxed_result.copy()
result["periods"] = merged
return result, {"phases_used": phases_used}
# ❌ Only reach here if ALL phases exhausted WITHOUT reaching min_periods # Build final result
final_standalone = len([p for p in accumulated_periods if not p.get("is_extension")]) final_result = (
new_standalone = final_standalone - baseline_standalone result.copy() if "result" in locals() else {"periods": baseline_periods, "metadata": {}, "reference_data": {}}
_LOGGER.warning(
"Day %s: All relaxation phases exhausted WITHOUT reaching goal - "
"found %d/%d standalone periods (%d baseline + %d from relaxation)",
day_label,
final_standalone,
min_periods,
baseline_standalone,
new_standalone,
) )
final_result["periods"] = existing_periods
recalculate_period_metadata(accumulated_periods, time=time) return final_result, {
"phases_used": phases_used,
if relaxed_result: "periods_found": len(existing_periods),
result = relaxed_result.copy() }
else:
result = {"periods": accumulated_periods, "metadata": {}, "reference_data": {}}
result["periods"] = accumulated_periods
return result, {"phases_used": phases_used}

View file

@ -518,7 +518,7 @@ class PeriodCalculator:
reverse_sort=reverse_sort, reverse_sort=reverse_sort,
) )
def calculate_periods_for_price_info( # noqa: PLR0915 def calculate_periods_for_price_info(
self, self,
price_info: dict[str, Any], price_info: dict[str, Any],
) -> dict[str, Any]: ) -> dict[str, Any]:
@ -585,10 +585,6 @@ class PeriodCalculator:
_const.CONF_MIN_PERIODS_BEST, _const.CONF_MIN_PERIODS_BEST,
_const.DEFAULT_MIN_PERIODS_BEST, _const.DEFAULT_MIN_PERIODS_BEST,
) )
relaxation_step_best = self.config_entry.options.get(
_const.CONF_RELAXATION_STEP_BEST,
_const.DEFAULT_RELAXATION_STEP_BEST,
)
relaxation_attempts_best = self.config_entry.options.get( relaxation_attempts_best = self.config_entry.options.get(
_const.CONF_RELAXATION_ATTEMPTS_BEST, _const.CONF_RELAXATION_ATTEMPTS_BEST,
_const.DEFAULT_RELAXATION_ATTEMPTS_BEST, _const.DEFAULT_RELAXATION_ATTEMPTS_BEST,
@ -624,7 +620,6 @@ class PeriodCalculator:
config=best_period_config, config=best_period_config,
enable_relaxation=enable_relaxation_best, enable_relaxation=enable_relaxation_best,
min_periods=min_periods_best, min_periods=min_periods_best,
relaxation_step_pct=relaxation_step_best,
max_relaxation_attempts=relaxation_attempts_best, max_relaxation_attempts=relaxation_attempts_best,
should_show_callback=lambda lvl: self.should_show_periods( should_show_callback=lambda lvl: self.should_show_periods(
price_info, price_info,
@ -658,10 +653,6 @@ class PeriodCalculator:
_const.CONF_MIN_PERIODS_PEAK, _const.CONF_MIN_PERIODS_PEAK,
_const.DEFAULT_MIN_PERIODS_PEAK, _const.DEFAULT_MIN_PERIODS_PEAK,
) )
relaxation_step_peak = self.config_entry.options.get(
_const.CONF_RELAXATION_STEP_PEAK,
_const.DEFAULT_RELAXATION_STEP_PEAK,
)
relaxation_attempts_peak = self.config_entry.options.get( relaxation_attempts_peak = self.config_entry.options.get(
_const.CONF_RELAXATION_ATTEMPTS_PEAK, _const.CONF_RELAXATION_ATTEMPTS_PEAK,
_const.DEFAULT_RELAXATION_ATTEMPTS_PEAK, _const.DEFAULT_RELAXATION_ATTEMPTS_PEAK,
@ -697,7 +688,6 @@ class PeriodCalculator:
config=peak_period_config, config=peak_period_config,
enable_relaxation=enable_relaxation_peak, enable_relaxation=enable_relaxation_peak,
min_periods=min_periods_peak, min_periods=min_periods_peak,
relaxation_step_pct=relaxation_step_peak,
max_relaxation_attempts=relaxation_attempts_peak, max_relaxation_attempts=relaxation_attempts_peak,
should_show_callback=lambda lvl: self.should_show_periods( should_show_callback=lambda lvl: self.should_show_periods(
price_info, price_info,

View file

@ -111,7 +111,6 @@
"best_price_max_level_gap_count": "Lückentoleranz für Niveaufilter", "best_price_max_level_gap_count": "Lückentoleranz für Niveaufilter",
"enable_min_periods_best": "Mindestanzahl Zeiträume anstreben", "enable_min_periods_best": "Mindestanzahl Zeiträume anstreben",
"min_periods_best": "Mindestanzahl Zeiträume", "min_periods_best": "Mindestanzahl Zeiträume",
"relaxation_step_best": "Lockerungsschritt",
"relaxation_attempts_best": "Lockerungsversuche (Flex-Stufen)" "relaxation_attempts_best": "Lockerungsversuche (Flex-Stufen)"
}, },
"data_description": { "data_description": {
@ -119,7 +118,6 @@
"best_price_max_level_gap_count": "Maximale Anzahl aufeinanderfolgender Intervalle, die exakt um eine Niveaustufe vom geforderten Level abweichen dürfen. Beispiel: Bei Filter 'Günstig' und Lückentoleranz 1 wird die Sequenz 'GÜNSTIG, GÜNSTIG, NORMAL, GÜNSTIG' akzeptiert (NORMAL ist eine Stufe über GÜNSTIG). Dies verhindert, dass Zeiträume durch gelegentliche Niveau-Abweichungen aufgespalten werden. Standard: 1.", "best_price_max_level_gap_count": "Maximale Anzahl aufeinanderfolgender Intervalle, die exakt um eine Niveaustufe vom geforderten Level abweichen dürfen. Beispiel: Bei Filter 'Günstig' und Lückentoleranz 1 wird die Sequenz 'GÜNSTIG, GÜNSTIG, NORMAL, GÜNSTIG' akzeptiert (NORMAL ist eine Stufe über GÜNSTIG). Dies verhindert, dass Zeiträume durch gelegentliche Niveau-Abweichungen aufgespalten werden. Standard: 1.",
"enable_min_periods_best": "Wenn aktiviert, werden Filter schrittweise gelockert, falls nicht genug Zeiträume gefunden wurden. Dies versucht die gewünschte Mindestanzahl zu erreichen, was dazu führen kann, dass auch weniger optimale Zeiträume als Bestpreis-Zeiträume markiert werden.", "enable_min_periods_best": "Wenn aktiviert, werden Filter schrittweise gelockert, falls nicht genug Zeiträume gefunden wurden. Dies versucht die gewünschte Mindestanzahl zu erreichen, was dazu führen kann, dass auch weniger optimale Zeiträume als Bestpreis-Zeiträume markiert werden.",
"min_periods_best": "Mindestanzahl an Bestpreis-Zeiträumen, die pro Tag angestrebt werden. Filter werden schrittweise gelockert, um diese Anzahl zu erreichen. Nur aktiv, wenn 'Mindestanzahl Zeiträume anstreben' aktiviert ist. Standard: 1", "min_periods_best": "Mindestanzahl an Bestpreis-Zeiträumen, die pro Tag angestrebt werden. Filter werden schrittweise gelockert, um diese Anzahl zu erreichen. Nur aktiv, wenn 'Mindestanzahl Zeiträume anstreben' aktiviert ist. Standard: 1",
"relaxation_step_best": "Prozentsatz des ursprünglichen Flexibilitätsschwellwerts, der pro Lockerungsschritt addiert wird. Beispiel: Bei 15% Flexibilität und 25% Schrittgröße werden 15%, 18,75%, 22,5% usw. versucht. Höhere Werte bedeuten schnellere Lockerung, aber geringere Präzision.",
"relaxation_attempts_best": "Wie viele Flex-Stufen (Versuche) nacheinander ausprobiert werden, bevor aufgegeben wird. Jeder Versuch testet alle Filterkombinationen auf der neuen Flex-Stufe. Mehr Versuche erhöhen die Chance auf zusätzliche Zeiträume, benötigen aber etwas mehr Rechenzeit." "relaxation_attempts_best": "Wie viele Flex-Stufen (Versuche) nacheinander ausprobiert werden, bevor aufgegeben wird. Jeder Versuch testet alle Filterkombinationen auf der neuen Flex-Stufe. Mehr Versuche erhöhen die Chance auf zusätzliche Zeiträume, benötigen aber etwas mehr Rechenzeit."
}, },
"submit": "Weiter zu Schritt 5" "submit": "Weiter zu Schritt 5"
@ -135,7 +133,6 @@
"peak_price_max_level_gap_count": "Lückentoleranz für Niveaufilter", "peak_price_max_level_gap_count": "Lückentoleranz für Niveaufilter",
"enable_min_periods_peak": "Mindestanzahl Zeiträume anstreben", "enable_min_periods_peak": "Mindestanzahl Zeiträume anstreben",
"min_periods_peak": "Mindestanzahl Zeiträume", "min_periods_peak": "Mindestanzahl Zeiträume",
"relaxation_step_peak": "Lockerungsschritt",
"relaxation_attempts_peak": "Lockerungsversuche (Flex-Stufen)" "relaxation_attempts_peak": "Lockerungsversuche (Flex-Stufen)"
}, },
"data_description": { "data_description": {
@ -143,7 +140,6 @@
"peak_price_max_level_gap_count": "Maximale Anzahl aufeinanderfolgender Intervalle, die exakt um eine Niveaustufe vom geforderten Level abweichen dürfen. Beispiel: Bei Filter 'Teuer' und Lückentoleranz 2 wird die Sequenz 'TEUER, NORMAL, NORMAL, TEUER' akzeptiert (NORMAL ist eine Stufe unter TEUER). Dies verhindert, dass Zeiträume durch gelegentliche Niveau-Abweichungen aufgespalten werden. Standard: 0.", "peak_price_max_level_gap_count": "Maximale Anzahl aufeinanderfolgender Intervalle, die exakt um eine Niveaustufe vom geforderten Level abweichen dürfen. Beispiel: Bei Filter 'Teuer' und Lückentoleranz 2 wird die Sequenz 'TEUER, NORMAL, NORMAL, TEUER' akzeptiert (NORMAL ist eine Stufe unter TEUER). Dies verhindert, dass Zeiträume durch gelegentliche Niveau-Abweichungen aufgespalten werden. Standard: 0.",
"enable_min_periods_peak": "Wenn aktiviert, werden Filter schrittweise gelockert, falls nicht genug Zeiträume gefunden wurden. Dies versucht die gewünschte Mindestanzahl zu erreichen, um sicherzustellen, dass du auch an Tagen mit ungewöhnlichen Preismustern vor teuren Zeiträumen gewarnt wirst.", "enable_min_periods_peak": "Wenn aktiviert, werden Filter schrittweise gelockert, falls nicht genug Zeiträume gefunden wurden. Dies versucht die gewünschte Mindestanzahl zu erreichen, um sicherzustellen, dass du auch an Tagen mit ungewöhnlichen Preismustern vor teuren Zeiträumen gewarnt wirst.",
"min_periods_peak": "Mindestanzahl an Spitzenpreis-Zeiträumen, die pro Tag angestrebt werden. Filter werden schrittweise gelockert, um diese Anzahl zu erreichen. Nur aktiv, wenn 'Mindestanzahl Zeiträume anstreben' aktiviert ist. Standard: 1", "min_periods_peak": "Mindestanzahl an Spitzenpreis-Zeiträumen, die pro Tag angestrebt werden. Filter werden schrittweise gelockert, um diese Anzahl zu erreichen. Nur aktiv, wenn 'Mindestanzahl Zeiträume anstreben' aktiviert ist. Standard: 1",
"relaxation_step_peak": "Prozentsatz des ursprünglichen Flexibilitätsschwellwerts, der pro Lockerungsschritt addiert wird. Beispiel: Bei -15% Flexibilität und 25% Schrittgröße werden -15%, -18,75%, -22,5% usw. versucht. Höhere Werte bedeuten schnellere Lockerung, aber geringere Präzision.",
"relaxation_attempts_peak": "Wie viele Flex-Stufen (Versuche) nacheinander ausprobiert werden, bevor aufgegeben wird. Jeder Versuch testet alle Filterkombinationen auf der neuen Flex-Stufe. Mehr Versuche erhöhen die Chance auf zusätzliche Spitzenpreis-Zeiträume, benötigen aber etwas mehr Rechenzeit." "relaxation_attempts_peak": "Wie viele Flex-Stufen (Versuche) nacheinander ausprobiert werden, bevor aufgegeben wird. Jeder Versuch testet alle Filterkombinationen auf der neuen Flex-Stufe. Mehr Versuche erhöhen die Chance auf zusätzliche Spitzenpreis-Zeiträume, benötigen aber etwas mehr Rechenzeit."
}, },
"submit": "Weiter zu Schritt 6" "submit": "Weiter zu Schritt 6"

View file

@ -111,7 +111,6 @@
"best_price_max_level_gap_count": "Level Filter Gap Tolerance", "best_price_max_level_gap_count": "Level Filter Gap Tolerance",
"enable_min_periods_best": "Try to Achieve Minimum Period Count", "enable_min_periods_best": "Try to Achieve Minimum Period Count",
"min_periods_best": "Minimum Periods Required", "min_periods_best": "Minimum Periods Required",
"relaxation_step_best": "Filter Relaxation Step Size",
"relaxation_attempts_best": "Relaxation Attempts" "relaxation_attempts_best": "Relaxation Attempts"
}, },
"data_description": { "data_description": {
@ -119,7 +118,6 @@
"best_price_max_level_gap_count": "Maximum number of consecutive intervals allowed that deviate by exactly one level step from the required level. For example: with 'Cheap' filter and gap count 1, a sequence 'CHEAP, CHEAP, NORMAL, CHEAP' is accepted (NORMAL is one step above CHEAP). This prevents periods from being split by occasional level deviations. Default: 0 (strict filtering, no tolerance).", "best_price_max_level_gap_count": "Maximum number of consecutive intervals allowed that deviate by exactly one level step from the required level. For example: with 'Cheap' filter and gap count 1, a sequence 'CHEAP, CHEAP, NORMAL, CHEAP' is accepted (NORMAL is one step above CHEAP). This prevents periods from being split by occasional level deviations. Default: 0 (strict filtering, no tolerance).",
"enable_min_periods_best": "When enabled, filters will be gradually relaxed if not enough periods are found. This attempts to reach the desired minimum number of periods, which may include less optimal time windows as best-price periods.", "enable_min_periods_best": "When enabled, filters will be gradually relaxed if not enough periods are found. This attempts to reach the desired minimum number of periods, which may include less optimal time windows as best-price periods.",
"min_periods_best": "Minimum number of best price periods to aim for per day. Filters will be relaxed step-by-step to try achieving this count. Only active when 'Try to Achieve Minimum Period Count' is enabled. Default: 1", "min_periods_best": "Minimum number of best price periods to aim for per day. Filters will be relaxed step-by-step to try achieving this count. Only active when 'Try to Achieve Minimum Period Count' is enabled. Default: 1",
"relaxation_step_best": "Percentage of the original flexibility threshold to add per relaxation step. For example: with 15% flexibility and 25% step size, filters will try 15%, 18.75%, 22.5%, etc. Higher values mean faster relaxation but less precision.",
"relaxation_attempts_best": "How many flex levels (attempts) to try before giving up. Each attempt runs all filter combinations at the new flex level. More attempts increase the chance of finding additional periods at the cost of longer processing time." "relaxation_attempts_best": "How many flex levels (attempts) to try before giving up. Each attempt runs all filter combinations at the new flex level. More attempts increase the chance of finding additional periods at the cost of longer processing time."
}, },
"submit": "Next to Step 5" "submit": "Next to Step 5"
@ -135,7 +133,6 @@
"peak_price_max_level_gap_count": "Level Filter Gap Tolerance", "peak_price_max_level_gap_count": "Level Filter Gap Tolerance",
"enable_min_periods_peak": "Try to Achieve Minimum Period Count", "enable_min_periods_peak": "Try to Achieve Minimum Period Count",
"min_periods_peak": "Minimum Periods Required", "min_periods_peak": "Minimum Periods Required",
"relaxation_step_peak": "Filter Relaxation Step Size",
"relaxation_attempts_peak": "Relaxation Attempts" "relaxation_attempts_peak": "Relaxation Attempts"
}, },
"data_description": { "data_description": {
@ -143,7 +140,6 @@
"peak_price_max_level_gap_count": "Maximum number of consecutive intervals allowed that deviate by exactly one level step from the required level. For example: with 'Expensive' filter and gap count 2, a sequence 'EXPENSIVE, NORMAL, NORMAL, EXPENSIVE' is accepted (NORMAL is one step below EXPENSIVE). This prevents periods from being split by occasional level deviations. Default: 0 (strict filtering, no tolerance).", "peak_price_max_level_gap_count": "Maximum number of consecutive intervals allowed that deviate by exactly one level step from the required level. For example: with 'Expensive' filter and gap count 2, a sequence 'EXPENSIVE, NORMAL, NORMAL, EXPENSIVE' is accepted (NORMAL is one step below EXPENSIVE). This prevents periods from being split by occasional level deviations. Default: 0 (strict filtering, no tolerance).",
"enable_min_periods_peak": "When enabled, filters will be gradually relaxed if not enough periods are found. This attempts to reach the desired minimum number of periods to ensure you're warned about expensive periods even on days with unusual price patterns.", "enable_min_periods_peak": "When enabled, filters will be gradually relaxed if not enough periods are found. This attempts to reach the desired minimum number of periods to ensure you're warned about expensive periods even on days with unusual price patterns.",
"min_periods_peak": "Minimum number of peak price periods to aim for per day. Filters will be relaxed step-by-step to try achieving this count. Only active when 'Try to Achieve Minimum Period Count' is enabled. Default: 1", "min_periods_peak": "Minimum number of peak price periods to aim for per day. Filters will be relaxed step-by-step to try achieving this count. Only active when 'Try to Achieve Minimum Period Count' is enabled. Default: 1",
"relaxation_step_peak": "Percentage of the original flexibility threshold to add per relaxation step. For example: with -15% flexibility and 25% step size, filters will try -15%, -18.75%, -22.5%, etc. Higher values mean faster relaxation but less precision.",
"relaxation_attempts_peak": "How many flex levels (attempts) to try before giving up. Each attempt runs all filter combinations at the new flex level. More attempts increase the chance of finding additional peak periods at the cost of longer processing time." "relaxation_attempts_peak": "How many flex levels (attempts) to try before giving up. Each attempt runs all filter combinations at the new flex level. More attempts increase the chance of finding additional peak periods at the cost of longer processing time."
}, },
"submit": "Next to Step 6" "submit": "Next to Step 6"

View file

@ -111,14 +111,12 @@
"best_price_max_level_gap_count": "Gaptoleranse for nivåfilter", "best_price_max_level_gap_count": "Gaptoleranse for nivåfilter",
"enable_min_periods_best": "Prøv å oppnå minimum antall perioder", "enable_min_periods_best": "Prøv å oppnå minimum antall perioder",
"min_periods_best": "Minimum antall perioder", "min_periods_best": "Minimum antall perioder",
"relaxation_step_best": "Avslappingstrinn",
"relaxation_attempts_best": "Antall forsøk (fleksnivåer)" "relaxation_attempts_best": "Antall forsøk (fleksnivåer)"
}, },
"data_description": { "data_description": {
"best_price_max_level": "Vis kun beste prisperioder hvis de inneholder intervaller med prisnivåer ≤ valgt verdi. For eksempel: å velge 'Billig' betyr at perioden må ha minst ett 'VELDIG_BILLIG' eller 'BILLIG' intervall. Dette sikrer at 'beste pris'-perioder ikke bare er relativt billige for dagen, men faktisk billige i absolutte tall. Velg 'Alle' for å vise beste priser uavhengig av deres absolutte prisnivå.", "best_price_max_level": "Vis kun beste prisperioder hvis de inneholder intervaller med prisnivåer ≤ valgt verdi. For eksempel: å velge 'Billig' betyr at perioden må ha minst ett 'VELDIG_BILLIG' eller 'BILLIG' intervall. Dette sikrer at 'beste pris'-perioder ikke bare er relativt billige for dagen, men faktisk billige i absolutte tall. Velg 'Alle' for å vise beste priser uavhengig av deres absolutte prisnivå.",
"enable_min_periods_best": "Når aktivert vil filtrene gradvis bli lempeligere hvis det ikke blir funnet nok perioder. Dette forsøker å nå ønsket minimum antall perioder, noe som kan føre til at mindre optimale tidsrom blir markert som beste-pris-perioder.", "enable_min_periods_best": "Når aktivert vil filtrene gradvis bli lempeligere hvis det ikke blir funnet nok perioder. Dette forsøker å nå ønsket minimum antall perioder, noe som kan føre til at mindre optimale tidsrom blir markert som beste-pris-perioder.",
"min_periods_best": "Minimum antall beste-pris-perioder å sikte mot per dag. Filtre vil bli lempet trinn for trinn for å prøve å oppnå dette antallet. Kun aktiv når 'Prøv å oppnå minimum antall perioder' er aktivert. Standard: 1", "min_periods_best": "Minimum antall beste-pris-perioder å sikte mot per dag. Filtre vil bli lempet trinn for trinn for å prøve å oppnå dette antallet. Kun aktiv når 'Prøv å oppnå minimum antall perioder' er aktivert. Standard: 1",
"relaxation_step_best": "Prosentandel av den opprinnelige fleksibilitetsterskealen som legges til per avslappingstrinn. For eksempel: med 15% fleksibilitet og 25% trinnstørrelse vil filtrene prøve 15%, 18,75%, 22,5%, osv. Høyere verdier betyr raskere avslapping men mindre presisjon.",
"relaxation_attempts_best": "Hvor mange fleksnivåer (forsøk) som testes før vi gir opp. Hvert forsøk kjører alle filterkombinasjoner på det nye fleksnivået. Flere forsøk øker sjansen for ekstra perioder, men tar litt lengre tid.", "relaxation_attempts_best": "Hvor mange fleksnivåer (forsøk) som testes før vi gir opp. Hvert forsøk kjører alle filterkombinasjoner på det nye fleksnivået. Flere forsøk øker sjansen for ekstra perioder, men tar litt lengre tid.",
"best_price_max_level_gap_count": "Maksimalt antall påfølgende intervaller som kan avvike med nøyaktig ett nivåtrinn fra det nødvendige nivået. For eksempel: med 'Billig' filter og gapantall 1, aksepteres sekvensen 'BILLIG, BILLIG, NORMAL, BILLIG' (NORMAL er ett trinn over BILLIG). Dette forhindrer at perioder blir delt opp av tilfeldige nivåavvik. Standard: 0 (streng filtrering, ingen toleranse)." "best_price_max_level_gap_count": "Maksimalt antall påfølgende intervaller som kan avvike med nøyaktig ett nivåtrinn fra det nødvendige nivået. For eksempel: med 'Billig' filter og gapantall 1, aksepteres sekvensen 'BILLIG, BILLIG, NORMAL, BILLIG' (NORMAL er ett trinn over BILLIG). Dette forhindrer at perioder blir delt opp av tilfeldige nivåavvik. Standard: 0 (streng filtrering, ingen toleranse)."
}, },
@ -135,14 +133,12 @@
"peak_price_max_level_gap_count": "Gaptoleranse for nivåfilter", "peak_price_max_level_gap_count": "Gaptoleranse for nivåfilter",
"enable_min_periods_peak": "Prøv å oppnå minimum antall perioder", "enable_min_periods_peak": "Prøv å oppnå minimum antall perioder",
"min_periods_peak": "Minimum antall perioder", "min_periods_peak": "Minimum antall perioder",
"relaxation_step_peak": "Avslappingstrinn",
"relaxation_attempts_peak": "Antall forsøk (fleksnivåer)" "relaxation_attempts_peak": "Antall forsøk (fleksnivåer)"
}, },
"data_description": { "data_description": {
"peak_price_min_level": "Vis kun topprisperioder hvis de inneholder intervaller med prisnivåer ≥ valgt verdi. For eksempel: å velge 'Dyr' betyr at perioden må ha minst ett 'DYR' eller 'VELDIG_DYR' intervall. Dette sikrer at 'topppris'-perioder ikke bare er relativt dyre for dagen, men faktisk dyre i absolutte tall. Velg 'Alle' for å vise topppriser uavhengig av deres absolutte prisnivå.", "peak_price_min_level": "Vis kun topprisperioder hvis de inneholder intervaller med prisnivåer ≥ valgt verdi. For eksempel: å velge 'Dyr' betyr at perioden må ha minst ett 'DYR' eller 'VELDIG_DYR' intervall. Dette sikrer at 'topppris'-perioder ikke bare er relativt dyre for dagen, men faktisk dyre i absolutte tall. Velg 'Alle' for å vise topppriser uavhengig av deres absolutte prisnivå.",
"enable_min_periods_peak": "Når aktivert vil filtrene gradvis bli lempeligere hvis det ikke blir funnet nok perioder. Dette forsøker å nå ønsket minimum antall perioder for å sikre at du blir advart om dyre perioder selv på dager med uvanlige prismønstre.", "enable_min_periods_peak": "Når aktivert vil filtrene gradvis bli lempeligere hvis det ikke blir funnet nok perioder. Dette forsøker å nå ønsket minimum antall perioder for å sikre at du blir advart om dyre perioder selv på dager med uvanlige prismønstre.",
"min_periods_peak": "Minimum antall topp-pris-perioder å sikte mot per dag. Filtre vil bli lempet trinn for trinn for å prøve å oppnå dette antallet. Kun aktiv når 'Prøv å oppnå minimum antall perioder' er aktivert. Standard: 1", "min_periods_peak": "Minimum antall topp-pris-perioder å sikte mot per dag. Filtre vil bli lempet trinn for trinn for å prøve å oppnå dette antallet. Kun aktiv når 'Prøv å oppnå minimum antall perioder' er aktivert. Standard: 1",
"relaxation_step_peak": "Prosentandel av den opprinnelige fleksibilitetsterskealen som legges til per avslappingstrinn. For eksempel: med -15% fleksibilitet og 25% trinnstørrelse vil filtrene prøve -15%, -18,75%, -22,5%, osv. Høyere verdier betyr raskere avslapping men mindre presisjon.",
"relaxation_attempts_peak": "Hvor mange fleksnivåer (forsøk) som testes før vi gir opp. Hvert forsøk kjører alle filterkombinasjoner på det nye fleksnivået. Flere forsøk øker sjansen for ekstra toppprisperioder, men tar litt lengre tid.", "relaxation_attempts_peak": "Hvor mange fleksnivåer (forsøk) som testes før vi gir opp. Hvert forsøk kjører alle filterkombinasjoner på det nye fleksnivået. Flere forsøk øker sjansen for ekstra toppprisperioder, men tar litt lengre tid.",
"peak_price_max_level_gap_count": "Maksimalt antall påfølgende intervaller som kan avvike med nøyaktig ett nivåtrinn fra det nødvendige nivået. For eksempel: med 'Dyr' filter og gapantall 2, aksepteres sekvensen 'DYR, NORMAL, NORMAL, DYR' (NORMAL er ett trinn under DYR). Dette forhindrer at perioder blir delt opp av tilfeldige nivåavvik. Standard: 0 (streng filtrering, ingen toleranse)." "peak_price_max_level_gap_count": "Maksimalt antall påfølgende intervaller som kan avvike med nøyaktig ett nivåtrinn fra det nødvendige nivået. For eksempel: med 'Dyr' filter og gapantall 2, aksepteres sekvensen 'DYR, NORMAL, NORMAL, DYR' (NORMAL er ett trinn under DYR). Dette forhindrer at perioder blir delt opp av tilfeldige nivåavvik. Standard: 0 (streng filtrering, ingen toleranse)."
}, },

View file

@ -111,14 +111,12 @@
"best_price_max_level_gap_count": "Gaptolerantie voor niveaufilter", "best_price_max_level_gap_count": "Gaptolerantie voor niveaufilter",
"enable_min_periods_best": "Probeer minimum aantal periodes te bereiken", "enable_min_periods_best": "Probeer minimum aantal periodes te bereiken",
"min_periods_best": "Minimum aantal periodes", "min_periods_best": "Minimum aantal periodes",
"relaxation_step_best": "Ontspanningsstap",
"relaxation_attempts_best": "Aantal ontspanningspogingen" "relaxation_attempts_best": "Aantal ontspanningspogingen"
}, },
"data_description": { "data_description": {
"best_price_max_level": "Toon alleen beste prijsperiodes als ze intervallen bevatten met prijsniveaus ≤ geselecteerde waarde. Bijvoorbeeld: selecteren van 'Goedkoop' betekent dat de periode minstens één 'ZEER_GOEDKOOP' of 'GOEDKOOP' interval moet hebben. Dit zorgt ervoor dat 'beste prijs'-periodes niet alleen relatief goedkoop zijn voor de dag, maar daadwerkelijk goedkoop in absolute termen. Selecteer 'Alle' om beste prijzen te tonen ongeacht hun absolute prijsniveau.", "best_price_max_level": "Toon alleen beste prijsperiodes als ze intervallen bevatten met prijsniveaus ≤ geselecteerde waarde. Bijvoorbeeld: selecteren van 'Goedkoop' betekent dat de periode minstens één 'ZEER_GOEDKOOP' of 'GOEDKOOP' interval moet hebben. Dit zorgt ervoor dat 'beste prijs'-periodes niet alleen relatief goedkoop zijn voor de dag, maar daadwerkelijk goedkoop in absolute termen. Selecteer 'Alle' om beste prijzen te tonen ongeacht hun absolute prijsniveau.",
"enable_min_periods_best": "Wanneer ingeschakeld worden filters geleidelijk versoepeld als er niet genoeg periodes worden gevonden. Dit probeert het gewenste minimum aantal periodes te bereiken om ervoor te zorgen dat je kansen hebt om van lage prijzen te profiteren, zelfs op dagen met ongebruikelijke prijspatronen.", "enable_min_periods_best": "Wanneer ingeschakeld worden filters geleidelijk versoepeld als er niet genoeg periodes worden gevonden. Dit probeert het gewenste minimum aantal periodes te bereiken om ervoor te zorgen dat je kansen hebt om van lage prijzen te profiteren, zelfs op dagen met ongebruikelijke prijspatronen.",
"min_periods_best": "Minimum aantal beste prijsperiodes om naar te streven per dag. Filters worden stap voor stap versoepeld om dit aantal te proberen bereiken. Alleen actief wanneer 'Probeer minimum aantal periodes te bereiken' is ingeschakeld. Standaard: 1", "min_periods_best": "Minimum aantal beste prijsperiodes om naar te streven per dag. Filters worden stap voor stap versoepeld om dit aantal te proberen bereiken. Alleen actief wanneer 'Probeer minimum aantal periodes te bereiken' is ingeschakeld. Standaard: 1",
"relaxation_step_best": "Percentage van de oorspronkelijke flexibiliteitsdrempel om toe te voegen per ontspanningsstap. Bijvoorbeeld: met 15% flexibiliteit en 25% stapgrootte zullen de filters 15%, 18,75%, 22,5%, enz. proberen. Hogere waarden betekenen snellere ontspanning maar minder precisie.",
"relaxation_attempts_best": "Hoeveel keer de ontspanningslogica filters opnieuw mag proberen. Gebruik hogere waarden om meer variaties te testen als dagen extreem grillig zijn. Hogere aantallen vergen meer rekentijd maar vergroten de kans dat het gewenste minimum aantal periodes wordt gehaald.", "relaxation_attempts_best": "Hoeveel keer de ontspanningslogica filters opnieuw mag proberen. Gebruik hogere waarden om meer variaties te testen als dagen extreem grillig zijn. Hogere aantallen vergen meer rekentijd maar vergroten de kans dat het gewenste minimum aantal periodes wordt gehaald.",
"best_price_max_level_gap_count": "Maximum aantal opeenvolgende intervallen dat precies één niveaustap mag afwijken van het vereiste niveau. Bijvoorbeeld: met 'Goedkoop' filter en gaptelling 1 wordt de reeks 'GOEDKOOP, GOEDKOOP, NORMAAL, GOEDKOOP' geaccepteerd (NORMAAL is één stap boven GOEDKOOP). Dit voorkomt dat periodes worden opgesplitst door incidentele niveauafwijkingen. Standaard: 0 (strikte filtering, geen tolerantie)." "best_price_max_level_gap_count": "Maximum aantal opeenvolgende intervallen dat precies één niveaustap mag afwijken van het vereiste niveau. Bijvoorbeeld: met 'Goedkoop' filter en gaptelling 1 wordt de reeks 'GOEDKOOP, GOEDKOOP, NORMAAL, GOEDKOOP' geaccepteerd (NORMAAL is één stap boven GOEDKOOP). Dit voorkomt dat periodes worden opgesplitst door incidentele niveauafwijkingen. Standaard: 0 (strikte filtering, geen tolerantie)."
}, },
@ -135,14 +133,12 @@
"peak_price_max_level_gap_count": "Gaptolerantie voor niveaufilter", "peak_price_max_level_gap_count": "Gaptolerantie voor niveaufilter",
"enable_min_periods_peak": "Probeer minimum aantal periodes te bereiken", "enable_min_periods_peak": "Probeer minimum aantal periodes te bereiken",
"min_periods_peak": "Minimum aantal periodes", "min_periods_peak": "Minimum aantal periodes",
"relaxation_step_peak": "Ontspanningsstap",
"relaxation_attempts_peak": "Aantal ontspanningspogingen" "relaxation_attempts_peak": "Aantal ontspanningspogingen"
}, },
"data_description": { "data_description": {
"peak_price_min_level": "Toon alleen piekprijsperiodes als ze intervallen bevatten met prijsniveaus ≥ geselecteerde waarde. Bijvoorbeeld: selecteren van 'Duur' betekent dat de periode minstens één 'DUUR' of 'ZEER_DUUR' interval moet hebben. Dit zorgt ervoor dat 'piekprijs'-periodes niet alleen relatief duur zijn voor de dag, maar daadwerkelijk duur in absolute termen. Selecteer 'Alle' om piekprijzen te tonen ongeacht hun absolute prijsniveau.", "peak_price_min_level": "Toon alleen piekprijsperiodes als ze intervallen bevatten met prijsniveaus ≥ geselecteerde waarde. Bijvoorbeeld: selecteren van 'Duur' betekent dat de periode minstens één 'DUUR' of 'ZEER_DUUR' interval moet hebben. Dit zorgt ervoor dat 'piekprijs'-periodes niet alleen relatief duur zijn voor de dag, maar daadwerkelijk duur in absolute termen. Selecteer 'Alle' om piekprijzen te tonen ongeacht hun absolute prijsniveau.",
"enable_min_periods_peak": "Wanneer ingeschakeld worden filters geleidelijk versoepeld als er niet genoeg periodes worden gevonden. Dit probeert het gewenste minimum aantal periodes te bereiken om ervoor te zorgen dat je wordt gewaarschuwd voor dure periodes, zelfs op dagen met ongebruikelijke prijspatronen.", "enable_min_periods_peak": "Wanneer ingeschakeld worden filters geleidelijk versoepeld als er niet genoeg periodes worden gevonden. Dit probeert het gewenste minimum aantal periodes te bereiken om ervoor te zorgen dat je wordt gewaarschuwd voor dure periodes, zelfs op dagen met ongebruikelijke prijspatronen.",
"min_periods_peak": "Minimum aantal piekprijsperiodes om naar te streven per dag. Filters worden stap voor stap versoepeld om dit aantal te proberen bereiken. Alleen actief wanneer 'Probeer minimum aantal periodes te bereiken' is ingeschakeld. Standaard: 1", "min_periods_peak": "Minimum aantal piekprijsperiodes om naar te streven per dag. Filters worden stap voor stap versoepeld om dit aantal te proberen bereiken. Alleen actief wanneer 'Probeer minimum aantal periodes te bereiken' is ingeschakeld. Standaard: 1",
"relaxation_step_peak": "Percentage van de oorspronkelijke flexibiliteitsdrempel om toe te voegen per ontspanningsstap. Bijvoorbeeld: met -15% flexibiliteit en 25% stapgrootte zullen de filters -15%, -18,75%, -22,5%, enz. proberen. Hogere waarden betekenen snellere ontspanning maar minder precisie.",
"relaxation_attempts_peak": "Hoeveel keer de ontspanningslogica filters opnieuw mag proberen. Gebruik meer pogingen wanneer de piekperiodes moeilijk te vinden zijn door vlakke of zeer grillige dagen. Elke extra poging kost wat extra verwerkingstijd maar vergroot de kans dat periodes worden gevonden.", "relaxation_attempts_peak": "Hoeveel keer de ontspanningslogica filters opnieuw mag proberen. Gebruik meer pogingen wanneer de piekperiodes moeilijk te vinden zijn door vlakke of zeer grillige dagen. Elke extra poging kost wat extra verwerkingstijd maar vergroot de kans dat periodes worden gevonden.",
"peak_price_max_level_gap_count": "Maximum aantal opeenvolgende intervallen dat precies één niveaustap mag afwijken van het vereiste niveau. Bijvoorbeeld: met 'Duur' filter en gaptelling 2 wordt de reeks 'DUUR, NORMAAL, NORMAAL, DUUR' geaccepteerd (NORMAAL is één stap onder DUUR). Dit voorkomt dat periodes worden opgesplitst door incidentele niveauafwijkingen. Standaard: 0 (strikte filtering, geen tolerantie)." "peak_price_max_level_gap_count": "Maximum aantal opeenvolgende intervallen dat precies één niveaustap mag afwijken van het vereiste niveau. Bijvoorbeeld: met 'Duur' filter en gaptelling 2 wordt de reeks 'DUUR, NORMAAL, NORMAAL, DUUR' geaccepteerd (NORMAAL is één stap onder DUUR). Dit voorkomt dat periodes worden opgesplitst door incidentele niveauafwijkingen. Standaard: 0 (strikte filtering, geen tolerantie)."
}, },

View file

@ -111,14 +111,12 @@
"best_price_max_level_gap_count": "Gaptolerens för nivåfilter", "best_price_max_level_gap_count": "Gaptolerens för nivåfilter",
"enable_min_periods_best": "Försök uppnå minsta antal perioder", "enable_min_periods_best": "Försök uppnå minsta antal perioder",
"min_periods_best": "Minsta antal perioder", "min_periods_best": "Minsta antal perioder",
"relaxation_step_best": "Avslappningssteg",
"relaxation_attempts_best": "Antal avslappningsförsök" "relaxation_attempts_best": "Antal avslappningsförsök"
}, },
"data_description": { "data_description": {
"best_price_max_level": "Visa endast bästa prisperioder om de innehåller intervall med prisnivåer ≤ valt värde. Till exempel: att välja 'Billigt' betyder att perioden måste ha minst ett 'MYCKET_BILLIGT' eller 'BILLIGT' intervall. Detta säkerställer att 'bästa pris'-perioder inte bara är relativt billiga för dagen, utan faktiskt billiga i absoluta tal. Välj 'Alla' för att visa bästa priser oavsett deras absoluta prisnivå.", "best_price_max_level": "Visa endast bästa prisperioder om de innehåller intervall med prisnivåer ≤ valt värde. Till exempel: att välja 'Billigt' betyder att perioden måste ha minst ett 'MYCKET_BILLIGT' eller 'BILLIGT' intervall. Detta säkerställer att 'bästa pris'-perioder inte bara är relativt billiga för dagen, utan faktiskt billiga i absoluta tal. Välj 'Alla' för att visa bästa priser oavsett deras absoluta prisnivå.",
"enable_min_periods_best": "När aktiverad kommer filtren att gradvis luckras upp om inte tillräckligt många perioder hittas. Detta försöker uppnå det önskade minsta antalet perioder för att säkerställa att du har möjligheter att dra nytta av låga priser även på dagar med ovanliga prismönster.", "enable_min_periods_best": "När aktiverad kommer filtren att gradvis luckras upp om inte tillräckligt många perioder hittas. Detta försöker uppnå det önskade minsta antalet perioder för att säkerställa att du har möjligheter att dra nytta av låga priser även på dagar med ovanliga prismönster.",
"min_periods_best": "Minsta antal bästa prisperioder att sträva efter per dag. Filtren kommer att luckras upp steg för steg för att försöka uppnå detta antal. Endast aktiv när 'Försök uppnå minsta antal perioder' är aktiverad. Standard: 1", "min_periods_best": "Minsta antal bästa prisperioder att sträva efter per dag. Filtren kommer att luckras upp steg för steg för att försöka uppnå detta antal. Endast aktiv när 'Försök uppnå minsta antal perioder' är aktiverad. Standard: 1",
"relaxation_step_best": "Procentandel av den ursprungliga flexibilitetströskeln att lägga till per avslappningssteg. Till exempel: med 15% flexibilitet och 25% stegstorlek kommer filtren att prova 15%, 18,75%, 22,5%, osv. Högre värden innebär snabbare avslappning men mindre precision.",
"relaxation_attempts_best": "Hur många gånger avslappningslogiken får försöka hitta nya kombinationer av flex och filter. Öka detta om dagarna är extrema och du behöver fler försök för att nå minimikravet. Varje extra försök tar lite mer tid men ökar chansen att hitta perioder.", "relaxation_attempts_best": "Hur många gånger avslappningslogiken får försöka hitta nya kombinationer av flex och filter. Öka detta om dagarna är extrema och du behöver fler försök för att nå minimikravet. Varje extra försök tar lite mer tid men ökar chansen att hitta perioder.",
"best_price_max_level_gap_count": "Maximalt antal på varandra följande intervaller som får avvika med exakt ett nivåsteg från det erforderliga nivået. Till exempel: med 'Billigt' filter och gapantal 1 accepteras sekvensen 'BILLIGT, BILLIGT, NORMALT, BILLIGT' (NORMALT är ett steg över BILLIGT). Detta förhindrar att perioder delas upp av tillfälliga nivåavvikelser. Standard: 0 (strikt filtrering, ingen tolerans)." "best_price_max_level_gap_count": "Maximalt antal på varandra följande intervaller som får avvika med exakt ett nivåsteg från det erforderliga nivået. Till exempel: med 'Billigt' filter och gapantal 1 accepteras sekvensen 'BILLIGT, BILLIGT, NORMALT, BILLIGT' (NORMALT är ett steg över BILLIGT). Detta förhindrar att perioder delas upp av tillfälliga nivåavvikelser. Standard: 0 (strikt filtrering, ingen tolerans)."
}, },
@ -135,14 +133,12 @@
"peak_price_max_level_gap_count": "Gaptolerens för nivåfilter", "peak_price_max_level_gap_count": "Gaptolerens för nivåfilter",
"enable_min_periods_peak": "Försök uppnå minsta antal perioder", "enable_min_periods_peak": "Försök uppnå minsta antal perioder",
"min_periods_peak": "Minsta antal perioder", "min_periods_peak": "Minsta antal perioder",
"relaxation_step_peak": "Avslappningssteg",
"relaxation_attempts_peak": "Antal avslappningsförsök" "relaxation_attempts_peak": "Antal avslappningsförsök"
}, },
"data_description": { "data_description": {
"peak_price_min_level": "Visa endast topprisperioder om de innehåller intervall med prisnivåer ≥ valt värde. Till exempel måste perioden om du väljer 'Dyr' ha minst ett 'DYR' eller 'MYCKET_DYR' intervall. Detta säkerställer att 'toppris'-perioder inte bara är relativt dyra för dagen, utan faktiskt dyra i absoluta termer (inte bara 'lite dyrare än genomsnittet på en billig dag').", "peak_price_min_level": "Visa endast topprisperioder om de innehåller intervall med prisnivåer ≥ valt värde. Till exempel måste perioden om du väljer 'Dyr' ha minst ett 'DYR' eller 'MYCKET_DYR' intervall. Detta säkerställer att 'toppris'-perioder inte bara är relativt dyra för dagen, utan faktiskt dyra i absoluta termer (inte bara 'lite dyrare än genomsnittet på en billig dag').",
"enable_min_periods_peak": "När aktiverad kommer filtren att gradvis luckras upp om inte tillräckligt många perioder hittas. Detta försöker uppnå det önskade minsta antalet perioder för att säkerställa att du blir varnad för dyra perioder även på dagar med ovanliga prismönster.", "enable_min_periods_peak": "När aktiverad kommer filtren att gradvis luckras upp om inte tillräckligt många perioder hittas. Detta försöker uppnå det önskade minsta antalet perioder för att säkerställa att du blir varnad för dyra perioder även på dagar med ovanliga prismönster.",
"min_periods_peak": "Minsta antal topprisperioder att sträva efter per dag. Filtren kommer att luckras upp steg för steg för att försöka uppnå detta antal. Endast aktiv när 'Försök uppnå minsta antal perioder' är aktiverad. Standard: 1", "min_periods_peak": "Minsta antal topprisperioder att sträva efter per dag. Filtren kommer att luckras upp steg för steg för att försöka uppnå detta antal. Endast aktiv när 'Försök uppnå minsta antal perioder' är aktiverad. Standard: 1",
"relaxation_step_peak": "Procentandel av den ursprungliga flexibilitetströskeln att lägga till per avslappningssteg. Till exempel: med -15% flexibilitet och 25% stegstorlek kommer filtren att prova -15%, -18,75%, -22,5%, osv. Högre värden innebär snabbare avslappning men mindre precision.",
"relaxation_attempts_peak": "Hur många gånger avslappningslogiken får försöka hitta nya kombinationer av flex och filter. Öka detta när topperioderna är svåra att hitta på grund av platta eller mycket volatila dagar. Fler försök ger större chans att hitta perioder men kräver lite mer beräkningstid.", "relaxation_attempts_peak": "Hur många gånger avslappningslogiken får försöka hitta nya kombinationer av flex och filter. Öka detta när topperioderna är svåra att hitta på grund av platta eller mycket volatila dagar. Fler försök ger större chans att hitta perioder men kräver lite mer beräkningstid.",
"peak_price_max_level_gap_count": "Maximalt antal på varandra följande intervaller som får avvika med exakt ett nivåsteg från det erforderliga nivået. Till exempel: med 'Dyrt' filter och gapantal 2 accepteras sekvensen 'DYRT, NORMALT, NORMALT, DYRT' (NORMALT är ett steg under DYRT). Detta förhindrar att perioder delas upp av tillfälliga nivåavvikelser. Standard: 0 (strikt filtrering, ingen tolerans)." "peak_price_max_level_gap_count": "Maximalt antal på varandra följande intervaller som får avvika med exakt ett nivåsteg från det erforderliga nivået. Till exempel: med 'Dyrt' filter och gapantal 2 accepteras sekvensen 'DYRT, NORMALT, NORMALT, DYRT' (NORMALT är ett steg under DYRT). Detta förhindrar att perioder delas upp av tillfälliga nivåavvikelser. Standard: 0 (strikt filtrering, ingen tolerans)."
}, },

View file

@ -6,6 +6,7 @@ This section contains documentation for contributors and maintainers of the Tibb
- **[Setup](setup.md)** - DevContainer, environment setup, and dependencies - **[Setup](setup.md)** - DevContainer, environment setup, and dependencies
- **[Architecture](architecture.md)** - Code structure, patterns, and conventions - **[Architecture](architecture.md)** - Code structure, patterns, and conventions
- **[Period Calculation Theory](period-calculation-theory.md)** - Mathematical foundations, Flex/Distance interaction, Relaxation strategy
- **[Timer Architecture](timer-architecture.md)** - Timer system, scheduling, coordination (3 independent timers) - **[Timer Architecture](timer-architecture.md)** - Timer system, scheduling, coordination (3 independent timers)
- **[Caching Strategy](caching-strategy.md)** - Cache layers, invalidation, debugging - **[Caching Strategy](caching-strategy.md)** - Cache layers, invalidation, debugging
- **[Testing](testing.md)** - How to run tests and write new test cases - **[Testing](testing.md)** - How to run tests and write new test cases

View file

@ -0,0 +1,441 @@
# Period Calculation Theory
## Overview
This document explains the mathematical foundations and design decisions behind the period calculation algorithm, particularly focusing on the interaction between **Flexibility (Flex)**, **Minimum Distance from Average**, and **Relaxation Strategy**.
**Target Audience:** Developers maintaining or extending the period calculation logic.
**Related Files:**
- `coordinator/period_handlers/core.py` - Main calculation entry point
- `coordinator/period_handlers/level_filtering.py` - Flex and distance filtering
- `coordinator/period_handlers/relaxation.py` - Multi-phase relaxation strategy
- `coordinator/periods.py` - Period calculator orchestration
---
## Core Filtering Criteria
Period detection uses **three independent filters** (all must pass):
### 1. Flex Filter (Price Distance from Reference)
**Purpose:** Limit how far prices can deviate from the daily min/max.
**Logic:**
```python
# Best Price: Price must be within flex% ABOVE daily minimum
in_flex = price <= (daily_min + daily_min × flex)
# Peak Price: Price must be within flex% BELOW daily maximum
in_flex = price >= (daily_max - daily_max × flex)
```
**Example (Best Price):**
- Daily Min: 10 ct/kWh
- Flex: 15%
- Acceptance Range: 0 - 11.5 ct/kWh (10 + 10×0.15)
### 2. Min Distance Filter (Distance from Daily Average)
**Purpose:** Ensure periods are **significantly** cheaper/more expensive than average, not just marginally better.
**Logic:**
```python
# Best Price: Price must be at least min_distance% BELOW daily average
meets_distance = price <= (daily_avg × (1 - min_distance/100))
# Peak Price: Price must be at least min_distance% ABOVE daily average
meets_distance = price >= (daily_avg × (1 + min_distance/100))
```
**Example (Best Price):**
- Daily Avg: 15 ct/kWh
- Min Distance: 5%
- Acceptance Range: 0 - 14.25 ct/kWh (15 × 0.95)
### 3. Level Filter (Price Level Classification)
**Purpose:** Restrict periods to specific price classifications (VERY_CHEAP, CHEAP, NORMAL, EXPENSIVE, VERY_EXPENSIVE).
**Logic:** See `level_filtering.py` for gap tolerance details.
---
## The Flex × Min_Distance Conflict
### Problem Statement
**These two filters can conflict when Flex is high!**
#### Scenario: Best Price with Flex=50%, Min_Distance=5%
**Given:**
- Daily Min: 10 ct/kWh
- Daily Avg: 15 ct/kWh
- Daily Max: 20 ct/kWh
**Flex Filter (50%):**
```
Max accepted = 10 + (10 × 0.50) = 15 ct/kWh
```
**Min Distance Filter (5%):**
```
Max accepted = 15 × (1 - 0.05) = 14.25 ct/kWh
```
**Conflict:**
- Interval at 14.8 ct/kWh:
- ✅ Flex: 14.8 ≤ 15 (PASS)
- ❌ Distance: 14.8 > 14.25 (FAIL)
- **Result:** Rejected by Min_Distance even though Flex allows it!
**The Issue:** At high Flex values, Min_Distance becomes the dominant filter and blocks intervals that Flex would permit. This defeats the purpose of having high Flex.
### Mathematical Analysis
**Conflict condition for Best Price:**
```
daily_min × (1 + flex) > daily_avg × (1 - min_distance/100)
```
**Typical values:**
- Min = 10, Avg = 15, Min_Distance = 5%
- Conflict occurs when: `10 × (1 + flex) > 14.25`
- Simplify: `flex > 0.425` (42.5%)
**Below 42.5% Flex:** Both filters contribute meaningfully.
**Above 42.5% Flex:** Min_Distance dominates and blocks intervals.
### Solution: Dynamic Min_Distance Scaling
**Approach:** Reduce Min_Distance proportionally as Flex increases.
**Formula:**
```python
if flex > 0.20: # 20% threshold
flex_excess = flex - 0.20
scale_factor = max(0.25, 1.0 - (flex_excess × 2.5))
adjusted_min_distance = original_min_distance × scale_factor
```
**Scaling Table (Original Min_Distance = 5%):**
| Flex | Scale Factor | Adjusted Min_Distance | Rationale |
|-------|--------------|----------------------|-----------|
| ≤20% | 1.00 | 5.0% | Standard - both filters relevant |
| 25% | 0.88 | 4.4% | Slight reduction |
| 30% | 0.75 | 3.75% | Moderate reduction |
| 40% | 0.50 | 2.5% | Strong reduction - Flex dominates |
| 50% | 0.25 | 1.25% | Minimal distance - Flex decides |
**Why stop at 25% of original?**
- Min_Distance ensures periods are **significantly** different from average
- Even at 1.25%, prevents "flat days" (little price variation) from accepting every interval
- Maintains semantic meaning: "this is a meaningful best/peak price period"
**Implementation:** See `level_filtering.py``check_interval_criteria()`
---
## Flex Limits and Safety Caps
### Hard Limits (Enforced in Code)
#### 1. Absolute Maximum: 50%
**Enforcement:** `core.py` caps `abs(flex)` at 0.50 (50%)
**Rationale:**
- Above 50%, period detection becomes unreliable
- Best Price: Almost entire day qualifies (Min + 50% typically covers 60-80% of intervals)
- Peak Price: Similar issue with Max - 50%
- **Result:** Either massive periods (entire day) or no periods (min_length not met)
**Warning Message:**
```
Flex XX% exceeds maximum safe value! Capping at 50%.
Recommendation: Use 15-20% with relaxation enabled, or 25-35% without relaxation.
```
#### 2. Outlier Filtering Maximum: 25%
**Enforcement:** `core.py` caps outlier filtering flex at 0.25 (25%)
**Rationale:**
- Outlier filtering uses Flex to determine "stable context" threshold
- At > 25% Flex, almost any price swing is considered "stable"
- **Result:** Legitimate price shifts aren't smoothed, breaking period formation
**Note:** User's Flex still applies to period criteria (`in_flex` check), only outlier filtering is capped.
### Recommended Ranges (User Guidance)
#### With Relaxation Enabled (Recommended)
**Optimal:** 10-20%
- Relaxation increases Flex incrementally: 15% → 18% → 21% → ...
- Low baseline ensures relaxation has room to work
**Warning Threshold:** > 25%
- INFO log: "Base flex is on the high side"
**High Warning:** > 30%
- WARNING log: "Base flex is very high for relaxation mode!"
- Recommendation: Lower to 15-20%
#### Without Relaxation
**Optimal:** 20-35%
- No automatic adjustment, must be sufficient from start
- Higher baseline acceptable since no relaxation fallback
**Maximum Useful:** ~50%
- Above this, period detection degrades (see Hard Limits)
---
## Relaxation Strategy
### Purpose
Ensure **minimum periods per day** are found even when baseline filters are too strict.
**Use Case:** User configures strict filters (low Flex, restrictive Level) but wants guarantee of N periods/day for automation reliability.
### Multi-Phase Approach
**Each day processed independently:**
1. Calculate baseline periods with user's config
2. If insufficient periods found, enter relaxation loop
3. Try progressively relaxed filter combinations
4. Stop when target reached or all attempts exhausted
### Relaxation Increments
**Problem (Before Fix):**
```python
# OLD: Increment scales with base Flex
increment = base_flex × (step_pct / 100)
# Example: base_flex=40%, step_pct=25%
increment = 0.40 × 0.25 = 0.10 (10% per step!)
# After 6 steps: 40% → 50% → 60% → 70% → 80% → 90% → 100% (explosion!)
```
**Solution (Current):**
```python
# NEW: Cap increment at 3% per step
raw_increment = base_flex × (step_pct / 100)
capped_increment = min(raw_increment, 0.03) # 3% maximum
# Example: base_flex=40%, step_pct=25%
increment = min(0.10, 0.03) = 0.03 (3% per step)
# After 8 steps: 40% → 43% → 46% → 49% → 52% → 55% → 58% → 61% (controlled!)
```
**Rationale:**
- High base Flex (30%+) already very permissive
- Large increments push toward 100% too quickly
- 100% Flex = accept ALL prices (meaningless periods)
**Warning Threshold:**
- If base Flex > 30% with relaxation enabled: Warn user to lower base Flex
### Filter Combination Strategy
**Per Flex level, try in order:**
1. Original Level filter
2. Level filter = "any" (disabled)
**Early Exit:** Stop immediately when target reached (don't try unnecessary combinations)
**Example Flow (target=2 periods/day):**
```
Day 2025-11-19:
1. Baseline flex=15%: Found 1 period (need 2)
2. Flex=18% + level=cheap: Found 1 period
3. Flex=18% + level=any: Found 2 periods → SUCCESS (stop)
```
---
## Implementation Notes
### Key Files and Functions
**Period Calculation Entry Point:**
```python
# coordinator/period_handlers/core.py
def calculate_periods(
all_prices: list[dict],
config: PeriodConfig,
time: TimeService,
) -> dict[str, Any]
```
**Flex + Distance Filtering:**
```python
# coordinator/period_handlers/level_filtering.py
def check_interval_criteria(
price: float,
criteria: IntervalCriteria,
) -> tuple[bool, bool] # (in_flex, meets_min_distance)
```
**Relaxation Orchestration:**
```python
# coordinator/period_handlers/relaxation.py
def calculate_periods_with_relaxation(...) -> tuple[dict, dict]
def relax_single_day(...) -> tuple[dict, dict]
```
### Debugging Tips
**Enable DEBUG logging:**
```yaml
# configuration.yaml
logger:
default: info
logs:
custom_components.tibber_prices.coordinator.period_handlers: debug
```
**Key log messages to watch:**
1. `"Filter statistics: X intervals checked"` - Shows how many intervals filtered by each criterion
2. `"After build_periods: X raw periods found"` - Periods before min_length filtering
3. `"Day X: Success with flex=Y%"` - Relaxation succeeded
4. `"High flex X% detected: Reducing min_distance Y% → Z%"` - Distance scaling active
---
## Common Configuration Pitfalls
### ❌ Anti-Pattern 1: High Flex with Relaxation
**Configuration:**
```yaml
best_price_flex: 40
enable_relaxation_best: true
```
**Problem:**
- Base Flex 40% already very permissive
- Relaxation increments further (43%, 46%, 49%, ...)
- Quickly approaches 50% cap with diminishing returns
**Solution:**
```yaml
best_price_flex: 15 # Let relaxation increase it
enable_relaxation_best: true
```
### ❌ Anti-Pattern 2: Zero Min_Distance
**Configuration:**
```yaml
best_price_min_distance_from_avg: 0
```
**Problem:**
- "Flat days" (little price variation) accept all intervals
- Periods lose semantic meaning ("significantly cheap")
- May create periods during barely-below-average times
**Solution:**
```yaml
best_price_min_distance_from_avg: 5 # Keep at least 5%
```
### ❌ Anti-Pattern 3: Conflicting Flex + Distance
**Configuration:**
```yaml
best_price_flex: 45
best_price_min_distance_from_avg: 10
```
**Problem:**
- Distance filter dominates, making Flex irrelevant
- Dynamic scaling helps but still suboptimal
**Solution:**
```yaml
best_price_flex: 20
best_price_min_distance_from_avg: 5
```
---
## Testing Scenarios
### Scenario 1: Normal Day (Good Variation)
**Price Range:** 10 - 20 ct/kWh (100% variation)
**Average:** 15 ct/kWh
**Expected Behavior:**
- Flex 15%: Should find 2-4 clear best price periods
- Flex 30%: Should find 4-8 periods (more lenient)
- Min_Distance 5%: Effective throughout range
### Scenario 2: Flat Day (Poor Variation)
**Price Range:** 14 - 16 ct/kWh (14% variation)
**Average:** 15 ct/kWh
**Expected Behavior:**
- Flex 15%: May find 1-2 small periods (or zero if no clear winners)
- Min_Distance 5%: Critical here - ensures only truly cheaper intervals qualify
- Without Min_Distance: Would accept almost entire day as "best price"
### Scenario 3: Extreme Day (High Volatility)
**Price Range:** 5 - 40 ct/kWh (700% variation)
**Average:** 18 ct/kWh
**Expected Behavior:**
- Flex 15%: Finds multiple very cheap periods (5-6 ct)
- Outlier filtering: May smooth isolated spikes (30-40 ct)
- Distance filter: Less impactful (clear separation between cheap/expensive)
---
## Future Enhancements
### Potential Improvements
1. **Adaptive Flex Calculation:**
- Auto-adjust Flex based on daily price variation
- High variation days: Lower Flex needed
- Low variation days: Higher Flex needed
2. **Machine Learning Approach:**
- Learn optimal Flex/Distance from user feedback
- Classify days by pattern (normal/flat/volatile/bimodal)
- Apply pattern-specific defaults
3. **Multi-Objective Optimization:**
- Balance period count vs. quality
- Consider period duration vs. price level
- Optimize for user's stated use case (EV charging vs. heat pump)
### Known Limitations
1. **Fixed increment step:** 3% cap may be too aggressive for very low base Flex
2. **Linear distance scaling:** Could benefit from non-linear curve
3. **No consideration of temporal distribution:** May find all periods in one part of day
---
## References
- [User Documentation: Period Calculation](../user/period-calculation.md)
- [Architecture Overview](./architecture.md)
- [Caching Strategy](./caching-strategy.md)
- [AGENTS.md](../../AGENTS.md) - AI assistant memory (implementation patterns)
## Changelog
- **2025-11-19**: Initial documentation of Flex/Distance interaction and Relaxation strategy fixes

View file

@ -186,6 +186,8 @@ peak_price_flex: -15 # Can be up to 15% less expensive than daily MAX
- **Increase (20-25%)** → Find more/longer periods - **Increase (20-25%)** → Find more/longer periods
- **Decrease (5-10%)** → Find only the very best/worst times - **Decrease (5-10%)** → Find only the very best/worst times
**⚠️ Important:** Flexibility works together with "Distance from Average" (see below). Very high flexibility (>30%) can conflict with the distance filter and become counterproductive. **Recommendation:** Start with 15-20% and enable relaxation instead of manually increasing flexibility.
#### Minimum Period Length #### Minimum Period Length
**What:** How long a period must be to show it **What:** How long a period must be to show it
@ -218,6 +220,17 @@ peak_price_min_distance_from_avg: 2
- **Increase (5-10%)** → Only show clearly better times - **Increase (5-10%)** → Only show clearly better times
- **Decrease (0-1%)** → Show any time below/above average - **Decrease (0-1%)** → Show any time below/above average
** Note:** This filter works **independently** from flexibility. Both conditions must be met:
- Price must be within flex range (close to MIN/MAX)
- **AND** price must be sufficiently below/above average
**Example conflict:** If daily MIN is 10 ct, daily AVG is 20 ct, flex is 50%, and min_distance is 5%:
- Flex allows prices up to 15 ct
- Distance requires prices ≤ 19 ct (20 - 5%)
- **Both must pass** → effective limit is 15 ct (the stricter one)
This is why very high flexibility (>30%) can be counterproductive - the distance filter may become the dominant constraint.
### Optional Filters ### Optional Filters
#### Level Filter (Absolute Quality) #### Level Filter (Absolute Quality)
@ -246,6 +259,77 @@ best_price_max_level_gap_count: 2 # Allow up to 2 NORMAL intervals per period
**Use case:** "Don't split periods just because one interval isn't perfectly CHEAP" **Use case:** "Don't split periods just because one interval isn't perfectly CHEAP"
### Tweaking Strategy: What to Adjust First?
When you're not happy with the default behavior, adjust settings in this order:
#### 1. **Start with Relaxation (Easiest)**
If you're not finding enough periods:
```yaml
enable_min_periods_best: true # Already default!
min_periods_best: 2 # Already default!
relaxation_attempts_best: 11 # Already default!
```
**Why start here?** Relaxation automatically finds the right balance for each day. Much easier than manual tuning.
#### 2. **Adjust Period Length (Simple)**
If periods are too short/long for your use case:
```yaml
best_price_min_period_length: 90 # Increase from 60 for longer periods
# OR
best_price_min_period_length: 45 # Decrease from 60 for shorter periods
```
**Safe to change:** This only affects duration, not price selection logic.
#### 3. **Fine-tune Flexibility (Moderate)**
If you consistently want more/fewer periods:
```yaml
best_price_flex: 20 # Increase from 15% for more periods
# OR
best_price_flex: 10 # Decrease from 15% for stricter selection
```
**⚠️ Watch out:** Values >25% may conflict with distance filter. Use relaxation instead.
#### 4. **Adjust Distance from Average (Advanced)**
Only if periods seem "mediocre" (not really cheap/expensive):
```yaml
best_price_min_distance_from_avg: 5 # Increase from 2% for stricter quality
```
**⚠️ Careful:** High values (>10%) can make it impossible to find periods on flat price days.
#### 5. **Enable Level Filter (Expert)**
Only if you want absolute quality requirements:
```yaml
best_price_max_level: cheap # Only show objectively CHEAP periods
```
**⚠️ Very strict:** Many days may have zero qualifying periods. **Always enable relaxation when using this!**
### Common Mistakes to Avoid
**Don't increase flexibility to >30% manually** → Use relaxation instead
**Don't combine high distance (>10%) with strict level filter** → Too restrictive
**Don't disable relaxation with strict filters** → You'll get zero periods on some days
**Don't change all settings at once** → Adjust one at a time and observe results
**Do use defaults + relaxation** → Works for 90% of cases
**Do adjust one setting at a time** → Easier to understand impact
**Do check sensor attributes** → Shows why periods were/weren't found
--- ---
## Understanding Relaxation ## Understanding Relaxation
@ -259,15 +343,34 @@ Sometimes, strict filters find too few periods (or none). **Relaxation automatic
```yaml ```yaml
enable_min_periods_best: true enable_min_periods_best: true
min_periods_best: 2 # Try to find at least 2 periods per day min_periods_best: 2 # Try to find at least 2 periods per day
relaxation_step_best: 35 # Increase flex by 35% per step (e.g., 15% → 20.25% → 27.3%) relaxation_attempts_best: 11 # Flex levels to test (default: 11 steps = 22 filter combinations)
relaxation_attempts_best: 8 # Flex levels to test (default 8 flex levels = 32 filter combinations)
``` ```
Set the matching `relaxation_attempts_peak` value when tuning Peak Price periods. Both sliders accept 1-12 attempts, and the default of 8 flex levels translates to 32 filter-combination tries (8 flex levels × 4 filter combos) for each of Best and Peak calculations. Lower it for quick feedback, or raise it when either sensor struggles to hit the minimum-period target on volatile days. ** Good news:** Relaxation is **enabled by default** with sensible settings. Most users don't need to change anything here!
Set the matching `relaxation_attempts_peak` value when tuning Peak Price periods. Both sliders accept 1-12 attempts, and the default of 11 flex levels translates to 22 filter-combination tries (11 flex levels × 2 filter combos) for each of Best and Peak calculations. Lower it for quick feedback, or raise it when either sensor struggles to hit the minimum-period target on volatile days.
### Why Relaxation Is Better Than Manual Tweaking
**Problem with manual settings:**
- You set flex to 25% → Works great on Monday (volatile prices)
- Same 25% flex on Tuesday (flat prices) → Finds "best price" periods that aren't really cheap
- You're stuck with one setting for all days
**Solution with relaxation:**
- Monday (volatile): Uses flex 15% (original) → Finds 2 perfect periods ✓
- Tuesday (flat): Escalates to flex 21% → Finds 2 decent periods ✓
- Wednesday (mixed): Uses flex 18% → Finds 2 good periods ✓
**Each day gets exactly the flexibility it needs!**
### How It Works (Adaptive Matrix) ### How It Works (Adaptive Matrix)
Relaxation uses a **matrix approach** - trying _N_ flexibility levels (your configured **relaxation attempts**) with the same 4 filter combinations. With the default of 8 attempts, that means 8 flex levels × 4 filter combinations = **32 total filter-combination tries per day**; fewer attempts mean fewer flex increases, while more attempts extend the search further before giving up. Relaxation uses a **matrix approach** - trying _N_ flexibility levels (your configured **relaxation attempts**) with 2 filter combinations per level. With the default of 11 attempts, that means 11 flex levels × 2 filter combinations = **22 total filter-combination tries per day**; fewer attempts mean fewer flex increases, while more attempts extend the search further before giving up.
**Important:** The flexibility increment is **fixed at 3% per step** (hard-coded for reliability). This means:
- Base flex 15% → 18% → 21% → 24% → ... → 48% (with 11 attempts)
- Base flex 20% → 23% → 26% → 29% → ... → 50% (with 11 attempts)
#### Phase Matrix #### Phase Matrix
@ -276,34 +379,33 @@ For each day, the system tries:
**Flexibility Levels (Attempts):** **Flexibility Levels (Attempts):**
1. Attempt 1 = Original flex (e.g., 15%) 1. Attempt 1 = Original flex (e.g., 15%)
2. Attempt 2 = +35% step (e.g., 20.25%) 2. Attempt 2 = +3% step (18%)
3. Attempt 3 = +35% step (e.g., 27.3%) 3. Attempt 3 = +3% step (21%)
4. Attempt 4 = +35% step (e.g., 36.9%) 4. Attempt 4 = +3% step (24%)
5. … Attempts 5-8 (default) continue adding +35% each time 5. … Attempts 5-11 (default) continue adding +3% each time
6. … Additional attempts keep extending the same pattern up to the 12-attempt maximum 6. … Additional attempts keep extending the same pattern up to the 12-attempt maximum (up to 51%)
**4 Filter Combinations (per flexibility level):** **2 Filter Combinations (per flexibility level):**
1. Original filters (your configured level filter) 1. Original filters (your configured level filter)
2. Remove level filter 2. Remove level filter (level=any)
**Example progression:** **Example progression:**
``` ```
Flex 15% + Original filters → Not enough periods Flex 15% + Original filters → Not enough periods
Flex 15% + Volatility=any → Not enough periods
Flex 15% + Level=any → Not enough periods Flex 15% + Level=any → Not enough periods
Flex 15% + All filters off → Not enough periods Flex 18% + Original filters → Not enough periods
Flex 20.25% + Original → SUCCESS! Found 2 periods ✓ Flex 18% + Level=any → SUCCESS! Found 2 periods ✓
(stops here - no need to try more) (stops here - no need to try more)
``` ```
### Choosing the Number of Attempts ### Choosing the Number of Attempts
- **Default (8 attempts)** balances speed and completeness for most grids (32 combinations per day for both Best and Peak) - **Default (11 attempts)** balances speed and completeness for most grids (22 combinations per day for both Best and Peak)
- **Lower (1-4 attempts)** if you only want mild relaxation and keep processing time minimal - **Lower (4-8 attempts)** if you only want mild relaxation and keep processing time minimal (reaches ~27-39% flex)
- **Higher (9-12 attempts)** for extremely volatile days or when you must hit a strict minimum (up to 48 combinations) - **Higher (12 attempts)** for extremely volatile days when you must reach near the 50% maximum (24 combinations)
- Remember: each additional attempt adds four more filter combinations because every new flex level still runs all four filter overrides - Remember: each additional attempt adds two more filter combinations because every new flex level still runs both filter overrides (original + level=any)
#### Per-Day Independence #### Per-Day Independence
@ -311,7 +413,7 @@ Flex 20.25% + Original → SUCCESS! Found 2 periods ✓
``` ```
Day 1: Finds 2 periods with flex 15% (original) → No relaxation needed Day 1: Finds 2 periods with flex 15% (original) → No relaxation needed
Day 2: Needs flex 27.3% + level=any → Uses relaxed settings Day 2: Needs flex 21% + level=any → Uses relaxed settings
Day 3: Finds 2 periods with flex 15% (original) → No relaxation needed Day 3: Finds 2 periods with flex 15% (original) → No relaxation needed
``` ```
@ -439,7 +541,7 @@ duration_minutes: 180
rating_level: "LOW" # All intervals are LOW price rating_level: "LOW" # All intervals are LOW price
price_avg: 18.5 # Average price in this period price_avg: 18.5 # Average price in this period
relaxation_active: true # This day used relaxation relaxation_active: true # This day used relaxation
relaxation_level: "price_diff_20.25%+level_any" # Found at flex 20.25%, level filter removed relaxation_level: "price_diff_18.0%+level_any" # Found at flex 18%, level filter removed
period_interval_smoothed_count: 2 # 2 outliers were smoothed (only if >0) period_interval_smoothed_count: 2 # 2 outliers were smoothed (only if >0)
period_interval_level_gap_count: 1 # 1 interval kept via gap tolerance (only if >0) period_interval_level_gap_count: 1 # 1 interval kept via gap tolerance (only if >0)
``` ```
@ -464,10 +566,9 @@ For advanced configuration patterns and technical deep-dive, see:
| `best_price_min_distance_from_avg` | 2% | 0-20% | Quality threshold | | `best_price_min_distance_from_avg` | 2% | 0-20% | Quality threshold |
| `best_price_max_level` | any | any/cheap/vcheap | Absolute quality | | `best_price_max_level` | any | any/cheap/vcheap | Absolute quality |
| `best_price_max_level_gap_count` | 0 | 0-10 | Gap tolerance | | `best_price_max_level_gap_count` | 0 | 0-10 | Gap tolerance |
| `enable_min_periods_best` | false | true/false | Enable relaxation | | `enable_min_periods_best` | true | true/false | Enable relaxation |
| `min_periods_best` | - | 1-10 | Target periods per day | | `min_periods_best` | 2 | 1-10 | Target periods per day |
| `relaxation_step_best` | - | 5-100% | Relaxation increment | | `relaxation_attempts_best` | 11 | 1-12 | Flex levels (attempts) per day |
| `relaxation_attempts_best` | 8 | 1-12 | Flex levels (attempts) per day |
**Peak Price:** Same parameters with `peak_price_*` prefix (defaults: flex=-15%, same otherwise) **Peak Price:** Same parameters with `peak_price_*` prefix (defaults: flex=-15%, same otherwise)
@ -508,5 +609,5 @@ The Tibber API provides price levels for each 15-minute interval:
--- ---
**Last updated:** November 15, 2025 **Last updated:** November 19, 2025
**Integration version:** 2.0+ **Integration version:** 2.0+