docs(period-calculation): update default thresholds for Best Price and Peak Price periods

This commit is contained in:
Julian Pawlowski 2025-11-20 11:52:15 +00:00
parent c2b9908e69
commit 46fcdb8ba3
2 changed files with 658 additions and 160 deletions

View file

@ -137,13 +137,165 @@ if flex > 0.20: # 20% threshold
**Implementation:** See `level_filtering.py``check_interval_criteria()` **Implementation:** See `level_filtering.py``check_interval_criteria()`
**Code Extract:**
```python
# coordinator/period_handlers/level_filtering.py
FLEX_SCALING_THRESHOLD = 0.20 # 20% - start adjusting min_distance
SCALE_FACTOR_WARNING_THRESHOLD = 0.8 # Log when reduction > 20%
def check_interval_criteria(price, criteria):
# ... flex check ...
# Dynamic min_distance scaling
adjusted_min_distance = criteria.min_distance_from_avg
flex_abs = abs(criteria.flex)
if flex_abs > FLEX_SCALING_THRESHOLD:
flex_excess = flex_abs - 0.20 # How much above 20%
scale_factor = max(0.25, 1.0 - (flex_excess × 2.5))
adjusted_min_distance = criteria.min_distance_from_avg × scale_factor
if scale_factor < SCALE_FACTOR_WARNING_THRESHOLD:
_LOGGER.debug(
"High flex %.1f%% detected: Reducing min_distance %.1f%% → %.1f%%",
flex_abs × 100,
criteria.min_distance_from_avg,
adjusted_min_distance,
)
# Apply adjusted min_distance in distance check
meets_min_distance = (
price <= avg_price × (1 - adjusted_min_distance/100) # Best Price
# OR
price >= avg_price × (1 + adjusted_min_distance/100) # Peak Price
)
```
**Why Linear Scaling?**
- Simple and predictable
- No abrupt behavior changes
- Easy to reason about for users and developers
- Alternative considered: Exponential scaling (rejected as too aggressive)
**Why 25% Minimum?**
- Below this, min_distance loses semantic meaning
- Even on flat days, some quality filter needed
- Prevents "every interval is a period" scenario
- Maintains user expectation: "best/peak price means notably different"
--- ---
## Flex Limits and Safety Caps ## Flex Limits and Safety Caps
### Hard Limits (Enforced in Code) ### Implementation Constants
#### 1. Absolute Maximum: 50% **Defined in `coordinator/period_handlers/core.py`:**
```python
MAX_SAFE_FLEX = 0.50 # 50% - hard cap: above this, period detection becomes unreliable
MAX_OUTLIER_FLEX = 0.25 # 25% - cap for outlier filtering: above this, spike detection too permissive
```
**Defined in `const.py`:**
```python
DEFAULT_BEST_PRICE_FLEX = 15 # 15% base - optimal for relaxation mode (default enabled)
DEFAULT_PEAK_PRICE_FLEX = -20 # 20% base (negative for peak detection)
DEFAULT_RELAXATION_ATTEMPTS_BEST = 11 # 11 steps: 15% → 48% (3% increment per step)
DEFAULT_RELAXATION_ATTEMPTS_PEAK = 11 # 11 steps: 20% → 50% (3% increment per step)
DEFAULT_BEST_PRICE_MIN_PERIOD_LENGTH = 60 # 60 minutes
DEFAULT_PEAK_PRICE_MIN_PERIOD_LENGTH = 30 # 30 minutes
DEFAULT_BEST_PRICE_MIN_DISTANCE_FROM_AVG = 5 # 5% minimum distance
DEFAULT_PEAK_PRICE_MIN_DISTANCE_FROM_AVG = 5 # 5% minimum distance
```
### Rationale for Asymmetric Defaults
**Why Best Price ≠ Peak Price?**
The different defaults reflect fundamentally different use cases:
#### Best Price: Optimization Focus
**Goal:** Find practical time windows for running appliances
**Constraints:**
- Appliances need time to complete cycles (dishwasher: 2-3h, EV charging: 4-8h)
- Short periods are impractical (not worth automation overhead)
- User wants genuinely cheap times, not just "slightly below average"
**Defaults:**
- **60 min minimum** - Ensures period is long enough for meaningful use
- **15% flex** - Stricter selection, focuses on truly cheap times
- **Reasoning:** Better to find fewer, higher-quality periods than many mediocre ones
**User behavior:**
- Automations trigger actions (turn on devices)
- Wrong automation = wasted energy/money
- Preference: Conservative (miss some savings) over aggressive (false positives)
#### Peak Price: Warning Focus
**Goal:** Alert users to expensive periods for consumption reduction
**Constraints:**
- Brief price spikes still matter (even 15-30 min is worth avoiding)
- Early warning more valuable than perfect accuracy
- User can manually decide whether to react
**Defaults:**
- **30 min minimum** - Catches shorter expensive spikes
- **20% flex** - More permissive, earlier detection
- **Reasoning:** Better to warn early (even if not peak) than miss expensive periods
**User behavior:**
- Notifications/alerts (informational)
- Wrong alert = minor inconvenience, not cost
- Preference: Sensitive (catch more) over specific (catch only extremes)
#### Mathematical Justification
**Peak Price Volatility:**
Price curves tend to have:
- **Sharp spikes** during peak hours (morning/evening)
- **Shorter duration** at maximum (1-2 hours typical)
- **Higher variance** in peak times than cheap times
**Example day:**
```
Cheap period: 02:00-07:00 (5 hours at 10-12 ct) ← Gradual, stable
Expensive period: 17:00-18:30 (1.5 hours at 35-40 ct) ← Sharp, brief
```
**Implication:**
- Stricter flex on peak (15%) might miss real expensive periods (too brief)
- Longer min_length (60 min) might exclude legitimate spikes
- Solution: More flexible thresholds for peak detection
#### Design Alternatives Considered
**Option 1: Symmetric defaults (rejected)**
- Both 60 min, both 15% flex
- Problem: Misses short but expensive spikes
- User feedback: "Why didn't I get warned about the 30-min price spike?"
**Option 2: Same defaults, let users figure it out (rejected)**
- No guidance on best practices
- Users would need to experiment to find good values
- Most users stick with defaults, so defaults matter
**Option 3: Current approach (adopted)**
- **All values user-configurable** via config flow options
- **Different installation defaults** for Best Price vs. Peak Price
- Defaults reflect recommended practices for each use case
- Users who need different behavior can adjust
- Most users benefit from sensible defaults without configuration
---
## Flex Limits and Safety Caps
#### 1. Absolute Maximum: 50% (MAX_SAFE_FLEX)
**Enforcement:** `core.py` caps `abs(flex)` at 0.50 (50%) **Enforcement:** `core.py` caps `abs(flex)` at 0.50 (50%)
@ -214,34 +366,72 @@ Ensure **minimum periods per day** are found even when baseline filters are too
### Relaxation Increments ### Relaxation Increments
**Problem (Before Fix):** **Current Implementation (November 2025):**
```python
# OLD: Increment scales with base Flex
increment = base_flex × (step_pct / 100)
# Example: base_flex=40%, step_pct=25% **File:** `coordinator/period_handlers/relaxation.py`
increment = 0.40 × 0.25 = 0.10 (10% per step!)
# After 6 steps: 40% → 50% → 60% → 70% → 80% → 90% → 100% (explosion!) ```python
# Hard-coded 3% increment per step (reliability over configurability)
flex_increment = 0.03 # 3% per step
base_flex = abs(config.flex)
# Generate flex levels
for attempt in range(max_relaxation_attempts):
flex_level = base_flex + (attempt × flex_increment)
# Try flex_level with both filter combinations
``` ```
**Solution (Current):** **Constants:**
```python ```python
# NEW: Cap increment at 3% per step FLEX_WARNING_THRESHOLD_RELAXATION = 0.25 # 25% - INFO: suggest lowering to 15-20%
raw_increment = base_flex × (step_pct / 100) FLEX_HIGH_THRESHOLD_RELAXATION = 0.30 # 30% - WARNING: very high for relaxation mode
capped_increment = min(raw_increment, 0.03) # 3% maximum MAX_FLEX_HARD_LIMIT = 0.50 # 50% - absolute maximum (enforced in core.py)
# Example: base_flex=40%, step_pct=25%
increment = min(0.10, 0.03) = 0.03 (3% per step)
# After 8 steps: 40% → 43% → 46% → 49% → 52% → 55% → 58% → 61% (controlled!)
``` ```
**Rationale:** **Design Decisions:**
- High base Flex (30%+) already very permissive
- Large increments push toward 100% too quickly
- 100% Flex = accept ALL prices (meaningless periods)
**Warning Threshold:** 1. **Why 3% fixed increment?**
- If base Flex > 30% with relaxation enabled: Warn user to lower base Flex - Predictable escalation path (15% → 18% → 21% → ...)
- Independent of base flex (works consistently)
- 11 attempts covers full useful range (15% → 48%)
- Balance: Not too slow (2%), not too fast (5%)
2. **Why hard-coded, not configurable?**
- Prevents user misconfiguration
- Simplifies mental model (fewer knobs to turn)
- Reliable behavior across all configurations
- If needed, user adjusts `max_relaxation_attempts` (fewer/more steps)
3. **Why warn at 25% base flex?**
- At 25% base, first relaxation step reaches 28%
- Above 30%, entering diminishing returns territory
- User likely doesn't need relaxation with such high base flex
- Should either: (a) lower base flex, or (b) disable relaxation
**Historical Context (Pre-November 2025):**
The algorithm previously used percentage-based increments that scaled with base flex:
```python
increment = base_flex × (step_pct / 100) # REMOVED
```
This caused exponential escalation with high base flex values (e.g., 40% → 50% → 60% → 70% in just 6 steps), making behavior unpredictable. The fixed 3% increment solves this by providing consistent, controlled escalation regardless of starting point.
**Warning Messages:**
```python
if base_flex >= FLEX_HIGH_THRESHOLD_RELAXATION: # 30%
_LOGGER.warning(
"Base flex %.1f%% is very high for relaxation mode! "
"Consider lowering to 15-20%% or disabling relaxation.",
base_flex × 100,
)
elif base_flex >= FLEX_WARNING_THRESHOLD_RELAXATION: # 25%
_LOGGER.info(
"Base flex %.1f%% is on the high side. "
"Consider 15-20%% for optimal relaxation effectiveness.",
base_flex × 100,
)
```
### Filter Combination Strategy ### Filter Combination Strategy
@ -291,7 +481,111 @@ def calculate_periods_with_relaxation(...) -> tuple[dict, dict]
def relax_single_day(...) -> tuple[dict, dict] def relax_single_day(...) -> tuple[dict, dict]
``` ```
### Debugging Tips #### Outlier Filtering Implementation
**File:** `coordinator/period_handlers/outlier_filtering.py`
**Purpose:** Detect and smooth isolated price spikes before period identification to prevent artificial fragmentation.
**Algorithm Details:**
1. **Linear Regression Prediction:**
- Uses surrounding intervals to predict expected price
- Window size: 3+ intervals (MIN_CONTEXT_SIZE)
- Calculates trend slope and standard deviation
- Formula: `predicted = mean + slope × (position - center)`
2. **Confidence Intervals:**
- 95% confidence level (2 standard deviations)
- Tolerance = 2.0 × std_dev (CONFIDENCE_LEVEL constant)
- Outlier if: `|actual - predicted| > tolerance`
- Accounts for natural price volatility in context window
3. **Symmetry Check:**
- Rejects asymmetric outliers (threshold: 1.5 std dev)
- Preserves legitimate price shifts (morning/evening peaks)
- Algorithm:
```python
residual = abs(actual - predicted)
symmetry_threshold = 1.5 × std_dev
if residual > tolerance:
# Check if spike is symmetric in context
context_residuals = [abs(p - pred) for p, pred in context]
avg_context_residual = mean(context_residuals)
if residual > symmetry_threshold × avg_context_residual:
# Asymmetric spike → smooth it
else:
# Symmetric (part of trend) → keep it
```
4. **Enhanced Zigzag Detection:**
- Detects spike clusters via relative volatility
- Threshold: 2.0× local volatility (RELATIVE_VOLATILITY_THRESHOLD)
- Single-pass algorithm (no iteration needed)
- Catches patterns like: 18, 35, 19, 34, 18 (alternating spikes)
**Constants:**
```python
# coordinator/period_handlers/outlier_filtering.py
CONFIDENCE_LEVEL = 2.0 # 95% confidence (2 std deviations)
SYMMETRY_THRESHOLD = 1.5 # Asymmetry detection threshold
RELATIVE_VOLATILITY_THRESHOLD = 2.0 # Zigzag spike detection
MIN_CONTEXT_SIZE = 3 # Minimum intervals for regression
```
**Data Integrity:**
- Original prices stored in `_original_price` field
- All statistics (daily min/max/avg) use original prices
- Smoothing only affects period formation logic
- Smart counting: Only counts smoothing that changed period outcome
**Performance:**
- Single pass through price data
- O(n) complexity with small context window
- No iterative refinement needed
- Typical processing time: <1ms for 96 intervals
**Example Debug Output:**
```
DEBUG: [2025-11-11T14:30:00+01:00] Outlier detected: 35.2 ct
DEBUG: Context: 18.5, 19.1, 19.3, 19.8, 20.2 ct
DEBUG: Residual: 14.5 ct > tolerance: 4.8 ct (2×2.4 std dev)
DEBUG: Trend slope: 0.3 ct/interval (gradual increase)
DEBUG: Predicted: 20.7 ct (linear regression)
DEBUG: Smoothed to: 20.7 ct
DEBUG: Asymmetry ratio: 3.2 (>1.5 threshold) → confirmed outlier
```
**Why This Approach?**
1. **Linear regression over moving average:**
- Accounts for price trends (morning ramp-up, evening decline)
- Moving average can't predict direction, only level
- Better accuracy on non-stationary price curves
2. **Symmetry check over fixed threshold:**
- Prevents false positives on legitimate price shifts
- Adapts to local volatility patterns
- Preserves user expectation: "expensive during peak hours"
3. **Single-pass over iterative:**
- Predictable behavior (no convergence issues)
- Fast and deterministic
- Easier to debug and reason about
**Alternative Approaches Considered:**
1. **Median filtering** - Rejected: Too aggressive, removes legitimate peaks
2. **Moving average** - Rejected: Can't handle trends
3. **IQR (Interquartile Range)** - Rejected: Assumes normal distribution
4. **RANSAC** - Rejected: Overkill for 1D data, slow
---
## Debugging Tips
**Enable DEBUG logging:** **Enable DEBUG logging:**
```yaml ```yaml
@ -345,7 +639,7 @@ best_price_min_distance_from_avg: 0
**Solution:** **Solution:**
```yaml ```yaml
best_price_min_distance_from_avg: 5 # Keep at least 5% best_price_min_distance_from_avg: 5 # Use default 5%
``` ```
### ❌ Anti-Pattern 3: Conflicting Flex + Distance ### ❌ Anti-Pattern 3: Conflicting Flex + Distance
@ -380,6 +674,14 @@ best_price_min_distance_from_avg: 5
- Flex 30%: Should find 4-8 periods (more lenient) - Flex 30%: Should find 4-8 periods (more lenient)
- Min_Distance 5%: Effective throughout range - Min_Distance 5%: Effective throughout range
**Debug Checks:**
```
DEBUG: Filter statistics: 96 intervals checked
DEBUG: Filtered by FLEX: 12/96 (12.5%) ← Low percentage = good variation
DEBUG: Filtered by MIN_DISTANCE: 8/96 (8.3%) ← Both filters active
DEBUG: After build_periods: 3 raw periods found
```
### Scenario 2: Flat Day (Poor Variation) ### Scenario 2: Flat Day (Poor Variation)
**Price Range:** 14 - 16 ct/kWh (14% variation) **Price Range:** 14 - 16 ct/kWh (14% variation)
@ -390,6 +692,15 @@ best_price_min_distance_from_avg: 5
- Min_Distance 5%: Critical here - ensures only truly cheaper intervals qualify - Min_Distance 5%: Critical here - ensures only truly cheaper intervals qualify
- Without Min_Distance: Would accept almost entire day as "best price" - Without Min_Distance: Would accept almost entire day as "best price"
**Debug Checks:**
```
DEBUG: Filter statistics: 96 intervals checked
DEBUG: Filtered by FLEX: 45/96 (46.9%) ← High percentage = poor variation
DEBUG: Filtered by MIN_DISTANCE: 52/96 (54.2%) ← Distance filter dominant
DEBUG: After build_periods: 1 raw period found
DEBUG: Day 2025-11-11: Baseline insufficient (1 < 2), starting relaxation
```
### Scenario 3: Extreme Day (High Volatility) ### Scenario 3: Extreme Day (High Volatility)
**Price Range:** 5 - 40 ct/kWh (700% variation) **Price Range:** 5 - 40 ct/kWh (700% variation)
@ -400,6 +711,75 @@ best_price_min_distance_from_avg: 5
- Outlier filtering: May smooth isolated spikes (30-40 ct) - Outlier filtering: May smooth isolated spikes (30-40 ct)
- Distance filter: Less impactful (clear separation between cheap/expensive) - Distance filter: Less impactful (clear separation between cheap/expensive)
**Debug Checks:**
```
DEBUG: Outlier detected: 38.5 ct (threshold: 4.2 ct)
DEBUG: Smoothed to: 20.1 ct (trend prediction)
DEBUG: Filter statistics: 96 intervals checked
DEBUG: Filtered by FLEX: 8/96 (8.3%) ← Very selective
DEBUG: Filtered by MIN_DISTANCE: 4/96 (4.2%) ← Flex dominates
DEBUG: After build_periods: 4 raw periods found
```
### Scenario 4: Relaxation Success
**Initial State:** Baseline finds 1 period, target is 2
**Expected Flow:**
```
INFO: Calculating BEST PRICE periods: relaxation=ON, target=2/day, flex=15.0%
DEBUG: Day 2025-11-11: Baseline found 1 period (need 2)
DEBUG: Phase 1: flex 18.0% + original filters
DEBUG: Found 1 period (insufficient)
DEBUG: Phase 2: flex 18.0% + level=any
DEBUG: Found 2 periods → SUCCESS
INFO: Day 2025-11-11: Success after 1 relaxation phase (2 periods)
```
### Scenario 5: Relaxation Exhausted
**Initial State:** Strict filters, very flat day
**Expected Flow:**
```
INFO: Calculating BEST PRICE periods: relaxation=ON, target=2/day, flex=15.0%
DEBUG: Day 2025-11-11: Baseline found 0 periods (need 2)
DEBUG: Phase 1-11: flex 15%→48%, all filter combinations tried
WARNING: Day 2025-11-11: All relaxation phases exhausted, still only 1 period found
INFO: Period calculation completed: 1/2 days reached target
```
### Debugging Checklist
When debugging period calculation issues:
1. **Check Filter Statistics**
- Which filter blocks most intervals? (flex, distance, or level)
- High flex filtering (>30%) = Need more flexibility or relaxation
- High distance filtering (>50%) = Min_distance too strict or flat day
- High level filtering = Level filter too restrictive
2. **Check Relaxation Behavior**
- Did relaxation activate? Check for "Baseline insufficient" message
- Which phase succeeded? Early success (phase 1-3) = good config
- Late success (phase 8-11) = Consider adjusting base config
- Exhausted all phases = Unrealistic target for this day's price curve
3. **Check Flex Warnings**
- INFO at 25% base flex = On the high side
- WARNING at 30% base flex = Too high for relaxation
- If seeing these: Lower base flex to 15-20%
4. **Check Min_Distance Scaling**
- Debug messages show "High flex X% detected: Reducing min_distance Y% → Z%"
- If scale factor <0.8 (20% reduction): High flex is active
- If periods still not found: Filters conflict even with scaling
5. **Check Outlier Filtering**
- Look for "Outlier detected" messages
- Check `period_interval_smoothed_count` attribute
- If no smoothing but periods fragmented: Not isolated spikes, but legitimate price levels
--- ---
## Future Enhancements ## Future Enhancements
@ -429,6 +809,170 @@ best_price_min_distance_from_avg: 5
--- ---
## Future Enhancements
### Potential Improvements
#### 1. Adaptive Flex Calculation (Not Yet Implemented)
**Concept:** Auto-adjust Flex based on daily price variation
**Algorithm:**
```python
# Pseudo-code for adaptive flex
variation = (daily_max - daily_min) / daily_avg
if variation < 0.15: # Flat day (< 15% variation)
adaptive_flex = 0.30 # Need higher flex
elif variation > 0.50: # High volatility (> 50% variation)
adaptive_flex = 0.10 # Lower flex sufficient
else: # Normal day
adaptive_flex = 0.15 # Standard flex
```
**Benefits:**
- Eliminates need for relaxation on most days
- Self-adjusting to market conditions
- Better user experience (less configuration needed)
**Challenges:**
- Harder to predict behavior (less transparent)
- May conflict with user's mental model
- Needs extensive testing across different markets
**Status:** Considered but not implemented (prefer explicit relaxation)
#### 2. Machine Learning Approach (Future Work)
**Concept:** Learn optimal Flex/Distance from user feedback
**Approach:**
- Track which periods user actually uses (automation triggers)
- Classify days by pattern (normal/flat/volatile/bimodal)
- Apply pattern-specific defaults
- Learn per-user preferences over time
**Benefits:**
- Personalized to user's actual behavior
- Adapts to local market patterns
- Could discover non-obvious patterns
**Challenges:**
- Requires user feedback mechanism (not implemented)
- Privacy concerns (storing usage patterns)
- Complexity for users to understand "why this period?"
- Cold start problem (new users have no history)
**Status:** Theoretical only (no implementation planned)
#### 3. Multi-Objective Optimization (Research Idea)
**Concept:** Balance multiple goals simultaneously
**Goals:**
- Period count vs. quality (cheap vs. very cheap)
- Period duration vs. price level (long mediocre vs. short excellent)
- Temporal distribution (spread throughout day vs. clustered)
- User's stated use case (EV charging vs. heat pump vs. dishwasher)
**Algorithm:**
- Pareto optimization (find trade-off frontier)
- User chooses point on frontier via preferences
- Genetic algorithm or simulated annealing
**Benefits:**
- More sophisticated period selection
- Better match to user's actual needs
- Could handle complex appliance requirements
**Challenges:**
- Much more complex to implement
- Harder to explain to users
- Computational cost (may need caching)
- Configuration explosion (too many knobs)
**Status:** Research idea only (not planned)
### Known Limitations
#### 1. Fixed Increment Step
**Current:** 3% cap may be too aggressive for very low base Flex
**Example:**
- Base flex 5% + 3% increment = 8% (60% increase!)
- Base flex 15% + 3% increment = 18% (20% increase)
**Possible Solution:**
- Percentage-based increment: `increment = max(base_flex × 0.20, 0.03)`
- This gives: 5% → 6% (20%), 15% → 18% (20%), 40% → 43% (7.5%)
**Why Not Implemented:**
- Very low base flex (<10%) unusual
- Users with strict requirements likely disable relaxation
- Simplicity preferred over edge case optimization
#### 2. Linear Distance Scaling
**Current:** Linear scaling may be too aggressive/conservative
**Alternative:** Non-linear curve
```python
# Example: Exponential scaling
scale_factor = 0.25 + 0.75 × exp(-5 × (flex - 0.20))
# Or: Sigmoid scaling
scale_factor = 0.25 + 0.75 / (1 + exp(10 × (flex - 0.35)))
```
**Why Not Implemented:**
- Linear is easier to reason about
- No evidence that non-linear is better
- Would need extensive testing
#### 3. No Temporal Distribution Consideration
**Issue:** May find all periods in one part of day
**Example:**
- All 3 "best price" periods between 02:00-08:00
- No periods in evening (when user might want to run appliances)
**Possible Solution:**
- Add "spread" parameter (prefer distributed periods)
- Weight periods by time-of-day preferences
- Consider user's typical usage patterns
**Why Not Implemented:**
- Adds complexity
- Users can work around with multiple automations
- Different users have different needs (no one-size-fits-all)
#### 4. Period Boundary Handling
**Current Behavior:** Periods can cross midnight naturally
**Issue:** Period starting 23:45 continues into next day
- Uses Day 1's daily_min as reference
- May be confusing when Day 2's prices very different
**Alternative Approaches Considered:**
1. **Split at midnight** - Always keep periods within calendar day
- Problem: Artificially fragments natural periods
- Rejected: Worse user experience
2. **Use next day's reference** - Switch reference at midnight
- Problem: Period criteria inconsistent across its duration
- Rejected: Confusing and unpredictable
3. **Current approach** - Lock to start day's reference
- Benefit: Consistent criteria throughout period
- Drawback: Period may "spill" into different price context
**Status:** Current approach is intentional design choice
---
## References ## References
- [User Documentation: Period Calculation](../user/period-calculation.md) - [User Documentation: Period Calculation](../user/period-calculation.md)

View file

@ -27,12 +27,30 @@ The integration finds time windows when electricity is especially **cheap** (Bes
Out of the box, the integration: Out of the box, the integration:
1. **Best Price**: Finds cheapest 1-hour+ windows that are at least 2% below the daily average 1. **Best Price**: Finds cheapest 1-hour+ windows that are at least 5% below the daily average
2. **Peak Price**: Finds most expensive 1-hour+ windows that are at least 2% above the daily average 2. **Peak Price**: Finds most expensive 30-minute+ windows that are at least 5% above the daily average
3. **Relaxation**: Automatically loosens filters if not enough periods are found 3. **Relaxation**: Automatically loosens filters if not enough periods are found
**Most users don't need to change anything!** The defaults work well for typical use cases. **Most users don't need to change anything!** The defaults work well for typical use cases.
<details>
<summary> Why do Best Price and Peak Price have different defaults?</summary>
The integration sets different **initial defaults** because the features serve different purposes:
**Best Price (60 min, 15% flex):**
- Longer duration ensures appliances can complete their cycles
- Stricter flex (15%) focuses on genuinely cheap times
- Use case: Running dishwasher, EV charging, water heating
**Peak Price (30 min, 20% flex):**
- Shorter duration acceptable for early warnings
- More flexible (20%) catches price spikes earlier
- Use case: Alerting to expensive periods, even brief ones
**You can adjust all these values** in the configuration if the defaults don't fit your use case. The asymmetric defaults simply provide good starting points for typical scenarios.
</details>
### Example Timeline ### Example Timeline
``` ```
@ -87,10 +105,10 @@ Periods must be meaningfully different from the daily average:
``` ```
Daily AVG: 30 ct/kWh Daily AVG: 30 ct/kWh
Minimum distance: 2% (default) Minimum distance: 5% (default)
Best Price: Must be ≤ 29.4 ct/kWh (30 - 2%) Best Price: Must be ≤ 28.5 ct/kWh (30 - 5%)
Peak Price: Must be ≥ 30.6 ct/kWh (30 + 2%) Peak Price: Must be ≥ 31.5 ct/kWh (30 + 5%)
``` ```
**Why?** This prevents marking mediocre times as "best" just because they're slightly below average. **Why?** This prevents marking mediocre times as "best" just because they're slightly below average.
@ -112,38 +130,21 @@ You can optionally require:
- **Absolute quality** (level filter) - "Only show if prices are CHEAP/EXPENSIVE (not just below/above average)" - **Absolute quality** (level filter) - "Only show if prices are CHEAP/EXPENSIVE (not just below/above average)"
#### 5. Statistical Outlier Filtering #### 5. Automatic Price Spike Smoothing
**Before** period identification, price spikes are automatically detected and smoothed: Isolated price spikes are automatically detected and smoothed to prevent unnecessary period fragmentation:
``` ```
Raw prices: 18, 19, 35, 20, 19 ct ← 35 ct is an isolated spike Original prices: 18, 19, 35, 20, 19 ct ← 35 ct is an isolated outlier
Smoothed: 18, 19, 19, 20, 19 ct ← Spike replaced with trend prediction Smoothed: 18, 19, 19, 20, 19 ct ← Spike replaced with trend prediction
Result: Continuous period 00:00-01:15 instead of split periods Result: Continuous period 00:00-01:15 instead of split periods
``` ```
**How it works:** **Important:**
- Original prices are always preserved (min/max/avg show real values)
- **Linear regression** predicts expected price based on surrounding trend - Smoothing only affects which intervals are combined into periods
- **95% confidence intervals** (2 standard deviations) define spike tolerance - The attribute `period_interval_smoothed_count` shows if smoothing was active
- **Symmetry checking** preserves legitimate price shifts (morning/evening peaks)
- **Enhanced zigzag detection** catches spike clusters without multiple passes
**Data integrity:**
- Original prices **always preserved** for statistics (min/max/avg show real values)
- Smoothing **only affects period formation** (which intervals qualify for periods)
- Attributes show when smoothing was impactful: `period_interval_smoothed_count`
**Example log output:**
```
DEBUG: [2025-11-11T14:30:00+01:00] Outlier detected: 35.2 ct
DEBUG: Residual: 14.5 ct > tolerance: 4.8 ct (2×2.4 std dev)
DEBUG: Trend slope: 0.3 ct/interval (gradual increase)
DEBUG: Smoothed to: 20.7 ct (trend prediction)
```
### Visual Example ### Visual Example
@ -186,17 +187,17 @@ peak_price_flex: -15 # Can be up to 15% less expensive than daily MAX
- **Increase (20-25%)** → Find more/longer periods - **Increase (20-25%)** → Find more/longer periods
- **Decrease (5-10%)** → Find only the very best/worst times - **Decrease (5-10%)** → Find only the very best/worst times
**⚠️ Important:** Flexibility works together with "Distance from Average" (see below). Very high flexibility (>30%) can conflict with the distance filter and become counterproductive. **Recommendation:** Start with 15-20% and enable relaxation instead of manually increasing flexibility. **💡 Tip:** Very high flexibility (>30%) is rarely useful. **Recommendation:** Start with 15-20% and enable relaxation it adapts automatically to each day's price pattern.
#### Minimum Period Length #### Minimum Period Length
**What:** How long a period must be to show it **What:** How long a period must be to show it
**Default:** 60 minutes **Default:** 60 minutes (Best Price), 30 minutes (Peak Price)
**Range:** 15-240 minutes **Range:** 15-240 minutes
```yaml ```yaml
best_price_min_period_length: 60 best_price_min_period_length: 60
peak_price_min_period_length: 60 peak_price_min_period_length: 30
``` ```
**When to adjust:** **When to adjust:**
@ -207,12 +208,12 @@ peak_price_min_period_length: 60
#### Distance from Average #### Distance from Average
**What:** How much better than average a period must be **What:** How much better than average a period must be
**Default:** 2% **Default:** 5%
**Range:** 0-20% **Range:** 0-20%
```yaml ```yaml
best_price_min_distance_from_avg: 2 best_price_min_distance_from_avg: 5
peak_price_min_distance_from_avg: 2 peak_price_min_distance_from_avg: 5
``` ```
**When to adjust:** **When to adjust:**
@ -220,16 +221,7 @@ peak_price_min_distance_from_avg: 2
- **Increase (5-10%)** → Only show clearly better times - **Increase (5-10%)** → Only show clearly better times
- **Decrease (0-1%)** → Show any time below/above average - **Decrease (0-1%)** → Show any time below/above average
** Note:** This filter works **independently** from flexibility. Both conditions must be met: ** Note:** Both flexibility and distance filters must be satisfied. When using high flexibility values (>30%), the distance filter may become the limiting factor. For best results, use moderate flexibility (15-20%) with relaxation enabled.
- Price must be within flex range (close to MIN/MAX)
- **AND** price must be sufficiently below/above average
**Example conflict:** If daily MIN is 10 ct, daily AVG is 20 ct, flex is 50%, and min_distance is 5%:
- Flex allows prices up to 15 ct
- Distance requires prices ≤ 19 ct (20 - 5%)
- **Both must pass** → effective limit is 15 ct (the stricter one)
This is why very high flexibility (>30%) can be counterproductive - the distance filter may become the dominant constraint.
### Optional Filters ### Optional Filters
@ -304,7 +296,7 @@ best_price_flex: 10 # Decrease from 15% for stricter selection
Only if periods seem "mediocre" (not really cheap/expensive): Only if periods seem "mediocre" (not really cheap/expensive):
```yaml ```yaml
best_price_min_distance_from_avg: 5 # Increase from 2% for stricter quality best_price_min_distance_from_avg: 10 # Increase from 5% for stricter quality
``` ```
**⚠️ Careful:** High values (>10%) can make it impossible to find periods on flat price days. **⚠️ Careful:** High values (>10%) can make it impossible to find periods on flat price days.
@ -433,14 +425,14 @@ Day 3: Finds 2 periods with flex 15% (original) → No relaxation needed
# Use defaults - no configuration needed! # Use defaults - no configuration needed!
best_price_flex: 15 # (default) best_price_flex: 15 # (default)
best_price_min_period_length: 60 # (default) best_price_min_period_length: 60 # (default)
best_price_min_distance_from_avg: 2 # (default) best_price_min_distance_from_avg: 5 # (default)
``` ```
**What you get:** **What you get:**
- 1-3 periods per day with prices ≤ MIN + 15% - 1-3 periods per day with prices ≤ MIN + 15%
- Each period at least 1 hour long - Each period at least 1 hour long
- All periods at least 2% cheaper than daily average - All periods at least 5% cheaper than daily average
**Automation example:** **Automation example:**
@ -464,86 +456,71 @@ automation:
**Symptom:** `binary_sensor.tibber_home_best_price_period` never turns "on" **Symptom:** `binary_sensor.tibber_home_best_price_period` never turns "on"
**Possible causes:** **Common Solutions:**
1. **Filters too strict** 1. **Check if relaxation is enabled**
```yaml
enable_min_periods_best: true # Should be true (default)
min_periods_best: 2 # Try to find at least 2 periods
```
```yaml 2. **If still no periods, check filters**
# Try: - Look at sensor attributes: `relaxation_active` and `relaxation_level`
best_price_flex: 20 # Increase from default 15% - If relaxation exhausted all attempts: Filters too strict or flat price day
best_price_min_distance_from_avg: 1 # Reduce from default 2%
```
2. **Period length too long** 3. **Try increasing flexibility slightly**
```yaml
best_price_flex: 20 # Increase from default 15%
```
```yaml 4. **Or reduce period length requirement**
# Try: ```yaml
best_price_min_period_length: 45 # Reduce from default 60 minutes best_price_min_period_length: 45 # Reduce from default 60 minutes
``` ```
3. **Flat price curve** (all prices very similar)
- Enable relaxation to ensure at least some periods
```yaml
enable_min_periods_best: true
min_periods_best: 1
```
### Periods Split Into Small Pieces ### Periods Split Into Small Pieces
**Symptom:** Many short periods instead of one long period **Symptom:** Many short periods instead of one long period
**Possible causes:** **Common Solutions:**
1. **Level filter too strict** 1. **If using level filter, add gap tolerance**
```yaml
best_price_max_level: cheap
best_price_max_level_gap_count: 2 # Allow 2 NORMAL intervals
```
```yaml 2. **Slightly increase flexibility**
# One "NORMAL" interval splits an otherwise good period ```yaml
# Solution: Use gap tolerance best_price_flex: 20 # From 15% → captures wider price range
best_price_max_level: cheap ```
best_price_max_level_gap_count: 2 # Allow 2 NORMAL intervals
```
2. **Flexibility too tight** 3. **Check for price spikes**
- Automatic smoothing should handle this
```yaml - Check attribute: `period_interval_smoothed_count`
# One interval just outside flex range splits the period - If 0: Not isolated spikes, but real price levels
# Solution: Increase flexibility
best_price_flex: 20 # Increase from 15%
```
3. **Price spikes breaking periods**
- Statistical outlier filtering should handle this automatically
- Check logs for smoothing activity:
```
DEBUG: [2025-11-11T14:30:00+01:00] Outlier detected: 35.2 ct
DEBUG: Smoothed to: 20.7 ct (trend prediction)
```
- If smoothing isn't working as expected, check:
- Is spike truly isolated? (3+ similar prices in a row won't be smoothed)
- Is it a legitimate price shift? (symmetry check preserves morning/evening peaks)
### Understanding Sensor Attributes ### Understanding Sensor Attributes
**Check period details:** **Key attributes to check:**
```yaml ```yaml
# Entity: binary_sensor.tibber_home_best_price_period # Entity: binary_sensor.tibber_home_best_price_period
# Attributes when "on": # When "on" (period active):
start: "2025-11-11T02:00:00+01:00" start: "2025-11-11T02:00:00+01:00" # Period start time
end: "2025-11-11T05:00:00+01:00" end: "2025-11-11T05:00:00+01:00" # Period end time
duration_minutes: 180 duration_minutes: 180 # Duration in minutes
rating_level: "LOW" # All intervals are LOW price price_avg: 18.5 # Average price in the period
price_avg: 18.5 # Average price in this period rating_level: "LOW" # All intervals have LOW rating
relaxation_active: true # This day used relaxation
relaxation_level: "price_diff_18.0%+level_any" # Found at flex 18%, level filter removed # Relaxation info (shows if filter loosening was needed):
period_interval_smoothed_count: 2 # 2 outliers were smoothed (only if >0) relaxation_active: true # This day needed relaxation
period_interval_level_gap_count: 1 # 1 interval kept via gap tolerance (only if >0) relaxation_level: "price_diff_18.0%+level_any" # Found at 18% flex, level filter removed
# Optional (only shown when relevant):
period_interval_smoothed_count: 2 # Number of price spikes smoothed
period_interval_level_gap_count: 1 # Number of "mediocre" intervals tolerated
``` ```
--- ---
@ -563,7 +540,7 @@ For advanced configuration patterns and technical deep-dive, see:
| ---------------------------------- | ------- | ---------------- | ------------------------------ | | ---------------------------------- | ------- | ---------------- | ------------------------------ |
| `best_price_flex` | 15% | 0-100% | Search range from daily MIN | | `best_price_flex` | 15% | 0-100% | Search range from daily MIN |
| `best_price_min_period_length` | 60 min | 15-240 | Minimum duration | | `best_price_min_period_length` | 60 min | 15-240 | Minimum duration |
| `best_price_min_distance_from_avg` | 2% | 0-20% | Quality threshold | | `best_price_min_distance_from_avg` | 5% | 0-20% | Quality threshold |
| `best_price_max_level` | any | any/cheap/vcheap | Absolute quality | | `best_price_max_level` | any | any/cheap/vcheap | Absolute quality |
| `best_price_max_level_gap_count` | 0 | 0-10 | Gap tolerance | | `best_price_max_level_gap_count` | 0 | 0-10 | Gap tolerance |
| `enable_min_periods_best` | true | true/false | Enable relaxation | | `enable_min_periods_best` | true | true/false | Enable relaxation |
@ -584,30 +561,7 @@ The Tibber API provides price levels for each 15-minute interval:
- `EXPENSIVE` - Above average - `EXPENSIVE` - Above average
- `VERY_EXPENSIVE` - Significantly above average - `VERY_EXPENSIVE` - Significantly above average
### Outlier Filtering Technical Details
**Algorithm:**
1. **Linear regression**: Predicts expected price based on surrounding trend
2. **Confidence intervals**: 2 standard deviations (95% confidence)
3. **Symmetry check**: Rejects asymmetric outliers (1.5 std dev threshold)
4. **Enhanced zigzag detection**: Catches spike clusters with relative volatility (2.0× threshold)
**Constants:**
- `CONFIDENCE_LEVEL`: 2.0 (95% confidence)
- `SYMMETRY_THRESHOLD`: 1.5 std dev
- `RELATIVE_VOLATILITY_THRESHOLD`: 2.0
- `MIN_CONTEXT_SIZE`: 3 intervals minimum
**Data integrity:**
- Smoothed intervals stored with `_original_price` field
- All statistics (min/max/avg) use original prices
- Period attributes show impact: `period_interval_smoothed_count`
- Smart counting: Only counts smoothing that actually changed period formation
--- ---
**Last updated:** November 19, 2025 **Last updated:** November 20, 2025
**Integration version:** 2.0+ **Integration version:** 2.0+