Skip to content

Piecewise Scheduler

About

The d9d.lr_scheduler.piecewise module provides a flexible, builder-based system for constructing piecewise learning rate schedules.

Instead of writing custom LRScheduler subclasses, manual functions for LambdaLR for every variation of piecewise schedule (i.e. "Warmup + Hold + Decay"), you can construct any such a schedule declaratively by chaining phases together.

Usage Examples

Python API

Here is how to create a standard "Linear Warmup + Hold + Cosine Decay" schedule:

import torch
from d9d.lr_scheduler.piecewise import *

optimizer: torch.optim.Optimizer = ...
total_steps: int = 1000

# Define Schedule
# 1. Start at 0.0
# 2. Linear warmup to 1.0*LR over 100 steps
# 3. Stay at 1.0 * LR until 50% of training steps
# 3. Cosine decay to 0.1 (10% of LR) for the rest of training
scheduler = (
    piecewise_schedule(initial_multiplier=0.0, total_steps=total_steps)
    .for_steps(100, target_multiplier=1.0, curve=CurveLinear())
    .until_percentage(0.5, target_multiplier=1.0, curve=CurveLinear())
    .fill_rest(target_multiplier=0.1, curve=CurveCosine())
    .build(optimizer)
)

Pydantic API

import json
import torch
from d9d.lr_scheduler.piecewise import PiecewiseSchedulerConfig, piecewise_scheduler_from_config

optimizer: torch.optim.Optimizer = ...
total_steps: int = 1000

raw_config_json = """
{
    "initial_multiplier": 0.0,
    "phases": [
        {
            "mode": "steps",
            "steps": 100,
            "target_multiplier": 1.0,
            "curve": { "type": "linear" }
        },
        {
            "mode": "rest",
            "target_multiplier": 0.1,
            "curve": { "type": "cosine" }
        }
    ]
}
"""

scheduler_config = PiecewiseSchedulerConfig.model_validate_json(raw_config_json)

scheduler = piecewise_scheduler_from_config(
    config=scheduler_config,
    optimizer=optimizer,
    total_steps=total_steps
)

Available Curves

The following curve classes are available to interpolate values between phases:

Curve Class Curve Config Description
CurveLinear "linear" Standard straight-line interpolation.
CurveCosine "cosine" Half-period cosine interpolation (Cosine Annealing).
CurvePoly(power) "poly" Polynomial interpolation. power=1 is linear, power=2 is quadratic, etc.
CurveExponential "exponential" Exponential (log-linear) interpolation.

API Reference

d9d.lr_scheduler.piecewise

Implements flexible piecewise learning rate schedules via a builder pattern.

CurveBase

Bases: ABC

Abstract base class for interpolation curves used in scheduling.

compute(start, end, step_p) abstractmethod

Calculates the interpolated value.

Parameters:

Name Type Description Default
start float

The value at the beginning of the phase.

required
end float

The value at the end of the phase.

required
step_p float

Progress fraction through the phase (0.0 to 1.0).

required

Returns:

Type Description
float

The interpolated value.

CurveCosine

Bases: CurveBase

Interpolates using a cosine annealing schedule (half-period cosine).

CurveExponential

Bases: CurveBase

Interpolates exponentially between start and end values (log-space linear).

CurveLinear

Bases: CurveBase

Linearly interpolates between start and end values.

CurvePoly

Bases: CurveBase

Interpolates using a polynomial function.

__init__(power)

Constructs a polynomial curve.

Parameters:

Name Type Description Default
power float

The exponent of the polynomial. 1.0 is linear, 2.0 is quadratic, etc.

required

PiecewiseSchedulerConfig

Bases: BaseModel

Declarative configuration for a piecewise learning rate scheduler.

Attributes:

Name Type Description
initial_multiplier float

The starting learning rate multiplier.

phases list[PhaseConfig]

A sequential list of phase configurations.

piecewise_schedule(initial_multiplier, total_steps=None)

Entry point for creating a piecewise learning rate schedule.

Parameters:

Name Type Description Default
initial_multiplier float

The initial learning rate multiplier.

required
total_steps int | None

Total training steps. Required for percentage-based scheduling.

None

Returns:

Type Description
PiecewiseScheduleBuilder

A builder instance to configure phases.

piecewise_scheduler_from_config(config, optimizer, total_steps)

Constructs a PyTorch scheduler from the provided configuration.

Parameters:

Name Type Description Default
config PiecewiseSchedulerConfig

The scheduler configuration.

required
optimizer Optimizer

The optimizer to wrap.

required
total_steps int | None

The total number of training steps. Required if using percentage-based phases.

required

Returns:

Type Description
LRSchedulerProtocol

A configured learning rate scheduler.