About

The d9d.core.types package gathers common Type Aliases used throughout the framework.

The d9d.core.protocol package defines standard interfaces (Protocols) for standard PyTorch components used in the distributed training loop.

d9d.core.types

Common type definitions used throughout the framework.

PyTree = TLeaf | list['PyTree[TLeaf]'] | dict[str, 'PyTree[TLeaf]'] | tuple['PyTree[TLeaf]', ...] module-attribute

A recursive type definition representing a tree of data.

This type alias covers standard Python containers (dictionaries, lists, tuples) nested arbitrarily deep, terminating in a leaf node of type TLeaf.

This is commonly used for handling nested state dictionaries or arguments passed to functions that support recursive traversal (similar to torch.utils._pytree).

TensorTree = PyTree[torch.Tensor] module-attribute

A recursive tree structure where the leaf nodes are PyTorch Tensors.

d9d.core.protocol

Package providing protocol definitions for standard PyTorch objects.

LRSchedulerProtocol

Bases: Protocol, Stateful

Protocol defining an interface for a Learning Rate Scheduler.

This protocol ensures that the wrapped scheduler supports stepping and state checkpointing via the Stateful interface.

Source code in d9d/core/protocol/training.py
26
27
28
29
30
31
32
33
34
35
36
37
38
@runtime_checkable
class LRSchedulerProtocol(Protocol, Stateful):
    """
    Protocol defining an interface for a Learning Rate Scheduler.

    This protocol ensures that the wrapped scheduler supports stepping
    and state checkpointing via the Stateful interface.
    """

    def step(self):
        """Performs a single learning rate scheduling step."""

        ...

step()

Performs a single learning rate scheduling step.

Source code in d9d/core/protocol/training.py
35
36
37
38
def step(self):
    """Performs a single learning rate scheduling step."""

    ...

OptimizerProtocol

Bases: Protocol, Stateful

Protocol defining an interface for standard PyTorch Optimizer object.

This protocol ensures that the wrapped optimizer supports standard API and state checkpointing via the Stateful interface.

Source code in d9d/core/protocol/training.py
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
@runtime_checkable
class OptimizerProtocol(Protocol, Stateful):
    """
    Protocol defining an interface for standard PyTorch Optimizer object.

    This protocol ensures that the wrapped optimizer supports standard
    API and state checkpointing via the Stateful interface.
    """

    def step(self):
        """Performs a single optimization step."""

        ...

    def zero_grad(self):
        """Sets the gradients of all optimized tensors to zero."""

        ...

step()

Performs a single optimization step.

Source code in d9d/core/protocol/training.py
15
16
17
18
def step(self):
    """Performs a single optimization step."""

    ...

zero_grad()

Sets the gradients of all optimized tensors to zero.

Source code in d9d/core/protocol/training.py
20
21
22
23
def zero_grad(self):
    """Sets the gradients of all optimized tensors to zero."""

    ...