About

The d9d.peft package provides a flexible framework for fine-tuning models using parameter-efficient strategies or targeted full fine-tuning.

Core Concepts

Apply Before State Loading

This package is deeply integrated with the model state mapping ecosystem.

When you apply methods like LoRA, the model structure changes (e.g., a Linear layer becomes a LoRALinear wrapper).

Consequently, the keys in your original checkpoint (e.g., layers.0.linear.weight) no longer match the keys in the efficient model (e.g., layers.0.linear.base.weight).

d9d.peft automatically generates the necessary ModelStateMapper objects to load standard checkpoints into modified architectures.

It is useful since framework user may apply a PEFT method to a model that was not initialized or horizontally distributed yet. Other PEFT frameworks usually want you to initialize model weights before applying PEFT which may break your horizontal parallelism setup logic or make it less reusable.

Configuration

All PEFT methods are driven by Pydantic configurations. This allows for custom validation of hyperparameters and easy serialization/deserialization.

The Injection Lifecycle (PeftMethod)

The framework operates on an Inject -> Train -> Merge lifecycle:

  1. Inject (inject_peft_and_freeze): The PeftMethod inspects the generic nn.Module. It locates target layers, replaces them with adapter layers (if necessary), and marks parameters that have to be trained with requires_grad=True.
  2. State Mapping: The injection process returns a ModelStateMapper object. This mapper describe how to map the original checkpoint keys to the new, injected model structure.
  3. Train: Here you train your model.
  4. Merge (merge_peft): Once training is complete, this method collapses the adapters back into the base weights, restoring the original architecture.

d9d.peft

Provides core logic for PEFT (Parameter-Efficient Fine-Tuning) application and base definitions.

PeftInjectionResult dataclass

Encapsulates the result of injecting a PEFT method into a model.

Attributes:

Name Type Description
parameters_to_train list[Parameter]

A list of parameters that should remain trainable.

load_state_mappers list[ModelStateMapper]

A list of mappers required to load pre-trained weights into the modified structure.

Source code in d9d/peft/base.py
11
12
13
14
15
16
17
18
19
20
21
22
@dataclasses.dataclass(slots=True)
class PeftInjectionResult:
    """
    Encapsulates the result of injecting a PEFT method into a model.

    Attributes:
        parameters_to_train: A list of parameters that should remain trainable.
        load_state_mappers: A list of mappers required to load pre-trained weights into the modified structure.
    """

    parameters_to_train: list[nn.Parameter]
    load_state_mappers: list[ModelStateMapper]

PeftMethod

Bases: ABC, Generic[TConfig]

Abstract base class for all Parameter-Efficient Fine-Tuning methods.

Source code in d9d/peft/base.py
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
class PeftMethod(abc.ABC, Generic[TConfig]):
    """
    Abstract base class for all Parameter-Efficient Fine-Tuning methods.
    """

    @abc.abstractmethod
    def inject(self, module: nn.Module) -> PeftInjectionResult:
        """
        Modifies the module in-place to apply the PEFT strategy.

        Args:
            module: The PyTorch module to modify.

        Returns:
            Result object containing trainable parameters and structure mappers.
        """
        ...

    @abc.abstractmethod
    def merge(self, module: nn.Module):
        """
        Merges the trained adapters back into the base model parameters.

        Args:
            module: The PyTorch module to update.
        """

        ...

    @classmethod
    @abc.abstractmethod
    def from_config(cls, config: TConfig) -> Self:
        """
        Creates an instance of the method from a configuration object.

        Args:
            config: The configuration object.

        Returns:
            An instance of the PeftMethod.
        """

        ...

from_config(config) abstractmethod classmethod

Creates an instance of the method from a configuration object.

Parameters:

Name Type Description Default
config TConfig

The configuration object.

required

Returns:

Type Description
Self

An instance of the PeftMethod.

Source code in d9d/peft/base.py
57
58
59
60
61
62
63
64
65
66
67
68
69
70
@classmethod
@abc.abstractmethod
def from_config(cls, config: TConfig) -> Self:
    """
    Creates an instance of the method from a configuration object.

    Args:
        config: The configuration object.

    Returns:
        An instance of the PeftMethod.
    """

    ...

inject(module) abstractmethod

Modifies the module in-place to apply the PEFT strategy.

Parameters:

Name Type Description Default
module Module

The PyTorch module to modify.

required

Returns:

Type Description
PeftInjectionResult

Result object containing trainable parameters and structure mappers.

Source code in d9d/peft/base.py
33
34
35
36
37
38
39
40
41
42
43
44
@abc.abstractmethod
def inject(self, module: nn.Module) -> PeftInjectionResult:
    """
    Modifies the module in-place to apply the PEFT strategy.

    Args:
        module: The PyTorch module to modify.

    Returns:
        Result object containing trainable parameters and structure mappers.
    """
    ...

merge(module) abstractmethod

Merges the trained adapters back into the base model parameters.

Parameters:

Name Type Description Default
module Module

The PyTorch module to update.

required
Source code in d9d/peft/base.py
46
47
48
49
50
51
52
53
54
55
@abc.abstractmethod
def merge(self, module: nn.Module):
    """
    Merges the trained adapters back into the base model parameters.

    Args:
        module: The PyTorch module to update.
    """

    ...

inject_peft_and_freeze(method, module)

Applies a PEFT method to a module, freezes non-trained parameters, and prepares state mapping.

This function performs three main steps:

  1. Sets requires_grad=False for all parameters in the module.
  2. Calls the method's inject to modify the model structure.
  3. Sets requires_grad=True for the parameters returned by the injection result.

Parameters:

Name Type Description Default
method PeftMethod

The PEFT method strategy to apply.

required
module Module

The PyTorch module to modify.

required

Returns:

Type Description
ModelStateMapper

A ModelStateMapper capable of loading checkpoint weights into the modified structure.

Source code in d9d/peft/applicator.py
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
def inject_peft_and_freeze(method: PeftMethod, module: nn.Module) -> ModelStateMapper:
    """
    Applies a PEFT method to a module, freezes non-trained parameters, and prepares state mapping.

    This function performs three main steps:

    1. Sets `requires_grad=False` for all parameters in the module.
    2. Calls the method's `inject` to modify the model structure.
    3. Sets `requires_grad=True` for the parameters returned by the injection result.

    Args:
        method: The PEFT method strategy to apply.
        module: The PyTorch module to modify.

    Returns:
        A ModelStateMapper capable of loading checkpoint weights into the modified structure.
    """

    for param in module.parameters():
        param.requires_grad = False

    result = method.inject(module)

    for param in result.parameters_to_train:
        param.requires_grad = True

    return ModelStateMapperParallel(result.load_state_mappers)

merge_peft(method, module)

Merges PEFT adaptations back into the base model weights.

Parameters:

Name Type Description Default
method PeftMethod

The PEFT method strategy originally applied.

required
module Module

The PyTorch module to merge.

required
Source code in d9d/peft/applicator.py
38
39
40
41
42
43
44
45
46
47
def merge_peft(method: PeftMethod, module: nn.Module):
    """
    Merges PEFT adaptations back into the base model weights.

    Args:
        method: The PEFT method strategy originally applied.
        module: The PyTorch module to merge.
    """

    method.merge(module)