About

The d9d.peft.full_tune package allows you to integrate standard fine-tuning into the PEFT workflow. It does not alter the model architecture. Instead, it uses regex patterns to identify specific modules (e.g., Norm layers or specific Heads) and unfreezes their parameters.

This is particularly useful when combined with other PEFT methods via Stacking, allowing for hybrid training strategies (e.g., LoRA on Attention + Full Tune on LayerNorm).

d9d.peft.full_tune

Package for Full Fine-Tuning functionality within the PEFT framework.

FullTune

Bases: PeftMethod[FullTuneConfig]

Implements Full Fine-Tuning as a 'PEFT' method.

Instead of injecting adapters, this method simply identifies existing parameters that match the configuration pattern and marks them for training.

Source code in d9d/peft/full_tune/method.py
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
class FullTune(PeftMethod[FullTuneConfig]):
    """
    Implements Full Fine-Tuning as a 'PEFT' method.

    Instead of injecting adapters, this method simply identifies existing parameters
    that match the configuration pattern and marks them for training.
    """

    def __init__(self, config: FullTuneConfig):
        """
        Constructs a FullTune object.

        Args:
            config: Configuration defining the module name patterns to fine-tune.
        """

        self._config = config

    def inject(self, module: nn.Module) -> PeftInjectionResult:
        params_to_train = []

        for mod_name, mod in module.named_modules():
            is_applicable = self._config.module_name_pattern.fullmatch(mod_name)

            if is_applicable:
                params_to_train.extend(mod.parameters())

        return PeftInjectionResult(
            parameters_to_train=params_to_train,
            load_state_mappers=[]
        )

    def merge(self, module: nn.Module):
        pass  # do nothing here

    @classmethod
    def from_config(cls, config: FullTuneConfig) -> Self:
        return cls(config)

__init__(config)

Constructs a FullTune object.

Parameters:

Name Type Description Default
config FullTuneConfig

Configuration defining the module name patterns to fine-tune.

required
Source code in d9d/peft/full_tune/method.py
17
18
19
20
21
22
23
24
25
def __init__(self, config: FullTuneConfig):
    """
    Constructs a FullTune object.

    Args:
        config: Configuration defining the module name patterns to fine-tune.
    """

    self._config = config

FullTuneConfig

Bases: BaseModel

Configuration for Full Fine-Tuning.

Allows specifying which modules should be fully fine-tuned using regex patterns.

Attributes:

Name Type Description
kind Literal['full_tune']

Discriminator field, always "full_tune".

module_name_pattern Pattern

Regular expression matching module names to unfreeze.

Source code in d9d/peft/full_tune/config.py
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
class FullTuneConfig(BaseModel):
    """
    Configuration for Full Fine-Tuning.

    Allows specifying which modules should be fully fine-tuned using regex patterns.

    Attributes:
        kind: Discriminator field, always "full_tune".
        module_name_pattern: Regular expression matching module names to unfreeze.
    """

    kind: Literal["full_tune"] = "full_tune"

    module_name_pattern: Pattern

```