Method Stacking
About
Complex fine-tuning often requires hybrid approaches.
The d9d.peft.all package facilitates this by grouping multiple PEFT configurations into a single PeftStack.
Usage Example
Applying LoRA to attention layers while fully fine-tuning normalization layers.
d9d.peft.all
Package for composing multiple PEFT methods into a stack.
PeftStack
Bases: PeftMethod[PeftStackConfig]
A composite PEFT method that applies a list of methods sequentially.
__init__(methods)
Constructs a PeftStack object.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
methods
|
list[PeftMethod]
|
A list of instantiated PEFT methods to apply in order. |
required |
PeftStackConfig
peft_method_from_config(config)
Factory function to instantiate the correct PeftMethod based on the configuration type.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
config
|
TConfig
|
A specific PEFT configuration object (e.g., LoRAConfig). |
required |
Returns:
| Type | Description |
|---|---|
PeftMethod[TConfig]
|
The corresponding method instance. |