ThinkRL
GitHub
Advanced

LoRA Merging

Merge LoRA adapters into base models for deployment.

Command Line

python -m thinkrl.cli.merge_lora \
    --base_model meta-llama/Llama-3-8b \
    --lora_path ./checkpoints/final_lora \
    --output_path ./exported_model \
    --save_format safetensors

Python API

from thinkrl.utils import merge_lora_weights

merged_model = merge_lora_weights(
    base_model="meta-llama/Llama-3-8b",
    lora_path="./checkpoints/final_lora",
)

merged_model.save_pretrained("./exported_model")