Post-LoRA Restoration: Utilizing Transferability of Low-Rank Adapter in Quantized Foundation Models

Yuto Kanda Kenji Hatano
雑誌・プロシーディングス名: ICLR 2025 Workshop on Sparsity in LLMs (SLLM)
開催地(都道府県): Singerpore
国名(英語): Singerpore
言語: English
出版年: 2025
出版月: 4
出版日: 2025-04-27
🌐 詳細ページへ
       

概要

In this study, we consider the transferability of LoRA adapters in quantized foundation models. Specifically, we investigate whether LoRA adapters trained on a low-bit-width foundation model can still function effectively when merged into a higher-bit-width foundation model. By leveraging this transferability, it becomes possible to construct models with performance comparable to conventional LoRA using QLoRA adapters trained under resource-constrained conditions. Our method can be utilized to not only improve the performance of trained QLoRA models without additional training but also accelerate the construction of LoRA.

引用情報

Yuto Kanda, Kenji Hatano, , Post-LoRA Restoration: Utilizing Transferability of Low-Rank Adapter in Quantized Foundation Models, ICLR 2025 Workshop on Sparsity in LLMs (SLLM), 2025-04-27.

Iconic One Theme | Powered by Wordpress