small doubt

#1
by aastha6 - opened

Is the difference between lora adapters for language vs tower vs tower connector is just the target modules? Is it possible to share the LoRA finetuning code? Also how do we evaluate the quality difference among the three when merged with base? It would be very helpful - I’m trying to replicate the same for Qwen2.5-VL and Qwen3-VL.

Sign up or log in to comment