Safetensors

adapter_model.safetensors appears to contain uninitialized weights

#2
by surendramph - opened

Hi, thank you for sharing PepDoRA on HuggingFace. I was trying to use the adapter for peptide embedding and noticed the following:

After loading the adapter with PeftModel.from_pretrained, the embeddings are identical to the base ChemBERTa-77M-MLM model. Upon inspecting the adapter_model.safetensors directly, I found:

All lora_B matrices are exactly zero (standard DoRA initialization)
All lora_magnitude_vector values exactly equal W₀.norm(dim=1) (also DoRA initialization)
Mathematically, at DoRA initialization with lora_B=0 and magnitude = ‖W₀‖_row, the effective weight reduces back to W₀, making the adapter equivalent to the base model.

Could you verify whether the uploaded checkpoint corresponds to the trained model from the paper, or whether an untrained/initial checkpoint was accidentally uploaded?

Thank you!

Sign up or log in to comment