AdaAnchor-LRA (LoRA + latent anchors)

This repo contains:

  • LoRA adapter weights (adapter_model.bin)
  • Tokenizer files
  • adaanchor_aux.pt (anchors + projection + layernorm for latent refinement)

Base model: meta-llama/Llama-3.2-1B K latent steps: 2, M anchors: 8

How to load

See the snippet in the notebook / chat for loading with PEFT + aux.

Downloads last month
26
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for disha20041005/adaanchor-llama-3.2-1b-lora-math-k2

Adapter
(623)
this model