Devstral-small-IFC-ACC LoRA
A LoRA adapter fine-tuned on Devstral-Small-2 24B for automated IFC compliance checking in the AEC (Architecture, Engineering, Construction) domain.
Given a natural-language compliance requirement (German), the model generates executable code to validate IFC building models — selecting the appropriate method and producing SQL, Cypher, or IFCOpenShell Python code.
Tasks
| Task | Description |
|---|---|
| Method Selection | Classify which query method(s) to use: SQL, Cypher, IFCOpenShell |
| SQL Generation | Generate SQL queries against a PostgreSQL schema parsed from IFC |
| Cypher Generation | Generate Cypher queries against a Neo4j knowledge graph of IFC entities |
| IFCOpenShell Code | Generate Python code using IFCOpenShell for geometric/spatial checks |
Training
- Steps: 156 (3 epochs)
- Final loss: 0.23 (from 1.61)
- Token accuracy: 92.4%
LoRA Configuration
| Parameter | Value |
|---|---|
| Rank | 8 |
| Alpha | 16 |
| Dropout | 0.05 |
| Target modules | q, k, v, o, gate, up, down proj |
Hyperparameters
| Parameter | Value |
|---|---|
| Learning rate | 2e-5 |
| Batch size | 2 × 8 gradient accum = 16 effective |
| Scheduler | Cosine (warmup 8%) |
| Optimizer | AdamW 8-bit |
| Max sequence length | 2048 |
| Precision | bf16 |
Usage
With vLLM (Recommended)
vllm serve mistralai/Devstral-Small-2-24B-Instruct-2512 \
--enable-lora \
--lora-modules devstral-ifc-acc=Balaharikaran/devstral-ifc-acc-lora-v2 \
--max-model-len 8192 \
--quantization fp8
Intended Use
Automated compliance checking of IFC building models against regulatory requirements
Research on domain-specific LLM fine-tuning for AEC/BIM
Part of an MSc thesis at TU Munich
Framework Versions
PEFT 0.18.1 · TRL 0.27.2 · Transformers 5.1.0 · PyTorch 2.4.1+cu124
Model tree for Balaharikaran/Devstral-small-IFC-ACC-LoRA
Base model
mistralai/Mistral-Small-3.1-24B-Base-2503