Quantum LLM Training
Quantum Student Learning Pipeline
Train quantum-compressed models using knowledge distillation from pretrained classical teacher models.
1
Teacher Model
Load Pretrained Classical Model
2
Quantum Student
Initialize Quantum Circuit
3
Distillation
Knowledge Transfer
4
Compression
Quantum Compression
5
Training
Quantum-Enhanced Training
Quantum Training Configuration
Recommended: Medical pretrained model (checkpoint-5000) available on Teraq Backend instance. This model is trained on medical data and is accessible from the quantum training instance. Located at:
/home/ec2-user/Training_Data/models/tinyllama-1b-medical-phase1-48vcpu/checkpoint-5000
Medical discharge summaries from MIMIC-III and MIMIC-IV datasets for quantum-enhanced LLM training.