Cayley unitary adapters executed on real quantum hardware improve LLM perplexity by 1.4% on Llama 3.1 8B with 6000 parameters and recover 83% of compression-induced degradation on SmolLM2.
Quantum large language model fine-tuning
3 Pith papers cite this work. Polarity classification is still indexing.
years
2026 3verdicts
UNVERDICTED 3representative citing papers
Hybrid quantum training discovers parity bases that improve accuracy 24-42% on binary tasks and recover performance on text benchmarks, with all inference remaining classical.
MerLin is a new open-source discovery engine for photonic and hybrid quantum machine learning that integrates circuit simulations into standard ML frameworks and reproduces 18 prior works as reusable benchmarks.
citing papers explorer
-
Quantum-enhanced Large Language Models on Quantum Hardware via Cayley Unitary Adapters
Cayley unitary adapters executed on real quantum hardware improve LLM perplexity by 1.4% on Llama 3.1 8B with 6000 parameters and recover 83% of compression-induced degradation on SmolLM2.
-
Quantum Parity Representations: Learnable Basis Discovery, Encoders, and Shadow Deployment
Hybrid quantum training discovers parity bases that improve accuracy 24-42% on binary tasks and recover performance on text benchmarks, with all inference remaining classical.
-
MerLin: A Discovery Engine for Photonic and Hybrid Quantum Machine Learning
MerLin is a new open-source discovery engine for photonic and hybrid quantum machine learning that integrates circuit simulations into standard ML frameworks and reproduces 18 prior works as reusable benchmarks.