MoIR mitigates modality dominance in VLMs by explicitly enriching low-information tokens with routed data from stronger modalities prior to LLM processing, yielding more balanced contributions and improved robustness under degradation.
arXiv preprint arXiv:2504.17696 , year=
2 Pith papers cite this work. Polarity classification is still indexing.
years
2026 2verdicts
UNVERDICTED 2representative citing papers
MER-DG applies modality-entropy regularization to reduce fusion overfitting in multimodal domain generalization, reporting average gains of 5% over standard fusion and 2% over prior methods on EPIC-Kitchens and HAC benchmarks.
citing papers explorer
-
Information Router for Mitigating Modality Dominance in Vision-Language Models
MoIR mitigates modality dominance in VLMs by explicitly enriching low-information tokens with routed data from stronger modalities prior to LLM processing, yielding more balanced contributions and improved robustness under degradation.
-
MER-DG: Modality-Entropy Regularization for Multimodal Domain Generalization
MER-DG applies modality-entropy regularization to reduce fusion overfitting in multimodal domain generalization, reporting average gains of 5% over standard fusion and 2% over prior methods on EPIC-Kitchens and HAC benchmarks.