Tag: Mixture of Experts
All the articles with the tag "Mixture of Experts".
-
MINGLE: Mixtures of Null-Space Gated Low-Rank Experts for Test-Time Continual Model Merging
MINGLE提出了一种测试时持续模型合并方法,通过混合低秩专家架构和自适应空空间约束门控,利用少量无标签测试样本动态融合模型,显著提升了持续学习中的泛化性能并减少了灾难性遗忘。