Dual-Stage Reweighted MoE for Long-Tailed Egocentric Mistake DetectionBoyu Han, Qianqian Xu, Shilong Bao, Zhiyong Yang, Sicong Li, Qingming Huanghttps://arxiv.org/abs/2509.12990
Dual-Stage Reweighted MoE for Long-Tailed Egocentric Mistake DetectionIn this report, we address the problem of determining whether a user performs an action incorrectly from egocentric video data. To handle the challenges posed by subtle and infrequent mistakes, we propose a Dual-Stage Reweighted Mixture-of-Experts (DR-MoE) framework. In the first stage, features are extracted using a frozen ViViT model and a LoRA-tuned ViViT model, which are combined through a feature-level expert module. In the second stage, three classifiers are trained with different objecti…