Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
-
Updated
Oct 6, 2025
Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
This is the official source code of our IEA/AIE 2021 paper
Implements a gated network for fusing features from different modalities for object detection
AutoMoE: a PyTorch Mixture‑of‑Experts self‑driving stack for CARLA with trained perception experts, a gating network, and a trajectory policy, plus datasets and training/inference scripts.
Add a description, image, and links to the gating-network topic page so that developers can more easily learn about it.
To associate your repository with the gating-network topic, visit your repo's landing page and select "manage topics."