Glossary · Foundations
Mixture of experts (MoE)
A model architecture where only parts of the model activate per token, improving efficiency.
Glossary · Foundations
A model architecture where only parts of the model activate per token, improving efficiency.
Mixture of experts (MoE) — A model architecture where only parts of the model activate per token, improving efficiency..