Browse topics Hub · essay · articles · FAQ · glossary

Glossary · Foundations

Mixture of experts (MoE)

A model architecture where only parts of the model activate per token, improving efficiency.

Mixture of experts (MoE) — A model architecture where only parts of the model activate per token, improving efficiency..