Part of a series on |
Machine learning and data mining |
---|
Mixture of experts (MoE) is a machine learning technique where multiple expert networks (learners) are used to divide a problem space into homogeneous regions.[1] MoE represents a form of ensemble learning.[2] They were also called committee machines.[3]
{{cite book}}
: |journal=
ignored (help)
© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search