Part of a series on |
Machine learning and data mining |
---|
A multilayer perceptron (MLP) is a name for a modern feedforward artificial neural network, consisting of fully connected neurons with a nonlinear activation function, organized in at least three layers, notable for being able to distinguish data that is not linearly separable.[1]
Modern feedforward networks are trained using the backpropagation method[2][3][4][5][6] and are colloquially referred to as the "vanilla" neural networks.[7]
MLPs grew out of an effort to improve single-layer perceptrons, which could only distinguish linearly separable data. A perceptron traditionally used a Heaviside step function as its nonlinear activation function. However, the backpropagation algorithm requires that modern MLPs use continuously differentiable activation functions such as sigmoid or ReLU.
lin1970
was invoked but never defined (see the help page).kelley1960
was invoked but never defined (see the help page).werbos1982
was invoked but never defined (see the help page).© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search