|
|
|
A multilayer perceptron is a prototypical network of modern connectionism. Like the simpler perceptron, it has a set of input units to represent environmental inputs, and a set of output units to represent responses to these inputs. However, it also has one or more layers of hidden units that stand as intermediate processors, and which are capable of detecting complex features present in the inputs. It is these hidden units that give the multilayer perceptron its exceptional power: to be an arbitrary pattern classifier (Lippmann, 1989), a universal function approximator (Hornik et al., 1989), or to be equivalent in power to a universal Turing machine (Siegelmann, 1999). The discovery of learning rules capable of training such powerful networks have led to the emergence of the connectionist alternative to classical cognitive science (Rumelhart et al., 1986).
References:
- Hornik, M., Stinchcombe, M., & White, H. (1989). Multilayer feedforward networks are universal approximators. Neural Networks, 2, 359-366.
- Lippmann, R. P. (1989). Pattern classification using neural networks. IEEE Communications magazine, November, 47-64.
- Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning representations by back-propagating errors. Nature, 323, 533-536.
- Siegelmann, H. T. (1999). Neural Networks And Analog Computation: Beyond The Turing Limit. Boston, MA: Birkhauser.
(Added October 2009)
|
|
|
|