Neural Networks: Difference between revisions
Line 5: | Line 5: | ||
=Overview= | =Overview= | ||
A ''neural network'' consists of several layers of activation units ("individual neurons"), where one layer's activation unit output is connected to the inputs of ''all'' activation units of the successive layer. The behavior of an individual activation unit is described in the "[[#Individual_Unit|Individual Unit]]" section. A neural network's topology, along with notations | A ''neural network'' consists of several layers of activation units ("individual neurons"), where one layer's activation unit output is connected to the inputs of ''all'' activation units of the successive layer. The behavior of an individual activation unit is described in the "[[#Individual_Unit|Individual Unit]]" section. A neural network's topology, along with conventions and notations - which are essential to get right if you want to follow the linear algebra equations - are discussed in the "[[#Topology|Topology]]" section. A neural network produces predictions by forward propagating input, then activations across its layers from left to right, until the output layer computes the hypothesis function, for a specific input sample. The forward propagation process is described in the "[[#Forward_Propagation|Forward Propagation]]" section. Forward propagation computations are performed based on a set of parameters (or weights) that are obtained by training the network. Training the network, or "fitting the parameters", is performed by a backpropagation algorithm, which is described in the "[[#Backpropagation|Backpropagation]]" section. | ||
=Individual Unit= | =Individual Unit= |
Revision as of 17:38, 7 January 2018
Internal
Overview
A neural network consists of several layers of activation units ("individual neurons"), where one layer's activation unit output is connected to the inputs of all activation units of the successive layer. The behavior of an individual activation unit is described in the "Individual Unit" section. A neural network's topology, along with conventions and notations - which are essential to get right if you want to follow the linear algebra equations - are discussed in the "Topology" section. A neural network produces predictions by forward propagating input, then activations across its layers from left to right, until the output layer computes the hypothesis function, for a specific input sample. The forward propagation process is described in the "Forward Propagation" section. Forward propagation computations are performed based on a set of parameters (or weights) that are obtained by training the network. Training the network, or "fitting the parameters", is performed by a backpropagation algorithm, which is described in the "Backpropagation" section.