Backpropagation

Artificial neural networks uses backpropagation method to calculate the error contribution of each neuron after a batch of data (in image recognition) is processed. It is a special case of an older and more general technique called automatic differentiation. It is commonly used by the gradient descent optimization algorithm to adjust the weight of neurons by calculating the gradient of the loss function. This technique is also sometimes called backward propagation of errors, because the error is calculated at the output and distributed back through the neural network layers.

  • Back-propagation for nonlinear self-tuning adaptive control
  • Control chart pattern recognition using backpropagation
  • Backpropagation Neural Network Implementation for Medical Image Compression
  • Time series forecasting using backpropagation neural networks
  • Backpropagation Neural Network in Tidal-Level Forecasting

Backpropagation Conference Speakers

    Recommended Sessions

    Related Journals

    Are you interested in