Backpropagation

Artificial neural networks uses backpropagation method to calculate the error contribution of each neuron after a batch of data (in image recognition) is processed. It is a special case of an older and more general technique called automatic differentiation. It is commonly used by the gradient descent optimization algorithm to adjust the weight of neurons by calculating the gradient of the loss function. This technique is also sometimes called backward propagation of errors, because the error is calculated at the output and distributed back through the neural network layers.

  • Back-propagation for nonlinear self-tuning adaptive control
  • Control chart pattern recognition using backpropagation
  • Backpropagation Neural Network Implementation for Medical Image Compression
  • Time series forecasting using backpropagation neural networks
  • Backpropagation Neural Network in Tidal-Level Forecasting

Related Conference of Backpropagation

May 22-23, 2024

11th Global Meet on Wireless and Satellite Communications

Amsterdam, Netherlands
July 25-26, 2024

23rd International Conference on Big Data & Data Analytics

Amsterdam, Netherlands
September 19-20, 2024

11th Global Innovators Summit

London, UK
November 20-21, 2024

5th World Congress on Robotics and Automation

Paris, France

Backpropagation Conference Speakers

    Recommended Sessions

    Related Journals

    Are you interested in