-**[Learning representations by back-propagating errors](https://doi.org/10.1038/323533a0)**
*David E. Rumelhart, Geoffrey E. Hinton, Ronald J. Williams (1986)*
Published in *Nature*
-The back-propagation method adjusts the connection weights by propagating
errors backward from the output layer to the input layer, aiming to minimize
errors and achieve a classification as close as possible to the optimum.
-**[ImageNet Classification with Deep Convolutional Neural Networks](https://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks)**
*Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton (2012)*
Presented at *NeurIPS*
**[Learning representations by back-propagating
errors](https://doi.org/10.1038/323533a0)** *David E. Rumelhart, Geoffrey E.
Hinton, Ronald J. Williams (1986)* Published in *Nature*
- This approach has halved the image classification error rate on the ImageNet dataset.
**[ImageNet Classification with Deep Convolutional Neural