Upload
swats-kats
View
627
Download
34
Embed Size (px)
Citation preview
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
1/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 1/19
M a t t M a z u r
A S t e p b y S t e p
B a c k p r o p a g a t i o n E x a m p l e
B a c k g r o u n d
Backpropagation is a common method for training a neural network. There is no
shortage of papers online that attempt to explain how backpropagation works, but
few that include an example with actual numbers. This post is my attempt to
explain how it works with a concrete example that folks can compare their own
calculations to in order to ensure they understand backpropagation correctly.
If this kind of thing interests you, you should sign up for my newsletter where I post
about AI-related projects that I’m working on.
B a c k p r o p a g a t i o n i n P y t h o n
You can play around with a Python script tha t I wrote that implements the
backpropagation algorithm in this Github rep o.
B a c k p r o p a g a t i o n V i s u a l i z a t i o n
For an interactive visualization showing a neural network as it learns, check out my
Neural Network visualization .
A d d i t i o n a l R e s o u r c e s
If you find this tutorial useful and want to continue learning about neural networks
and their applications, I highly recommend checking out Adrian Rosebrock’s
excellent tutorial on Getting Started with Deep Learning and Python .
O v e r v i e w
For this tutorial, we’re going to use a neural network with two inputs, two hidden
neurons, two output neurons. Additionally, the hidden and output neurons will
include a bias.
Here’s the basic structure:
F o l l o w v i a E m a i l
Enter your email address to
follow this blog and receive
notifications of new posts by
em ail.
Joi n 1,810 other follower s
En ter your email address
A b o u t
I'm a developer at Automattic
where I work on growth and
analytics for WordPress.com. I
also built Lean Domain Search ,
Preceden and a number of other
software products over the years.
I love solving problems and
helping others do the same.
Drop me a note if I can help withanything.
Search …
F o l l o w m e o n T w i t t e r
Tweets by @mhmazur
Home
About
Archives
Contact
Now
Projects
https://www.google.com/search?q=backpropagation+algorithmhttps://www.google.com/search?q=backpropagation+algorithmhttps://www.google.com/search?q=backpropagation+algorithmhttp://mattmazur.com/projects/http://mattmazur.com/contact/http://mattmazur.com/http://mattmazur.com/http://mattmazur.com/projects/http://mattmazur.com/now/http://mattmazur.com/contact/http://mattmazur.com/archives/http://mattmazur.com/about/http://mattmazur.com/https://twitter.com/mhmazurhttps://twitter.com/mhmazurhttp://cloud.feedly.com/#subscription%2Ffeed%2Fhttp%3A%2F%2Fmattmazur.com%2Ffeed%2Fhttp://mattmazur.com/feed/http://www.linkedin.com/in/mhmazurhttp://twitter.com/mhmazurmailto:[email protected]://www.preceden.com/http://www.leandomainsearch.com/http://www.pyimagesearch.com/2014/09/22/getting-started-deep-learning-pythonhttp://www.emergentmind.com/neural-networkhttps://github.com/mattm/simple-neural-networkhttp://www.emergentmind.com/newsletterhttps://www.google.com/search?q=backpropagation+algorithmhttp://mattmazur.com/http://mattmazur.com/
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
2/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 2/19
In order to have some numbers to work with, here’s are the initial weights , thebiases , and training inputs/outputs :
The goal of backpropagation is to optimize the weights so that the neural network
can learn how to correctly map arbitrary inputs to outputs.
For the rest of this tutorial we’re going to work with a single training set: given
inputs 0.05 and 0.10, we want the neural network to output 0.01 and 0.99.
T h e F o r w a r d P a s s
To begin, lets see what the neural network currently predicts given the weights and
biases above and inputs of 0.05 and 0.10. To do this we’ll feed those inputs
5h
6h
04 Apr
29 Feb
If a user signs up for your appat 11:59pm on Tuesday,does something at 12:01amon Wedn esday, then nothin gever again, is he D1retained?
Global te mperature reached+1°C ove r pre-industrialaverage i n Oct 2015, now asmuch as +1.4°C as of Feb2016slate.com /blogs/future _t…
Matt Mazur Retweeted
(some thi ng s Iha ve learned )
Matt Mazur Retweeted
Reality of high-growthstartup: things will feelbroken, info won’t feelappropriately distributed,important things will seemoverlooked.
@mhmazur
Matt Mazur @mhmazur
Our Hem isphere’s Tempe …
Update, March 3, 2016: Si …
slate.com
Kyle Wild @dorkitude
April Underwood @aunder
http://twitter.com/dorkitude/status/584149408888946688/photo/1https://twitter.com/aunderhttp://twitter.com/dorkitude/status/584149408888946688/photo/1http://twitter.com/dorkitude/status/584149408888946688/photo/1http://twitter.com/dorkitude/status/584149408888946688/photo/1http://twitter.com/dorkitude/status/584149408888946688/photo/1https://twitter.com/dorkitudehttps://twitter.com/intent/like?tweet_id=706905462541832192http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/#https://twitter.com/mhmazur/status/706905462541832192http://twitter.com/dorkitude/status/584149408888946688/photo/1http://twitter.com/dorkitude/status/584149408888946688/photo/1http://twitter.com/dorkitude/status/584149408888946688/photo/1https://t.co/jAaFLBbIqnhttps://t.co/jAaFLBbIqnhttps://t.co/jAaFLBbIqnhttps://t.co/jAaFLBbIqnhttps://twitter.com/mhmazurhttps://twitter.com/mhmazurhttp://twitter.com/dorkitude/status/584149408888946688/photo/1https://twitter.com/mhmazurhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/#https://twitter.com/aunderhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/#http://twitter.com/dorkitude/status/584149408888946688/photo/1https://twitter.com/dorkitudehttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/#https://t.co/jAaFLBbIqnhttps://twitter.com/mhmazurhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/#https://twitter.com/mhmazurhttps://twitter.com/intent/like?tweet_id=704408612588933120https://twitter.com/intent/like?tweet_id=584149408888946688https://twitter.com/intent/like?tweet_id=706881467549405184https://t.co/jAaFLBbIqnhttps://twitter.com/intent/like?tweet_id=706905462541832192https://twitter.com/aunder/status/704408612588933120https://twitter.com/dorkitude/status/584149408888946688https://twitter.com/mhmazur/status/706881467549405184https://twitter.com/mhmazur/status/706905462541832192
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
3/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 3/19
forward though the network.
We figure out the total net input to each hidden layer neuron, squash the total net
input using an activation function (here we use the logistic function ), then repeat
the process with the output layer neurons.
Total net input is also referred to as just net input by some sources .
Here’s how we calculate the total net input for :
We then squash it using the logistic function to get the output of :
Carrying out the same process for we get:
We repeat this process for the output layer neurons, using the output from the
hidden layer neurons as inputs.
Here’s the output for :
And carrying out the same process for we get:
C a l c u l a t i n g t h e T o t a l E r r o r
We can now calculate the error for each output neuron using the squared error
function and sum them to get the total error:
Some sources refer to the target as the ideal and the output as the actual .
Embed View on Twitter
04 Mar
19 May
Retention Rate Terminologymattmazur.com/2016/03/04/r et…
Matt Mazur Retweeted
Accidental Escher
@mhmazur
Retention Rat …
Yesterday my …
mattmazur.com
Oscar Koeroo @okoeroo
http://twitter.com/okoeroo/status/600771282431561728/photo/1http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/#http://twitter.com/okoeroo/status/600771282431561728/photo/1https://twitter.com/okoeroohttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/#https://t.co/YwqiikvJ88https://twitter.com/mhmazurhttps://twitter.com/intent/like?tweet_id=600771282431561728https://twitter.com/intent/like?tweet_id=705861781529325568https://t.co/YwqiikvJ88https://twitter.com/okoeroo/status/600771282431561728https://twitter.com/mhmazur/status/705861781529325568https://twitter.com/mhmazurhttps://twitter.com/settings/widgets/new/user?user_id=279802557http://www.amazon.com/Introduction-Math-Neural-Networks-Heaton-ebook/dp/B00845UQL6/ref=sr_1_1?ie=UTF8&qid=1426296804&sr=8-1&keywords=neural+networkhttp://en.wikipedia.org/wiki/Backpropagation#Derivationhttp://web.cs.swarthmore.edu/~meeden/cs81/s10/BackPropDeriv.pdf
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
4/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 4/19
The is included so that exponent is cancelled when we differentiate later
on. The result is eventually multiplied by a learning rate anyway so it doesn’t
matter that we introduce a constant here [ 1].
For example, the target output for is 0.01 but the neural network output
0.75136507, therefore its error is:
Repeating this process for (remembering that the target is 0.99) we get:
The total error for the neural network is the sum of these errors:
T h e B a c k w a r d s P a s s
Our goal with backpropagation is to update each of the weights in the network so
that they cause the actual output to be closer the target output, thereby minimizing
the error for each output neuron and the network as a whole.
O u t p u t L a y e r
Consider . We want to know how much a change in affects the total error,
aka .
is read as “the partial derivative of with respect to “. You can
also say “the gradient with respect to “.
By applying the chain rule we know that:
Visually, here’s what we’re doing:
http://en.wikipedia.org/wiki/Chain_rulehttp://en.wikipedia.org/wiki/Backpropagation#Derivation
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
5/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 5/19
We need to figure out each piece in this equation.
First, how much does the total error change with respect to the output?
is sometimes expressed as
When we take the partial derivative of the total error with respect to ,
the quantity becomes zero because does notaffect it which means we’re taking the derivative of a constant which is zero.
Next, how much does the output of change with respect to its total net input?
The partial derivative of the logistic function is the output multiplied by 1 minus the
output:
Finally, how much does the total net input of change with respect to ?
Putting it all together:
http://en.wikipedia.org/wiki/Logistic_function#Derivative
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
6/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 6/19
You’ll often see this calculation combined in the form of the delta rule :
Alternatively, we have and which can be written as , aka
(the Greek letter delta) aka the node delta . We can use this to rewrite the
calculation above:
Therefore:
Some sources extract the negative sign from so it would be written as:
To decrease the error, we then subtract this value from the current weight
(optionally multiplied by some learning rate, eta, which we’ll set to 0.5):
Some sources use (alpha) to represent the learning rate, others use
(eta), and others even use (epsilon).
We can repeat this process to get the new weights , , and :
We perform the actual updates in the neural network after we have the new
weights leading into the hidden layer neurons (ie, we use the original weights, not
the updated weights, when we continue the backpropagation algorithm below).
Follow
http://void%280%29/http://web.cs.swarthmore.edu/~meeden/cs81/s10/BackPropDeriv.pdfhttps://www4.rgu.ac.uk/files/chapter3%20-%20bp.pdfhttp://aima.cs.berkeley.edu/http://en.wikipedia.org/wiki/Delta_rulehttp://en.wikipedia.org/wiki/Delta_rule
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
7/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 7/19
H i d d e n L a y e r
Next, we’ll continue the backwards pass by calculating new values for , , ,
and .
Big picture, here’s what we need to figure out:
Visually:
We’re going to use a similar process as we did for the output layer, but slightly
different to account for the fact that the output of each hidden layer neuron
contributes to the output (and therefore error) of multiple output neurons. We know
that affects both and therefore the needs to take into
consideration its effect on the both output neurons:
Starting with :
We can calculate using values we calculated earlier:
Follow “MattMazur”
Get every new post delivered toyour Inbox.
Join 1,810 other followers
Enter your email address
Sign me up
Build a website with WordPress.com
https://wordpress.com/?ref=lofhttp://void%280%29/https://matthewmazur.files.wordpress.com/2015/03/nn-calculation.png
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
8/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 8/19
And is equal to :
Plugging them in:
Following the same process for , we get:
Therefore:
Now that we have , we need to figure out and then for each
weight:
We calculate the partial derivative of the total net input to with respect to the
same as we did for the output neuron:
Putting it all together:
You might also see this written as:
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
9/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 9/19
← Introducing ABTestCalculator.com,
an Open Source A/B Test
Significance Calculator
TetriNET Bot Source Code Published
on Github →
We can now update :
Repeating this for , , and
Finally, we’ve updated all of our weights! When we fed forward the 0.05 and 0.1
inputs originally, the error on the network was 0.298371109. After this first round of
backpropagation, the total error is now down to 0.291027924. It might not seem
like much, but after repeating this process 10,000 times, for example, the error
plummets to 0.000035085. At this point, when we feed forward 0.05 and 0.1, the
two outputs neurons generate 0.015912196 (vs 0.01 target) and 0.984065734 (vs
0.99 target).
If you’ve made it this far and found any errors in any of the above or can think of
any ways to make it clearer for future readers, don’t hesitate to drop me a note .
Thanks!
S h a r e t h i s :
Twi tte r Facebook 283
Posted on March 17, 2015 by Mazur. This entry was posted in Machine Learning and tagged ai,
backpropagation, machine learning, neural networks. Bookmark the permalink .
Like
18 bloggers like this.
Related
The State of EmergentMind
Experimenting with aNeural Network-basedPoker Bot
Emergent Mind #10
http://mattmazur.com/2014/10/15/emergent-mind-10/http://mattmazur.com/2009/10/13/experimenting-with-a-neural-network-based-poker-bot/http://mattmazur.com/2015/12/11/the-state-of-emergent-mind/http://en.gravatar.com/sekiryuuhttp://en.gravatar.com/saimadhu7http://en.gravatar.com/patriczhaohttp://en.gravatar.com/amitrangahttp://en.gravatar.com/medic200http://en.gravatar.com/kevincolettahttp://en.gravatar.com/digitakehttp://en.gravatar.com/russomihttp://en.gravatar.com/ufukcuyhttp://en.gravatar.com/pashtodreamlandhttp://en.gravatar.com/miaurshttp://en.gravatar.com/lsatishhttp://en.gravatar.com/pmcianohttp://en.gravatar.com/payamrastogihttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/http://mattmazur.com/tag/neural-networks/http://mattmazur.com/tag/machine-learning-2/http://mattmazur.com/tag/backpropagation/http://mattmazur.com/tag/ai/http://mattmazur.com/category/machine-learning/http://mattmazur.com/author/mhmazur/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?share=facebook&nb=1http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?share=twitter&nb=1http://mattmazur.com/contact/http://mattmazur.com/2015/04/09/tetrinet-bot-source-code-published-on-github/http://mattmazur.com/2015/02/27/introducing-abtestcalculator-com-an-open-source-ab-test-significance-calculator/
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
10/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 10/19
← Older Comments
1 1 5 t h o u g h t s o n “ A S t e p b y S t e p B a c k p r o p a g a t i o n E x a m p l e ”
Mostafa Razavi— December 7, 2015 at 1:09 pm
That was heaven, thanks a million.Reply
Sonal Shrivastava— December 8, 2015 at 11:40 am
That was awesome. Thank a ton.
Reply
Nayantara— December 9, 2015 at 7:29 am
Hi Matt, Can you also please provide a similar example for a convolutional neural
network which uses at least 1 convolutional layer and 1 pooling layer ? Surprisingly, I
haven’t been able to find ANY similar example for backpropagation, on the internet,
for Conv. Neural Network.
TIA.
Reply
Mazur — December 9, 2015 at 8:36 am
I haven’t learnt that yet. If you find a good tutorial please let me know.
Reply
payamrastogi— December 11, 2015 at 4:24 am
All hail to “The” Mazur
Reply
A Step by Step Backpropagation Example | Matt Mazur | tensorflowgraphs P i n g !
https://tensorflowgraphs.wordpress.com/2015/12/11/a-step-by-step-backpropagation-example-matt-mazur/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=18792#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-18792http://gravatar.com/payamrastogihttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=18763#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-18763http://www.mattmazur.com/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=18762#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-18762http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=18755#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-18755https://www.facebook.com/app_scoped_user_id/527406137435436/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=18741#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-18741http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-1/#comments
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
11/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 11/19
Louis Hong— December 11, 2015 at 4:41 pm
Thank you so much for your most comprehensive tutorial ever on the internet.
Reply
ad— December 17, 2015 at 1:49 am
why is bias not updated ?
Reply
Mazur — December 17, 2015 at 9:23 am
Hey, in the tutorials I went through they didn’t update the bias which is why I
didn’t include it here.
Reply
justaguy— December 24, 2015 at 8:54 pm
Typically, bias error is equal to the sum of the errors of the neurons
that the bias connects to. For example, in regards to your example,
b1_error = h1_error + h2_error. Updating the bias’ weight would be
adding the product of the summed errors and the learning rate to the
bias, ex. b1_weight = b1_error * learning_rate. Although many
problems can be learned by a neural network without adjusting
biases and there may be better ways to adust bias weights. Also,
updating bias weights may cause problems with learning as opposed
to keeping them static. As usual with neural networks, through
experimentation you may discover more optimal designs.
Reply
patriczhao— January 13, 2016 at 1:30 am
nice explanations, thanks.
Reply
Ahad Khan— December 20, 2015 at 2:26 am
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-18949http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19470#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19470http://patricz.wordpress.com/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19041#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19041http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=18904#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-18904http://www.mattmazur.com/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=18893#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-18893http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=18807#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-18807https://www.facebook.com/app_scoped_user_id/713253878775515/
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
12/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 12/19
This is perfect. I am able to visualize back propagation algo better after reading this
article. Thanks once again!
Reply
sunlyt— December 21, 2015 at 12:57 am
Brilliant. Thank-you!
Reply
garky— December 24, 2015 at 8:25 am
If we have more than one sample in our dataset how we can train it by considering all
samples, not just one sample?
Reply
Daniel Zukowski— December 24, 2015 at 2:32 pm
Invaluable resource you’ve produced. Thank you for this clear, comprehensive, visual
explanation. The inner mechanics of backpropagation are no longer a mystery to me.
Reply
Long Pham— December 26, 2015 at 10:58 am
precisely, intuitively, very easy to understand, great work, thank you.
Reply
Dionisius AN— December 27, 2015 at 1:16 pm
Thank you very much ,it’s help me well, u really give detail direction to allow me
imagine how it works. I really appreciate it. May God repay your kindness thousand
time than u do.
Reply
singhrocks91
http://cloudmicrophysics.wordpress.com/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19076#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19076http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19063#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19063http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19011#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19011http://archethought.com/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19006#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19006http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=18961#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-18961http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=18949#respond
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
13/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 13/19
— December 28, 2015 at 1:35 am
Thank You. I have a better insight now
Reply
DGelling
— January 1, 2016 at 6:48 pm
Shouldn’t the derivative of out_o1 wrt net_o1 be net_o1*(1-net_o1)?
Reply
NaanTadow— February 24, 2016 at 1:10 am
No the one stated above is correct, see here for the steps on the gradient of
the activation function with respect to its input value (net):
https://theclevermachine.wordpress.com/2014/09/08/derivation-derivatives-
for-common-neural-network-activation-functions/
Oh and thanks for this Matt – was able to work through your breakdown of the
partial derivatives for the Andrew Ng ML Course on coursera :D
Reply
Aro— January 10, 2016 at 6:23 pm
thanks so much, I haven’t see tutorial before like this.
Reply
Derive Me— January 12, 2016 at 1:22 am
Hello. I don’t understand, below the phrase “First, how much does the total error
change with respect to the output?”, why there is a (*-1) in the second equation, that
eventually changes the result to -(target – output) instead of just (target – output). Can
you help me understand?
Thank you!
Reply
Coding Neural networks | Bits and pieces P i n g !
Apprendre à coder un réseau de neurones | Actuaires – Big Data P i n g !
Contextual Integration Is the Secret Weapon of Predictive Analytics P i n g !
http://erpinnews.com/contextual-integration-is-the-secret-weapon-of-predictive-analyticshttps://actuairesbigdata.wordpress.com/2016/01/04/apprendre-a-coder-un-reseau-de-neurones/https://srippa.wordpress.com/2015/11/21/coding-neural-networks/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19457#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19457http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19422#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19422http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20171#respondhttps://theclevermachine.wordpress.com/2014/09/08/derivation-derivatives-for-common-neural-network-activation-functions/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20171http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19202#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19202http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19088#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19088
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
14/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 14/19
angie1pecht— January 17, 2016 at 8:52 pm
This helped me a lot. Thank you so much!
Reply
LEarning AI again— January 18, 2016 at 4:26 pm
This was awesome. Thanks so much!
Reply
Ashish— January 19, 2016 at 7:21 am
Thanks a lot Matt… Appreciated the effort, Kudos
Reply
Tariq— January 20, 2016 at 12:03 pm
If the error is “squared” but simply E = sum (target – output) , you can still do the
calculus to work out the error gradient .. and then update the weights. Where did I go
wrong with this logic?
Reply
Elliot— January 28, 2016 at 9:03 am
Good afternoon, dear Matt Mazur!
Thank you very much for writing so complete and comprehensive tutorial, everything
is understandable and written in accessible way! If is it posdible may I ask following
question if I need to compute Jacobian Matrix elements in formula for computing Error
Gradient with respect to weight dEtotal/dwi I should just percieve Etotal not as the full
error from all outputs but as an error from some certain single output, could you
please say is this correct? Could you please say are you not planning to make a
simillar tutorial but for computing second order derivatives (backpropagation with
partial derivatives of second order)? I have searching internet for tutorial of calculating
second order derivatives in backpropagation but did not found anything. Maybe you
know some good tutorials for it? I have know that second order partial derivatives
Learning How To Code Neural Networks | ipythonblog P i n g !
https://ipythonblog.wordpress.com/2016/01/20/learning-how-to-code-neural-networks/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19745http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19603#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19603http://makeyourownneuralnetwork.blogspot.com/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19583#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19583http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19574#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19574http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19559#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19559http://angiepecht.wordpress.com/
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
15/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 15/19
(elements of Hessian Matrix) can be approximated by multiplaying Jacobians but
wanted to find it’s exact non approximated calculation. Thank you in advance for your
reply!
Sincerely
Reply
Pulley— February 1, 2016 at 9:52 pm
hello Matt, Can you please tell me that after updating all weights in first iteration I
should update the values of all ‘h’ at-last in first iteration or not.
Reply
Behroz Ahmad Ali— February 6, 2016 at 8:01 am
Thank you for such a comprehensive explanation of backpropagation. I have been
trying to understand backpropagation for months but today I finally understood it after
reading your this post.
Reply
Tariq— February 8, 2016 at 10:57 am
i am writing a gentle intro to neural networks – aimed at being accessible to
someone at school approx age 15… here is a draft which includes a very very
gentle intro to backprop
https://goo.gl/7uxHlm
i’d appreciate feedback to @myoneuralnet
Reply
Rebeka Sultana— February 16, 2016 at 12:59 am
Thank you so much.
Reply
Ron— February 21, 2016 at 1:10 pm
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20143http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20069#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20069http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19956#respondhttps://goo.gl/7uxHlmhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19956http://makeyourownneuralnetwork.blogspot.co.uk/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19916#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19916https://www.facebook.com/app_scoped_user_id/10205797312227101/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19835#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-19835http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=19745#respond
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
16/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 16/19
Firstly, thank you VERY much for a great walkthrough of all the steps involved with
real values. I managed to create a quick implementation of the methods used, and
was able to train successfully.
I was looking to use this setup (but with 4 inputs / 3 outputs) for the famous iris data
(http://archive.ics.uci.edu/ml/datasets/Iris ). The 3 outputs would be 0.0-1.0 for each
classification, as there would be an output weight towards each type.
Unfortunately it doesn’t seem to be able to resolve to an always low error value, and
fluctuates drastically as it trains. Is this an indication that a second layer is needed for
this type of data?
Reply
Werner — February 22, 2016 at 5:44 am
The first explanation I read that actually makes sense to me. Most just seem to start
shovelling maths in your face in the name of “not making it simpler that they should”.
Now let’s hope my AI will finally be able to play a game of draughts.
Reply
admin— February 22, 2016 at 9:20 am
It helps me a lot. thanks for the work!!!
Reply
Name(required)— February 24, 2016 at 9:04 pm
Great tutorial. By any chance do you know how do backpropagate 2 hidden layers?
Reply
Mazur — February 25, 2016 at 8:22 am
I do not, sorry.
Reply
Kiran— February 25, 2016 at 12:29 am
Thank you so much! The explanation was so intuitive.
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20188http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20193#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20193http://www.mattmazur.com/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20183#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20183http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20148#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20148http://writeconomy.com/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20147#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20147http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20143#respondhttp://archive.ics.uci.edu/ml/datasets/Iris
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
17/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 17/19
Reply
Anon— February 25, 2016 at 11:18 pm
Thank you! The way you explain this is very intuitive.
Reply
tariq— February 26, 2016 at 9:38 am
I’d love your feedback on my attempt to explain the maths and ideas underlying
neuralnetworks and backrpop.
Here’s an early draft online. The aim for me is to reach as many people as possible
inck teenagers with school maths.
http://makeyourownneuralnetwork.blogspot.co.uk/2016/02/early-draft-feedback-
wanted.html
Reply
Garett Ridge AndThenSomeMoreWords— March 1, 2016 at 5:45 pm
I have a presentation tomorrow on neural networks in a grad class that I’m
drowning in. This book is going to save my life
Reply
falcatrua— February 29, 2016 at 2:23 pm
It’s a great tutorial but I think I found an error:
at forward pass values should be:
neth1 = 0.15 * 0.05 + 0.25 * 0.1 + 0.35 * 1 = 0.3825
outh1 = 1/(1 + e^-0.3825) = 0,594475931neth2 = 0.20 * 0.05 + 0.30 * 0.1 + 0.35 * 1 = 0.39
outh2 = 1/(1 + e^-0.39) = 0.596282699
Reply
Garett Ridge AndThenSomeMoreWords— March 1, 2016 at 9:37 pm
The labels go the other way in his drawing, where the label that says w_2
goes with the line it’s next to (on the right of it) and the value of w_2 gets
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20277https://www.facebook.com/app_scoped_user_id/10101847274045184/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20250#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20250http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20276#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20276https://www.facebook.com/app_scoped_user_id/10101847274045184/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20213#respondhttp://makeyourownneuralnetwork.blogspot.co.uk/2016/02/early-draft-feedback-wanted.htmlhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20213http://makeyourownneuralnetwork.blogspot.co.uk/http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20207#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20207http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20188#respond
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
18/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ 18/19
written to the left; look at the previous drawing without the values to see what I
mean
Reply
Bill— March 2, 2016 at 3:09 am
Good stuff ! Professors should learn from you. Most professors make complex things
complex. A real good teacher should make complex things simple.
Reply
b— March 2, 2016 at 3:11 am
Also , recommend this link if you want to find a even simpler example than this one.
http://www.cs.toronto.edu/~tijmen/csc321/inclass/140123.pdf Reply
Priti— March 2, 2016 at 4:27 am
Can you give an example for backpropagation in optical networks
Reply
Moboluwarin— March 2, 2016 at 2:13 pm
Hey there very helpful indeed, in the line for net01 = w5*outh1 + ‘w6’*outh2+b2*1, is it
not meant to be ‘w7’ ??
Cheers
Reply
Dara— March 4, 2016 at 9:17 am
Can anyway help me explaining manual calculation for testing outputs with trained
weights and bias? Seems it does not give the correct answer when I directly substitute
my inputs to the equations. Answers are different than I get from MATLAB NN toolbox.
Reply
http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20320#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20320http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20292#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20292http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20284#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20284http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20281#respondhttp://www.cs.toronto.edu/~tijmen/csc321/inclass/140123.pdfhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20281http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20279#respondhttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-2/#comment-20279http://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/?replytocom=20277#respond
8/18/2019 A Step by Step Backpropagation Example – Matt Mazur
19/19
8/3/2016 A Step by Step Backpropagation Example – Matt Mazur
← Older Comments
L e a v e a R e p l y
Blog at WordPress.com. The Publish Theme.
Enter your comment here...
https://wordpress.com/themes/publish/https://wordpress.com/?ref=footer_bloghttp://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/comment-page-1/#comments