zlacker

[parent] [thread] 3 comments
1. Housha+(OP)[view] [source] 2016-01-26 02:54:44
NN weights need to start random because otherwise two weights with exactly the same value can get "stuck" and be unable to differentiate. Backpropagation relies on starting random patterns that kind of match so that it can fine tune them.

But the weights are often initialized to be really close to zero.

replies(1): >>brianp+93
2. brianp+93[view] [source] 2016-01-26 04:05:07
>>Housha+(OP)
Given the era though, Sussman may have actually been working with a neural net that's not the typical hidden-layer variety. "Randomly wired" could be a statement about the topography of the network, not about the weights.
replies(1): >>argona+if
◧◩
3. argona+if[view] [source] [discussion] 2016-01-26 09:09:32
>>brianp+93
There is no evidence he was actually working with a neural net.

https://web.archive.org/web/20120717041345/http://sch57.msk....

replies(1): >>dTal+7E1
◧◩◪
4. dTal+7E1[view] [source] [discussion] 2016-01-27 02:58:06
>>argona+if
I had no idea this existed; it's brilliant!
[go to top]