NN weights need to start random because otherwise two weights with exactly the same value can get "stuck" and be unable to differentiate. Backpropagation relies on starting random patterns that kind of match so that it can fine tune them.
But the weights are often initialized to be really close to zero.