Coming up with objectives or deciding what is appropriate (i.e. what to backpropagate against) is a function of intelligence not accounted for in current AI design.
We've built this in code, a form of AGI at animal-levels using (naturally) an alternative to backpropagation. Get in touch ali at aolabs.ai.