This kind of error just feels comical to me, and really makes it hard for me to believe that AGI is anywhere near. LLM's struggle to understand the order of datasets, when explicitly told. This is like showing a coin trick to a child, except perhaps even simpler.
No amount of added context or instructions seems to fix these kind of issues in a way that doesn't still feel pretty hobbled. The only way to get the full power out of the model is to conform your problem to the expectations that seem to be baked in - i.e. just change your rendering coordinate system to be z-up.