More precisely, I think it is intuitive that the class of problems that can be solved in any time given O(n) space is far larger than the class of problems that can be solved in any space given O(n) time.
If your program uses O(n) memory, it must run at least in O(n) time (lower bound on time).
> If your program runs in O(n) time, it cannot use more than O(n) memory (upper bound on memory usage.[sic]
This is clearly refuted by all software running today. Programs (especially games) clearly use more memory than there are instructions in the program.
> If your program uses O(n) memory, it must run at least in O(n) time (lower bound on time).
Memory bombs use an incredible amount of memory and do it incredibly quickly.
How can you access a piece of memory without issuing an instruction to the CPU? Also, "clearly" is not an argument.
>Memory bombs use an incredible amount of memory and do it incredibly quickly.
How can you access a piece of memory without issuing an instruction to the CPU? Also "incredibly quickly" is not an argument. Also also, O(n) is incredibly quick.
As in your assertion is literally self-evidently false. It is on you to provide a burden of proof here; especially since there are instructions that can load more than a single bit of memory.
> How can you access a piece of memory without issuing an instruction to the CPU?
Let me rather ask you this: where do the instructions exist that are running? That is right: in memory. However, just because instructions exist in memory doesn’t mean they’re accessed. There is not a relationship between the number of instructions and the amount of memory accessed/used.
That is not what it means. Again, if you are not familiar with the notation then all you are doing is slapping your personal ideas about computing to some symbols