Thankfully OpenAI hasn't blocked me yet and I can still use Codex CLI. I don't think you're ever going to see that level of power locally (I very much hope to be wrong about that). I will move over to using a cloud provider with a large gpt-oss model or whatever is the current leader at the time if/when my OpenAI account gets blocked for no reason.
The M-series chips in Macs are crazy, if you have the available memory you can do some cool things with some models, just don't be expecting to one shot a complete web app etc.