The big question today is: Do you try to make an AI business using OpenAI's APIs, or do you host everything yourself? One could make the argument either way.
There is an argument for airbnb the lands with a castle on wheels.
An extreme form is self-hosted on edge-only devices where folks are buying some other hw. Ex: Nvidia selling GPUs and giving out free Triton inferencing OSS software. But most are in the middle, eg, some accounting app now with LLMs. Our case of investigations in louie.ai is right at that boundary: OpenAI likes to support data analysis, but folks using Splunk/databricks/etc all day expect a lot more out of software here, and that's too at-odds with OpenAI's org chart and customerbase.