The problem we often hit when building apps on top of LLMs is managing LLM context windows (and sometimes swappable LLM providers). For which you need different types of worker/consumer/queue setups.
TypeScript is amazing for building full-stack web apps quickly. For a decade my go-to was Django, but everything just goes so much faster with endpoints & frontend all in the same place. But, finding a good job/queue service is a little more of a challenge in this world that "just setup Celery". BullMQ is great, but doesn't work with "distributed" Redis providers like Upstash (Vercel's choice).
So, in a roundabout way, an offering like this is in a super-duper position for AI money :)