Aren't all the money go to AI companies these days (even the unicorns didn't do well with their IPOs. E.g. Hashicorp).
That said, I love every single addition to the Go community so thumbs up from me.
The problem we often hit when building apps on top of LLMs is managing LLM context windows (and sometimes swappable LLM providers). For which you need different types of worker/consumer/queue setups.
TypeScript is amazing for building full-stack web apps quickly. For a decade my go-to was Django, but everything just goes so much faster with endpoints & frontend all in the same place. But, finding a good job/queue service is a little more of a challenge in this world that "just setup Celery". BullMQ is great, but doesn't work with "distributed" Redis providers like Upstash (Vercel's choice).
So, in a roundabout way, an offering like this is in a super-duper position for AI money :)