Maybe in the future, humans don't have to verify the spelling, logic or grounding truth either in programs because we all have to give up and assume that the LLM knows everything. /s
Sometimes, I read these blogs from vibe-coders that have become completely complacent with LLM slop, I have to continue to remind others why regulations exist.
Imagine if LLMs should become fully autonomous pilots on commercial planes or planes optimized for AI control and the humans just board the plane and fly for the vibes, maybe call it "Vibe Airlines".
Why didn't anyone think of that great idea? Also completely remove the human from the loop as well?
Good idea isn't it?
This was the result of an afternoon/evening exploring a problem space, and I thought it was interesting enough to share.
Everything else was a thought experiment to show how the idea of LLMs on everything including commercial planes is a very bad idea and would give regulators a hard time.
The point is: just because you can (build and run anything) does not mean you should (put it on commercial planes).