- accountability
- reliability
- validation
- security
- liability
Humans can reliably produce text with all of these features. LLMs can reliably produce text with none of them.
If it doesn't have all of these, it could still be worth paying for if it's novel and entertaining. IMO, LLMs can't really do that either.
LLMs can use linters and type checkers, but getting past them often times leads it down a path of mayhem and destruction, doing pretty dumb things to get them to pass.