* People using it as a tool, aware of its limitations and treating it basically as intern/boring task executor (whether its some code boilerplate, or pooping out/shortening some corporate email), or as tool to give themselves summary of topic they can then bite into deeper.
* People outsourcing thinking and entire skillset to it - they usually have very little clue in the topic, are interested only in results, and are not interested in knowing more about the topic or honing their skills in the topic
The second group is one that thinks talking to a chatbot will replace senior developer
> People using it as a tool, aware of its limitations
You can't know the limitations of these tools. It is literally unknowable. Depending on the context, the model and the task, it can be brilliant or useless. It might do the task adequately first time, then fail ten times in a row.
> People outsourcing thinking and entire skillset to it
You can't get out of it something that you can't conceive of. There are real life consequences to not knowing what AI produced. What you wrote basically assumes that there is a group who consistently hit themselves on the head with a hammer not knowing what hurt them.