Like it or not, people are using LLMs a lot. The output isn’t universally good. It depends on what you ask for and how you criticize what comes back. But the simple reality is that the tools are pretty good these days. And not using them is a bit of a mistake.
You can use LLMs to fix simple grammar and style issues, to fact-check argumentation, and to criticize and identify weaknesses. You can also task LLMs with doing background research, double-checking sources, and more.
I’m not a fan of letting LLMs rewrite my text into something completely different. But when I'm in a hurry or in a business context, I sometimes let LLMs do the heavy lifting for my writing anyway.
Ironically, a good example is this article which makes a few nice points. But it’s also full of grammar and style issues that are easily remedied with LLMs without really affecting the tone or line of argumentation (though IMHO that needs work as well). Clearly, this is not a native speaker. But that’s no excuse these days to publish poorly written text. It's sloppy and doesn't look good. And we have tools that can fix it now.
And yes, LLMS were used to refine this comment. But I wrote the comment.
When a tool blurs the line between who performed the task, and you take full credit despite being assisted, that is deceitful.
Spell checking helps us all pretend we're better spellers than we are, but we've decided as a society that correct spelling is more important than proving one's knowledge of spelling.
But if you're purportedly a writer, and you're using a tool that writes for you, then I will absolutely discount your writing ability. Maybe one day we will decide that the output is more important than the connection to the person who generated it, but to me, that day has not arrived.