If we cared about preventing LLMs from being used for violence, we would have poured more than a tiny fraction our resources into safety/alignment research. We did not. Ergo, we don't care, we just want people to think we care.
I don't have any real issue with using LLMs for military purposes. It was always going to happen.