zlacker

[parent] [thread] 2 comments
1. foolof+(OP)[view] [source] 2024-02-14 14:47:34
I think you're missing the point. I don't think we should have prevented the development of this tech. It's just absurd to complain about things that we always knew would happen as though they're some sort of great surprise.

If we cared about preventing LLMs from being used for violence, we would have poured more than a tiny fraction our resources into safety/alignment research. We did not. Ergo, we don't care, we just want people to think we care.

I don't have any real issue with using LLMs for military purposes. It was always going to happen.

replies(2): >>kelips+H3 >>kj99+7n
2. kelips+H3[view] [source] 2024-02-14 15:05:47
>>foolof+(OP)
Safe or alignment research isn't going to stop it from being used for military purposes. Once the tech is out there, it will be used for military purposes; there's just no getting around it.
3. kj99+7n[view] [source] 2024-02-14 16:29:08
>>foolof+(OP)
You say ‘we’ as if everyone is the same. Some people care, some people don’t. It only takes a a few who don’t, or who feel the ends justify the means. Because those people exist, the people who do care are forced into a prisoners dilemma forcing them to develop the technology anyway.
[go to top]