zlacker

[parent] [thread] 0 comments
1. fnordp+(OP)[view] [source] 2023-05-16 15:58:00
(Acknowledging you didn’t support regulation in your statement, just riffing)

Then write laws and regulations about the actions of humans using the tools. The tools have no agency. The human using them towards bad ends do.

By the way, writing things the state considers immoral is an enshrined right.

How do you draw the line between AI writing assistance and predictive text auto completion and spell check in popular document editors today? I would note that predictive text is completely amoral and will do all sorts of stuff the state considers immoral.

Who decides what’s immoral? The licensing folks in the government? What right do they have to tell me my morality is immoral? I can hold and espouse any morality I desire so long as I break no law.

I’d note that as a nation we have a really loose phrasing in the bill of rights for gun rights, but a very clear phrasing about freedom of speech. We generally say today that guns are free game unless used for illegal actions. These proposals say tools that take our thoughts and opinions and creation of language to another level are more dangerous than devices designed for no other purpose than killing things.

Ben Franklin must be spinning so fast in his grave he’s formed an accretion disc.

[go to top]