Once these things can move around us, far away from their owner, there is enormous potential for societal harm.
Someone could buy a $10k Figure robot, strap a bomb or nerve agent to it, then have it walk into a public place.
If we just accept these robots as normal everyday things (it seems like we will), we wouldn't even blink or think twice that a robot was walking up to us.
I hate monitoring and tracking and surveillance. I'm a freedom and personal liberty absolutist for most things without negative externalities. But as I put this new AI tech through thought experiments, I don't know how we'll survive in a normal world anymore when agency is cheap and not tied to mortality.
Society, even one with guns, relied on the fact that people are afraid of the consequences of their actions. If there's no ability to trace a drone or robot, god only knows what could happen.
Kidnappings, murders, terrorism. It seems like this might become "easy".
How hard is it going to be to kill off political opponents in the future? Putin, for instance, enjoys relative freedom of movement because it's hard to get close to him.
Once you can throw a drone into a field or rooftop and have it "sleep" for months until some "awake" command, then it operates entirely autonomously - that's cheap, easy to plan, and potentially impossible to track.
Some disgruntled guy buys some fertilizer, a used van, and comma.ai?
We potentially have a very, very different world coming soon.
The British army only has maybe 20,000 actual soldiers. You could manufacture enough robots to kill them all in a week. Then you’d just have a whole country.
It’ll completely change the game. There’s no point selling it to a state for their army, when you could just instantly make yourself the owner of the state.
Don't worry, we're safe. It's already been done and it did not win: https://www.reddit.com/r/ChatGPT/comments/14dv530/the_homele...