zlacker

[parent] [thread] 2 comments
1. nopins+(OP)[view] [source] 2023-07-06 05:43:59
Genetic modifications could potentially cause havoc in the long run as well, but it's much more likely we have time to detect and thwart their threats. The major difference is speed.

Even if we knew how to create a new species of superintelligent humans who have goals misaligned with the rest of humanity, it would take them decades to accumulate knowledge, propagate themselves to reach a sufficient number, and take control of resources, to pose critical dangers to the rest.

Such constraints are not applicable to superintelligent AIs with access to the internet.

replies(1): >>mejuto+L8
2. mejuto+L8[view] [source] 2023-07-06 07:03:31
>>nopins+(OP)
Counterexample: covid.

Assumptions:

- Genetic modification as danger needs to be in the form of a big number of smart humans (where did that come from?)

- AI is not physically constrained

> it's much more likely we have time to detect and thwart their threats.

Why? Counterexample: covid.

> Even if we knew how to create a new species of superintelligent humans who have goals misaligned with the rest of humanity, it would take them decades to accumulate knowledge, propagate themselves to reach a sufficient number, and take control of resources, to pose critical dangers to the rest.

Why insist on some superinteligent and human, and suficient number. A simple virus could be a critical danger.

replies(1): >>nopins+4N3
◧◩
3. nopins+4N3[view] [source] [discussion] 2023-07-07 02:34:49
>>mejuto+L8
We do have regulations and laws to control genetic modifications of pathogens. They are done in highly secure labs and the access is not widely available to anyone.

If a pathogen more deadly than Covid starts to spread, eg like Ebola or Smallpox, we would have done more to limit its spread. If it’s good at hiding from detection for a while, it could potentially cause a catastrophe but most likely will not wipe out humanity because it is not intelligent and some surviving humans will eventually find a way to thwart it or limit its impact.

A pathogen is also physically constrained by available hosts. Yes, current AI also requires processors but it’s extremely hard or nearly impossible to limit contact with CPUs & GPUs in the modern economy.

[go to top]