I'm not sure what gives the authors the confidence to predict such statements. Wishful thinking? Worst-case paranoia? I agree that such an outcome is possible, but on 2--3 year timelines? This would imply that the approach everyone is taking right now is the right approach and that there are no hidden conceptual roadblocks to achieving AGI/superintelligence from DFS-ing down this path.
All of the predictions seem to ignore the possibility of such barriers, or at most acknowledge the possibility but wave it away by appealing to the army of AI researchers and industry funding being allocated to this problem. IMO it is the onus of the proposers of such timelines to argue why there are no such barriers and that we will see predictable scaling in the 2--3 year horizon.
A lot of this resembles post-war futurism that assumed we would all be flying around in spaceships and personal flying cars within a decade. Unfortunately the rapid pace of transportation innovation slowed due to physical and cost constraints and we've made little progress (beyond cost optimization) since.
Lets say intelligence caps out at the maximum smartest person that's ever lived. Well, the first thing we'd attempt to do is build machines up to that limit that 99.99999 percent of us would never get close to. Moreso the thinking parts of humans is only around 2 pounds of mush in side of our heads. On top of that you don't have to grow them for 18 years first before they start outputting something useful. That and they won't need sleep. Oh and you can feed them with solar panels. And they won't be getting distracted by that super sleek server rack across the aisle.
We do know 'hive' or societal intelligence does scale over time especially with integration with tooling. The amount of knowledge we have and the means of which we can apply it simply dwarf previous generations.
Instead think of them saying a crusade occurring in the next few years. When the group saying the crusade is coming is spending billions of dollars to trying to make just that occur you no longer have the ability to say it's not going to happen. You are now forced to examine the risks of their actions.
(They could be wrong, but this isn't a guess, it's a well-researched forecast.)
https://www.theguardian.com/technology/2017/apr/18/god-in-th...
Maybe we'll see "Church of the Children of Altman" /s
It seems without a framework of ethics/morality (insert XYZ religion), us humans find one to grasp onto. Be it a cult, a set of not-so-fleshed-out ideas/philosophies etc.
People who say they aren't religious per-se, seem to have some set of beliefs that amount to religion. Just depends who or what you look towards for those beliefs, many of which seem to be half-hazard.
People I may disagree with the most, many times at least have a realization of what ideas/beliefs are unifying their structure of reality, with others just not aware.
A small minority of people can rely on schools of philosophical thought, and 'try on' or play with different ideas, but have a self-reflection that allows them to see when they transgress from ABC philosophy or when the philosophy doesn't match with their identity to a degree.
Perhaps the article is wrong about the timescale, but given how much AI has improved in the last 5 years, can you agree that it's likely to reach 'sit back and watch' levels in the next 5-10 years?
"Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war." https://www.safe.ai/work/statement-on-ai-risk
Laughing it off as the same as the Second Coming CANNOT work. Unless you think yourself cleverer and more capable of estimating the risk than all of these experts in the field.
Especially since many of them have incentives that should prevent them from penning such a letter.
Would be a shame to have energy consumption by datacenters regulated, am I right ?
Perhaps they were trying to avoid any possible misunderstanding/misconstrual (there are misinformed people who don't believe in global warming).
In terms of avoiding all nitpicking, I think everyone that's not criminally insane believes in pandemics and nuclear bombs.