The main reason to worry, though, is not the proprietary monetization of "AI" algorithms: Just like it was not an algorithm (pagerank) but the invention of adtech that spawned surveillance capitalism, here too the main question is what sort of "disruption" can this tech facilitate, as in which social contract will be violated in order to "create value".
"Success" in "tech" has for a long time been predicated on the absence of any regulation, pushback or controls when applying software technology in social / economic spheres previously operating under different moral conventions. In the name of "not stiffling innovation".
Ironically our main protection is that we may actually now live a "scorched Earth" environment. The easy disruptions are done and "tech innovation" is bumping against domains (finance, medical) that are "sensitive".
What needs to be understood is that this sort of technology is not an equalizer, regardless of the PR behind having your own personal Einstein/secretary at your beck and call. You can look at the state of modern computing sans AI to see this is true: many people with desktops are using Microsoft, Apple, or Google OSes, which become more and more restrictive as time goes on, despite the capabilities of such computers increasing regularly.
True, and it is to be expected that existing interests will seek to integrate any new tricks into the old patterns
The question is to what extend this can go on without imploding. How big the mismatch between what you could do with a mobile or a desktop or a decentralized cluster of millions of computers and what you actually do before some random bug in a typewriter short-circuits the entire system.
People are banking on widespread digital transformation as one of the few major economic growth drivers in an otherwise exhausted opportunity landscape - the literally scorched Earth. I fail to see, though, how this regeneration could possibly be achieved with parasitic business models and behaviors. We should not think just about individuals or "consumers", as in this role we are effectively disenfranchised, but our role in all sorts of private and public organizations that collectively have much more political and economic weight than "big tech".
I fully agree. It's been said to death that AI will radically transform everything about the world. In the first case, the implicit assumption is everything except how the economy works at a fundamental level, which doesn't really jive with all the other things it's expected to transform.
In the second case, AI control problem aside, we have a human control problem that none of our technological advancements have ever solved, and in fact only exacerbated. Billionaires can hoard wealth in ways and places normal people can't; despite all the billions lying around and plenty of real problems to solve (hunger, sanitation, the death of the biosphere, the toxic "externalities" of the economy), vanity projects, personal philanthropies, and moar tech is seen as the solution, always.
I don't trust machines to shape people for the better when the last decade has shown just how Big Tech will co-opt our psychology for money. We need to rethink if progress for progress' sake is worth the carnage it causes, if eternal unchecked ambition is psychologically pathogenic, and if anything can build a "better world" when promises of ample leisure have rung hollow since the industrial revolution.
Stuck between a rock and a hard place, I have to root for severe climate disruption to put a hard limit on the insanities of industry before they drive us over a completely different kind of cliff.