> My path in technology started at Facebook where I was the first Director of Monetization. [...] we sought to mine as much attention as humanly possible and turn into historically unprecedented profits. We took a page from Big Tobacco’s playbook, working to make our offering addictive at the outset.
> Tobacco companies [...] added sugar and menthol to cigarettes so you could hold the smoke in your lungs for longer periods. At Facebook, we added status updates, photo tagging, and likes, which made status and reputation primary and laid the groundwork for a teenage mental health crisis.
> Allowing for misinformation, conspiracy theories, and fake news to flourish were like Big Tobacco’s bronchodilators, which allowed the cigarette smoke to cover more surface area of the lungs.
> Tobacco companies added ammonia to cigarettes to increase the speed with which nicotine traveled to the brain. Extreme, incendiary content—think shocking images, graphic videos, and headlines that incite outrage—sowed tribalism and division. And this result has been unprecedented engagement -- and profits. Facebook’s ability to deliver this incendiary content to the right person, at the right time, in the exact right way... that is their ammonia.
> The algorithm maximizes your attention by hitting you repeatedly with content that triggers your strongest emotions — it aims to provoke, shock, and enrage. All the while, the technology is getting smarter and better at provoking a response from you. [...] This is not by accident. It’s an algorithmically optimized playbook to maximize user attention -- and profits.
> When it comes to misinformation, these companies hide behind the First Amendment and say they stand for free speech. At the same time, their algorithms continually choose whose voice is actually heard. In truth, it is not free speech they revere. Instead, Facebook and their cohorts worship at the altar of engagement and cast all other concerns aside, raising the voices of division, anger, hate and misinformation to drown out the voices of truth, justice, morality, and peace.
One person might say "We created all these statuses and features to be addictive" but it seems just as true to say "We created this stuff because people liked it and we are trying to make something people like."
Honestly, this is a super interesting question. I would say anything designed to succeed by hijacking human brain chemistry instead of providing superior or novel quality is probably worth regulating at some level.
From that standpoint, Breaking Bad would not have an issue - it's superior and novel. Shows that succeed in making a viewer binge with a combination of (effectively) mid-episode endings and autoplay, are somewhat hacky. You can't regulate cliffhanger endings, so autoplay should probably not be legal - Netflix already asks you if you want to continue watching, they should simply do so after every episode. Shows with good content like Breaking Bad would still be easy to binge (just press yes once an hour), and poor quality shows would have a harder time taking advantage of brain chemistry by requiring an affirmative act.
Manipulative advertising is an act of malice, particularly with addictive products.
You originally posted that:
>"Advertising is an act of malice, particularly with addictive products."
But changed it to:
>"Manipulative advertising is an act of malice, particularly with addictive products."
What do you see as the difference between "manipulative advertising" and regular "advertising", and how is either (or both) malicious? Advertising is basically telling people that you are offering them something, and trying to persuade them to buy/use it, and I am not sure how that is "characterized by unscrupulous control of a situation or person."