Surely you don’t want your fellow citizens to fall for Russian, Chinese, Another State Actor propaganda?
State surveillance on unprecedented level? Don’t be paranoid! Surely a state actor would never abuse the power to snoop for your private photos[2]!
Electronic waste? Duopoly? Censorship? Ownership? Those are made up words, comrade!
[1] - https://youtu.be/cwCtM6D4GOc
[2] - https://www.rollingstone.com/tv-movies/tv-movie-news/karen-r...
Centralized Appstore monopolies are best friends of the authoritarian governments. Both Google and Apple readily removed any app the local government points them to, saying that they are committed to working in accordance with the legislation of countries where they provide their services.
"....but added that it complies with the local laws of each particular country." [0]
[0]: https://www.techspot.com/news/67701-russia-tells-apple-googl...
First they came for the Communists
And I did not speak out
Because I was not a Communist [0]
https://en.wikipedia.org/wiki/Code_as_speech
(It was later partly rejected by other courts in the DMCA anticircumvention context.)
This argument doesn't imply that companies have to help you publish your software, because they might be entitled to some kind of editorial control over which speech they do or don't distribute. But it does at least imply that the stakes of such control are very high and that free speech norms may be implicated by them!
Nothing you allege was missed, and indeed it was considered at length in the longer series on these topics:
https://www.apple.com/legal/more-resources/docs/2024-App-Sto...
says that there are over 1700 apps removed per year due to "government takedown demands". Since this is separate from about 2 million (!) apps they rejected from the app store and about 80,000 apps they removed from the app store on their own initiative, it stands to reason that they would have disagreed with quite a lot of those requests, but they still obeyed them.
One could think about this in at least two ways:
(1) If the 2,000,000 apps they rejected or the 80,000 apps they removed on their own initiative were very dangerous or very harmful in some way, one might believe that Apple's huge and arbitrary power over iPhones is ultimately beneficial because it's mostly used to protect people, and only slightly used to uphold state power over citizens.
(2) If you compare this to the baseline of "OS developers shouldn't decide what software you can run", then it's already, well, thousands of programs, probably often quite popular ones, that people are being intentionally prevented from using because their governments disapprove. And probably quite routinely for reasons that large parts of the population would disagree with. It is already a frequent event; in some countries (it's a long tail so the absolute majority of the removals in 2024 were attributable to the PRC!) it's plausible that most iPhone users directly experience the results of app censorship.
(You could add to this that users would also be divided about some of Apple's decisions on its own initiative, primarily apps that the company banned for sexual or violent content, usually fictional. Some users may agree with Apple using its power this way and other users may disagree. A recent example is that they've banned the SpicyChat AI erotic chat app, and probably many other "AI boyfriend/girlfriend" apps. In the past, they've banned apps created by various porn sites.)
I think this issue is confusing. I've always believed that device owners should have complete control of their computing devices and not be subject to other people's power when using them. You can see people in this thread pointing out that sometimes this power is being used to protect users (including from having their devices hijacked by malicious third parties, which would also tend to significantly undermine their control of their devices... although one can then argue about what responsibility different parties had to actively prevent that outcome). The argument that technological paternalism contributes to maximizing users' practical control is an argument that must be engaged with. And also, sometimes it's simply not being used to protect users at all.
By the way, if you get into the object level issue then you can get even more confused:
(1) I think the U.S. government probably wanted to ban this particular app merely because it was successful at helping people avoid deportation. But it might turn out that, with this app or with some future app that looks superficially similar, it actually is being used to coordinate violent attacks, even if the developer didn't intend that outcome. At some point, governments will have a case that there is some kind of meaningful physical-world harm associated with the observed usage of some piece of software. (More on that in other points below.)
(2) If Apple literally prevented itself from having the power to approve or reject software for iOS (e.g. by allowing "sideloading", which was the norm for almost all historical computing environments), then you literally could have apps that explicitly describe themselves as meant to coordinate violence (against law enforcement, against minority groups, against specific people, or whatever). This is not a strawman. It's really easy to write such an app. There is no reason to think that people who know how to write apps are all refraining from writing violence-coordination apps. In other contexts, people might be able to agree not to blame toolmakers for downstream uses of their tools, like not blaming radio manufacturers for having their radios be able to receive the broadcast incitements to genocide in Rwanda in the 1990s. So maybe we would eventually similarly be able to agree not to blame Apple for making an OS that could run the "Let's Kill ______" third-party app. But we should understand that on some occasions such an app would probably exist. You know, there are video games whose content is actually pretty gross by almost any given standard. A lot of people have been able to agree that those games can exist, or at least that people other than the developers bear no responsibility for their availability.
(3) You could say that Apple should just make good ethical object-level case-by-case decisions about how to use its power, which is probably what they try to do most of the time, but they sometimes fail, or sometimes there isn't a consensus within the company or within a society about what the right call should be. In this case, we're going to be back here again and again talking about the merits of different app bans, when they manage to get wide enough attention. Remember, again, there were already 1700 app bans per year last year, and presumably lots of governments are only just waking up to the possibility of demanding them!
(4) Governments are already using offline harms to justify incredibly intrusive control of computing and communications. Some of those offline harms are real, not speculative. For example, there really were lynchings coordinated via WhatsApp groups and via WhatsApp memes in several developing countries. The remedies and "solutions" that many governments have suggested in response to such things are incredibly scary.