It's baffling how many people in previous threads thought a company that gets most of its money from enterprise/business clients, will burn all their reputation by surreptitiously using client data to train their AI.
> Zoom has agreed to pay $85 million to settle claims that it lied about offering end-to-end encryption and gave user data to Facebook and Google without the consent of users. The settlement between Zoom and the filers of a class-action lawsuit also covers security problems [0]
> Mac update nukes dangerous webserver installed by Zoom [1]
> The 'S' in Zoom, Stands for Security - uncovering (local) security flaws in Zoom's macOS client [2]
[0] https://arstechnica.com/tech-policy/2021/08/zoom-to-pay-85m-...
[1] https://arstechnica.com/information-technology/2019/07/silen...
(Even if revenue was much higher. Revenue doesn't tell you anything about how well a company can take a financial hit)
If the specific misconduct they got caught for netted them $x, and they got fined for $5x, who cares how much % of their global revenue is? That specific crime was still a net negative for them. I'm not sure why conglomerates should be punished more harshly just because they have more revenue overall.
Personally I think that C levels should automatically be disbarred if the corporation is found guilty of criminality as that puts responsibility on the people with the power to prevent it.
While you thought you presented an argument against hefty fines, you actually gave the perfect reason for why they should be hefty. If illegal practices are affordable, they're not illegal. They're just the price of doing business. So make them hurt.