zlacker

[return to "Google Web Environment Integrity Is the New Microsoft Trusted Computing"]
1. baz00+8t[view] [source] 2023-07-27 07:01:00
>>neelc+(OP)
The problem here is that most people don't give a crap. I was explaining this situation to my girlfriend last night over a drink. She's a high level academic with a strong mathematical and logical background in a different field but she didn't really formulate an opinion on it past "if my stuff keeps working, why is it a problem?". Which is fair, because it's a hypothetical risk, but the side effects are a net negative and the open nature of the web is at risk.

As always people see the happy path down the middle of the forest, not the creatures waiting to leap out and eat them two steps down the line.

◧◩
2. nologi+sU[view] [source] 2023-07-27 10:45:57
>>baz00+8t
> The problem here is that most people don't give a crap

Most people are not qualified to give a crap.

We don't adopt medicines on the basis of "most people's" opinion, we don't adopt anything technological with potentially harmful impact on the basis of the opinion of large uninformed masses.

Thats why we have regulators and other institutions that should be informed and give an informed crap. On a ongoing basis and not only a result of popular outrage.

Which brings us to regulatory capture and said institutions actually failing their mandate to serve the interests of the people that fund them.

But now we have something that most people should give a crap about. This is not technical, it goes to the foundation of democracy and governance. Otherwise we might as well stop voting and accept we live in a corporate oligarchy.

◧◩◪
3. flagra+A91[view] [source] 2023-07-27 12:44:46
>>nologi+sU
This can be a really dangerous approach though. Appeals to authority can easily turn dark if the wrong authority is in charge, and by then the people have been conditioned to blindly follow them.

People should never be expected to make meaningful decisions in their life only because someone with degrees said it's best for them, or even worse make no decision because the leader already made it for them. People need to be able to think for themselves and make their own decisions, even if the few experts may disagree with the decisions made.

In my opinion, this should have been the most important lesson from three years of pandemic response. We had a small group of experts getting out over their skis and speaking with certainty about the virus and what everyone must do. In reality these experts had much less research-based data to support this level of confidence, and in some cases the data even contradicted them. In the meantime we were all forced or coerced into various decisions and protocols that didn't seem to pan out, for a virus that we once got kicked off social media platforms for comparing to the cold or flu while that's precisely how said experts discuss it today.

Experts should absolutely weigh in and attempt to educate people on what's at stake and why they should make one decision of another. But a system in which a few at the top decide for and control the rest of the population is extremely dangerous and should be reserved for only the absolutely most important situations.

◧◩◪◨
4. nologi+Vj1[view] [source] 2023-07-27 13:34:25
>>flagra+A91
Yes I definitely share the same concerns. But there is a practical need to rank risks and identify what are immediate, first order ones versus second order and broader concerns. In the absence of independent and minimally competent bodies we are in dire straits, effectively in snake-oil-salesmen territory regarding a technology that is considered as central to our future.

I don't think there is or there will ever be perfect regulation. Pick any sector (banking is a prime example) and you can identify recurring failure, capture, complacency and other pathologies on top of the intrinsic difficulty of working out the unknown-unknowns.

Ultimately the only structural mitigation available is to have as many checks-and-balances as possible and transparency about motivations and incentives of all actors involved.

But that is not the immediate problem with "tech". I put the term in quotes because even that is a conceit. The accurate term is probably "random conglomerates that were first movers in adopting digital technologies, with user-data based advertising the overwhelming business model".

The shtick has been that "heavy handed" regulation of said "tech" will stifle innovation and other such drivel. Indeed, if by innovation we mean drifting ever deeper into the black hole. For more a decade now we are trapped in an egregiously suboptimal situation.

[go to top]