zlacker

[return to "Social Cooling (2017)"]
1. JacobD+Hf[view] [source] 2020-09-29 14:37:23
>>rapnie+(OP)
>In China each adult citizen is getting a government mandated "social credit score". This represents how well behaved they are, and is based on crime records, what they say on social media, what they buy, and even the scores of their friends.

This really isn't all that different than what is happening elsewhere across the world today. Your Uber rider score represents your "social credit" for that service. Your Airbnb guest reviews impact if you will be allowed to rent a room. Each platform is putting social credit in place via crowd-sourced "trust"

EDIT: I don't mean to minimize China's human rights violations, but to posture that independently of central control many companies are implementing their own versions of these systems, which can have _some_ of the same effects in terms of losing access to services. Obviously one's Uber scores won't put you in jail / detainment camp and I was not intended to imply such.

◧◩
2. istori+Sg[view] [source] 2020-09-29 14:43:26
>>JacobD+Hf
It's extremely different. It's so so so so different.

The Chinese surveillance state is incredibly more massive and pervasive, the list of infractions includes incredibly more minor actions (and include political speech that is in anyway dissident), the consequences of a low score are so much more dire (unable to fly, travel, live in certain places, etc).

◧◩◪
3. captai+OE[view] [source] 2020-09-29 16:33:10
>>istori+Sg
It's a slower slide, but over time this information is centralised into a fewer and fewer large data brokers, what the scores means will become standardised across industries meaning eventually there will be no escape by going to competitors. This enables companies to start charging based on automatically generated risk profiles, some of which will end up being generated based on political preference (or proxies for it) Eventually this means that people with bad scores will be unable to afford certain things in the same way that they have trouble accessing credit. Credit scores are just the beginning.

For instance, imagine you are an airline. You have an issue to do with deportation critics disrupting flights when people are being forcibly deported on them. This happens fairly infrequently but costs you quite a lot of money every time this happens. So, 'logically', you decide to determine who is most likely to disrupt a flight and so through discriminatory thinking somebody decides that those with left-wing political leanings are more likely to disrupt a flight. They purchase this information on political leanings for each of their passengers and pass this cost onto those who fit the profile, entirely on the basis of their political beliefs.

And if they don't do this directly, they will do it by proxy in the same way that the insurance industry has been using proxies for race. https://www.forbes.com/sites/advisor/2020/07/23/insurance-re...

It's a sort of insurance-ification of all pricing and permission which this kind of technology is increasingly enabling.

[go to top]