I wonder what happened to this story after that. 2019 AI to determine your creditworthiness, Japan: China to rate 1.4 billion people: Asahi Shimbun GLOBE+
The “Social Credit System” is a project launched by China’s State Council in June 2014 under the banner of improving social norms. The target year is 2020. The idea is to develop a “social credit” score for 1.4 billion citizens… The points on the score range from traffic safety and paying taxes to online behavior. “Social trust” will be lowered if the company is certified for spreading fake news on the Internet. Those with low “social credit” are blacklisted and sanctions such as being denied boarding of airplanes and trains are also imposed. The Global Times, an English-language newspaper in China, reports that 11.14 million people have already been denied boarding of airplanes and 4.25 million people of high-speed trains by the end of April 2018 because of “social credit”. … Widely known is Zima Credit, a credit evaluation system provided by Ant Financial, a subsidiary of Alibaba, China’s largest online retailer.Zima Credit The AI determines a variety of data, including online purchase history, payment history, service usage history, and even friendships, and assigns a credit score ranging from 950 to 350 points. This data collection will also include information from the police and other public agencies. They will receive benefits such as no hotel or car rental deposit required with this score. … The “1984-like” concern that every behavior will be datamined and scored persists.
Social Credit Score MIT Tech Review: How did the misunderstanding of China’s “Social Credit Score” come about?”
-
The attempt to have AI algorithms evaluate citizens is often criticized as a dystopian policy promoted by authoritarian states like China. However, it is in Western countries where the use of this technology is actually spreading.
-
by Melissa Heikkilä 2023.01.04
- U.S. and EU Agree to Oppose Social Credit Scores
- EU negotiating to enact AI law
- China’s social credit system is perceived differently from reality
- Alibaba’s “Sesame Credit” is a source of misunderstanding
- Similar systems exist in the West.
-
In Amsterdam, for example, authorities use an algorithm to rank young people in disadvantaged neighborhoods according to their likelihood of becoming criminals. The goal, the authorities claim, is to prevent crime and provide better and more targeted assistance.
-
- Western Legislators Need to Seek Common Understanding on AI Governance
- Honest and thorough audits of AI use by government authorities and companies are needed
This page is auto-translated from /nishio/社会信用システム using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I’m very happy to spread my thought to non-Japanese readers.