Why China’s Social Credit Scoring System Isn’t Quite What You Think It Is

and how a similar system could one day sneak into North America

A simple Google search for China’s social credit system paints a startling dystopian future, seemingly on the verge of becoming a reality. Amidst commentary about “big brother” and “that one episode of Black Mirror”, you’ll learn of a world where every citizen is monitored, evaluated (on metrics such as: how long you play video games for each day or if you’re a parent), and publicly ranked. In this world, this score is in direct relationship to your ability to travel, your ability to own property, and even a potential determinant for who will be willing to date you. Although this initiative is still in the pilot phase, the ultimate goal is that by 2020 all Chinese citizens will be controlled not only by the law, but also by their reputation.

Is this the reality of what’s truly going on in China, and will it really be in full-force within the next 5 months? The answers to those questions are “not quite”, and “unlikely”. So, let’s get into it shall we?

The History

The concept of a social credit score isn’t new. In fact, the concept was originally conceived in the 1990’s, and has been percolating in the minds of Chinese regulators ever since. Historically, there has been an extremely low level of trust between Chinese citizens, and with a population of 1.4 billion people depending on each other for daily living, there had to be a better way to ensure the safety and security of all citizens.

In 2014, the Chinese government published their “social credit system” plans, which allowed for multiple pilots of different social credit scoring algorithms and systems, with the goal of introducing a China-wide system by 2020. However, the details of these plans are not quite what is being portrayed in Western media.

What’s the Government Actually Supporting?

In order to fully understand what is happening, there’s a key point I need to highlight — getting blacklisted is different than your social credit score.

In China, most of the serious repercussions we’re hearing about are actually a consequence of getting blacklisted. In order to get blacklisted, citizens must commit a serious offense (like fraud) that would be cause for legal action in most other parts of the world. Being on the blacklist is publicly available information, and limits the individual’s ability to buy a plane ticket, buy property, or take out a loan. In 2018, there were 17.5 million times Chinese citizens were blocked from buying an airline ticket. The success of these repercussions is primarily due to most departments of the Chinese governments having a common understanding to discipline citizens on each others’ behalf. In Canada, this would be like CRA punishing you for a Health and Safety violation.

Additionally, you’re not stuck on the blacklist forever. In theory, as long as you follow through with the court’s orders (like paying the fine), your name will be removed from the list. In practice, there are a lot of concerns and ways to abuse this process, but even so, this isn’t quite as bad as we originally thought, is it?

On the other hand, the current “social credit system” actually doesn’t have explicit repercussions and pilots instead focus on reward-based incentives. Although the pilots do take into account “blacklist-able” behaviours and related consequences, the system is actually more focused on compiling and sharing public-record data such as: centralizing licensing information and adverse court decisions into one database. Unless you have a sole proprietorship or have spent some time in court, there’s no guarantee you’ll even be on the list.

The bottom line — government pilots are not evaluating detailed personal information and daily activities to rank citizens.

Then Where Did This Confusion Come From?

It’s not a complete surprise that the Western media picked up on this story. In 2015, when the Chinese government was just beginning work on the program, they did actually sponsor eight private companies to experiment with social credit scoring. One of these examples includes Ant Financial’s Sesame Credit’s system. This pilot included: monitoring what you buy, how many hours you’re watching tv, who you’re friends with, and so much more. This data and scoring could also be shared with other companies. For example, Sesame Credit partnered with the dating site Baihe, so users could opt-in to display their score to help their dating chances. However, in 2017, the government announced that none of these private pilots would be considered to be the official system for China. Most of these systems now act more like a loyalty program than a social credit scoring system, and in the future the government could shut down these initiatives if they so choose. Even though these programs started as having government support, China has largely backed away from having anything to do with them.

What Will Happen in 2020?

Simply put, no one can actually be sure what will happen in 2020. The Chinese government has been quite tight-lipped on the development of the system. Many analysts doubt that there will be any system ready at the deadline, and in fact, many believe that China is years away from having the capabilities to integrate all government and private data into one scoring mechanism. Instead, some see the 2020 deadline as the end of the planning period. If however, China gets a system in place for a large city like Beijing, that would capture a large proportion of the population. In November 2018, China did announce plans for such a system, but nothing has been publicly spoken about since.

North Americans Would Never Stand for This…

In theory, no, North Americans wouldn’t stand for this. At least not in the current way that we understand the system. However, rating systems have slowly snuck more and more into our culture in the past few years. Want to be able to get a ride somewhere? Don’t vomit in someone’s Uber car. Want to qualify for a loan? Be in good-standing with your money. Want to fly somewhere? Don’t have the same name as a terrorist. Facebook even has a trustworthiness score to help find fake news. You’re probably still feeling pretty comfortable though knowing “most of these ratings aren’t publicly available and stored on independent systems”. There is an underlying sentiment that consumer demand won’t be there for such a system, and that even if there was a demand, North American governments won’t let anything like this happen. But don’t feel too confident yet… I’ve picked up on three signals hinting towards these assumptions being wrong.

The first signal, which challenges the assumption of lack of consumer demand, is that we are starting to see movement (by the private sector) toward this type of rating structure. A Los Angeles-based company called MyLife is aggregating publicly-available data with personal reviews to provide all US citizens with a reputation score. Every American is on the site whether they’re aware or not, and this is legal because the data inputs are already public. The company does make all searchers click a few boxes, promising that no discriminatory decisions will be made off of this information, but they also offer a service to advise users on how to improve their score. Another organization, #ReputationMatters, is advocating for one single, “globally accepted reputation scoring system and peer-to-peer background verification platform” — aka, your Uber rating, eBay rating, and Etsy rating are all one-in-the-same.

Homepage of MyLife

The second signal, which challenges the assumption of governments blocking such a system, is fear. Looking back in history, the expansiveness of the No Fly List was born out of the fear created by 9/11. The government stepped in and created a clear line of the “haves” and “have nots” restricting air travel based off of behaviours they deemed risky. Now don’t get me wrong, I am in full support of the No Fly List, however it is a sign that the government is willing to reevaluate the definition of privacy and make hard-and-fast decisions in the name of security. Logically, we don’t want a social credit rating system due to the numerous negative repercussions it could bring, but fear is the enemy of logic. What we’re starting to see today in the United States is an immense level of fear due to an unprecedented level of gun violence. Just recently there was a shooting at a Walmart, at a downtown bar scene, and at a food festival. How can Americans go to any public space and trust that they will be safe? And people are definitely feeling fearful. In a survey by Chapman University, the percentage of people who responded to “How afraid are you of being the victim [of a random/mass shooting]?” saying they were “afraid” or “very afraid” rose from approximately 16% in 2015 to 42% in 2018. That is a 162.5% increase in three years. If we can use ratings to ensure someone won’t vomit in our car, can we not use ratings to ensure that we will make it home at the end of the day? The divergence from the No Fly List and any sort of “Mass Shooting Risk List” is the simplicity of inputs. For the No Fly List, there are a few behaviours that correlate relatively strongly to a terrorist risk, but for mass shooters it may not be as straight forward. Is it mental health? Is it a poor family life? Is it how many video games you play? The only way to truly know is to track it all, and wouldn’t the best way to track it all be to consolidate data sources? Now we’re starting to sound a bit more like China aren’t we…

The third signal signal ties directly to the second signal, and that is the disintegration of trust in America. Fear breeds distrust and China’s entire value proposition is predicated on needing a solution to the lack of trust. RAND, a leading think tank has warned that America is suffering from a “truth decay”, and a study from the Pew Research Center found that 64% of U.S. adults believe that American’s trust in each other has been shrinking. eBay and Uber ratings were built out of the desire to increase trust in interactions. Couldn’t we trust each other even more if we had more than one data source to judge?

In the Western world we tend to disparage China and their surveillance actions. Although China has had a poor track record for human rights interests which makes any sort of social credit system a greater concern, it is important to understand that we may not have fully understood their intent. The more we dig into their plans and the more we self-reflect on our current state the more we might start to see threads of similarity. It’s not too late though. We are in a position to start shaping the future we want — whether that includes a social credit rating system or not. The fundamental question we need to ask ourselves is do we believe in the ideology that fear and distrust are best controlled by five stars on a phone screen?

Canadian-based researcher and foresight practitioner within an innovation team

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store