With news about Huawei breaking the silence over China’s digital sneakiness, even more information about the fourth largest country in the world and their dystopian-esque government tactics has come to light. But the news leaking out isn’t just idle talk. In some cases it’s like something straight out of a Bond film. At least, it probably was for this journalist.
But why the daring dos?
As the world continues to become integrated digitally, and since data (as a whole) is ambivalent to race, creed, or color, the tight borders of China stands in stark contrast to the rest of the modern world. The values and ethics lurking behind the choices China makes raises question after question as to their ultimate goal. Add to that the reports about micro-chips leaving back doors in our devices, and the whole thing takes on a clandestine feel.
And that’s the essence of why companies and individuals are set on finding vulnerabilities in China’s defense: because, in many ways, it stands against progress. How can we continue growth if we live in fear?
We’ve covered this concept before in other posts, but there is this idea that the internet can look forward to one of two likely futures: The first is what we see happening to China as they progress with technology; a future in which the government control becomes oppressive and citizens are rewarded for following government demands.
The other future is one in which consumers take charge of their own digital health to the betterment of society. Namely, restraining screen time and extending the use of each device they own (and ownership is their right).
As for the former option, what we see happening in China with regards to something as rudimentary as VPN access is just an indication of the lengths they are willing to go to control the world wide web within their borders.
How can they stand being so limited?
It’s easy to think this is strange behavior for human free-thinkers to willingly submit themselves to a platform that dictates what can be viewed--especially when you live in a country with free internet access, right? But the joke is on us. We already live in this kind of dystopian existence. And it’s of our own making.
For the past fifteen or so years, the truth is you can’t view whatever you want because of filter bubbles, algorithms, and Social Media bans that pre-select what it calculates you should be looking at. We’ve created our own prisons of data with the very data we give it.
Perhaps our controlling platform seems less onerous because we can’t see who’s pulling the strings (most of the time). With China, it can be easy to vilify the government because they take credit for the limitations they place on their citizens. With algorithms, on the other hand, it is the amoral calculations doing the decision making.
This might seem too simplistic, but when you think of how Facebook, Google, and Amazon have made their billions. It is because of creepily accurate predictive algorithms that they craft their own fortunes. Admittedly, we do have some degree of transparency, and we can opt out of these services, which Chinese citizens don't have.
Since the data these tech giants deliver, however, is in-line with our own choices, we don’t see it as so much limiting as it is convenient. But the output is the same; we are all being pushed to specific content. What the content hopes to create in us is the ultimate question. Our own choices and preferences distilled back to us again and again seems like a recipe for social degradation.
The sinister side of math
Algorithms are equations designed to generate a grouped set of data surrounding something particular. This can be something innocuous like the kind of movies you might like or something more ethically questionable like targeting criminals.
The math churns numbers so fast our heads spin and then we have a grouped solution that matches our expectations. And, once again, we come back to the individual as regulator. It seems, in the face of light-speed algorithms and filter bubbles, the only thing we can control are our own responses. At least, those of us in free countries, that is.
Math will give us numbers—ethically neutral numbers. It’s what we do with those numbers that either makes us conform to a governmental regime, a social group, or a suggestion based off of the data the algorithm was given. And our future will be a direct result of which path we choose.