Brice Bunner

Are Big Data predictions becoming self-fulfilling prophecies?


Image ©: Peak ProsperityWe have all experienced filter-bubble syndrome, whether you realize it’s happening or not. That’s where what you search for has been so targeted by Google or Bing’s algorithms that you can’t get results that are outside the ring of ideas and concepts you trend toward—even if you want to. This has implications too vast to get into in this post, but there is another factor that’s been happening because of this bubble. And it involves the world around us changing to match it.

Nature vs Nurture

At its core, what is happening with A.I. algorithms and social media feedback is the same thing that happens with small children as they develop. There are inherent characteristics that are built-in to the child’s thinking and behavior, but then there are external stimuli that affect the child as well. But the age-old debate is whether a person’s outcome is more the cause of the nature of that child, or the nurture (or lack thereof) that they experienced growing up.

Do algorithms guide us to the content it wants us to desire or are we guiding the algorithm to produce the results we want? The answer is: Yes. It’s a mixed bag—the same as parenting. Our results are the intersection of what the external stimuli (algorithmic results) AND the internal motivations (what we are urged to search for) combined produce.

When Nurture takes over

This parenting debate changes tune slightly when it comes to technology, however. The reason being that the majority of users today are already past childhood when they approach social-media and search-engines for their input. Not only that, but the more we use the technology, the more this Big Data dials-in on what it thinks about us.

As an example, algorithms are being used to actually make predictions in human behavior—even with regards to the greater mystery of dating. But as these data sets refine to an absolute, we are slowly surrendering our free will—all in the name of convenience. After all, there is a trade off between being giving an answer and having to accept the responsibility of our own actions. For many today, and even more in the future, they might consider it better to take the easy road.  

Risk assessment and living

If you take a moment to think about it, there are a lot of chances you take each day. From what to wear all the way to business decisions, and each can have an impact on how your day goes. That can be a lot of stress throughout the day when you add up all the micro-decisions we make. Wouldn’t it be easier to leave deciding on where to eat lunch, for instance, to Siri? Or have Alexa pre-screen emails in order of importance?

These seem like small battles that we cede control over, but in the world of algorithms, everything counts. And when digital results sway real-world decisions, things will eventually become less diverse. Algorithms, after all, are designed to collate and pattern-ize the world around us.

When algorithms lead our decisions, rather than our own will, the potential for disappointment is out of our hands. Have a bad date? Blame it on the dating app. Conscience cleared.

But we are designed to handle decisions. In fact, decision making is a fundamental life skill. So, what is happening to us when we defer to algorithmic results? Can it be healthy to shrug off control? And where do the ethics of bad decisions come in? After all, we've seen where A.I. left to wander can lead to

As the things we do trend in a way that matches our consumption preferences, we lose opportunities. Even when new information is introduced, it quickly becomes homogenized into the trending aesthetic—losing the edge that gives diversity its value. Not only are we giving decisions to technology, but the opportunities to discover new stimuli is systematically eliminated.

This is great for when we really want to find that thing we’ve been looking for, but if it also dictates everything we search for after that result, then we’ve inadvertently “programed” the algorithm to—like an overprotective parent—decide what’s not good for us.

Broadening the horizon

I have an old laptop that I keep for the kids to do homework on and for whenever I need to run a CDROM. But the interesting thing is that I haven’t used Youtube on that device in ages. So, when I pulled up the video site, the suggested videos were completely different from what I’ve grown accustomed to on my current laptop. And when I searched Google for something, the results were radically different than what the same search would proffer on the other device (assuming I’m not logged in to Google at the time).

To see what might be out there, in your own searches, give DuckDuckGo or some other search engine a shot. You might be amazed at what else is out there that you’ve been missing. And, if you want to do some good in this world while searching for things, why not try Ecosia’s search engine? They will plant a tree for every few searches people make—all while offering a new experience through their results.

Technology is amazing, but with that magic comes the power to take over our decisions. And, as tempting as it may be to leave the decisions up to an app or search result, do the right thing and take hold of your personal freedom!  Stop the self-fulfilling prophecy from coming to fruition in your own life and put Big Data in its place.

About Sage

Sage Sustainable Electronics leads the market in sustainable IT asset management and disposition (ITAD) by reusing more and recycling less. Every year, businesses retire millions of used-but-still-useful technology products, creating the fastest growing business and consumer waste stream in the world. We strategically and passionately help companies reuse more and recycle less than anyone else in the industry.

For Businesses: Schedule a pickup for your retired computers, servers, printers and more.

Schedule Now

or call (844) 472-4373

For Individuals: Shop for refurbished tech at amazing prices, backed by The Sage Promise.

Get your Dosage in your inbox as often as you'd like.