Have you ever called a customer support line and heard the phrase “this call may be recorded for training and Quality Assurance purposes”?
Well the truth is your recorded phone call will most likely never be used for training, an extremely small number of calls actually are, so why are all these companies recording millions of phone calls on a daily basis data?
All these recorded phone calls are automatically sent to companies that develop algorithms, as soon as you hang up the phone your conversation is being analyzed by computer code. Such companies have developed intelligent machine learning algorithms that analyze billions of phone conversations every single day, sent to them directly by large companies.
Their largest clients send them over 250 million recorded phone calls every day. Their computers examine what you say how you say It, the words you choose and the tone that you used to determine your personality and put you into a group of other people with similar personalities.
You may, for example, respond well to facts and figures, or you may better respond to personal sentiments and compliments. You may be short and aggressive with your tone, or you may be patient. you may be shy, sarcastic, blunt, outgoing, within minutes after your first call.
A call center algorithm has attached a personality label to your phone number, then the next time you call any other company, you are automatically routed to a customer service agent that has a similar personality to yours.
You are routed to a person that can better tap into your psyche to more efficiently sell you products or to solve your problem quicker. The result is a shorter more pleasant phone call for every involved.
Happier clients and of course boosted sales for the company.
Products from Amazon.com
Price: Check on Amazon
Price: $18.35Was: $28.00
This has happened to you hundreds of times over hundreds of phone calls without you ever even realizing It.
so how do you feel about this? Should you feel like your trust has been betrayed? Your privacy invaded?
Well you could do, but in reality, these are what’s known as Blackbox algorithms, no real humans are listening to your phone call and the data that these algorithms extract and use from millions of aggregated phone conversations isn’t even visible to the engineers who create the algorithms.
It’s all just math, your words are first converted into numbers and then transmogrification millions of times by computer software. The end result is so obfuscated and so vastly complex that all the people behind the scenes actually see is a black box of billions of numbers that take phone calls as an input and spits out a personality label as an output.
Nobody is listening in to you complain about that pair of pajamas you just purchased with a tear in the crotch, because no one cares and, in the end, thanks to these algorithms, you will be put through to someone who assimilates really well with yourself and you’ll probably hang up the phone having a shorter, happier, and more resolute full conversation.
Algorithms are not a new thing, we have been living by algorithms and using them to enhance our lives for literally thousands of years, a simple recipe is an algorithm you take an input that is ingredients follow a set of predefined instructions, and if you’re good enough at cooking, you get an output — a tasty meal.
The ancient Greeks Babylonians and ancient Egyptians all developed their own mathematical algorithms to accomplish a variety of basic tasks and make life simpler, but today algorithms define your life like never before.
In your whole life, everything and everyone in it has pretty much already been decided by an algorithm without you even realizing it, whether you can get a credit card a loan or a mortgage to buy your dream home has been decided by an algorithm. Which school you go to or your children go to your exam, your university degree, are all determined by algorithms.
When you sit down to watch a film on Netflix or Amazon, statistics show that your most likely to watch a film that has been recommended to you by an algorithm. But it gets a lot more personal than that.
Since 2010 online dating has been the most popular way for new couples to get together, and today the majority of new couples meet online.
Now if you’re the type of person who believes in fate and one true love, then the math would like to have a word with you.
If you met your partner online, a series of steps had to happen to lead up to that point. Both of you and your partner had to first discover and sign up to said dating website or app, that most likely happened because you both saw an ad or search listing for said dating service.
An algorithm, whether it was Google’s or a Nova, decided that based on your search and web browsing history and whether these algorithms have categorized you as an elite single, a sugar daddy, or a lonely person.
You will see an ad or search result for the appropriate dating website.
At that point in time once you’ve signed up to the dating website, you will be asked to fill out your profile and answer a series of never-ending questions about your personality, and whether you enjoy long walks on the beach which a really strange question it is, because the vast majority people don’t live near a beach.
Products from Amazon.com
Price: $9.22Was: $16.00
Based on your answers and the words you put on your profile, an algorithm will then decide which matches should be shown to you. You may then take it upon yourself to flirt with one of said matches and become romantically entangled, so then, algorithmic math fate has narrowed down a small handful of people from your area out of a few thousand possibilities.
Algorithmic math has decided who you might spend the rest of your life with and the children that you may have, so if you think about it today, computers are breeding humans.
The algorithmic hype train is the police in most major countries. Police forces are now trying out sci-fi like algorithms that can do crazy stuff, such as predict when and where a crime is likely to happen before it does, and the reason police forces are getting so hyped up about these minority report style algorithms is that they work really well-using thousands of data points, such as the current weather conditions, traffic statistics, and neighborhoods such as average income social backgrounds and education.
These algorithms in use by the police today can predict with uncanny precision dynamically where a crime might happen to the minute, police use these systems to more effectively allocate their officers to specific areas of a city, but this isn’t new.
So-called predictive policing has been used for the past few years now and it’s becoming more and more common.
Something even more controversial: new algorithms have been recently developed that can calculate any civilians threat score, what is a threat score?
Billions of data points are taken into account within seconds, this includes a person’s arrest records, but worryingly it goes a lot more personal, the algorithm looks at a person’s property records, education history, commercial databases from companies that person has used or associated with, and most personally, of all their social media network who they’re friends with, and every single one of their tweets, posts, images and videos.
Even their web searches are used. All this information to algorithmic-ally calculate a threat score about that person, similar to a credit score.
The higher the fret score the most of a potential risk that individual could be to the police or civilians, so in practice, when a call comes to an officer they can tap in that person’s name into their in-car computer and the system will instantaneously reveal that suspects threat score before they even arrive at scene.
If they have a high fret score the suspect will be treated far more cautiously, the tasers and guns will be as ready as that person’s fret score indicates they are more likely to be carrying a weapon, and hence more likely to use it against others.
This kind of social rating system opens up a whole kind of controversial thoughts.
This is just a tip of the mathematical iceberg. When it comes to punishment and law enforcement it’s no secret that certain countries have long been collecting vast amounts of data about their citizens phone calls, emails, photos, videos and more.
The predictive power of algorithms and mass data means their systems can even predict what we might be planning to do in the future, launching a terrorist attack for example, if you have liked a certain page on Facebook and recently shopped at a certain hardware store, and you happen to be friends with someone from a particular country, then the automated algorithms that use at the NSA and other agencies may have put you on a watch list, even if you don’t actually have any negative intentions.
If this style of mass surveillance is a good or a bad thing is a highly philosophical question. There were four awful terrorist attacks in the UK in 2017, but according to mi5, there were also nine planned terrorist attacks that were prevented, and it’s very highly likely that these algorithmic driven mass surveillance systems were at least partially responsible for discovering and stopping those attacks.
We all should consider the more we integrate algorithms into our lives the more our lives are enhanced, in many ways we have cheaper airline tickets because the prices are more efficiently calculated by algorithms, we arguably find better relationship matches today thanks to our personalities being evaluated, rather than whoever smiled at us across the bar. We get better shopping recommendations than ever, we visit better and more interesting cafes and restaurants, thanks to recommendation algorithms. we can even go on better holidays.
But the dark sides to integrating all these mathematical systems into our lives is that huge decisions that affect us in irreparable ways, such as mortgage approvals, prison sentences, and surveillance, have lost their human touch. We are now only numbers waiting to be stored.