Skip to content
Technology & Innovation

Algorithms Feel Like Science, but Are Full of Human Error

Algorithms are in charge of hiring people and data collection. You should have the right to know what they’re saying about you.

Algorithms are capable of fascinating things. Just by hearing the sound of your voice, one can allegedly tell if you’re a trustworthy person. Luis Salazar, CEO of Jobaline, the company behind the algorithm, sees it as an unbiased way of hiring. “Math is blind,” he said to NPR. Salazar says the algorithm analyzes the architecture of a potential hire’s voice to find out if it has all the right qualities.

Zeynep Tufekci from the University of North Carolina at Chapel Hill says this assessment is a common misconception. She explained toNPR:

“The fear I have is that every time this is talked about people talk about it as if it’s math or physics; therefore, some natural-neutral world, and they’re programs. They’re complex programs. They’re not, like laws of physics or laws of nature; they’re created by us. We should look into what they do, and not let them do everything. We should make those decisions explicitly.”

“[O]ne woman was falsely accused of being a meth dealer by a private data broker, and it took years for her to set the record straight — years during which landlords and banks denied her housing and credit.”

Algorithms are made by man and capable of making mistakes. Cases where programs and products have not worked properly for all people: the Apple Watch for those with darker pigmented skin or tattoos and photo sites autotagging darker skinned people as apes. A slip-up can also happen simply because of data imperfections, causing things like high-paying ads to show up for men far more often than for women. There may even be a fault on the programmer’s part, unknowingly injecting biases into the code.

But far more worrying are the profiles being built about you, the individual. Algorithms are the way advertisers and companies do business these days and somewhere out there is a data profile on you. These profiles are built based on what you do around the web: what search queries you type in, what online stores you shop at, and so on. The problem is you don’t know what they say about you. This issue can manifest itself in two ways:

When algorithms are used to personalize your experience

The former comes in the form of the filter bubble effect, where a site encapsulates each user in a personalized informational echo chamber of things they already agree with — not the best for knowledge growth. It’s an experiment you can do right now, if you wish: Ask two friends to Google something like “Obama” or “Egypt” and see what results pop up first. The results tend to be different. Eli Pariser, author of The Filter Bubble: What the Internet is Hiding from You, explained the detrimental effects of this process in his 2011 TED Talk:

When an algorithm gets something wrong

The latter comes in the form of misunderstandings — some big, some small, but none you want on your permanent profile. As Aeon‘s Frank Pasquale found, “[o]ne woman was falsely accused of being a meth dealer by a private data broker, and it took years for her to set the record straight — years during which landlords and banks denied her housing and credit.”

For these reasons, and many others, there should be some kind of algorithmic accountability — a way users could challenge mistakes in the system. The better option, though, would be to stop data collection altogether. If you want to reduce the data collected on you, start by using search engines that don’t track your queries, like DuckDuckGo. If you really want to mess businesses up, go one step further and try browsing anonymously with Tor. Vote with your data or, in this case, by not providing it.

Photo courtesy of JACQUES DEMARTHON / Getty Staff


Related

Up Next