4 Times Algorithms F*cked Humanity

“Buzz, buzz buzz” my phone vibrates as I receive yet another notification today. 

How long do I wait before I look? 

What could it be? 

The notification could’ve been triggered by any of the plethora of applications I have installed on my phone. 

The tension builds and I begin to feel anxious like a child on Christmas Eve waiting for the clock to strike midnight. Now, my smartwatch heart rate monitor is triggered and warns me of having and elevated heart rate. 

So, I give in. 

I reach for the phone and the enchanting black mirror illuminates to life. I see the message it had waiting for me, which says, “Rain expected in 20 minutes at your location.”

 -_- 

The buzzing and humming of our phone occurs multiple times a day, and it has gradually become the norm. We’ve become so dependent on our electronics and their algorithms while placing a blinding trust in their utility. However, let’s not forget these 4 times algorithms f*cked humanity.


Y2K Bug

“The Commerce Department’s $100 billion estimate covers the cost of testing and repairing computers affected by the Y2K problem from 1995 to 2001. It does not, however, include the money that businesses have spent on new machines to replace older ones that have date glitches, which economists say could provide some long-term economic benefits through productivity gains.

The Commerce estimate also doesn’t take into account firms’ Y2K-related publicity campaigns or the possible cost of litigation stemming from undiscovered glitches. As a result, some economists believe overall Y2K spending probably is somewhat higher, perhaps closer to $150 billion.”

Chandrasekaran, Rajiv. “$100 Billion Price Tag for Y2K FIX / Computer Bug Repair Sets Peacetime Record.” SFGATE, San Francisco Chronicle, 23 July 2020, https://www.sfgate.com/news/article/100-Billion-Price-Tag-for-Y2K-Fix-Computer-bug-2896765.php.

To save some bits of data, some software engineers simply used the last two digits to denote the year. This forced companies to retroactively review code used in production to implement fixes to resolve issues caused by the missing two leading digits of precision in the year.

Overall, the y2k fix was needed to prevent issues that would come up in applications such as scheduling, trend analysis and time-based calculations. For example, banking applications that calculate interest on your bank account or tap into your 401k with or without penalty.

Click for more details of the Y2K bug that caused mass hysteria.


Killer GPS

“We found that single-vehicle collisions comprised the majority of crashes (51 cases, 32% of overall incidents), with crashes with other vehicles (26 cases, 17%) and crashes with pedestrians and bikes (13 cases, 8%) making up the remainder of crash incidents.”

Lin, Allen & Kuehl, Kate & Schöning, Johannes & Hecht, Brent. (2017). Understanding “ Death by GPS “ : A Systematic Analysis of Catastrophic Incidents Associated with Personal Navigation Technologies. 10.1145/3025453.3025737.

People who drive to places that are not familiar to them trust in GPS algorithms to get them to their dream destinations, but end up in nightmare situations. Traveling the great unknown is something humanity has been doing for millions of years using primitive methods such as following the North Star to modern day GPS applications. However, GPS has made traveling so much easier that we sometime take for granted the complexity involved in finding a path to our destination, but is it the safest? 

One unfortunate fatal account comes from a Canadian couple in 2011 looking to take a road trip from their home in British Columbia to Las Vegas, which ended in tragedy because they depended on GPS to help navigate their trip to Sin City. Sadly, the couple took a route through the desert, which the GPS suggested was the best route, and ended up getting stuck in thick mud from the rough terrain they encountered along the way.

The husband, Albert Chretien (59), left his wife, Rita Chretien (56), in the vehicle while he went to search for help to get their vehicle out of the thick mud that they were unable to get themselves out of. Rita was found 7 weeks later in the vehicle, 30 pounds lighter, was rushed to the nearest hospital and was able to recover fully. However, Rita’s husband, Albert, was found dead about a year later by hunters.

For more details of the tragic accident cause by GPS error


Self-driving to Death

“Krafcik (John Krafcik, CEO of Waymo (owned by Google’s parent)) said that you have to be sensitive to the real losses people suffer in accidents such as these. But, he added, achieving the bigger picture — eliminating the roughly 35,000 annual auto fatalities, largely due to driver error — means not being deterred by the “bumps in the road” to accident-free driving. Krafcik was basically putting a fresh coat of paint on an old, rarely spoken platitude: People must get killed en route to a better, safer transportation system.”

Korman, Richard. “Give Us the Risk Reality.” ENR: Engineering News-Record, Aug. 2018, p. 52.

The future certainly is here and the wealthiest man in the world now owns one of the trail blazing companies introducing “autonomous” machines to our roads. However, with “beta” version software being deployed in the Tesla Auto-pilot, one has to question the transparency to consumers and general public of the risks involved in using these algorithms that are controlling 2 ton metal machines armed with li-ion batteries on wheels doing 70 MPH on a road near you. It’s a marketing genius but one which puts people in danger by misleading customers by stating a Tesla is a “self-driving” vehicle. According to the 5 level classification of autonomous vehicles where 0 is lowest level and 5 being full autonomy; a Tesla, at best, is level 2.

With the slick marketing, a Tesla customer was unfortunately too reliant on these “self-driving” features and ended in a fatal accident with a truck.

Click for more details of the first fatal autonomous car accident 


Bye, Bye First Amendment

“According to internal materials reviewed by The Intercept, Dataminr meticulously tracked not only ongoing protests, but kept comprehensive records of upcoming anti-police violence rallies in cities across the country to help its staff organize their monitoring efforts, including events’ expected time and starting location within those cities. A protest schedule seen by The Intercept shows Dataminr was explicitly surveilling dozens of protests big and small, from Detroit and Brooklyn to York, Pennsylvania, and Hampton Roads, Virginia.”

Biddle, Sam. “Police Surveilled George Floyd Protests With Help From Twitter-Affiliated Startup Dataminr.” The Intercept, 9 July 2020, theintercept.com/2020/07/09/twitter-dataminr-police-spy-surveillance-black-lives-matter-protests.

What should we do when we witness injustice? Usually, we think of spreading information, organizing and protesting. However, since we are using the World Wide Web, the powers that be also have access to your ideas that you are spreading about. Now, with social media platforms already tracking and cataloging your data, it made it much easier to suppress activism. For instance, there is a company, Dataminr, that reportedly tracked BLM activist on social media and was helping the police now of these activist planned activities.

Click for more information the Dataminr story


These four events have occurred, yet, we are still alive and breathing. However, we should reflect on these events so that we learn not to have blind faith in applications or their creators. We trust these applications we use that we often neglect to read the terms and agreements. We are so quick to click the “I Agree” checkbox and “Next” button to finally have access to the latest application everyone is buzzing about, but don’t know what we traded to gain access to this application. The old adage goes “if something is too good to be true that’s because it probably is”, yet we haven’t applied it to the technology we use.

Life is more complex than what any algorithm can currently predict or handle. We, the human element, are the counter balance to make sure that technology and technological companies do not make life-altering decisions without our final say. Using our empathy and compassion, which is what separates us from machines, we will need to provide the checks and balances necessary to prevent questionable use of these algorithms that are so integrated in our lives. I ask that my fellow software engineers and the tech companies that employ us be more transparent with AI being deployed and used on society, so that the public gives their own risk assessment on the decisions being made by AI potentially based off of training data sets that may be bias or skewed.

Leave a Reply

Your email address will not be published. Required fields are marked *