My job had me go through a training session about “unconscious bias in interviewing“, which I found interesting in ways both expected and unexpected. I expected to be reminded of biases involving appearance, gender, age, voice, accent, nationality and so forth, but there were some notions particular to interviewing which apply to other situations as well. For instance, there is a tendency to weigh more heavily the last part of the interview – a person stumbling over a question at the end is more impactful than an earlier stumble from which they later recovered, but every moment should count equally. Also, one well-answered question can override a multitude of poorly answered ones – this is called “the halo effect”. We also compare people we most recently interviewed with the one we are currently talking to – there should be no extra weight on that recent-ness but there it is, the “contrast effect”. It’s important to be aware of every kind of bias, yet there are so many! It’s hard to keep track.
We build our biases into our systems, often just as unconsciously as we apply them in our daily lives or situations like interviews. I was recently working on a machine learning project to determine, by means of sensors and software, whether a residence is currently occupied or not. Motion sensors relay data throughout the day and night to a backend service, and a machine learning algorithm applies its initial model – gained through a training set – on the incoming information, producing probabilities of occupancy state. If little or no motion is detected throughout the day, the algorithm concludes that no one is at home, but given the same data throughout the night, the algorithm will decide that the occupants are sleeping. You can see a number of built-in biases here – that humans are nocturnal creatures, that they have day jobs, and that those jobs are outside the home. It’s also interesting to note that the time zone reported in the data is critical. It’s astounding to me how high the proportion of software bugs in such systems are because of errors involving time zones! How can the program be allowed to adapt for those homes where someone is working a graveyard shift, or some other non-standard routine? How can every exception possibly be accounted for without either severely diluting the criteria or creating a configuration confusion?
If we can’t help but build some biases into our machine learning systems, then considerations about the future of artificial intelligence have to include such flaws. Sophisticated computer programs are just as liable to “leap” to conclusions based on their limited experience, their sample sizes, and the biases built into their training data sets, as we humans do every single day. Even a setting as routine and commonplace as a job interview is filled to the brim with pre-loaded implications. What will we think of Artificial Intelligences that are inherently conformist, stuffing people into tidy little cubbyholes based on arbitrary biases? We are already beginning to come across such examples in our everyday lives as more and more “intelligence” is built into our smart-phones and other gadgets. We start a search term and instantly completion-suggestions are brought up – just start typing in a search bar “why do gi” to see what the world thinks you want to know. The algorithm is only spitting up the likeliest choices, which simply come from the multitude of previous searches, so that ultimately we have no one to blame but ourselves, but still, the reinforcement effect is strong. Suddenly you find yourself wondering why everyone seems to think that “girls” are bleeding cheaters who always fall for creeps.
Ultimately machines will learn the way they are taught to learn, which is the way we all learn, which is to filter, sort, and select what we secretly wanted in the first place. We choose that which looks like us, acts like us, feels like us, thinks like us, agrees with us, feels comfortable to us, which is why you’ll find zero Black engineers working at Twitter today. Bias, conscious and otherwise, is the road most travelled, the well-worn groove. As Karl Marx wrote: