Bias, Conscious and otherwise

My job had me go through a training session about “unconscious bias in interviewing“, which I found interesting in ways both expected and unexpected. I expected to be reminded of biases involving appearance, gender, age, voice, accent, nationality and so forth, but there were some notions particular to interviewing which apply to other situations as well. For instance, there is a tendency to weigh more heavily the last part of the interview – a person stumbling over a question at the end is more impactful than an earlier stumble from which they later recovered, but every moment should count equally. Also, one well-answered question can override a multitude of poorly answered ones – this is called “the halo effect”. We also compare people we most recently interviewed with the one we are currently talking to – there should be no extra weight on that recent-ness but there it is, the “contrast effect”. It’s important to be aware of every kind of bias, yet there are so many! It’s hard to keep track.

We build our biases into our systems, often just as unconsciously as we apply them in our daily lives or situations like interviews. I was recently working on a machine learning project to determine, by means of sensors and software, whether a residence is currently occupied or not. Motion sensors relay data throughout the day and night to a backend service, and a machine learning algorithm applies its initial model – gained through a training set – on the incoming information, producing probabilities of occupancy state. If little or no motion is detected throughout the day, the algorithm concludes that no one is at home, but given the same data throughout the night, the algorithm will decide that the occupants are sleeping. You can see a number of built-in biases here – that humans are nocturnal creatures, that they have day jobs, and that those jobs are outside the home. It’s also interesting to note that the time zone reported in the data is critical. It’s astounding to me how high the proportion of software bugs in such systems are because of errors involving time zones! How can the program be allowed to adapt for those homes where someone is working a graveyard shift, or some other non-standard routine? How can every exception possibly be accounted for without either severely diluting the criteria or creating a configuration confusion?

If we can’t help but build some biases into our machine learning systems, then considerations about the future of artificial intelligence have to include such flaws. Sophisticated computer programs are just as liable to “leap” to conclusions based on their limited experience, their sample sizes, and the biases built into their training data sets, as we humans do every single day. Even a setting as routine and commonplace as a job interview is filled to the brim with pre-loaded implications. What will we think of Artificial Intelligences that are inherently conformist, stuffing people into tidy little cubbyholes based on arbitrary biases? We are already beginning to come across such examples in our everyday lives as more and more “intelligence” is built into our smart-phones and other gadgets. We start a search term and instantly completion-suggestions are brought up – just start typing in a search bar “why do gi” to see what the world thinks you want to know. The algorithm is only spitting up the likeliest choices, which simply come from the multitude of previous searches, so that ultimately we have no one to blame but ourselves, but still, the reinforcement effect is strong. Suddenly you find yourself wondering why everyone seems to think that “girls” are bleeding cheaters who always fall for creeps.

Ultimately machines will learn the way they are taught to learn, which is the way we all learn, which is to filter, sort, and select what we secretly wanted in the first place. We choose that which looks like us, acts like us, feels like us, thinks like us, agrees with us, feels comfortable to us, which is why you’ll find zero Black engineers working at Twitter today. Bias, conscious and otherwise, is the road most travelled, the well-worn groove. As Karl Marx wrote:

“Men make their own history, but they do not make it as they please; they do not make it under self-selected circumstances, but under circumstances existing already, given and transmitted from the past. The tradition of all dead generations weighs like a nightmare on the brains of the living.”

Machine, Learning

For the past 25 years or so I’ve been a human learning to program computers.  It’s been my day job for much of that time. My night jobs have included writing fictions of various stripes.  I’m currently working on one about computers learning to program humans.
It’s a work in progress going on in Wattpad under the working title “Machine, Learning”,  and so far consists of log file entries. The computer is controlling a star seed spaceship carrying colonists to a distant planet decades away.  While the humans lie in stasis in capsules,  encased in a minty fresh goo, two programs,  a main and a backup operating system,  set
out to try and understand their cargo. It’s an adventure for them,  but also for me as i try to apply my experiences in learning about an alien form of being. It’s an experiment that could easily fall flat,  but then that’s true of all attempted art.

How to Be Rated

I recently wrote a couple of stories based on a dream I had, in which my younger brother had invented a system of perpetually rating everyone and everything in the world. The stories were a couple of variations on the theme, and it was my intention to write either more stories or a longer novella based on the idea. Instead, I have gone ahead and programmed the novel.

I have built the System.

I will soon unleash it on the unsuspecting world, where I strongly suspect it will languish in the same obscurity as its author. Correction, it will THRIVE in the same obscurity!

The app is called “Be Rated” and it will allow anyone, anywhere, at any time, to rate anyone or anything in the world. In the dream (and stories) the system rated from zero to one hundred, but in the app, mirroring the real world, the System uses the familiar 5 star ratings, complete with little gold stars to choose from.

The System will overcome some of the biggest peeves I personally have:

  1. You can rate the same things over and over again. The System will display historical data along with the averages (both personal and universal). In the real world now, you can only rate something once, but what it if you liked it one day and hated it the next? There is no room for that in the current universe. The System solves this problem.
  2. You can find all the ratings for all the things in one place – The System. No longer do you have to go to Amazon or Yelp or Google or Goodreads or ITunes or SoundCloud etc to find ratings of various things. All things are rated all the time in the same one place.
  3. It’s so easy, anyone can do it, and best of all, there is no required Sign In, no Accounts, no Passwords, no Personal Data of any kind (required, that is. You will be able to sign in with third party accounts if you want to be able to view your own personal rating history).
  4. It should drive the whole world even more fucking crazy than it already is, because for the first time, People will be rated as well as Products, Businesses, and Places. That’s right. And not only People (meaning You – You will be rated!) but anything at all. Concepts can be rated. That’s right. You can give “Free Will” two stars if you want because really, is Free Will all it’s cracked up to be? Personally, I’ve been somewhat disappointed.
  5. If you think people are obsessing over their smart phones now, just wait until they are addicted to checking the ratings of their own personhood.
  6. There used to be no God but God, but soon there will be two: God, and the System
  7. Your welcome very much

(The stories are here: The System, Out of My System)

Data Mining Your Brain

Over the past few years there has been an explosion in the amount of data being transferred from users’ brains to permanent storage (a.k.a. “the cloud”), where it can be sifted through and analyzed by anyone with access to it. Typically, the ones with access are the major corporations that are doing the “hosting”. Like any good “host”, they soon know everything about you, down to the most intimate details, and have no compunction at all about selling that information to practically anyone who will pay for it.

Who knows everything there is to know about you? Google, Apple, Facebook, Amazon, Microsoft, Oracle – do any of these names sound familiar?

A decade or go I first got a hint of this when I worked on a project called “Interactive TV” at Sun Microsystems. It became clear early on that what they meant by “interactive” was “figuring out which advertisements to show you based on what you watched and responded to”. At that time it was only a computer corporation’s wet dream but since then it has become commonplace. Anyone with Gmail has seen that page littered with what Google believes are appropriate ads from information gleaned from the content of your personal correspondence. Facebook with its billion users and deeply vertical penetration into their individual worlds is a boiling cauldron of personal data leading to unlimited advertising potential.

More and more players are getting into the game, including, finally, publishers. Players like Amazon and Barnes and Noble, through their Kindle and Nook devices, are now able to tap even deeper into your brain. They can see deep down even into which sentences people highlight. Just as there are companies which specialize in “search engine optimization”, there will soon be companies which will help writers and publishers fine tune their products down to the word level – they will know exactly which kinds of phrases are sure to get the lady erotica readers hottest, and which technical terms resound in the brains of teen sci-fi fanatics. There is really no end to the potentials of tailorization. That thing that used to be known as “creativity” will finally be tackled and nailed down.

They know who you are and they know what you like. Your pleasure centers will be stimulated precisely and eternally, joy without end. Hallelujah.

problem, solution

software piracy (whether involving music, books, applications or operating systems)  is a technical problem which will require a technical solution. treating it as a moral problem will never get you anywhere. you can scold people all you want and you will accomplish nothing. someday some super-engineers will solve the problem somehow, but whether they ever get the international cooperation needed for a truly comprehensive fix is another matter. in the meantime, you might as well stay off your high horse.