“The web is more a social creation than a technical one…I designed it…to help people work together” – Tim Berners-Lee, 1999.
Technology is the root of all our ills. I’m hearing this more and more. A few months ago at The Engine Shed, a bunch of people were talking about tech, and one of the panelists said, “Technology is bad”. Or words to that effect.
It’s not hard to see why this view is becoming more prevalent. Over the past few years, an increasing number of books and articles have appeared alerting us to the dangers of technology. Stories revealing that, instead of the utopian dream envisaged by Tim Berners-Lee all those years ago, technology is helping to shore up privilege, increase inequality, and exclude some of our fellow humans.
You know the stories. Apple’s iPhone X ‘racist’ facial recognition software meant a Chinese woman’s colleague could unlock her phone. Google Photos, which automatically applies labels to pictures in people’s photo albums, classified images of black people as gorillas. Nikon’s S630 camera wrongly identified a photograph of a smiling Asian woman as someone who might be blinking. Google’s ad targeting system showed fewer adverts for highly-paid jobs to women than men. Software used by American judges, parole and police officers determined black Americans were more likely to reoffend than whites, meaning they received a harsher sentence or waited longer for parole.
The New York Times, in an article on prejudices being built into artificial intelligence, declared: “This is fundamentally a data problem”.
It’s a data problem. It’s an AI problem. It’s a technology problem.
I should have put my hand up and explained to that panelist: it isn’t the technology that’s the problem, it’s the people building and creating the technology, the data, the systems, the algorithms. Smart, creative people who are ultimately human with biases and prejudices, whether conscious or unconscious. Well-intentioned people, for the most part, who are amplifying inequality.
As a social anthropologist, it worries me. It might worry you too because (like me) you care about inequality. It should also worry you because you might be designing products for the few, not the many, products that (inadvertently) alienate some of your customers, or exclude them, or make their lives harder or worse.
I don’t know what the solution is. Diversity training is one suggestion. After an incident that went viral, 175,000 Starbucks employees in the US underwent racial bias training. However, challenging a person’s self-image as someone exempt from racism (or sexism), challenging their privilege, is very different from talking about it in the abstract. It’s a very human thing to refuse to face directly that which we have not experienced ourselves. That takes longer than a few days’ training. Better is to increase diversity in the workplace, and for people to be genuinely open to acknowledging their privilege and prejudices.
What I do know is that something has to change so that technology is built for the many, not the few. And I have hope because, as a social anthropologist, I know that culture changes over time. We might not see change straight away. It happens incrementally. You might not be alive to see it. But your grandchildren or your great-grandchildren just might.