Opinion: Are AI workplace systems that monitor employees reducing people to numbers?

Social anthropologist Dawn Walter argues that AI systems that measure employee productivity are misguided because they reduce people to numbers and ignore context.
23rd April 2019

Companies have long collected data about their employees in order to boost efficiency and increase productivity. We tend to associate these systems with blue-collar workers and production lines such as car manufacturers or large warehouses. With the vast amounts of digital data now able to be collected, and particularly with the advances in AI (machine learning), it’s now possible for companies, such as Amazon and Walmart, to use data from wearable technologies such as bracelets or badges equipped with sensors to determine whether employees are performing their jobs efficiently and correctly.

But is workplace monitoring going to become the norm for white-collar workers too?

In early April, I read a news story about some UK companies including five law firms, a training company, and an estate agency that were using a system called Isaak, which uses artificial intelligence to collect real-time data on employee behaviour. This data is used to determine when people are overworking and how well teams are collaborating, based on information such as who is emailing whom, and who is working outside of “normal office hours”.

While the issue that probably springs to mind for most of us is ‘surveillance’, what also concerns me as an anthropologist is ‘context’. All the data being collected in these different ways – wearable tech and computer usage –  is aggregated and analysed and taken out of context. For me, these systems are reducing people to numbers.

The creators of these AI systems no doubt believe the data being produced about employees is objective and unbiased. Indeed, the CEO of Isaak wrote that “the issue with traditional systems of measurement” such as “timesheets, punch-cards, progress reports and employee surveys” is that “they are open to biases”.

However, machines, systems, and data are not neutral, as I’ve stated before.

These workplace systems are reassuring to management because they give the appearance of certainty, converting the messy complexity of human employees into the reassuring objectivity of numbers and graphs, and removing the uncertainty of the decision-making processes around human behaviour. These systems appeal because they offer the promise of eliminating the subjective element of management decision-making. They offer a way to make decisions and judgements based on rational reasons rather than values or biases.

But these systems, in offering up their seemingly objective data, ignore individual specificity and context in favour of superficial knowledge such as the number of emails sent or hours worked. Sure, such systems may provide useful overviews of employee performance or enable quick comparisons but they ignore and conceal the specificity of employees’ lives. The dad who is late to work because his child is teething. The mum who’s emailing at 7 pm because she had to leave early to collect her sick child from kindergarten. Context is everything.

And what is omitted when measures are created? A system measures some things and ignores others: it may measure emails sent, but not thinking time spent away from the computer, as has been pointed out.

Although statistics might make comparison between employees easier, they are, in some senses, meaningless, even misleading, because a common metric is being applied across teams, implying all employees are alike.

So while these workplace systems may enable companies to monitor employees more easily and provide them with the data needed to boost productivity, my concern is around individual specificity and context. And I can’t help but wonder if all this is leading to a new era in employer-employee relations: reducing people to numbers.