Covid-19 and its after-effects remain very much still with us. As the world locked down in early 2020, it moved online – and with this digital surge, as an urgent new report from the Institute for Public Policy research thinktank shows, came an incredible increase in electronic workplace surveillance. Complaints from workers subjected to the beady digital eye are multiplying, but the number of large firms monitoring their employees was estimated by management consultants Gartner to have doubled since the start of the pandemic. One survey, cited by the BBC, suggested 80 per cent of employers are using some from digital surveillance.
Some of this is driven by bosses’ fears, if not outright paranoia, about what their charges get up to when working from home. At the peak of the lockdowns, 49 per cent of people in employment in Britain were working from home for at least part of a week, well up from 12 per cent just before the pandemic. That shift has had long-lasting impacts – official figures today show that around 40 per cent of adults in employment report having worked from home in the last seven days. Despite cajoling and wheedling and occasional threats to get people back to full-time office work, working from home in some form is here to stay for millions of us.
The temptation for at least some employers has been to extend the possibilities for monitoring and surveillance that an office provides into employees’ homes.
Laptops can log key-strokes, and monitor software use over time. The digital camera in modern machines isn’t just useful for endless Zoom calls – it can provide intensive monitoring of employee behaviour. The appropriately named Sneek software sits on a employee’s computer and jams their camera on, sending a picture to the software’s “wall of faces” as frequently as every minute. The company selling the package seemingly believes this will digitally reproduce the allegedly friendly open-plan office environment of pre-pandemic capitalism, with everyone able to see everybody else’s happy and diligent faces at their desks. The reality is that an always-on camera allows continual surveillance by managers of their workers. Sneek boasted to Business Insider that sales had grown “ten-fold” in the first months of the pandemic.
Fujitsu has developed an AI model that can monitor facial expressions to detect signs of concentration on a task. The company claims this is “85 per cent accurate”, although it’s not clear if 100 per cent accuracy would be more or less troubling. Fujitsu plans to roll the software out for “online classes, online meetings, and sales activities”. It’s not just that you need to be on camera the whole time – you need to look busy while you’re there. And biometric monitoring, typically including the use of fingerprint data to confirm a worker is at a desk, is becoming increasingly popular in the US.
But as the IPPR describes, surveillance can be more intense away from office work. Filings by Amazon to US regulators, for example, revealed that the handheld scanners its workers use to track packages in its warehouses also track what the company calls “time off tasks” for each worker. Missing more than 30 minutes from “tasks” in a single day, in any one-year period, provokes a written warning. Taking 30 minutes over three days in a year can mean dismissal.
Like a high-tech Victorian workhouse, Amazon’s spreadsheet registers employees’ time for toilet breaks, and such dreadful infractions as “talking to a colleague”. The company has patented a design for a wristband that can track workers’ locations in a warehouse, and “nudge” them, using vibrations, to head in the right direction. Its plans for Whole Foods, bought by the company, include the use of infra-red cameras to monitor employee locations in-store.
The risk the IPPR highlights is that this capacity to intensely monitor workers provides a huge volume of data to be processed by opaque and poorly understood algorithms. These are then used to “optimise” worker behaviour. But these Big Data can create forms of “algorithmic bias”. When the Department for Education attempted to adjust GCSE grades in 2020, to account for time missed from school, the algorithm unintentionally favoured private school results, and protests by students erupted across the country.
The problem of surveillance isn’t only the act itself – it’s also the way the possibility of surveillance starts to affect people’s behaviour. This is a very old idea: in the early 19th century, the philosopher Jeremy Bentham devised what he thought would be the perfect prison, the Panopticon, in which prisoners would be held in cells completely open on one side, facing a guard tower. The prisoners would never know if they were being watched – but always knew they could be, and so, Bentham thought, would behave themselves.
One worker, interviewed by the IPPR, reported a manager would often threaten to “get the cameras on you” – not that they were being monitored now, but that they could be. Bentham would be delighted.
What this shows is that surveillance is ultimately about power. Reinforcing existing rights over data, and creating new ones like a “right to disconnect”, as already in place in France, help tip the balance of power back towards employees. But it is workplace unions, like those Amazon workers are now trying to form, that are the biggest barrier to abuse on the ground.