A recent report published by the European Trade Union Institute (ETUI) titled Algorithmic management and collective bargaining highlights various risks associated with artificial intelligence (AI) and algorithmic workplace management tools. These have been implicated in contentious automated hiring-and-firing processes, as well as in extensive monitoring of employees’ personal lives.
Forms of workplace surveillance and monitoring, known as “scientific management”, have been in place since the industrial revolution. The ETUI report notes, however, that there has recently been a “qualitative leap” in technology’s ability to control and subordinate employees, which “exceeds the capacity of any past human supervision”.
The industrialisation of modern work
In recent years, AI and algorithmic workplace surveillance has been aggressively deployed, particularly in platform work. The starkest example of this was Amazon’s introduction of wearable devices for factory workers. These track employees’ productivity and even the duration of their bathroom breaks, with repeated violations sometimes resulting in automatic contract termination. The Verge reported that, between August 2017 and September 2018, one particular Amazon factory saw over ten percent of the workers automatically fired for sub-par productivity. Uber, meanwhile, has become notorious for “robo-firing”, the automatica dismissal of thousands of drivers on the vague pretext of “fraud” - which includes turning down rides.
Platform work served as a testing ground for tech-driven employee monitoring. With the onset of the Covid-19 pandemic, the practice is now entering other sectors on a mass scale. This represents a major turning point for labour rights.
An April 2021 survey by ExpressVPN found that 78 percent of employers were using monitoring tools to track employee performance or their online activity. 51 percent had started using the surveillance software in the previous six months.
The methods by which employees are surveilled continue to expand. They include tracking emails, private chat windows, GPS location monitoring, and even impromptu screenshots to ensure workers are at their desks.
Such monitoring has created an unprecedented quantity of data that needs to be managed, leaving companies beholden to automatised decision-making that is vulnerable to inaccuracies. Such errors often persist permanently in the digital sphere, with the potential to damage careers. The situation also undermines those human values and needs that are incompatible with an algorithmic approach driven solely by profit and productivity targets. The normalisation of such tools opens the way to surveillance that reaches deep into employees’ private lives, and to judgments that can be used against them at work.
“Everyone now has a form of device that is permanently connected to their workplace, which wasn’t the case 15 years ago,” says professor of Labour law at KU Leuven Valerio De Stefano. This “has the potential to make us constantly connected for 24 hours to some device or software which gives feedback on what we do beyond our professional life.”
One particularly concerning overreach is the tracking of employees’ physical and mental health. For example, voice recognition software can be used to collect emotional data, and AI-powered facial scans can monitor the level of eagerness to take on tasks.
Some companies even impose wearable devices that can track heart rate, stress levels and sleep patterns as part of “wellness” packages. Constant connection to such devices are leading to, as the ETUI report puts it, a “blurring of work and private life”.
“There are employers who offer rewards when the sleep app signals that the worker regularly sleeps eight hours per night, and other programs that encourage workers to have a regular calorie intake. All these are not the business of the employers,” De Stefano says. “Employers should not know about those things, should not pressure people to lead a life that is not of their choosing.”
Tools that monitor such sensitive personal health data are claimed to help improve the wellbeing of workers. But constant surveillance and algorithmic judgment in the workplace risks having the opposite effect, exacerbating an already serious mental health crisis brought upon by the response to Covid-19.
The ExpressVPN study found that 59 percent of workers reported “feeling stress and/or anxiety about their employer surveilling their online activity”, with “workers constantly wondering whether they’re being watched” (41%) and “feeling more pressure to be online than doing productive work” (38%) the most common reasons cited.
The lack of transparency surrounding the deployment of workplace algorithmic tools looks set to generate distrust. 81 percent of employees were already using one or more employer-provided device, yet just 54 percent were aware of being monitored.
“Because of a lack of transparency, workers will have to obey a system which they're not aware of how it works. So first they have to imagine how the system works and then change their behaviour according to what they think the system will do”, Nicolas Kayser-Bril of AlgorithmWatch told Voxeurop.
“This creates a new layer of consciousness. You have to judge your own actions based on what you think the system does. And this of course creates much more stress for the workers involved," he added.
A draft EU Regulation on Artificial Intelligence fails to provide clear guarantees that workers’ rights are to be protected against such modern surveillance practices. The implementation of safeguards has been left entirely to employers.
A major problem with this, according to De Stefano, is that “the average employer in most cases is not even aware of the risks that are involved with some of the technologies that they use.”
Additionally, EU-wide legislation would supersede national laws, some of which already provide better protection against the risks of data collection.
“This regulation is not nearly protective enough for workers [as] it doesn’t provide for the involvement of unions or employers’ associations in negotiating what goes on in the workplace,” De Stefano adds.
Without proper safeguards, modern surveillance tools risk creating workplaces where employees are treated more like cogs in a machine than members of a team. The temptation in the digital age to quantify every aspect of life has been exploited as an ostensibly reasonable way to optimise output and profit But algorithms can never measure the value of all human traits, including those that stand to be made redundant.
Receive the best of European journalism straight to your inbox every Thursday
Was this article useful? If so we are delighted! It is freely available because we believe that the right to free and independent information is essential for democracy. But this right is not guaranteed forever, and independence comes at a cost. We need your support in order to continue publishing independent, multilingual news for all Europeans. Discover our membership offers and their exclusive benefits and become a member of our community now!
Russia’s attack on Ukraine: Kateryna Mishchenko in conversation with Sergey Lebedev
Two weeks after the launch of Russia’s massive attack on Ukraine, Ukrainian writer Kateryna Mishchenko – who had to flee Kyiv – shared her thoughts with our readers and with Sergey Lebedev, a veteran Putin opponent.Go to the event >