LinkedInFacebook

Alexandra Mateescu has a complicated relationship to data

The Data & Society researcher on the uses and misuses of workplace data—and why nothing is inevitable

|
January 22, 2024

Alexandra Mateescu is a researcher at Data & Society—the independent nonprofit research organization studying the social implications of data, automation, and AI—where she is a part of the Labor Futures Initiative. There, her fieldwork alongside her collaborations with labor advocates informs her research on the ways technology and digital surveillance impact worker rights and worker power in the United States. She has written for The Guardian and Slate, among many other outlets. Her most recent Data & Society publication, released in November 2023, is Challenging Worker Datafication.

This interview has been edited for length and clarity.

What are you working on right now?

I just recently finished writing Challenging Worker Datafication, an explainer that’s a sequel to two others I had co-authored with my colleague Aiha Nguyen a couple years ago: Algorithmic Management in the Workplace and Workplace Monitoring & Surveillance. Now our team is thinking about next steps. How do we grapple with the emerging responses to worker surveillance? A lot of public discourse treats worker surveillance as an inevitability, or assumes that workers have lost the battle. Within the Labor Futures Initiative at Data & Society, we’ve been grappling with issues around worker surveillance in a lot of different contexts—from retail to care work to workers in Amazon warehouses. They’re often experiencing common issues, even if they seem like they are very different workplaces.

What does work mean to you?

A lot of what we do throughout life is work. I was recently reading this multi-volume series called The History of Private Life, which is about the ever-changing distinctions between public and private life throughout history. Our conception of the “workplace” as a distinct place from home life and other types of mundane human activity are relatively new. I’ve been working from home since 2020, so I feel like the distinction between work life and private life is much more blurry, and I’ve struggled with those boundaries.

Do you have a framework for your career, or for the kind of work you do? Have those goals changed over time?

My background is in anthropology, and my orientation as a social scientist is that I prefer to talk to people directly. My research is at the intersection not only of ethnographic work, but also policy and public narratives around work and worker rights. A lot of the discussions about the “future of work” treat workers as an abstraction, or only include them in token ways. Yet workers and worker-led organizations have written insightful analyses and reflections, and journalists and researchers have done deep-dive investigations with workers, all of which surface more nuanced and complex narratives. 

What initially brought you to the subject of workplace surveillance?

Workplace surveillance is ubiquitous, so it’s hard not to encounter it when studying people’s experiences in the contemporary workplace. It’s also often not recognized as surveillance, as much as we recognize and are critical of state or consumer surveillance. Workplace surveillance has long existed, but right now the scale of digital surveillance, of data collection, is embedded and normalized within every aspect of work. 

A lot of the discussions about the “future of work” treat workers as an abstraction, or only include them in token ways.

I remember being struck, when Data & Society hosted a convening [on this topic], how folks from a lot of different [professional] backgrounds—agricultural laborers, athletes, domestic workers—were all experiencing the same minute and continuous forms of surveillance. The lack of transparency, the data-driven productivity pressures, and the sense of losing power and agency over one’s work were all widely expressed issues. I’ve also experienced surveillance as a worker myself, having worked in call centers, doing retail, and a variety of service jobs. The most hyper-surveilled workplace that I worked in was a call center: someone’s listening to your call, there’s software monitoring your every move, the atmosphere is very punitive. I’m thankful my workplace has been very mindful of these issues, including paying attention to the privacy terms of tools we use, like Zoom and Slack. 

What’s the difference between data collection and surveillance?

They are connected but not entirely synonymous. Fundamentally data is just a way to gain and organize knowledge about a workplace. The question is, who has power over that data and who decides what the data means, where it goes, and how it is used? Employers collect data through workplace tools and HR data, health insurers collect data, government agencies do the same in order to regulate workplaces, worker organizations and labor unions have always collected data about their membership to support and inform their advocacy. There are a lot of different kinds of data that circulate in and through workplaces, but what defines surveillance is the power dynamic. These tools often further entrench older dynamics of exploitation and control, and the erosion of hard-won labor rights and driving down of workplace standards. 

What did you uncover in Challenging Worker Datafication that you found new or surprising?

I found it a lot more challenging to answer the questions I sought to ask [than I expected]. What do we even mean when we talk about worker data? The more I dug, the more complicated the question seemed. What do we want from worker data? What do workers want from worker data? Who are the scholars working on these issues?   

You mentioned how surveillance is a shared experience across many different types of industries and many different types of roles. What does, say, a member of SAG-AFTRA have in common with a truck driver or Amazon warehouse worker?

I do think it’s worth mentioning first the ways that different workers might experience surveillance very differently. A white collar office worker might feel like a productivity tool that quantifies everything she does is great, and helps her stay on track. But her colleague who was hired through a temp agency might feel terrified she’ll get fired if she doesn’t work fast enough. Surveillance is going to have much bigger stakes for marginalized workers and those who are precariously employed, whether that’s because they’re independent contractors or immigrant workers or workers of color or because they don’t have any representation from a union, and so on.

What defines surveillance is the power dynamic.

What they do all have in common is that workers in the United States generally have very few privacy rights when it comes to the workplace. So while there’s piecemeal legislation at the state level that limits the scope of surveillance, employers often don’t have to notify or obtain consent from workers to perform surveillance. They often don’t have to explain what it’s for. They don’t even have to explain how the data informs decision-making about workers. In recent years, there have been efforts to build into regulation protections for specific rights around informed consent and transparency, like giving workers the ability to opt-out of certain types of data collection or the uses of data, but it’s not always easy to opt-out of data collection as a worker.

The second big thing that many occupations share in common is that surveillance tends to magnify the already existing labor issues within any given industry. With the SAG-AFTRA strikes, for example, there have always been tensions in negotiating the connection between creative ownership and compensation. Now generative AI tools have raised the stakes. Can studios own an actor’s face, or their digital replica, or their voice? In the case of Amazon warehouse workers, workplace safety and productivity pressures have always been an issue in the warehouse industry. The ubiquitous surveillance—which ends up imposing these impossible-to-meet quotas on employees, [which] leads to higher injury rates—exacerbates issues that have always been latent.

How can workplace data collection help make workplaces safer or better for workers?

There’s a distinction between data that’s collected to meticulously quantify employee productivity levels in order to push them to work harder and faster, and data that’s always been collected about workers’ experiences and workplace conditions.

There’s also been efforts to make better use of data that already exists. There was one recent project from the immigration news nonprofit Documented called the Wage Theft Monitor, where they piece together disparate data sets about wage theft from across New York State. 

In Challenging Worker Datafication, I talked about worker-led data collection during COVID-19, where employees were crowdsourcing COVID infection rates at their workplaces to get a sense of how safe they were there, although these efforts were a response to broader institutional failures to protect workers. Workers have also been building tools to collect data, particularly in areas like the gig economy where the platforms often aren’t very transparent: crowdsourcing data on things like tips and tip theft.

But with all of these examples, having the data is only half the battle, because it’s also about enacting larger structural changes that require power, resources, and influence. And with data collection, there’s always a risk of employers or regulators focusing on data collection and metrics as a proxy for maintaining the status quo. [They can say] “Oh, we’ll just collect more data” without actually fixing the problem.

But there’s no reason why data collected about workers can’t be turned around to scrutinize and challenge employers’ practices. Data about workers also tells us about management practices and workplace conditions as set by employers themselves.

Since data itself is neither inherently good nor bad, how do we collectively decide—on a company level, governmental level, movement level—what to curb (like biometric data) and what to unlock (like salary transparency)? Is there a framework you use when you are thinking about different kinds of data?

The answer to this might be book-length! Overall: What does the data collection do in context? What is its end goal?

Sometimes the harm comes from how the data informs decision making, sometimes it comes from flawed assumptions about what data can tell you, sometimes it’s the act of data collection itself that can be very harmful. Data doesn’t just produce itself: you need to change conditions in your workplace in order to make bodies and people and actions amenable to datafication. And sometimes the problem isn’t really about data or technology—it’s about capitalism and the existing extractive relationships that end up disempowering workers.

The legal scholar Veena Dubal talks about the notion of data abolition [in response to] data harvesting practices [that are] fundamentally unjust in ways that can’t be fixed. That’s where bans on data collection may be most appropriate. For example, advocates have argued that facial recognition simply doesn’t have uses where the benefits outweigh the harm. 

Public discourse acts as though we’re all passive subjects merely experiencing inevitable technological change.

Another path is to bring in more accountability [from a more mixed group of people]. That might mean subjecting workplace technologies like hiring algorithms to independent audits, or ensuring workers and unions have access to the data being collected or take the lead in shaping [the] technologies [that collect and use it], as well as strengthening worker rights and protections through legislation and enforcement.

More broadly, I think workers have a complicated relationship to data and to technology as a whole. They might want ownership over the data, they might want control, they might want outright refusal—or something else. These [are] questions that can’t be answered without asking workers themselves what they want. 

What about your work—or your field—is most surprising to you right now?

Maybe “surprising” isn’t the word, maybe more “disappointment.” I’m definitely frustrated with the way public discourse, like journalism, about the future of work acts as though we’re all passive subjects merely experiencing inevitable technological change. As if it’s something that just happens to us, rather than something that we can actively shape in the present. I’m always surprised that the same overblown doomsday claims get repeated over and over again as if they’re new: that workers are all hyper-surveilled and that’s just the way it is, or that generative AI is going to eliminate all jobs and that’s just the way it is. Or that workers have resigned themselves to these changes.

It’s funny thinking about that following a year of extraordinary labor victories.

Right.

What is the toughest challenge you—or your field—currently face? 

I don’t know if this is the toughest challenge, but it’s one that comes to mind. Like a lot of social scientists who study technology and investigative journalists [who report on it], I struggle with the immense lack of transparency from companies. A lot of my earlier research has been looking at the gig economy—workers who find work on apps like Uber and Doordash. These apps really depend on information asymmetries to exert control over their workforces, which means that workers often don't have a lot of insight into how they’re being algorithmically managed or how decisions are made about them. So workers are playing this detective game of taking screenshots, crowdsourcing information on forums, and making speculations about what they think happened when they got deactivated or something. And then you come in as a researcher, and you also play the detective game to figure out how these apps [actually work]. As an outside researcher, the opacity of these systems, whether it’s a gig platform app or a scheduling software or a hiring algorithm or a productivity tool, is very challenging.

This is the end mark. You have reached the end!

Keep reading

Ask a Founder: Should I sell my company or keep trying to grow?

A sale happens not because you want to sell, but because someone wants to buy

Elizabeth Zalman
May 20, 2024
Ask a Founder: How do I hire someone and have them succeed?

Founders can make or break a company, and getting over oneself is the key

Elizabeth Zalman
April 12, 2024
The inverted miracle of accidents

A senior scholar at the National Academy of Engineering explains why human error is a symptom—not the reason—for most safety failures

Guru Madhavan
March 18, 2024