
Surya Mattu: Explaining the Unseen
As artificial intelligence continues to dominate headlines and creep into daily life, its inner workings grow increasingly abstract. “Explainability” describes an area of research that seeks to make the decision-making process of these systems understandable to humans and evaluated for bias. For instance, a hiring algorithm might discriminate against women if it was trained primarily on résumés from men where keywords like “Barnard” were not encountered. As AI grows more pervasive and complex, understanding how these systems arrive at decisions feels increasingly out of reach.
Surya Mattu is one of the first artists I met who made the word “tech” in art feel like an opportunity rather than a liability. His work in art, engineering, and journalism demystifies a wide range of complex systems: from the biases in predictive policing algorithms, to the invasiveness of Facebook ad targeting, to how your devices are tracking you in your home.
Mattu is the founder of the Digital Witness Lab at Princeton University and a senior data journalism engineer at Bloomberg News. We’ve collaborated on many projects over the years, and I’ve always admired his ability to find joy—even when things seem bleak. We recently had a conversation about creating work that critiques technology in today’s landscape and how to stay inspired in the process.

Angie Waller
The political landscape has changed a lot since we met in 2015. At the time you were a finalist for the Pulitzer for your reporting on machine bias. I remember feeling like there was hope for correcting course on discrimination in algorithms and the erosion of personal privacy perpetrated by Big Tech. Now CEOs like Jeff Bezos, Mark Zuckerberg, and Elon Musk have more openly shifted to the right and consolidated power. Keywords like “bias,” “fairness,” or “equity” in a project description result in grant funding being pulled away.
I am curious about your perspective for continuing to make work that critiques technical systems and topics like bias and privacy in this new and treacherous landscape.
Surya Mattu
I’ve been thinking about this differently lately. I’ve been listening to the Mindscape podcast, which has exposed me to scientists I didn’t have access to before, like evolutionary biologists and others, who discuss how information is a key component of life.
Complexity arises from simple rules because of information. Information isn’t just a human construct—it’s a fundamental property of the universe. It seems obvious, but DNA is information. Scientists describe it as an emergent property. So information is not confined to tweets from Elon Musk or something like that.
Understanding the world and processing information is something all living beings do. It’s a survival mechanism, just like eating and breathing. Regardless of who’s in power or the state of the world, our brains process information to help us exist.
I’ve been listening to Anil Seth, a neuroscientist studying consciousness. He describes how consciousness evolved as a way for the brain to synthesize information for survival. Our perception of the world is shaped by how our species evolved to process it. So when people talk about AGI (Artificial General Intelligence) and sentient machines, they overlook the purpose of consciousness. It evolved for survival, not for arbitrary technological goals.
Making work that critiques these ideas introduces a different set of values into the information ecosystem, counterbalancing the dominant “tech bro” narratives. Whether those values prevail isn’t in an artist’s or researcher’s control, but it’s important that they exist.
It’s like eating your vegetables: understanding the world is part of being a full human. Our brains evolved to make sense of our environment; but as we’ve built more tools and complex systems, we need new ways to perceive and process information. Making work that pokes fun or shows new sides of these systems makes the world more livable by making information more accessible.
AW
A few years ago, when we worked together on reporting on Meta’s privacy violations, like their collecting financial and health data, we had those moments where we thought, “We got them!” But other than some firm warning letters from Congress and a few class action suits, Meta’s operations were not fazed. Now all our data is being scooped up and joined in a master dataset by Musk’s DOGE team, so worrying about health data going to Meta feels quaint at this point. Do you approach projects or outcomes from investigations differently now?
SM
I focus more on the process and the joy of making rather than just outcomes. Early in my career, I felt like I had to prove something or build a reputation. Now, I value the act of creating in and of itself. I am happy to sit, think, and build things.
When I started working in journalism, I was outcome driven. But I’ve come to accept that even if the forest is burning, planting seeds still matters. One person can only do so much, and we’re all riding the wave of history. The work we do makes the world a little less terrible, even if we can’t control the outcome.
AW
It reminds me of picking up litter: even if more trash comes, for that moment, the street is clean.
SM
Exactly. I’ve been deep into neuroscience and biology, and it’s humbling to think about human limitations. No matter how advanced technology becomes, we still perceive the world through our brains, which have evolved in specific ways. Grand ideas about technology are always constrained by human perception.
AW
Your background is in electrical engineering, which is a field I don’t associate with art or journalism. Can you talk about a project that was a real “aha” moment for you, where you made the shift to more creative and storytelling fields?
SM
My thesis in grad school at ITP (Interactive Telecommunications Program at New York University) was that moment. I studied how our devices leak information through Wi-Fi. At the time, privacy was a big topic. This was just after the [Edward] Snowden leaks, and work around corporate surveillance was taking off.
I read a paper on how phones broadcast every network they’ve ever connected to. So I built a tool to sniff those packets and collect that data, basically to reveal all the networks that were being broadcast. At ITP, I could match network names to people I knew—creating “portraits” of their movements. I could tell when someone was dating someone else or had attended a conference just from their Wi-Fi history.
The project’s impact wasn’t in a physical artifact; it was in the conversations it sparked. I explained technical concepts, like the 802.11 Wi-Fi protocol, in a way that was personal and tangible. When I showed people their own data, or the networks their device was broadcasting, they had a visceral reaction. They didn’t care about the protocol, but when they saw their personal history exposed, they understood the implications. That’s when I realized the power of making the invisible visible.
That project led me to my career in journalism.

AW
Your work is not always what people associate with journalism. You don’t just write articles; you also build interactive tools.
SM
Right, but it’s still storytelling. I think of it as giving people an “aha” moment. If you teach someone well, they’ll forget you taught them. They’ll just remember learning something cool. My goal is to make my tools invisible, so people take ownership of their discoveries.
If people feel like they’re discovering something themselves, they’re more receptive. It’s like parenting—you can impose behaviors on a child, or you can encourage agency. The latter sticks longer.
Like with the Wi-Fi project, people know what Wi-Fi is but don’t fully understand it. When you show them something deeply personal, like their own network history, they get it. It was a magic trick, but an informative one.
AW
When did you transition from art projects to focusing on journalism?
SM
I didn’t have any notion that what I was doing would be useful for journalists. I really fell into it.
The Wi-Fi project started my career. I was sitting at ITP and saw someone’s Wi-Fi history that looked intriguing. There were a few signals—a recent tech conference’s Wi-Fi, the author Cory Doctorow’s home network, etc.—that led me to think I was seeing my advisor Gilad Lotan’s computer since these are places I knew he had been. When I emailed him to say I noticed he was in the building and to see if he wanted to meet, he realized I was seeing the Wi-Fi networks for his partner, danah boyd, and forwarded my message to her. She was just starting the Data & Society Research Institute, and the connection led her to inviting me to apply for their fellowship program.
I had no plans to be a journalist at the time, but through the fellowship danah connected me with Julia Angwin at ProPublica, and I learned how my work could contribute to investigative reporting.
AW
The project you created that was most widely recognized, with millions of users, was Blacklight. I remember, soon after it launched, even my neighbors were talking about it at a kid’s birthday party. In my view, that is pretty mainstream.
SM
It wasn’t until Blacklight that I fully embraced my role as a journalist. It was the first project where I synthesized everything: technical skills, storytelling, and public impact. Other projects had glimpses of this, but Blacklight was where I felt fully in control of my creative voice.
AW
And it let people learn about their own data and make discoveries themselves.
SM
Exactly. That’s the magic of good explainability work. When done right, it empowers people to understand the world on their own terms.
Prior to Blacklight, I worked on projects like the Facebook browser extension at ProPublica (collecting people’s targeted advertising keywords). At Gizmodo, there were a few smaller projects. The House That Spied on Me with Kashmir Hill was another important step.

With my earlier projects, I often felt constrained, like I couldn’t fully express myself. Blacklight was different. It was the first time I saw how journalism could fit with my creative approach. When it succeeded, I realized this is something meaningful.
Of course, Blacklight still required a lot of effort to shape. But what stood out was how it resonated with people. Like the Wi-Fi project, Blacklight made a complex and concerning issue of invasive web tracking feel tangible and even exciting. Even though it revealed something troubling, people found it compelling.
AW
The tool was great because it let people make discoveries on their own. Some read the reporting, but others used it to check websites they use regularly. I remember looking up things that weren’t newsworthy but were fascinating to me, like all the trackers on this website my kid’s school uses.
SM
Exactly. I wasn’t interested in the top-million websites. I wanted people to explore spaces they actually interact with.
AW
You had been looking at trackers prior to this. I remember you presenting the early stages of a network-sniffing tool prior to Blacklight.
SM
Right. Herbivore was another important project in my research. The name was a play on Carnivore, an FBI packet-sniffing tool from the late ’90s. There was also an art project by Alex Galloway called Carnivore, which showed how you could intercept network traffic.
I wanted to create something more accessible, hence Herbivore—softer and friendlier, like eating your vegetables. Carnivore sounded aggressive, and I wanted to change that perception.
Herbivore was a simple packet-sniffing tool with a friendly dinosaur icon. It let users view all the networks and devices on their local network and intercept their traffic. The fascinating part was seeing the data that apps were transmitting.
I often demonstrated this with a simple trick: I’d show domain names and ask people to guess which app I had opened. For example, if the intercepted traffic showed facebook.com, the answer was obvious. Same for instagram.com. But then I’d show an app with tons of different requests, over 60. Could they guess it?
AW
Google?
SM
No, it was a subway-map app.
AW
From the MTA?
SM
No, it was some third-party app with a bunch of ads. All I really needed was a static image of the map, yet the app was loaded with trackers. Looking at it through this lens, you realize how invasive it was. I was seeing 300 trackers for something that really could have just been a jpeg.
Herbivore sharpened my networking skills, which later informed Blacklight.
AW
And Herbivore was like an early privacy-check tool, like the App Privacy Report on iPhones?
SM
Exactly, but this was before iPhones had those features. Back then, Instagram images were still loading over HTTP, meaning Herbivore could actually retrieve images people were looking at on their phones if they were on the same network.
AW
Wow.
SM
Yeah. It was like sitting on the subway, glancing at someone’s phone over their shoulder, except on a network level.
AW
Your work could be associated with hacking; but working with you and knowing the ways you consider safety and privacy, your work is more ethical. Did you always think about ethics when making your work, or was that something you picked up working with journalists? I ask because my early art-making was very gratuitous in breaking rules in ways I would not do now.
SM
I was always interested in making work that is accessible to people. I grew up in a culture where hackers were revered. Hackers were the coolest people in the world.
But I understood tech well enough to understand how much of hacking is B.S. The aesthetic of what hacking was so far removed from the reality of what that stuff is. So I was always on a mission to get away from the hoodie-hacker aesthetic, even though everyone really liked to put that on me, as “Oh, yeah, he’s a hacker.” So one of the things the ethics came out of was fighting that culture—to be, “No, I’m a goofy dude.”
The ethics part of it just became embedded with the package of: This feels like a Surya project authentically if I’m doing it in the way that’s accessible to my mom, and being a hacker is not accessible to my mom.
My goal was always to demystify the complexity of technology in a way that got to the human parts of it and let people have agency over the tech. Part of that was I had to make it in ethical ways because otherwise it is inaccessible. I’m always thinking from that perspective.▪︎

Angie Waller’s publications, Unknown Unknowns and others, explore the influence of automated technologies on language and culture. She reverse engineers everyday systems using programming, text analysis, and diagrams to reveal bizarre and disconcerting phenomena. She is guest editor of the Walker Reader series Code in Codex.
Surya Mattu is a Brooklyn-based artist, engineer, and data journalist whose work explores how algorithmic systems encode bias and shape everyday life. Blending investigative methods with creative practice, he builds tools that expose hidden infrastructures of power in digital platforms. Mattu’s earlier work with ProPublica on the Machine Bias series was a Pulitzer Prize finalist and helped spark national conversations about algorithmic discrimination. His projects have been exhibited at the Whitney Museum, Somerset House, Haus der Kulturen der Welt, the V&A, and Bitforms Gallery.