Back to Blog
Editorials
February 28, 2022

Toward data dignity: We need commonsense privacy regulation to curb Big Tech

Big Tech has unlocked our doors and walked into our homes uninvited. They’re reading our mail and rifling through our underwear drawers. In our first piece in our Toward Data Dignity series, we detailed how Big Tech got inside. And once they get there, they never leave. Rather than acting with the urgency we would if we were robbed offline, too often we act like the ongoing invasion of privacy isn’t happening.

Originally appearing in Fortune on February 28th, 2022

BY

TOM CHAVEZ, MARITZA JOHNSON, AND JESPER ANDERSEN

Big Tech has unlocked our doors and walked into our homes uninvited. They’re reading our mail and rifling through our underwear drawers. In our first piece in our Toward Data Dignity series, we detailed how Big Tech got inside. And once they get there, they never leave. Rather than acting with the urgency we would if we were robbed offline, too often we act like the ongoing invasion of privacy isn’t happening.

It’s almost like we as consumers have adopted a learned helplessness in the face of Big Tech data privacy abuses. It doesn’t have to be that way.

When we get hacked, we respond with swift indignation. We see an unknown charge on our credit card, and we jump into action. We practically have our credit card fraud centers on speed dial.

We are also appalled when our data is exposed at scale, like when 700 million LinkedIn users had their personal data put up for sale online last year.

Even less direct abuse and profiteering hit us viscerally when they touch important aspects of our offline lives. We’ve cared deeply and objected loudly when violations of our data dignity crossed over into politics. The Cambridge Analytica scandal unleashed a wave of righteous fury: Polls showed Facebook had a 66% decline in user trust just one week after the revelations.

Yet for some reason, outside of headline-grabbing incidents, we all seem to sigh with indifference over the daily, routine, less overtly outrageous invasions of our digital privacy. Why is that? Companies across the economy, with the Big Tech companies such as Facebook and Google as serial offenders, are following, listening, and watching us constantly for their exclusive benefit, but we sit complacently. It’s a classic tragedy of the commons: The practical impact on each of us may be limited, but the damage to the collective is profound.

The Ethical Tech Project aims to do something. We see a future where technology is in service of human flourishing—not a remora permanently attached to our personal data supply. What if we had a way to protect our privacy rights? That’s why we are convening a multidisciplinary community of technologists and experts to design standards for privacy protection across technological stacks. At the core of our framework is respect for a concept called Data Dignity.

The erosion of Data Dignity

Big Tech would tell us that they aren’t stealing our data, they are bartering for it in return for “free services” like social media, email, and photo sharing that make our lives better and more convenient. But commerce can only be fair when the value of the good or service on both sides is understood, and Big Tech has intentionally and systematically made it impossible for users to identify the real terms of the trade. A thing of value—our privacy—is being monetized not by the owners of the data (you and me) but by those exploiting it. As citizens, we are left without Data Dignity and without control over the monetary value we ourselves are creating.

Markets are supposed to be empowering and democratizing. With Big Tech in total, unchecked control, data markets operate as a functional autocracy. No other market in the world is left to function with so little oversight and with such asymmetrical terms in favor of an oligopoly.

Even when it may seem that Big Tech is putting user data privacy first, it is usually about them lining their own pockets—again. Following Apple’s move to depreciate third-party cookies, Google unveiled its own now-delayed plans to do the same to prevent the tracking and targeting of users as a privacy measure. But deprecating third-party cookies is a strategy that could leverage even greater advantage for Google’s market dominance. Too many of us are in a first-party relationship with Google to get much protection from such deprecation, and Google will make more money by shutting off data-skimming possibilities for others.

The effects of Big Tech’s disrespect for our Data Dignity aren’t just distorting capitalism and markets, they are also distorting our psychology. Privacy is a human right and a necessity, and attacking our privacy has a corrosive effect on the human condition. Our curated online experience, fueled by our personal data, serves to further divide and isolate us by limiting the amount of shared, comparable information we can see and disagree with. It also leaves us more vulnerable to political polarization.

Privacy by design

We need to restore sanity to the marketplace and put control back in the hands of those to whom data belongs in the first place—all of us. It’s easy to throw up our hands and think these abuses can’t be stopped, but let’s remember that the theft of our personal data—indeed the very existence of digital data—has occurred in the blink of an eye. We have every opportunity to change course, and we’re seeing more and more companies building “ethical tech” solutions every day, such as Neeva, which is giving power to consumers to decide where and how their data is used, and Ketch, which Tom Chavez is CEO of and which helps companies respect and automate data privacy compliance.

Data Dignity is possible: We just have to build it into the architecture of the internet, before the next wave of innovation further erodes our privacy. This is not an antibusiness stance: Commonsense regulation around a citizen’s right to privacy can be balanced with businesses’ need for growth. Rules and frameworks can create the certainty that the private sector needs to thrive.

We are on the cusp of another period of tremendous technology-fueled growth, this one even greater and more far-reaching than the World Wide Web of the 1990s or Web 2.0 of the 2000s and 2010s. Programmatic, technological solutions for privacy are possible if businesses build them, citizens demand them, and the government creates the framework for their necessity. In an upcoming piece on moving toward Data Dignity, we’ll detail how an entire industry and movement focused on “ethical tech” is on the rise and can slam shut the doors that Big Tech flung open.

Tom Chavez is cofounder and general partner of the startup studio super{set} and CEO and cofounder of the software company Ketch.


Maritza Johnson, Ph.D., formerly with Facebook and Google, is founding executive director of the Center for Digital Civil Society at University of San Diego and partner at Good Research.


Jesper Andersen is CEO and president of Infoblox, a provider of cloud-first DDI and DNS security services.


They are founding members of the Ethical Tech Project.