Illustration by Joe Kim

We, The Technocrats

Jer Thorp
5 min readNov 12, 2018

--

In Brussels he spoke of courage, of defending principles, of cultivating a ‘healthy suspicion of authority.’ He lavished praise on the GDPR, and called for a new comprehensive privacy law in the United States. He admonished the practice of collecting personal data and demolished big tech’s favorite talking point, that privacy for users needs to be weighed against profit. He quoted Thoreau. It wasn’t exactly the speech you might have expected from the CEO of the worlds’ most successful tech company.

Cook’s performance last month as tech’s valorous reformer might have been more believable had he not relied so heavily on Silicon Valley’s go-to linguistic dodge: the collective we. Cook used the word fifty-three times in twenty minutes. Sometimes the we he spoke on behalf of was Apple: “When we designed this device,” he says, referring to himself and Steve Jobs and the original iPhone team, “we knew it could put more personal data in your pocket than most of us keep in our homes.” At other times, Cook’s we widened to include other people and companies who make technology. Talking about the dangers of AI he warned: “If we get this wrong, the dangers are profound.“ In his widest cast of the word the we, Apple’s CEO caught pretty much everyone: “Now, more than ever — as leaders of governments, as decision-makers in business, and as citizens — we must ask ourselves a fundamental question: What kind of world do we want to live in?” He repeated the question again at the end of the talk: “What kind of a world do we want to live in?”

Big tech’s collective we is its ‘all lives matter’, a way to soft-pedal concerns about privacy while refusing to speak directly to dangerous inequalities. If I want to talk about the human experience of data, I need to talk about risk, and risk is something that does not affect people equally, or at the same time. Living in data may seem like a shared reality, but it is an experience which critically different from person to person, and group to group. Cook’s we bundles his experience with that of others, and in turn denies them their unique concerns. One two-letter word cannot possibly hold all of the varied experiences of data, specifically those of the people are at the most immediate risk: visible minorities, LGBTQ+ people, indigenous communities, the elderly, the disabled, displaced migrants, the incarcerated.

Face recognition technologies operating in public space have only recently raised alarm in this country; meanwhile these technologies have been used for years against people living in conflict zones and migrant camps. Even within our borders, where more than half of adults have their images stored in one or more face recognition database, the risks up to now have been felt more urgently by specific groups. At least twenty-six states allow the FBI to perform facial recognition searches against their databases of images from drivers licenses and state IDs, despite the fact that the FBI’s own reports have indicated that facial recognition is less accurate for black people. Black people, already at a higher risk of arrest and incarceration than other Americans, feel these data systems in a much different way than I do.

The composition of data’s risk groups, and the particular dangers they are facing, are continually changing. Just last week, the Department of Justice passed a brief to the Supreme Court arguing that sex discrimination protections do not extend to transgender people. If this ruling were to be supported, it would immediately put trans women and men at more risk than others from the surveillant data technologies that are becoming more and more common in the workplace. Trans people will be put in distinct danger — a reality that is lost when they are folded neatly into a communal we.

At the core of Cook’s Brussels speech was a listing of four data rights for individuals: the right to have data minimized, the right to know what data is being collected and how it’s being used, the right to access, and the right to security. What’s missing from this list is the right to self determination, the right for communities of people faced with particular risks to have an active voice. These groups need to be able to arrive at their own solutions in regards to data and privacy, tailored to their own particular ways of knowing and their own unique lived experience.

I looked at the list of speakers for the conference in Brussels to get an idea of the particular we of Cook’s audience, which included Mark Zuckerberg, Google’s CEO Sundar Pichai and the King of Spain. Of the presenters, 57% were men and 83% where white. Only 4 of the 132 people on stage were black.

I found out about Apple’s own we from their most recent diversity report. 32% of the company’s employees are women and 9% are black– compared to 50.8% and 12.3% country-wide. These numbers get worse when restricted to tech jobs (23% female, 7% black) or to jobs at the leadership level (29% female, 3% black).

There is of course another we that Tim Cook necessarily speaks on the behalf of: privileged men in tech. This we includes Mark and Sundar; it includes 60% of Silicon Valley and 91% of its equity. It is this we who have reaped the most benefit from Big Data and carried the least risk, all while occupying the most time on stage. To this we– one that I am very much a part of– I offer some advice:

We need to stop centering ourselves; to stop asking what we want, what or fixes or solutions or futures might work for us. We need to shift our thinking from the future tense to the present. Here’s a more urgent question for us, one that doesn’t ask what we want but instead what they need:

How can this new data world be made safer for the people who are facing real risks, right now?

Apple might start by granting its users in China the same data protections it offers to others around the world. The tech community could work to get more people from the groups that are most keenly affected by data systems into its boardrooms and its dev teams, and on to its stages. All of us can remember that the dangers of big data aren’t felt equally. We can acknowledge those who are most immediately at risk, by saying their names out loud, and by listening carefully to their voices.

“The act of listening has greater ethical potential than speaking” — Julietta Singh

--

--

Jer Thorp

Jer Thorp is an artist, writer & teacher. He is Innovator-in-Residence at the Library of Congress. His book Living in Data is out now from MCDxFSG.