May 22, 2019

Geoffrey Nunberg to I School Graduates: “We’re Rather Eclectic”

Geoffrey Nunberg gave the following address to the School of Information graduating class on May 19, 2019:

I’m very honored to have been asked to do this. I’ve been privileged to be associated with the I School for over 15 years, from around the time that Anno took over as dean. As you no doubt know already, you’re going to be called on a lot on to explain to people what a school of information is. So I want to share with you my take about what makes this place exceptional, and how that makes you exceptional—not just for your benefit, but for the families and friends who have supported you. Of course, they already know you’re exceptional, but I want them to know what this place had to do with that. 

If you’re looking for a concise picture of what the School of Information is about and how it got that way, you might check out the video of a talk that Anno gave a couple of weeks ago to recount her 15-year tenure as dean. When Anno took on the deanship, this was a smallish school that was just beginning to spread its wings. And she had a huge role not just in expanding the programs but in endowing it with the two features to my mind make it exceptional: first, by broadening the faculty to include people doing ethnography, sociology, public policy, history, economics, HCI, linguistics and so forth, and second, by creating a culture of community, which is what alumni are always pointing to as the most memorable feature of their experience here. 

Now I tend to see the world through the lens of language, and I’ll be making a few points here about words. For starters, I’m never sure how our program should be to described—are we interdisciplinary, cross-disciplinary, transdisciplinary multidisciplinary, or what? I had the idea once of describing the I School as “diversiplinary”—not my proudest creation. Or there’s this new management buzzword “cross-silo.” The next time someone asks you what the I School is about you could say “Oh, we do a lot of cross-silo thinking.” (But please don’t let on whom you heard that from.) For my part, when people ask me that question, I just say, “Well, we’re rather eclectic.” 

Call it eclecticism or interdisciplinarity or whatever; the important thing to understand is that it’s intrinsic to what we do.

Call it eclecticism or interdisciplinarity or whatever; the important thing to understand is that it’s intrinsic to what we do. A geologist can team up with people from other disciplines but it’s not necessary—there’s nothing in the nature of a rock that requires you to look at it from multiple points of view. But information isn’t something you can study through a single window. It lives two lives. It has a material life—it accumulates on servers and networks, in software and databases, on cell phones and surveillance cameras and MRIs. And it survives in printed documents that still have cultural weight, like books and ballots and parking tickets—and I almost forgot, diplomas. You’d be disappointed if you came up here and Anno handed each of you a USB drive with a pdf of your degree. 

But information also has a social life, to quote the title of a classic book by John Seely Brown and Paul Duguid. Governments and corporations sweep it up, kids text it furtively in the classroom, sports fans swill in it, democracies die in its absence. So any school of information is going to be anchored at two disciplinary poles. There are the formal disciplines that study the way information is stored, processed, and transmitted—CS, statistics, data science and so on. And there are the disciplines that touch the contexts where information takes on meaning: the culture, the market, public life. People usually come to the I School with a background in one or the other of those poles, but the ideal is to make everybody kind of bidialectal. You want students to lose their native disciplinary accents, so to speak, so you can’t tell by listening to somebody what their undergraduate major was. 

Geoff Nunberg
Geoff Nunberg

But the relation between those two poles is binary but not quite symmetrical. The transition is easier in one direction than the other. Let me tell you a story that I heard from Brian Smith, a computer scientist and philosopher who’s now the University of Toronto. Back in the 1960s, Brian was an undergraduate at Oberlin majoring in physics. He was also a gifted amateur musician, and he got very interested in computer-generated music, which was then in its formative stages. But Oberlin also has a famous music conservatory, and after a while the composition students there sort of took over the computer music project. There were two reasons for that, Brian says. On the one hand, the software became more versatile and more accessible. And then too, he adds, it turns out to be a lot harder to turn a geek into a musician than to turn a musician into a geek. 

Brian told me that story back in the 1980s when we were colleagues at the Xerox Palo Research Center—or “the legendary Xerox PARC,” as it’s always called. John Seely Brown was running PARC back then, and he was bringing in social scientists and artists and philosophers and people like Paul Duguid and me, the idea being that if you understood the context and aesthetics of technology use that understanding would somehow infuse the technology itself. I think of what Henri Matisse once said, “A colorist makes his presence known even in a charcoal sketch.” 

That idea did get through to a lot of technologists at PARC, and it had some influence at Apple and Google, where some PARC researchers and research found a home. But it wasn’t an easy sell. One of the hardest things in the world is to recognize rigor in disciplines that are more discursive or subjective than your own, wherever you’re starting from. Back in the day when there were still people who wrote assembly language— "coding close to the metal,” they called it—they disdained those who wrote higher-level code as sloppy. Mathematical logicians think of philosophers of logic as wooly-headed. The psychopharmacologists roll their eyes at the psychoanalysts. And it takes a huge leap of faith for a hardnosed coder to imagine that ethnography or philosophical ethics could be anything more than just airy-fairy gobbledygook. 

It takes a huge leap of faith for a hardnosed coder to imagine that ethnography or philosophical ethics could be anything more than just airy-fairy gobbledygook.

So most tech people didn’t really get why PARC was doing that—and in fact even fifteen years later, when The Social Life of Information was published, a lot of people took the title to be making an absurd claim. 

But that title doesn’t sound so preposterous anymore. People have come to a realization that was put very crisply by Doug Cutting, another former Xerox PARC colleague, who went on to create the Hadoop framework: “The biggest problems of technology are not technical.” That’s what gives rise to today’s pervasive algorithm anxiety, as I call it. Twenty-five years ago algorithm was a word you only ran into in the technical literature, where it just referred to a recipe for getting at an optimal result like alphabetizing a list of names or managing memory. But now the media are full of stories about algorithms, usually to point out how biased and fallible they are. I have a whole file of headlines like “How Recommendation Algorithms Run the World,” “The Violence of the Algorithm,” and one I saw in the New York Times a couple of months ago, “The Algorithmification of the Human Experience.” (Algorithmification”—there’s a term that deserves to die a horrible death, ideally in a head-on collision with “cross-silo thinking.”) 

It’s easy to see how this impression of fallibility arises. Why should I trust Facebook to get hate speech right when they have other algorithms that are telling me that my interests include beauty pageants, the band Journey and the professional wrestling hall of fame? So people in tech are saying, we really have to do something about all these problems of bias and abuse and privacy and misinformation and toxic content and so on. Engineers being engineers, their first instinct is to see these as essentially technical problems and keep throwing more cycles at them. And it’s clear that these are hard AI problems and that the algorithms need a lot of tuning. 

But tech people tend to struggle more with the conceptual issues—how do you define the problem and what would an optimal solution look like? They rarely give a sign of having learned the lesson that Brian learned at Oberlin in the sixties or that PARC was evangelizing for in the eighties or that the I School has been teaching for a couple of decades now. Engineers being engineers, once again, their instinct here is to attack these questions as purely deductive exercises. “Hey, we’re smart, we can just reason this out, how hard could it be?” 

Geoff Nunberg
Geoffrey Nunberg on stage with the I School faculty

Take the issues of misinformation and propaganda. As it happens, I just finished an article on the history of those notions for the forthcoming Companion to the History of Information for Princeton University Press. So I was particularly interested in a slick video that Facebook put out last year about its efforts to fight fake news. At one point the manager for news feed integrity is defining these notions. He draws a Venn diagram on a whiteboard that partitions the whole space of digital content according to two features: everything anybody ever says is either true or false and either deceptive or sincere. So here on the upper left is in the true but deceptive quadrant where propaganda lives and down here in the false and deceptive quadrant is where fake news lives and here on the other side is the false but sincere stuff, which is just being wrong and so on. 

And, I’m thinking, “You know, the city you’re trying to police has a lot more precincts than that.” He’s clearly well intentioned, but he seems to have no idea how many terabytes of philosophy and history and sociology have been dedicated to mapping the landscape of public discourse. This is basically what I think of as dorm-room epistemology, the sort of thing that college students come up in Red Bull-fueled late-night bull sessions. The problem is that you don’t see that you’re putting yourself at the mercy of ordinary language—I mean the everyday vocabulary that we use to talk about these things and the everyday assumptions that pass for common sense, which are both saturated with hidden ideology and cultural preconceptions. As the philosopher Paul De Man said, “There’s nothing more theoretical than the language on the street.” 

Those dorm-room theories are pervasive in the world of tech. Jenna Burrell and Elisa Oreglia of the I School have written about the myths of what you could call dorm-room theories of development, like the idea you can just sprinkle these rural, low-income populations with mobile phones and an efficient market will sprout up out of the soil. 

Or think of the people here at the I School who are concerned about the fairness of hiring algorithms—they have to contend with the sorts of dorm-room theories about the tech gender gap that were exemplified in the screed that that Google engineer posted a couple of years ago that had everybody up in arms. Or if you’ve been watching the NBA playoffs, you may have seen the one-minute TV ad that Apple has been running to tout the privacy features of the new iPhone. It’s a stream of 25 or so brief images that are meant to connote privacy in one way or another: a chihuahua barking behind a gate with a “Beware of Dog” sign on it, a teenager slamming the door to her room with “Keep Out” written on it, a guy in a public bathroom walking down to the last urinal so he isn’t near anybody else, someone shredding a document, and so on. Then the text comes on the screen, “If privacy matters in your life, it should matter to the phone your life is on.” 

Now to my mind, the privacy features I’m looking for in a cell phone are not quite the same as the ones I want to find in a public bathroom. So I sent a link to that ad to Deirdre Mulligan and asked her just how many different conceptions of privacy it was invoking. She came up with half-a-dozen or so: privacy as property, privacy as boundary negotiation, privacy as freedom from intrusion, and so on. Now of course this is just Apple’s ad agency trying to whip up a soufflé of feelings to sell a phone. But it gives you a sense of what a tangled set of notions the word “privacy” evokes in our conversations about technology and of the challenges that people like Deirdre and Chris Hoofnagle face when they try to operationalize or codify them. 

Now this kind of thinking is depressingly endemic in the world of technology. But my sense is that most students come out of the I School more-or-less inoculated against it. When you look at the work that our students are doing, at whatever level, you usually pick up a much more nuanced sense of both the social context and the ethical implications of the project. 

Coders and designers and data analysts don’t have to become full feathered ethnographers or ethicists or whatever—they just have to know what it would be like to be one of those things.

In fact Brian Smith was wrong about geeks and musicians. Coders and designers and data analysts don’t have to become full feathered ethnographers or ethicists or whatever—they just have to know what it would be like to be one of those things. They have to realize that these problems are difficult, and not intuitive. They might call for specialized expertise. Or they may just require the capacity to step back and extract yourself from the everyday language and preconceptions that cloud your perceptions. And they almost always call for a touch of empathy. 

Whatever work you wind up doing, your capacity for that kind of insight is going to confer huge benefits—for your organization, for society, and by-the-by, for you. It’s a competitive advantage in your career that’s more important than your technical skills or just raw intelligence, which is something that young people in particular often put too much weight on. Google job interviewers used to be famous for posing these idiotic brain-teasers like “Estimate how many tennis balls can fit in an airplane.” (I have to say I wouldn’t want to work with anyone who thought that asking a question like that one was a useful way to pick colleagues—or with anyone who took pride in being able to answer it.) 

Now if I were interviewing job candidates for Google or Facebook, I might ask them something like, ‘How would you define privacy’ (or ‘fairness’ or ‘merit’ or ’trust’ or whatever)?” My guess is that the first nine people I asked about any of those would come up with more-or-less the same answer, and it would almost always be wrong. The tenth would say something like, “Oh, boy, that’s really complicated—can I get back to you on that one?” That’s the person I’d want to talk further with, and I like to think that person would be you. 

So, congratulations. If the past is any guide, most of you will go on to successful and fulfilling careers. We’d like to hear about your progress, because the success of our alumni redounds to the reputation of their alma mater—you make us famous and you make us proud. And in that connection, it’s worth remembering that “alma mater” is Latin for “nurturing mother.” So let me close with the four words that every mother says to her offspring as they go out to make their way in the world: “Don’t forget to write.”


Geoffrey Nunberg is a linguist, researcher, and adjunct professor at the UC Berkeley School of Information.

Geoff Nunberg speaking at graduation
Geoffrey Nunberg is a linguist and adjunct full professor at UC Berkeley’s School of Information, and well-known author and a regular contributor on NPR’s Fresh Air
group photo the class of 2019
UC Berkeley School of Information, Class of 2019

Last updated:

May 29, 2019