By Joseph Turow and Chris Jay Hoofnagle
The internet’s business model is tailored advertising. Internet firms hoover up information about what you do and say on the web, in apps and even in physical locations. They then sell advertisers the ability to figure out whether you’re valuable enough for them to want to reach you, and if you are, how they should try to persuade you.
In a recent Wall Street Journal commentary, Mark Zuckerberg claimed that Facebook users want to see ads tailored to their interests. But the data show the opposite is true. With the help of major polling firms, we conducted two large national telephone surveys of Americans in 2012and 2009. When we asked people whether they wanted websites they visit to show them commercial ads, news or political ads “tailored to your interests,” a substantial majority said no. Around half did say they wanted discounts tailored to their interests. But that too changed after we told them how companies gathered the information that enables tailoring, such as following you on a website. Bottom line: If Facebook’s users in the United States are similar to most Americans (and studies suggest they are), large majorities don’t want personalized ads — and when they learn how companies find out information about them, even greater percentages don’t want them.
To Mr. Zuckerberg, protecting ad personalization from privacy rules is key. His essay argues that regulatory intervention would take away a “free” goody from the public. Facebook makes virtually all its revenues from advertising, and it has created enormous amounts of data about the people who use Facebook and the larger internet. In his essay, Mr. Zuckerberg defends Facebook from a chorus of critics who rail against a business model that they argue uses and abuses people’s information under the guise of transparency, choice and control. Mr. Zuckerberg therefore has an interest in arguing that he and his colleagues well understand what his audience wants. “People consistently tell us that if they’re going to see ads, they want them to be relevant,” he writes. “That means we need to understand their interests.”
But consider how deeply the specifics of what we found contradict Mr. Zuckerberg’s case. In one of our surveys, we asked 1,503 Americans four different questions: whether or not they wanted “the websites you visit” to show them (1) tailored ads for products and services, (2) tailored discounts, (3) tailored news and (4) tailored political ads. If a respondent answered yes to any of the above questions, we went deeper, asking whether the tailoring to their interests would be acceptable if based on the user’s behavior on the website the user was visiting, on the user’s browsing on other websites and on offline activities, such as store shopping or magazine subscriptions.
Sixty-one percent of respondents said no, they did not want tailored ads for products and services, 56 percent said no to tailored news, 86 percent said no to tailored political ads, and 46 percent said no to tailored discounts. But when we added in the results of the second set of questions about tracking people on that firm’s website, other websites and offline, the percentage that in the end decided they didn’t want tailoring ranged from 89 percent to 93 percent with political ads, 68 percent to 84 percent for commercial ads, 53 percent to 77 percent for discounts, and 64 percent to 83 percent for news.
This resounding consumer rejection of surveillance-based ads and content actually makes sense in view of the surveys that we and others have carried out on the digital marketing environment. We find consistently that people are wary of marketers tracking them, don’t understand the complexities of data mining, and don’t like to be discriminated against based on information that companies have about them and others. They may therefore see personalization as a double-edge sword. Personalization can provide them with material they like, but it just as well could be used to shape their behavior or beliefs, or even cause them to lose out on discounts to more desirable consumers. Given that people have lives outside the internet and don’t have the time or ability to figure out its complexities, they may go with the flow of Facebook’s understanding of their views. But our studies suggest that their attitudes toward relevance are far more complex than Mr. Zuckerberg asserts, and very likely in the opposite direction.
Our work yielded another finding that ought to be taken seriously. When asked to choose what, if anything, should be a company’s single punishment beyond fines if it “uses a person’s information illegally,” 38 percent of Americans answered that the company should “fund efforts to help people protect privacy.” But over half of Americans adults were far tougher: 18 percent responded that the company should “be put out of business” and 35 percent said “executives who are responsible should face jail time.” (Three percent said the company shouldn’t be punished and 6 percent said it depends or didn’t know.)
Use of personal information is a serious issue to the American public. People consent because they have no choice. And delusional statements like Mr. Zuckerberg’s that they want to go with his plan should not go unchallenged.
This article was originally published in The New York Times on January 29, 2019.
Chris Jay Hoofnagle is an adjunct professor in the School of Information and the School of Law.