In the physical world, icons are always telling us what to do. No smoking on the airplane. Beware of the road crew ahead. Crap here if you’re a man, and there if you’re a woman. There’s even an icon that says, essentially, “Yes, when I die in a car wreck, you may take my organs and put them in another human body.” And these icons, for the most part, tell us these things without so much as a word. They’re feats of human efficiency: Why force someone to read tedious text when the image of a crossed-out cigarette will do?
The digital world, however, is a different story. On the Internet, warnings and directions are often much more complex. How can an icon summarize, for instance, whether to allow Facebook to share your user history or to allow your iPhone to know where you are at all times? You’ll probably understand a website’s terms of service better if you sit down to read them, but you probably don’t have the time or patience for that, instead signing away vast swaths of your personal information without a second thought.
The White House thinks this is a problem. Last February, it promulgated a Privacy Bill of Rights decreeing that consumers shall get a clearer idea of what, exactly, a mobile app does with your data (California, meanwhile, has already made its own rules). The app industry, desperate to avoid clunky regulations from Congress, promised to work with privacy advocates to come up with their own standard practice for privacy notifications. Get all the players in a room, the thinking went, and they should be able to hash out an approach that works for everybody.
The formal process, now in its sixth month of meetings moderated by the National Telecommunications and Information Administration (of the Department of Commerce), hasn't gone that smoothly. Rival app developer associations, meanwhile, have created different proposals for conveying an app’s terms of service. The main flashpoint in this skirmish: whether to use icons, or rely entirely on text. One camp thinks users can absorb information immediately through pictures, and the other thinks users will only fully understand if they read something start to finish.
This isn't just a technical debate about how to convey information to consumers. It's also a power struggle over who'll get to run these policy fights in Washington, and a fundamental disagreement about how worried people should be about the data they're releasing. Shouldn't people be prompted to think a little harder before granting a company access to the most intimate details of their life? How do you even make a set of road signs for cyberspace, when society hasn't yet decided how much invasion of privacy is too much?
"We know that we want you to be able to safely drive your car. And we know when it's dark, you should turn your lights on. So designers and industrial psychologists can figure out how to get the desired behavior," says Jules Polonetsky, director of the Future of Privacy Forum, which has been trying to broker an agreement. "With apps, we don't know what the desired behavior is."
Lots of people have tried making icons work on the Internet. The Creative Commons system for denoting copyright permissions is probably the most successful. Mozilla’s convoluted set of suggested privacy icons, less so. The problem is, the entities with the greatest power to come up with standardized systems—online retailers and advertising companies—have a strong financial interest in people allowing their behavior to be tracked.
"They pretty much don't test things, or they test things in a very cursory way, or they test things and they ignore the results," says Lorrie Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie Mellon University.
Cranor is thinking in particular about the digital advertising industry's earlier foray into icon-based notifications, which privacy advocates initially supported: A little “i” in a triangle that indicates an ad is tracking your online behavior. While testing out different designs and taglines, the industry chose the version that survey respondents understood less well (“AdChoices” rather than phrases like “Why did I get this ad?”) and Cranor found that only 27 percent of people who saw the final product actually knew what it was supposed to mean.
That's why, when the time came to decide on a universal standard, a group of privacy advocates opted to bypass icons altogether. Instead, they struck a deal with the one-year-old Application Developers Alliance to support a set of screens that explain an entire data-usage policy through text.
In the other corner: Another developers group, the Association for Competitive Technology, which formed 15 years ago to defend Microsoft against antitrust charges. They favor an icon-based dashboard that explains further when users click on pictures, figuring that most people will absorb at least a little information, and folks who are really interested can read up (as it happens, it also gels well with Microsoft's icon-heavy operating system Windows 8, but ACT says most of its members now develop for Apple's iOS anyway).
"What we decided is, icons have been tried, with a couple different iterations over the years," says Consumer Action's Michelle DeMooy, to whom the ADA directed my questions about their proposal. Pro-icon people, she says, "tend to overlook some of the research that shows people do not understand what those things mean. They're also in this world where icons and 'glanceability' is what matters. In my world, it's a little different, because I'm talking to real consumers and listening and hearing what they're saying and that kind of stuff isn't necessary."
Here’s the problem with that approach, though: consumers lie. As a user-experience designer for the privacy management company TrustE, Travis Pinnick comes up with privacy notices for a living, and even he doesn't think people are as concerned about the issue as they say. For that reason, he tries to show only the most important information—like what data is pulled from your phone, and how to get in touch with the company—rather than make people read through everything there is to know.
"Trying to support any type of decisionmaking based on what consumers say they want is misguided," Pinnick says. "No matter what I show them, 100 percent of the time, they say, 'Yes, that's what I want.' They just think, 'Well, privacy's important, and if there's a better way to read privacy policies, that would be a good thing.'"
At this point, the two sides have gotten downright exasperated with each other.
"I find it ironic that we're fighting against something new and different," says ACT's executive director Morgan Reed. "We also no longer live in caves and scrawl on the walls in charcoal. I'm convinced that long strings of text in tiny font size is not a good way of communicating information on a small screen. Call me crazy, but I'm pretty comfortable with that."
To Reed, giving up your information is a pretty small price to pay for apps that are usually free, and consumers don't need to be completely aware of every detail in order to use them.
"When the privacy community looks at transparency, they look at transparency as something that should create a behavior change," Reed says. "That you, upon learning information about what's happening, will not in fact move on to the application. It's more of the 'eat your broccoli' belief than it is transparency. It's, 'I want you to do something different.'"
Meanwhile, the people who actually pay for advertising—retailers and marketers—say they've already got good enough standards, thank you very much, and they don't need to be bossed around by developers or advocates.
"The people who really make choices about collection and use of data are the people who have the apps themselves," says the Direct Marketing Association's lawyer, Stuart Ingis. "Developers are the people we hire to do software coding. That's like saying the painter of a retail store makes decisions about the paint."
Ingis says he tried to bring the finicky sides together around a common set of principles, and nobody was interested. "When you provide text, they want icons, when you provide icons, people want text," he says. "It's silly, it's just silly. We thought there was going to be a substantive discussion. Other than us, people haven't wanted to do that. I feel I’m the only person speaking the truth."
The likely end result? Talks fall apart, and no consensus is reached. App developers continue to use a hodgepodge of different icons and text-based privacy policies, and none of them gain broad acceptance. In the meantime, developers don’t know whether they’ll be prosecuted for not having a privacy policy, which makes the development of apps that use sensitive information—like for medical purposes—not worth the risk. In other words, apps that would be of use to millions of Americans won’t get made because a few organizations can’t decide whether to communicate with words or images.
And then there’s the absolute worst-case scenario: Congress tires of this endless bickering and imposes on, all of us, some clunky solution that this whole process was designed to prevent.