of the American Society for Information Science and Technology   Vol. 27, No. 6    August / September 2001

Search

Go to
Bulletin Index

bookstore2Go to the ASIST Bookstore

 

Copies

Exoinformation & Interface Design

by Benjamin Brunk

Privacy is only about 100 years old as either a legal or political issue. Throughout its entire existence, it has been a deeply contested concept as representative readings in Shoeman's 1984 collection Philosophical Dimensions of Privacy clearly show. Recently, Internet privacy has become an immensely popular subject of public discussion. It seems as though everyone, from policy makers to expert consultants and the news media, has been working overtime to raise the alarm over new surveillance and information gathering technologies that have been identified as threats to privacy.

There are two main themes to most of the reports. The first one identifies a particular threat, discusses its implications and ends with proposals for some kind of legislation to address the problem. The second theme follows a similar course except, instead of suggested legislation, there follows a pitch for some new product or technology.

While all of this work is interesting and thoughtful, it often fails to accomplish anything useful because it is too reliant on bureaucracy, is overly broad and inflexible or is simply too much trouble to use. What the current debate on privacy overlooks is that privacy is a deeply personal issue and that any given individual's privacy preferences are constantly changing. People often make split-second cost/benefit analyses in which they weigh their privacy against other desirable goods. Neither law nor design can truly adapt or evolve fast enough to keep up with the rapidly changing environment. At best, current privacy laws and software systems restrict freedom of choice and hinder commercial activity.

Rather than approaching privacy from the security perspective of "threats" and "intrusions," this article takes an individualized approach. Instead of talking about privacy in terms of outsiders trying to get in, it makes sense to look at privacy from the inside out. The central question in this approach is What information is leaving me as I am in the process of seeking information?  We expect that we reveal a great deal about ourselves in everyday interactions. "Merely by walking outdoors, we put ourselves in the public domain," as Sanchez points out in The Privacy Cage ( www.liberzine.com/juliansanchez/010205privacy.htm). In other words, there is a natural tendency for us to constantly shed information about ourselves, whether it is in the physical world or in cyberspace. That the information we "broadcast" is important was discussed by Singleton in a Cato Institute Policy Analysis in 1998 ( www.cato.org//pubs/pas/pa-295es.html ). While it is debatable whether we have ownership of this information, we definitely have a stake in trying to control it. Information leaving us, either consciously or unconsciously, is referred to here as exoinformation

Exoinformation

It has become necessary to coin this new word exoinformation because language has not yet caught up with the challenge of discussing privacy. Exoinformation is the informational byproduct of an individual's information-seeking activities. This byproduct, or "data exhaust" as Olsen calls it (http ://news.cnet.com/news/0-1007-200-3230107.html), has become more and more important to people building profiles about consumers. An entire industry devoted to collecting and making sense of exoinformation already thrives. The leading example is Doubleclick, Inc., often portrayed as the villain in the news media's privacy coverage.

Specifically, exoinformation consists of the tidbits of information that are unconsciously or unwittingly disseminated by people's everyday actions. All life processes produce exoinformation. Observing that someone is breathing will reveal that he or she is alive. We already have a pretty good understanding of these subtleties in the physical world, but the cyber realm offers new challenges for individuals to understand and manage information leakage. Examples of exoinformation include a preference or a behavior captured and recorded as the result of posing a search query, selecting a song to listen to, checking on a stock quote or just clicking through a website.

It is also possible that deliberately revealed information, for example your name and address, could become exoinformation in circumstances where it is used for some purpose other than that which it was originally intended. For example, Kevin Mitnick, in some of his hacking exploits, used exoinformation to learn about the operations of the telephone company. His intent was to learn enough inside information so that he could pass himself off as a telephone company employee in order to further his hacking exploits. His "social engineering" convinced other telephone company employees to reveal all kinds of confidential information. They trusted him because he knew things they assumed no outsider could find out. He got that information by eavesdropping, digging through trash and exploiting the hidden streams of data that are the byproducts of a company's day-to-day operations. Here, exoinformation was a wedge used to access more direct streams of information. Ironically, Mitnick was caught because his own exoinformation leakage led authorities to his doorstep. This, despite his tireless devotion to trying to constrain his own exoinformation flows, of which he was extraordinarily cognizant. 

Exoinformation has several important characteristics that could be components of a measure of its criticality . The five we will consider here are granularity, the temporal attributes persistence, frequency and sensitivity, and valuation.

    1. Granularity. Exoinformational granularity refers to how much data is being shed and potentially collected. A keystroke is quite different from a whole document.

    2. Persistence describes how long exoinformation remains observable. From a potential observer's point of view, this might also describe how long collected exoinformation is stored or remains useful.

    3. Frequency, another temporal attribute, is the rate of leakage of exoinformation. Two related concepts from a potential observer's point of view are the sampling rate over this frequency how often exoinformation is recorded and the level of detail for each sample a form of quantization. If you only want to know someone's search query, it takes one look. On the other hand, if you want to see which pages they visited on a Website (in order to construct a click trail) you must make multiple observations.

    4. The last temporal dimension of exoinformation is sensitivity, the sampling duration necessary to observe a particular exoinformation event. Depending on what observers are trying to find out, they may require a short quantum (milliseconds, seconds, minutes, etc.) or a much longer sampling sensitivity (hours, days, weeks, months, etc.) for each observation. One's ability to detect exoinformation leakage has a direct relationship to these temporal dimensions.

    5. Valuation is how important or confidential the information is to you. For example, the exposure of one's name and e-mail address is probably considered to be less upsetting than something more personal, such as medical or financial information. 

Criticality, then, combines all five qualities of exoinformation into a single overall assessment. If we can develop an adequate measure for criticality, we will have a framework to map exoinformation leakage to user interface features (or entire systems) designed to reveal and/or alleviate the potential for unintended exposure of information. At the macro level, we could begin to compare the many privacy tools and techniques currently available in a more consistent and deterministic fashion. In addition, we can zoom in on individual features (e.g., digital certificates or encryption) implemented in different systems and compare their relative effectiveness using usability measures and the above measures related to exoinformation. We can then extrapolate back out to say something about the overall effectiveness of each system in even greater detail. My on-going research is addressing these issues. 

Privacy and User Interface Design

User interface designers should develop strategies that reduce the cognitive load required to manage personal information flows, especially with regard to the information leaving us. The challenge is to increase privacy awareness and self-protective behaviors, without imposing overbearing and judgmental ideology or inducing too much cognitive load on users. Many Internet users are at least a little pragmatic about protecting their privacy but are confused by the wide variety of tools and techniques available to them. That population should be a primary concern in privacy interface design. I believe that most, if not all, of the problems associated with exoinformation leakage will be managed eventually through a combination of behavioral and technical means. Right now, however, awareness is low, especially among privacy neophytes. Even those who are mindful of their privacy often feel they lack the technical know-how to download privacy-enhancing software and make it work. The technology is spread throughout a confusing jumble of software tools and information sources that all have various shortcomings and overlapping capabilities. Many new companies are working to improve online privacy, but most are working independently of one another, using inefficient hit-or-miss design strategies, with not enough emphasis on usability or even user needs.

User interface design practices emphasize removing cognitive load burdens from users and shifting them to the interface. A byproduct of this approach, and in software customization/personalization in general, is that it can often increase the amount of exoinformation available for broadcast. Despite this drawback, it is reasonable to assume that personalized interfaces should also be able to help people manage their privacy preferences. In other words, the same technology used to invade privacy can also be used to protect it.

Concluding Remarks

Many believe that privacy is a lost cause and do not trust new technology. But as Ann Cavoukian, the information and privacy commissioner of Ontario, noted in Lester's recent article on "The Reinvention of Privacy" (The Atlantic Monthly, 287, 27-39, www.theatlantic.com/issues/2001/03/lester-p1.htm), "It's still the early days and we can't give up just because people say 'You have no privacy, get over it.'" Technology has done just as much or more to free us as it has to enslave us. Evidence abounds:

    One of the earliest technologies, writing, enabled a new and enduring form of private communication. The printing press popularized reading, an intensely private affair. The wristwatch privatized time. Cheap and widely available mirrors allowed, literally, a new level of private self-reflection. The gummed envelope boosted expectations of privacy in the mail. The technological advances of the Industrial Revolution led to the creation of a prosperous middle class that could afford to build houses with separate rooms for family members. The single-party telephone line allowed for direct, immediate and private communication at a distance. Modern roads and mass-produced automobiles made private travel possible. Television and radio brought news and entertainment into private homes. (Lester, p. 39, www.theatlantic.com/issues/2001/03/lester-p4.htm )

As technology advances, it will become harder to distinguish between the physical world and the online world. For example, it is now possible to directly collect digital records of biometric and genetic exoinformation on a large scale. We have already seen expanded use of face recognition and fingerprint recognition technologies. The implications to privacy are well documented (though often ignored or discounted in favor of some other desirable efficiency).

We cannot count on others to look after our privacy when there are incentives for government and private industry to take advantage of technologies that threaten privacy in the name of efficiency. But technology can also let people become more capable of taking responsibility for protecting their own privacy. In a society that values the unfettered exchange of ideas and information, it is impossible to guarantee absolute privacy protection. And why should we want to? Our most valuable societal asset is our ability to harness the free flows of information and use them to improve the human condition. Privacy is an evolving and deeply personal matter with no one-size-fits-all solutions. Our goal should be to furnish people with the knowledge to recognize their own optimum privacy posture and the tools that give them the ability to confidently, dynamically and unambiguously ensure that their expectations are manifested.

Acknowledgments

The author would like to acknowledge Gary Marchionini for his continued support and involvement in formulating the definition of exoinformation and the author's research framework. Many thanks also to James S. Wilson for his time to discuss and edit this paper and his enthusiasm for privacy and freedom, which have given the author so much to think about.

Benjamin Brunk can be reached by e-mail at brunkb@ils.unc.edu

How to Order


ASIST Home Page

American Society for Information Science and Technology
8555 16th Street, Suite 850, Silver Spring, Maryland 20910, USA
Tel. 301-495-0900, Fax: 301-495-0810 | E-mail:
asis@asis.org

Copyright 2001, American Society for Information Science and Technology