Advertisement

Seeing the Value in the Social Impact of Design

Share

We’re all familiar by now with the amazing capabilities of information technologies and automated systems. But few people stop to think about how these same systems impart and embed values into our lives.

That was the subject of an intriguing workshop on “value-sensitive design” held last week at the University of Washington in Seattle. Hosted by professors Batya Friedman and Alan Borning and sponsored by the National Science Foundation, the workshop brought together computer scientists, philosophers, social scientists, software designers and humanists from academia and business. The participants discussed how to integrate value choices into the design of technical systems, which can be intertwined with everything from issues of privacy to corporate power structures.

Some large companies now employ anthropologists, social scientists, psychologists and even artists to help them think through the implications of design choices. A few companies, including Microsoft, have recently included “chief privacy officer” in their organizational charts.

Advertisement

At the Seattle conference, Jonathan Grudin of Microsoft Research described his investigation of how companies use enterprise-wide calendaring systems, for example.

Microsoft uses a calendaring system that lets employees see the daily calendars of other staff members, but the software reports only whether someone is available or not; it doesn’t display details of the person’s daily activities.

Sun Microsystems, on the other hand, uses its own proprietary calendaring software that lets employees see all the details of another employee’s schedule, such as where someone is, who they’re meeting with and the subject of the meeting. Grudin said the employees of each company tend to be strongly attached to the way their company displays and uses work calendars.

At Microsoft, the calendaring system was crafted chiefly by programmers and researchers, who were averse to revealing the details of their schedules, Grudin said. At Sun Microsystems, the deliberate inclusion of clerical and administrative staff in the design process led to the display of such details because these employees believed that they could be more efficient if they knew what other people were doing and where they were.

Who had power over the design made the difference, and consequently the value choices came out quite differently.

There are many other examples of this phenomenon in the computing field. In the United States, we have an entire class of software applications known as “expert systems.” These are typically sophisticated databases of specialized knowledge with a user interface that imitates the way one asks questions of an expert.

Advertisement

In Scandinavia, however, programmers and designers have developed what they call “systems for experts.” These are information systems designed to augment and support the judgment of an expert, not replace it. A simple twist of terminology reveals a stark contrast in values: Is expertise something to be replaced by machines, or something to be valued and enhanced?

There are also portentous controversies over design and values. In the legal case over the music-sharing program Napster, for example, federal Judge Marilyn Hall Patel said Napster was designed to violate copyrights; its illegality is inherent in its technical design, she suggested. But a consortium of large companies and industry associations, including Napster’s critics, objected to this sweeping characterization, because Napster’s peer-to-peer file-sharing design may be useful for legal activities.

We’re also struggling over the politics of designing privacy into Web sites and other online services. Should Web pages that gather information from users offer an “opt-in” or “opt-out” choice--that is, ask users whether they want to be included in a database that might be sold to third parties? And if so, should the automatic default (the choice when the user does nothing) be opt-in or opt-out? The default indicates a certain value choice, and that choice is likely to determine the economic value of the information collected.

At the Seattle workshop, the participants discussed whether there are ways to formalize these evaluations of value choices in the design of technical systems. There are a variety of techniques that need more work, such as the “participatory design” approach, in which a system’s users are involved in its design process.

The biggest obstacle to a complete consideration of value choices in technical design is speed to market. Most companies just don’t have time, or take the time, to ponder the social impact of their designs. There is also an unspoken but strong tradition in engineering that attention to values is “soft and fuzzy,” not a subject fit for engineers.

“Why is value-sensitive design important now?” the Seattle workshop audience was asked. Because we’re setting technical standards that could last a generation, said one participant. Another answered that, unlike the past, today a single programmer or a small group of developers can influence the behavior of millions of people. It’s therefore imperative that we press for accountability and ethics.

Advertisement

And yet another speaker said that, given the importance and omnipresence of technology today, technical design decisions are increasingly substituted for what were once issues of public debate and politics.

That’s why we have an emerging and increasingly urgent “politics of design,” politics with no candidates, campaigns or slogans, but politics with serious consequences.

*

Gary Chapman is director of the 21st Century Project at the University of Texas at Austin. He can be reached at gary.chapman@mail.utexas.edu. Recent Digital Nation columns are available at https://www.latimes.com/dnation.

Advertisement