Sara Bly
User Experience Consultant

Sara Bly is a user experience consultant who specializes in the design and evaluation of distributed group technologies and practices. As well as having a Ph.D. in computer science, Sara is one of the pioneers in the development of rich, qualitative observational techniques for gathering data to inform technology design through analyzing group interactions and activities. Prior to becoming a consultant, Sara managed the Collaborative Systems Group at Xerox Palo Alto Research Center (PARC). While at PARC, Sara was part of the 80’s Media Space project, a forerunner to today’s video conferencing and social networking services in which video, audio, and computing technologies were uniquely combined to create a trans-geographical laboratory. She was also part of ground-breaking work on shared drawing, awareness systems, and systems that used non-speech audio to represent information.
JP: Sara, tell us about your work and what especially interests you. SB: I’m interested in the ways that qualitative studies, particularly based on ethnographic methods, can inform design and development of technologies. It’s so important to look not only at the user interface but at the entire context of the activity and how technology fi ts in. My work spans the full gamut of user-centered design, from early conceptual design through iterative prototypes to fi nal product deployment, and a wide range of projects from complex collaborative systems to straightforward desktop applications. I very much enjoy uncovering how people do what they do and, at the same time, keeping up with what’s happening with new technologies. Fun projects have included studying how and where people read (including in bed), what families do with home media centers, and whether physical devices might help remote co-workers maintain presence.
JP: Why do you think qualitative methods are so important for data gathering? SB: I strongly believe that technical systems are closely bound with the social setting in which they are used. An important part of design and evaluation is to look ‘beyond the task.’ Too often we think of computer systems in isolation from the rest of the activities in which the people are involved. It’s important to be able to see the interface in the context of ongoing practice. Usually the complexities and ‘messiness’ of everyday life do not lend themselves to constraining the data gathering to very specifi c and narrow questions. Qualitative methods are particularly helpful for exploring complex systems that involve several tasks, embedded in other activities that include multiple users.
JP: Can you give me an example?
SB: I was part of a team exploring how people encounter and save published material in the form of paper and electronic clippings. We conducted twenty artifact interviews in homes and offi ces. We weren’t surprised to fi nd that everyone has clippings of some form and they often share them. However, we were somewhat surprised to fi nd that these shared clippings did more than provide a simple exchange of information. In fact, the content itself did not always have immediate value to the recipient. The data that particularly intrigued me was that the clippings could be a form of social bonding. Several recipients described some of their clippings as an indication that the giver was ‘thinking of’ them. This came from the open-ended interviews we had with people who were describing a range of materials they read and clippings they receive.
(Further information about this work is in Marshall and Bly (2004)).
JP: Collaborative applications seem particularly difficult to understand out of context.
SB: Yes, you have to look at collaborative systems integrated within an organizational culture in which working relationships are taken into account. We know that work practice impacts system design and that the introduction of a new system impacts work practice. Consequently, the system and the practice have to evolve together. Understanding the task or the interface is impossible without understanding the environment in which the system is or will be used. Although early projects focused on “work” activities, it’s been very exciting to see the growth of more personal social networking sites in the last several years. The technologies and practices are defi nitely co-evolving; who could have guessed the impact of 140-character exchanges?
JP: Much of what you’ve described involves various forms of observation. How do you collect and analyze this data?
SB: It’s important that qualitative methods are not seen as just watching. Any method we use has at least three critical phases. First, there is the initial assessment of the domain and/or technology and the determination of the focal points to address. Second is the data collection, analysis, and representation, and third, the communication of the fi ndings with the research or development team. I try to start with a clear understanding of what I need to focus on in the fi eld. However, I also try hard not to start with assumptions about what will be true. So, I start with a well-defi ned focus but not a hypothesis. In the fi eld (or even in the lab), I primarily use interviews and observations with some self-reporting that often takes the form of diaries, etc. The data typically consist of my notes, the audio and/or videotapes from interviews and observation time, still pictures, and as many artifacts as I can appropriately gather, e.g. a work document covered with post-its, a page from an old calendar. I also prefer to work with at least one other colleague so that there is a minimum of two perspectives on the events and data.
JP: It sounds like keeping track of all this data could be a problem. How do you organize and analyze it?
SB: Obviously it’s critical not to end with the data collection. Whenever possible, I do immediate debriefs after each session in the fi eld with my colleagues, noting individually and collectively whatever jumped out at us. Subsequently, I use the interview notes (from everyone involved) and the tapes and artifacts to construct as much of a picture of what happened as possible, without putting any judgment on it. For example, in a recent study six of us were involved in interviews and observations. We worked in pairs and tried to vary the pairings as often as possible. Thus, we had lots of conversations about the data and the situations before we ever came together. First, we wrote up the notes from each session (something I try to do as soon as possible). Next we got together and began looking across the data. That is, we created representations of important events (tables, maps, charts) together. Because we collectively had observed all the events and because we could draw upon our notes, we could feed the data from each observation into each fi nding. Oftentimes, we create collections, looking for common behaviors or events across multiple sessions. A collection will highlight activities that are crucial in addressing the original focal points of the study. Whatever techniques we use, we always come back to the data as a reality and validity check.
JP: Is it diffi cult to get development teams and managers to listen to you?
How do you feed your fi ndings back? SB: As often as possible, research and development teams are involved in the process along the way. They participate in setting the initial focal points for gathering data, occasionally in observation sessions, and as recipients of a fi nal report. My goal with any project is to ensure that the fi nal report is not a handoff but rather an interactive session that offers a chance to work together on what we’ve found.
JP: What are the main challenges you face?
SB: It’s always diffi cult to conduct a fi eld study with as much time and participation as would be ideal. Most development cycles are short and collecting the fi eld data is just one of many necessary steps. So it’s always a challenge to do a qualitative study that is timely, useful, and yet based on solid methodology. The real gnawing question for me is how to get valuable data in the context of the customer’s own environment and experience when either the activities are not easily observable and/or the system is not fully developed and ready to deploy. For example, a client recently had a prototype interface for a system that was intended to provide a new approach to person-to-person calls. It was not possible to give it to people to use outside the lab but using the interface only made sense in the context of actual real-world interactions. So, while we certainly could do a standard usability study of the interface, this approach wouldn’t get at the questions of how well the product would fi t into an actual life situation. What kinds of data can we hope to get in that situation that will inform us reliably about real world activity? Of course there are always ‘day-today’ challenges; that’s what makes the work so much fun to do! For instance, in the clipping study mentioned earlier, we expected that people would be likely to forget many of their clippings. How do we uncover the forgotten? We pushed to look at different rooms and different places (fi le drawers, piles by the sofa, the refrigerator), often discovering a clipping ourselves that we could then explore in conversation. We didn’t want to rely on asking participants to predetermine what clippings they had. In a more recent study of reading, one of our participants regularly reads in bed. How do we gather realistic data in that situation while not compromising our participant? In this case, we set up a videocamera that our participant turned on himself. He just let a tape run out each day so that he could fall asleep as normally as possible.
JP: Finally, what about the future? Any comments?
SB: I think the explosion of digital technologies is both exciting and overwhelming. We now have so much new information constantly available and so many new devices to master that it’s hard to keep up. The digital home is now even more complex than the digital offi ce. This makes design more challenging—there are more complex activities, more diverse users and more confl icting requirements to pull together. To observe and understand this growth is a challenge that will require all the techniques at our disposal. I think an increasingly important aspect of new interfaces and interaction procedures will be not only how well they support performance, satisfaction, and experience, but how well a user is able to grasp a conceptual model that allows them to transition from current practices to new ones. ■