Navigation

Entries in design research (10)

Sunday
Jun262011

A Dangerous Idea

This past Friday I attended Bolt | Peters' User Research Friday, and there were a couple of presentations I found interesting. For example, frogdesign's Associate Strategy Director Ben McAllister's talk based on his recent Atlantic Monthly article, "The 'Science' of Good Design: A Dangerous Idea." Evidently McAllister got some flak from folks who were saying he is anti-research or anti-science, when, in fact, he is a researcher but is anti-pseudo science - against designers or design researchers attempting to provide oomph to their argument by using scientific-sounding language about their research, as if Science is Certain. He refers to the mathematician William Byers' Science of Certainty, and the economist F. A. Hayek's term, Scientism - both dealing with the idea that science is not certain, it is an ever-evolving body of work moving toward certainty, with some ideas more certain than others, but nothing immutable.

The Dangerous Idea? That research can be used to provide easy answers. The talk covered a topic I often raise with my students - the idea of trying to persuade with scientific-sounding language rather than a courageous and well-argued rationale based on instinct that is well informed by solid research, iterative testing, and analysis.

McAllister starts with a cogent dissection of the word "strategy," which comes from the Greek "στρατός" (stratos), or army (that which is spread out) and "ἀγός" (agos), or leader. Strategy is, in other words, "leading that which is spread out." Leadership in the face of uncertainty, ambiguity.

Humans don't like uncertainty and ambiguity, so the impulse to find easy answers is a strong one. Without the uncertainty, however, there is no need for real leadership. What we're left with, McAllister says, is merely following directions. To provide true leadership, rather than easy answers, is to face the ambiguity. This is the act of courage required of the designer.

I made a LiveScribe "pencast" of the talk, which you can view here. Click on the handwritten notes to hear the presentation, and you can click anywhere (provided you can read my handwriting!) to skip to different parts of the talk:

Update: Nate Bolt has put this pencast along with Ben's slides up on the Bolt | Peters site, so you can listen to the pencast as you flip through the slides. Ben's slides are delightfully quirky, however, so I defy you to figure out exactly which slides, in some cases, go with which part of the talk. But you'll get the idea.
Saturday
Sep122009

Analytical Toolsets

Here is the set of tools for analysis of research data that John Payne presented at EPIC 2009. He ran a workshop in which we discussed and refined this process. I was especially interested, as I had come to the same conclusion as John—that there are few who have assembled an organized and comprehensive way to analyze research results. I had begun to assemble a kit of tools of my own:

In my previous post I showed the "Tool Picker" for helping design students decide which research methods to use. The right-hand edge of that diagram containing the list of methods is shown, above. The question: after you use the proscribed set of methods in the field, how do you make sense of what you've found?

I have been putting together a set of tools gathered from my own experience and the experience of others (such as the good folks at the Institute of Design at IIT, Dori Tunstall, Lloyd Walker, Andy Ogden, among others). This is the "Insights : Opportunities" deck we've been using in my Design Investigations course. The intent is that, with the use of a variety of "lenses" through which to look at the data, the conclusions will be more robust. I've been very pleased with the results. Where before, students finished their research presentations with a single slide containing three or four bullet-point conclusions, they are now concluding with ten or twelve slides, each pointing out a viable design opportunity that derives from an insight from the research.

When I saw John's Analysis / Synthesis Palette at EPIC, I was fascinated. He is coming at the same problem from a completely different direction. I am using the metaphor of a group of individuals looking at the research data, each with a different point of view. John is looking at the process itself, and creating, in a wonderfully methodical way, different ways to arrange, sift, compile, deconstruct, and recombine the data, winding up with prescribed directions.

I will be looking over my notes for some time, to decide how I will change what I'm doing based on his approach.

 

Saturday
Sep052009

The Insight : Opportunity Deck

Research is worthless unless it fuels the design process. Once the fieldwork is done, we need additional tools to help us make sense of what we've got. I have been using a variant of the KJ Method (developed by Jiro Kawakita in the 60s, similar to Affinity Diagrams) for years in my course, but recently I've begun to beef up the process by which we analyze what results. I've begun to assemble a deck of analytical aids to help guide students' thinking into areas they might not automatically consider. I've found many methods in use for fieldwork and am developing an aid to reduce the complexity of navigating that decision (discussed in the previous post), but to date I haven't found many aids for making sense of the the analysis process.

In practice, designers always work in a multidisciplinary team and research findings are interpreted by a number of different specialists: designers, human factors engineers, anthropologists—the list varies according to the needs of the project. In student work and also in small design firms, those multiple viewpoints may not exist. The deck consists of lists of questions that we can "ask" the data—questions that an anthropologist might ask, or a cognitive scientist, or an engineer, or a management consultant.


Students stand in front of the wall of data and work their way through the deck, each card acting as a lens through which they view the data. The deck is in two parts: an insight deck and an opportunity deck. The first part helps reveal important insights that might fuel design opportunities. We work slowly and methodically through the deck, making an effort to find—even force—connections between the questions and the data.


The insights are listed, mapped, or arranged in diagrams, as needed. The second deck is used to create and validate the design opportunities represented by each insight.

This process takes two or three weeks, at least. At the end, we link the insights to opportunities for design intervention, seeking quantity, quality, depth, and range: products, experiences, and business models from near term to blue sky, mild to wild. Our aim is to present our clients with a robust set of insight : opportunity pairs, hooking each opportunity to the insight that inspired it.

This is a work in process. Last week at EPIC2009 I took part in an amazing workshop with John Payne from MomentDesign, who showed us an analysis framework he's been developing, and based on that excellent session (which I hope to cover in an upcoming post) I know I will be developing this further.
I'll be presenting this work at the IDSA National Conference in Miami in a few weeks. If any of you are attending, I'd love to have the opportunity to show you more and get your feedback. See you there!

 

Friday
Sep042009

Designing Design Research

By way of an explanation for why I haven't posted lately, this last term was consumed by two projects: finishing the plan for what I've come to call the "tool picker" (above) to help designers new to qualitative research expand their palette of methods, plus a set of analytical tools to use on the research data.
This, on top of a term of research for a multi-term project for the American Red Cross, kept me busier than a dot painter in a paisley tie factory. I'll post more on all of this in the upcoming weeks.

The so-called "tool picker," above, is an attempt to help designers explore beyond a set research methodology. As currently taught (and sometimes practiced), design research is often treated as a constant set of tools and, as a result, students tend to think that it's a standard process. The field of design research has evolved into a complex landscape of approaches, however, and good design practice stays abreast of these developments.

In order to help my students break out of a narrow approach and yet negotiate the complexity of the myriad methods in practice today, I am attempting to acquaint them with a comprehensive and yet manageable set of methods. Also, I need to equip them with an understanding of why, and in which situations, a particular approach would be effective.

Currently, the research approach is chosen by those with expertise. There is a "guru" who brings years of experience to bear on the decision. Is there a way to enable beginners to more quickly gain the experience necessary to know which approach might be best for a given problem?

I distilled the complex set of approaches in use today into a set of eighteen (you see them down the right-hand side of the diagram, above). I will be creating a decision-making tool to guide the students through the decision process by asking a series of questions about what type of knowledge they seek for a given topic.

Starting at the left-hand side with a careful choice of topic, students are asked to generate a research objective statement. We discuss issues of ethics, scope, appropriateness, and so forth, and gain explicit knowledge of the researcher's bias.

Moving on to the decision process (while at the same time generating specifications of which sorts of participants will be recruited and engaging in the recruitment process), students begin to consider the type of knowledge they seek. We consider three general areas of knowledge about the user: what they do, what they feel, and who they are. Moving right-ward through the diagram, you can see how we move into finer levels of discrimination, arriving at a recommended set of methods.

This is a first rough design for the tool. When I first completed this version, I was disheartened at first by seeing that, if one worked backwards through the chart one could see that a skilled researcher could use any of the tools to uncover any of the types of knowledge desired. But I reminded myself that this is a decision tree that helps beginners and widens their view beyond a limited single-thread process. The tool is designed to lead them to the most appropriate choice, by no means the only choice possible. Once they've used the tool for a few projects, they will begin to gain knowledge of the wider set of approaches and begin to see how the different methods work in different cases. Once they begin to see that the tools actually can be tailored to many purposes, they are right where I want them: imbued with a robust working knowledge of the multivariate research process.

 

Wednesday
Dec312008

What I'm Up To

Research wall from Camp Boomer, a three-term research project on Baby Boomers entering retirement, by Laura Dye and Heather Emerson, back when they were my students.
I'm two-thirds through with my MSID in design research at Art Center, and I feel the need to take stock of where I am. I've been teaching design research to product design students at Art Center since 1991, but since my journey down the path of getting this additional degree I have been traveling over some interesting ground. Here's an update.

My goal is to be able to teach product design students how to do credible and effective qualitative design research. Most product designers are at first focused on the methods, like we would be on any set of tools. Give me the tools, and I'll use 'em. I think this comes from how we learn the design process. It is a standard sequence—investigation, problem definition, ideation, concept generation, concept refinement, final design specification. We learn it by doing it, over and over. We expect that any problem can be solved by the application of this process, and for the most part this is true.


The investigation stage, however, has its own set of tools (methods), borrowed from science, psychology, anthropology, etc., and there is no standard set that applies to all situations. It is important to know not only the methods that are out there, but also the rationale behind their application. And nobody has a complete list. For example, Brenda Laurel’s Design Research cites 36; the Design and Emotion Society’s Methods and Tools web site describes 57 (not all research—some of those are analysis); and IDEO outlines 36 research and 15 analysis tools in their Method Cards. After reviewing these and other sources and allowing for duplication, I have found 52 distinct techniques for research and 18 for analysis (and I've only begun to compile a list of those).


Many design firms' initial experience with research is via the hiring of a specialist. They observe the process that that person uses for a particular investigation and assume that that is "the process," (it's as if they think that, like design itself, design research has a universal process applicable to all situations). Some offices then polish up that process, giving it a catchy name and graphic veneer, and add it to the list of their firm's capabilities as a branded form of research, much like they began to offer engineering capability in the 80s. It's a way of making their firms more marketable. In the competitive environment of today's consulting offices, this is understandable and necessary.


The problem is that the research approach should differ depending on the issues under investigation. Good research takes into consideration the entire palette of methods available and chooses the right set to uncover the necessary knowledge in each situation. It's vitally important, then, to understand the rationale behind each choice.


And above all it is important that designers understand that qualitative research is not merely a kit of tools, it is an approach. At its heart is an immutable demand: to understand and have empathy with the point of view of all customers and stakeholders in a situation. In order to gain this understanding one must make smart decisions about which methodologies to employ. [I use the term methodology to mean the tool, or method, plus the rationale behind using it.]


So my goal is twofold: first, to acquaint my students with at least a basic set of methods, and second, to enable them to understand why, and in which situations, a particular one would be effective.


I continue to teach my course the way I've done it since 1991: using the time-honored project-based learning we're accustomed to—learning by doing. The students engage in fourteen weeks of field research and analysis (in some cases, more than one term's worth, as in Laura Dye and Heather Emerson's Camp Boomer project, above), culminating in a research presentation. They choose the topic and I advise them on approaches that would be effective. The problem with this is that the students, like the consulting firms I describe earlier, often come away from the experience thinking that there is one way to do research.


To remedy this I have added a theoretical component that teaches the wider range of methods and their accompanying rationales. A survey of the methods is followed by learning the principles behind their application via the case study method. The cases are written specifically to teach design research, and each case centers on important axioms. Much like the case study method pioneered by the Harvard Business School, the cases provide opportunities for students to engage in discussions centered on the decision process involved. Instead of discussions about management theory, the cases I am writing focus on the decisions necessary for planning research activities. A range of cases allow students to act out the planning process—and choose approaches—for research that would apply to a variety of design problems.


So far, I've got that long list of methods and am working on descriptions of each of them (broken down into: a brief description, an example, the objective, the procedure, the rationale, advantages and limitations, and citations of references where one could go for more examples, papers by those who have used the approach, etc).


I've got a few simple cases that I have used to teach basic axioms, and am working on some larger ones with research specialists from a couple of well-known firms. Both are excited about my doing this work, and although it's a tall order to flesh these out, it will be worth it.


While I started out like many product designers, focusing on finding "the right kit of tools," I have come to realize that the so-called tools are only a means to an end. What really matters is how smart you are at analyzing what you get from using them, and figuring out what it means.

 

Page 1 2