By the late twentieth century, as personal computers—and then the internet—changed the way people interacted with technology, scientists had begun to pay particular attention to something they called human-computer interaction (HCI). First written about in the early 1980s, HCI employs a combination of psychology, computer science, and design to evaluate the two-way relationship between people and technology—not just how people use technology, but how technology might change because of how people use it.
The 1968 film 2001: A Space Odyssey depicted human-computer interaction between the HAL 9000 onboard computer and astronaut Dr. David Bowman (Keir Dullea). While the year 2001 has come and gone, some of the HCI technology dreamed of in the movie is still quite advanced. Image credit: Wikimedia Commons
HCI is one of the first academic specialties that fed into the modern user experience job market, and now that people can focus their studies on UX or interaction design, it seems, to many, to be a legacy. But in fact as our understanding of what makes a computer a computer continues to evolve, it may be more important than ever.
To learn more about HCI’s place in modern UX, we talked with Jusine Aylmer, a UX Designer at Portland, Oregon’s Emerge Interactive, who earned a master’s degree in HCI at Carnegie-Mellon University in 2009.
Want more content like this?
Sign up to get our weekly newsletter
+ a PDF copy of this report.
This conversation has been edited for clarity and length.
Jillian: What made you want to pursue your master’s degree in HCI?
Justine: After college I worked for a couple of years as a tech consultant, doing professional services for a software company. In that role, you’re tailoring software for what a customer needs, but you have no influence over capabilities. It’s frustrating. You see deficiencies but there’s nothing you can do.
I decided to go back to grad school based on what I saw every day working on software applications that weren’t great and I wanted to make them better. My undergrad studies were split between computer science and art. It seemed like UX was a nice coalition of those subjects: it employs a little design, and you’re creating useable and useful applications to make people’s lives better. Nowadays there are lots of academic programs specifically in UX and/or interaction design, but when I went to grad school a decade ago there were far fewer options.
Jillian: How do you explain HCI to people who don’t get it?
Justine: Something I read once that made sense is that HCI is an academic field of study that feeds into UX careers. Someone compared it to how you study computer science but become a software engineer. You apply HCI techniques to a job doing UX. UX is an overall field, not really a discipline.
Our mantra in grad school was “I am not the user.” We’re never going to get it right if we’re doing our work in a vacuum. Our success hinges on the research.
People who pursue HCI usually have a background in psychology, design, or computer science. If you’re a designer, the program isn’t turning you into an engineer or psychologist, but you should understand how they apply to your user experience work.
Jillian: Computers are created by and for humans, so why is it important to think of humans and computers as having a direct relationship?
Justine: Because we can’t just focus on the human side that produces computers; we also have to look at the users consuming the technology. It’s important to keep in mind the fact that—and this especially relates to software engineers who are always super smart people and often think they know the best way to do something—the whole point of UX work is knowing you don’t know how to do something and doing the research to come up with the right solution.
Our mantra in grad school was “I am not the user.” It’s so important. We’re natural problem solvers, but we’re never going to get it right if we’re doing our work in a vacuum. So much of our success hinges on the research. What are users trying to accomplish, what are their environments? What’s going on in the background while they work? Factors like these are important and often overlooked.
Jillian: So what’s the research process that gets you there?
Justine: It totally depends, first and foremost, on budget and timelines. How much time and money do you have to do it? Do you do in-person contextual inquiries? Do you do phone interviews? Remote sessions?
Ideally, for each project you’re working on, you interview at least five people. More is obviously great. Less than that, you’re not getting comprehensive data.
Then, it’s a matter of creating interview questions, making sure you capture the right information. There are lots of methodologies for extracting information.
HCI is definitely old, but I like the legacy aspect of it. It legitimizes the UX field because it’s something based in history and academia.
Jillian: How do you analyze your research?
Justine: The most common method I use is affinity diagramming. I take a lot of disparate information I collected over the research process, put it all on Post-Its, put those on a wall. I’ll move Post-Its around and see where groupings and themes emerge from sound bytes. It’s pretty iterative. I spend a lot of time putting thoughts together, and then deciding they go somewhere else.
I usually come out with overall themes, and key nuggets within each theme by looking at where user feedback overlaps and where it doesn’t. That makes it easy for clients to consume in a research report.
Jillian: HCI first came on the scene in the 1980s, when many people did not even own home computers and smartphones weren’t even dreamed of. Knowing this, how has HCI changed as our definition of “computer” has expanded and evolved, for example with the introduction of mobile tech and AI?
Justine: HCI is definitely old, but I like the legacy aspect of it. It grounds it. A lot of the UX world is buzzword-y, but HCI legitimizes the field because it’s something based in history and academia.
One of the people I follow is Jakob Nielsen, one of the leaders of old-school usability consulting. He established heuristics for good usability design, ground rules you should cover if you’re trying to design well. They defined these in the 1990s, but I use these principles all the time. They capture very fundamental aspects of what makes something usable. There’s something to be said for not throwing out everything that’s old.
It’s an interesting transition we’re going through. Obviously things are transitioning to mobile. We have multiple devices. We’ve been presented with this new challenge: how do we effectively design for all these experiences, and the ones we aren’t even using yet? This is an interesting time because we’re learning how to accommodate different types of design and environment and screen sizes—watch, phone, tablet. They’re still limited but they’re expanding, and the more they expand, we’ll see that design can’t be based on screen size. We have to be ready to accommodate whatever is in the future.
This is an interesting time because we’re learning how to accommodate different types of design and screen sizes—watch, phone, tablet. We have to be ready to accommodate whatever is in the future.
Jillian: A recent research study observing human-robot interaction revealed that interaction was more positive when the robots made occasional mistakes. Although one of the values of using computers in a business setting is that they remove the risk of human error, could the success of human-computer interaction in the long term hinge on making machines more relatable instead of more perfect?
Justine: There’s a fear that people have of the inevitable robot takeover. I think that’s legitimate. There’s healthy anxiety—this is really cool, but what are the consequences of me adopting this technology. Things like incorporating the ability to make mistakes into a machine that isn’t actually capable of making mistakes is a pretty genius way to make people feel more comfortable. And then down the road you can take it away. Measures that can reduce anxiety are important up and down the line. Think about an error message on a website that doesn’t load. The more you can personalize that, these computer processes can be humanized, more friendly. Research shows that people have more positive interaction when it’s not just jargon.
Representation is also important when we think about creating the future. When you don’t think about how technology impacts all users, you get huge snafus.
Jillian: How do you use your background in HCI to help think not just about user experience in current technology, but in what comes next?
Justine: It’s a challenge always. The famous quote is Henry Ford: “If I had asked people what they wanted they would have said faster horses.” If you’re just asking what people need now, you’re not getting to the actual needs. It’s more the process of observing and understanding pain points people don’t know they have. That’s the holy grail. You observe somebody going through their process and it’s tedious or hard and they don’t realize that there’s a way you can take away that pain point.
The only way you can do this is to observe people’s behavior instead of just seeing how they use specifically what you’re designing or redesigning. Beyond that it’s tricky. You can kind of guess.
I’ve always been interested in FUI—fake user interface, or fantasy user interface. It’s what you see in movies where someone invents something fun and you can say, “wouldn’t it be cool if…” It’s nice to see where someone’s imagination can take tech without being grounded in whether it’s useful or possibility. The advertising technology depicted in Minority Report can influence where people think they should be in 10-20 years, and that influences research and design.
Representation is also important when we think about creating the future. When you don’t think about how technology impacts all users, you get huge snafus, like the facial-recognition technology that couldn’t recognize black faces. That’s why it’s really important to create user personas and to design for that persona and to be true to that persona.
When I was designing for oil and gas, the reality of it is that most of the users were men. Colorblindness is more common in men, so we had to take it into account when we designed. We needed to make sure we had no visual alarm indications that relied solely on color. More representation is better always.
The Future of HCI and UX
Computers are everywhere. They’re not just on our desks or in our pockets: they operate our microwaves and adjust our thermostats. Cars, Aylmer said, are “just giant computers at this point.” And with the widespread adoption of smart home technology and recent leaps in virtual reality, computers will become more a part of human existence than ever before—so it will be more important for designers and engineers to think about how people interact with computers than ever before. HCI may have roots in academia, and it may predate modern technology, but it’s an important force behind UX professionals who are focused on the future.
Want to contribute to User Interviews content? Here’s how.
Get "Fresh Views," our weekly newsletter, delivered to your inbox. We feature interviews and point-of-views from UXers, product managers, and research nerds of all stripes.Subscribe
Sometimes, to learn how to do better UX research, you have to fail
Become more childlike to unlock deeper insights