“”&noscript=1(35 B) https://ct.pinterest.com/v3/?event=init&tid=2613280411624&pd[em]=&noscript=1 ” />

Awkward Silences

Laura Klein on Building Products That Don't Cause Emotional Trauma

Sometimes, big tech does things that actually end up emotionally harming their users. How do we do better?

Carrie Boyd
/
January 30, 2019

It all started with a tweet.


JH and Laura were talking about this article, written by Gillian Brockell, about being served maternity ads after a stillbirth. It’s a heartbreaking story, and one that many internet-users can no doubt empathize with. Algorithms don’t always handle “edge cases” well, and the people who design them can have too broad a definition of edge case—stillbirths, miscarriages, and other life tragedies are actually relatively common.

So how do the people who are responsible for building things, often many many people for any given human experience, design for the possibility of unexpected outcomes?  That is to say, for real life? JH and Erin thought this was a pretty interesting and important topic, so we took the conversation from Twitter to Awkward Silences.

[1:10] How we got here.

[3:50] We ask the big question. Why do bad things happen?

[6:10] Why don’t pregnancy apps account for miscarriage?

[6:56] The hippocratic oath for researchers.

[9:21] “Edge cases” happen.

[11:15] The consequences of short-term thinking.

[12:41] Everyone makes mistakes, and we can learn from everyone else’s mistakes too.

[17:44] The internal mutiny to do the right thing, like the Google Dragonfly petition.

[18:56] Is the designer morally responsible for the actions of the product?

[21:41] Apps that do better by their users (hopefully) will do better in the market.

[24:43] How do you make a difference as an individual?

[25:28] The history of weekends

[27:00] The power of telling your user’s story.

[29:16] How JH helped make users happy at Vistaprint

[33:25] The difficulty of separating yourself from some of the big companies

[36:32] We’re all gonna die, how does tech deal with that? (hint, not well)

[40:21] We wrap up. Laura’s not mad, she’s just disappointed.

Want more content like this?

Sign up to get our weekly newsletter

+ a PDF copy of this report.

Subscribe to Fresh Views for weekly research insights and inspiration.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Of Course “Edge Cases” Happen

What percentage of Facebook users will die one day? 100%. Yet, Facebook is still struggling to sort out exactly what happens when users pass away, and how their online presence will interact with those they’ve left behind. The mishandling of this unavoidable use case is traumatizing for users who are still actively on Facebook, yet is something—OK one of many things—they are still working on getting right.

While death is an inevitability, not every “edge case” is.  Enter the gray areas where most design decisions are made. There is a lot of human impact potential in the space between always and never, and many edge cases deserve more attention.

Part of the problem is we don’t know what we don’t know. If we can’t imagine possibilities outside the “normal,” majority, or perhaps most please to think about, how can we design for them? Creating more diverse product and research teams, who bring different perspectives to the work, can help alleviate some of this. Doing research with more diverse sets of users is also a good way to bring more “edge cases” to the forefront of the research cycle. More voices and more contact with people who use your products is going to improve your connection with all kinds of users, creating more opportunities to think outside whatever personal or organizational boxes we have to make truly great products.

Move Fast, Break Hearts

In the world of UX, product, and design, we’re used to moving fast. Design sprints and constant iteration have led to some great results for many teams. But the problem with constant short term thinking is potentially overlooking a scenario that damages your user’s perception of your product for good.

The upside? Approaching research as an open-ended question, rather than a means to an end, allows you to investigate the way your users interact with your product or service more consistently. Sprints, constant iteration can be super helpful here, in constantly talking to users and building a more robust, diverse understanding of possible user outcomes over time.

It Starts With Stories

So what do we do about all this bad emotionally damaging UX? For people who aren’t the final decision makers, sharing stories can go a long way. Most people in this field are capable of empathy, and want to make things right by the user. The problem is, when it’s just a number on a page, it’s pretty easy to ignore or brush off. “Oh? It’s only 5% of our users? We should focus on something that affects more of the population.”

But, if it’s a big enough problem or is causing significant pain to that 5% of users, it’s still worth talking about. Involving more of your stakeholders in the research process is the easiest way to make people aware of the problems users are facing. In the pod, Laura talks about a session in which an engineer who was sitting in fixed a long-standing bug on the spot. Watching the user struggle was just too much for him to watch, knowing he could fix the problem easily and quickly.

However, bringing stakeholders to the meeting isn’t always possible. The one thing that is always possible? Bringing that story to the table. And tweeting, of course.

About Our Guest

Laura Klein is the Principal at Users Know and the author of UX for Lean Startups and Build Better Products. She hosts her own podcast with Kate Rutter, What is Wrong with UX?, which goes best with cocktails.

When researching for this article, we found this awesome episode of Reply All about other ways in which the internet can suck. If you found this episode of Awkward Silences interesting, you may like this episode of Reply All.

Transcript

Erin: This is Erin May.

JH: I'm John Henry Forster and this is Awkward Silences.

Erin: Silences. Hello everybody and welcome back to Awkward Silences. It's 2019. It's going to be a fantastic year. We're here, very lucky to be with Laura Klein, the author of Build Better Products and UX for Lean Startups She's also the principal at Users Now and they have an awesome podcast, continuing our trend of getting all the best podcast hosts in the business to join our little podcast. Thank you Laura so much for joining.

Laura: Thank you for having me.

JH: Welcome.

Erin: So speaking of JH, welcome, welcome. This whole episode started on Twitter as all great things do. It started with the back and forth tweet about maybe you've seen this viral tweet. A woman wrote an open letter to the big tech companies. She'd had a terrible experience getting served some ads that were not only not relevant, but were really terrible. JH, you want to tell a little bit more about the story of how we got here?

JH: Sure. I think we almost fell into the trap of social media somehow turning allies into enemies but we've avoided that. Basically, I think people just ... I guess what we started to dig into is the underlying cause of why that happens. Like why do these bad experiences pop up in these big platforms. As we peeled that back, I think there's a handful of layers to it and it felt like something that was worthy of a bigger discussion and not I guess it's 280 characters now. Either way, the brief back and forth is a little tougher to dig into a topic of that nature. Laura and I were chatting and decided it'd probably be better to actually explore this with some richness in some context.

Laura: I think you're underselling it a little bit. I think that the true story here is that I posted something a little bit aggressive. You responded with something thoughtful and I attacked you for it. Again, apologies for that. I think you're right. I think that we are actually mostly on the same side of this and I think it's an important conversation to have.

JH: Either something tough about social media in the sense of I think there's a lot of things that everyone will agree on, but the amount that you care about it varies from person to person and sometimes you have not like totally on the same level about on the level of intensity on a given issue. People can snap at each other a little and I think that's across like any issue. Like when you look at like all the social justice stuff or any not to go like crazy hate on all these things, but it happens across the board and it's actually like, I think a lot of these people believed the same things. Like are fighting for the same causes, but for whatever reason that format just makes it hard to see eye to eye sometimes.

Laura: It does. Part of it is because you'll get similar comments from folks who don't necessarily agree with you because it's so short. It's such a short thing. I think it's very easy to have an argument with somebody else, which is what often happens for me is that I'm not having the argument with you, I'm having the argument I think I'm having, but it's really with somebody who responded separately.

JH: There's a lot of strong men out there. To try to frame it as like a big fuzzy question the point that Julian was getting to about her son's stillbirth is social media and especially like some of the bigger platforms are about connecting people and sharing life moments and the sad reality is that there are unfortunate parts of life and bad things do happen and for whatever reason, these platforms don't seem to handle this well. We wanted to explore like is that a ... they're not doing enough research, they're not talking to users to know about these cases. Is it a short term business decision where there's prioritizing money over the wellbeing of users, is it something more complicated and it seemed like a topic that the three of us could explore.

Laura: I'll jump right into it. I think it's all of those things. I think its lack of diversity in the room. I think they're making tradeoffs explicitly for money without really understanding the impact that they're having on people. I think its lack of user research. There are probably a dozen other reasons for it to happen. My goal, since I talked to a lot of product managers and UX designers, my goal is to reduce that a bit as much as possible. I would like people to think more about the humans that they are potentially hurting with their products.

Erin: When you say reduce that, like what is that? We're talking about bad user experiences, we're talking about emotionally traumatizing, we're talking about inconvenient. What is it for the purposes of this conversation that we want to talk about trying to make better.

Laura: I think it's the really severe harm and coming to see baby psychological harm as being pretty bad, especially when somebody can't avoid it and trying to reduce the number of times that happens to people as much as possible. I'd like to get rid of them entirely. I would like it so that I'm not unintentionally served ads or quote features that cause me psychological trauma. I don't know that it can be avoided entirely. I think that there are some cases like the one where they're serving a bunch of be related ads to somebody who just lost their baby. Some of those could be avoided, more of those could be avoided if we even thought about that as an issue. There's another story that's been going around a lot.

Laura: People have been looking at things like period trackers and talking about how and pregnancy trackers and things like that and how they just don't account for the case of miscarriage which happens all the time and it just isn't in them, just not an option and people are struggling with that and it causes them problems and it makes them very upset. I think it goes a little bit beyond. I think of it like a bad user experiences. I tried to log into something and it's confusing or I try to use it, I tried to use a product and it doesn't work the way that I think it does. I don't think of it as like I'm going to be reminded of one of the worst things that happened to me.

Erin: It's almost like a Hippocratic oath for designers and PMs, do no harm, do no psychological harm.

Laura: If you can't, I mean a lot of it is this over reliance on what I'm going to call sort of stupid AI or stupid machine learning or just stupid computers trying to pretend like they know what you want based on your past behavior. You're going to end up with this if you rely entirely on machines to make these decisions for people. I don't think it's a good experience.

Erin: How do you train the machines if the humans aren't even thinking about it?

Laura: Exactly. That's a whole other set of podcasts. If you've got people who wouldn't even think through this, the machines certainly aren't going to.

JH: I see it as two big hurdles. One I think people, designers and product people and researchers, I think they probably know about many of these things, but I think they probably underrate how many people it affects and the severity of that experience. I think they miss the mark maybe on two angles. I think the other piece is if you're able to get over that first hurdle and do the research and kind of quantify how many people are impacted and how serious it is, to the point do you want to take action, then I think it's hard to make the case that's going to pay off in the long term for whatever business or metrics you need to optimize in the shorter term.

JH: I think it's like a two prong. You're like two hurdles to get over. There's the first and just like actually understanding and appreciating the scope of the problem and then the second is actually being able to mobilize people internally to want to address it. I think that's part of the reason maybe it's so difficult. I'm not sure if that's I'm being too reductive and it's a simple way of saying it, but those feel like the biggest two things in my mind as I thought about it.

Laura: Yes, I agree. I think that those are definitely two huge things. There may be a little bit even more of a cascade of suck there. I think there is the they don't even think about it. I think there's the we don't have the right people in the room and we're not doing the research, so we don't even know that these, we don't even think about these things existing. I've certainly been in design reviews where I have brought up what I thought were completely obvious and we can call them edge cases. Edge cases happened quite a bit. It's not a complete black swan. It happens quite a bit. I've brought up what I thought were obvious edge cases. Everybody could go, could that possibly happen. Yes, of course it could happen. It happens all the time. How do you not know this, because you haven't left the room.

Laura: You've got they don't even know its happening. You've got the they know it's happening, but they sort of undersell how bad it is, like, well, what will they do. They'll stop using the product, which honestly is maybe a perfectly reasonable answer if you're a startup. It's probably not a reasonable answer if you're Google or Facebook. Well, although a lot of people are now stopping using Facebook for similar reasons. Then you've got the we know it's a problem, we know how bad the problem is, but we don't really know how to fix it. Then there's the we know it's a problem, we know about the problem is, we actually know how to fix it, but we can't convince the person who's in charge of making the decisions that it's important because it makes us money. I feel like almost that last one and the funny thing, I mean it's not funny, but the sad thing is that that last one I would say is true of all sorts of things in UX.

Laura: That you've got UX designers and product managers who do care very deeply about a problem, any problem in the product and they can't convince people to address it and that it's a big enough problem to fix. They just don't even know how or they don't try because they want to stay in their lane or they're just all of people that they're dealing with are just sociopaths.

Erin: The old sociopath.

Laura: The old deal sociopath argument. I think most people aren't. I just sometimes people are making tradeoffs, making what I consider to be bad tradeoff.

JH: I think we just work in an industry that really values short term thinking. I think if you extend the time horizon on when you're thinking about the implications of some of these decisions, it's easier to make the case of if somebody has such a bad like marker moment is a term Erin and I were using before. Such a bad marker moment that that's like the thing that they're always going to think of when they think of your product or service, it's not hard to like paint the picture of, well eventually they stop using it. When anybody asks them about it that's going to be the first thing they tell and it's going to sound horrible and you can see it cascade and start to unravel.

JH: I think there's just such a short term focus and a lot of the day to day because like we do in a lot of ways we praise that and not technology. We like the design sprint and the faster feedback loops in being more iterative. Like our muscles are all very short term focus when we try to zoom out and be like, well, what happens two years from now for some of these users? There's not as much of that day to day in terms of some of the conversation.

Laura: We could move fast to break someone's heart. Fantastic.

Erin: Right now I think we could mend their heart fast too.

Laura: That's harder it turns out.

Erin: Sure. All I'm saying is if these companies are actually taking these rapid feedback cycles seriously and they can see the damage they're doing and they care about it and that cascading funnel of things that have to be right to do something is in place. You could in theory do a sprint around solving some of these issues.

Laura: I would take that. I would absolutely accept that and I will I am the first person to say, I personally am not infallible. I know that's hard for me to admit, but I screw this stuff up all the time. I miss stuff. I feel like I've been doing this for long enough and talked to enough actual humans that I miss less of it than I used to, but I know that I've missed stuff in the past. I know that I missed stuff now and I think you're 100 percent right. I think that the important thing there is to respond to it and we don't all have to make the same damn mistakes. We can learn from other people's mistakes too. Like now that this one is out there in the world so we can go, we're not going to make that particular mistake again

JH: For sure. To get to the business piece, I know you've talked a couple times about it's going to cost us money or whatever. I think, I'm not sure I entirely believe that I didn't mention this upfront, but part of the reason I was engaged in this conversation is my wife and I lost a pregnancy over the summer and it was obviously a difficult experience, probably the most difficult experience of our lives. What happened afterwards was leading up to that, since we were thinking so much about having a baby in our lives and being really frugal and saving for childcare and doing all this other stuff. We spent a lot of money on stuff in those following weeks in retail therapy. We had a baby moon plan that turned out to not be a baby moon and we went out to some very nice dinners and we bought ourselves whatever we wanted.

JH: I'm not saying that the retail therapy is the solution, but if these people were able to quickly adapt the AI and the machine learning that's trying to serve up these ads and stuff and somehow gracefully navigate that, like there actually may be a business case to be made that not only could you avoid this very traumatic, horrible experience for the user, you might actually be able to, it sounds gross, but like profit off of it in like in that moment. I think there is, if you actually get to know these use cases well enough, like I think you can make a business case and it doesn't have to be purely framed as let's lose money to do the right thing because I think that's a tough sell.

Laura: It is. First of all, let me just say, I am the last person to judge anybody for retail therapy for literally anything. I heartily endorse it. I think it's a fine model if you can afford it. I'm all for it. I want to separate out two different things here because I think those are really important points. I think the point you're making is extremely important that there's the potential that caring about this stuff actually doesn't lose you any money. Both in that it could prevent loss just of users. If you're actually sensitive to people's needs and users and you actually make them happy instead of incredibly sad that you will end up with longer term users. I think that that is often true and so I think that's great.

Laura: I agree, like I don't think of it as profiting from this horrible thing that happened to somebody, think about it as profiting from being sensitive to your users actual needs and feelings and doing the right thing for them, which I'm okay with. There's another thing I want to bring up though, which is I think as companies, this is an entirely an opinion thing as companies and as humans, we may need to say it's okay to lose a little money in order to not hurt people in a particular way. To not hurt people that much. I was saying before, one of the things I want people to do is I want people to have to actively state I know this is going to harm people and I am going to do it for money. You are welcome to do that, but what you are not welcomed to do in my opinion, is deny that that is what you are doing.

Laura: I think that we sometimes make it a little too easy to abstract that away. Like it's five percent of our user base. Okay, what are you willing to do to that five percent of your user base for money. I may draw the line at a different place than you do. That's fine. It's not great, but it's a thing that I think that we should have to actually ask ourselves as people.

Erin: I think we always want to go to the you can have your cake and eat it too. You can do the right user thing and make money. What about the hard truth.

Laura: That's great.

Erin: That's great. Of course, if you could have it all, if you can win-win everyone to try to do that all the time. If you have to make the choice or the big companies going to do that and if not whose responsibility is it? Is it the individual contributor, excited designer with a conscience, is it the squad lead who if not the companies themselves makes that choice. Is it an internal quiet you need to do the right thing one experience at a time.

Laura: One internal mutiny to do the right thing.

Erin: Like how does that actually happen? What does that look like?

Laura: Well, I think we're seeing some of it honestly. I believe the Google trends product project got shut down because people internally said, no. I think we're seeing more of that. I always hate dumping all of my generation's failures onto the next generation because I feel like that's a total cop out and I'm still out here working on it and trying to get better. That said, I am extremely optimistic about the new generation of workers who are coming in and actually seem to care about this stuff and think about it in ways that we didn't when we came up with, we'll just put everything on for free and then benefit from advertising. I'm optimistic about that and I'm optimistic that individuals are doing it.

Laura: I don't think like the company is not going to do it because the company doesn't have a conscience. Individuals within the company are going to do it. Honestly, I've always hated this idea that like, the designer's in charge of being the advocate for the user because I haven't got let's literally everything in the organization off the hook and lets them just ignore the designer. That said, we're often the ones who are the closest, designers, PMs we're the ones who're the closest. We're the ones who know about these things, we're the ones who have the ability to tell stories, to make these things real to the people who do get to make the final decision. I get it. You're a junior UX designer at whatever you don't necessarily have the ability to go, no, we're not doing that. You can tell the story, you can bring it up, you can make it a thing. You can build a reputation for caring about this shit. It's not always going to work, but you'll feel better about it by yourself I think.

JH: I think it what's difficult about this and what's difficult attack, is that like, it feels like tech companies were supposed to be different and there was this do no evil, like a lot of empathy stuff that comes with it compared to like the stereotype of like the big faceless corporation in the past of like dumping chemicals into the river or something.

Laura: As best as in baby powder.

JH: Right. It felt like these products were purely for good. I think the last few years have been a little bit of a reckoning of, well, they can be, but they can also be misused and there can be abused or there can be, “edge cases” like we've been talking about. I think there's a little bit of just like a general awareness amongst people who work in these companies that you do need to be mindful of these outcomes can be negative. I think you're right that there seems to be a growing awareness of people who are working for various companies and trying to be the vocal change on some of these issues and maybe that is where it starts.

Laura: I hope so. I really do because I don't want to blame all the social media for it. We're having this conversation because of social media. I've met marvelous people through social media. I've built my business partially because of twitter, honestly. I think all of these things can be used for good. I really want all of us to stop thinking of ourselves as immune from possibly causing harm because all we're building this tools and have to actually deal with the fact that we're building things that lots of people use and some people use badly and that we sometimes do things that are terrible for certain groups of users and we need to fix that when it happens because we're responsible.

JH: Sorry, I was going to say just because I think some of the tech companies like they grew so quickly, I do think there's a little bit more pressure and like accountability of like you gave the example of pregnancy tracking app that doesn't account for miscarriages. I'd imagine a pregnancy app that does handle that case is going to win in the market. Like in terms of like the win-win stuff. I don't want to be the naive person who keeps coming back to that. I think there are cases like Facebook built itself a huge following really quickly, I'd imagine that there the risk of losing that, following with the same speed if things don't improve. I think there is some notion of win-win if you do think of a longer time horizon where the business drivers and the user drivers can be aligned. Maybe not always, but I guess I'm going to be the optimist and think that some of those things will win out over time.

Laura: I think that can be true. I think it's more true in products where the end user is also the payer, which is not true of most apps and social media. The end user is not the person who is paying the bills in most of those cases. Some of that is our fault as consumers that hey, we just expect to get everything for free. Some of that is just the nature of how you build giant networks of people, you can't do that by charging them initially. Part of that is the stupid lack of business model of the early web that has sort of carried over into now and it's not causing all sorts of problems. I think that's possible if you're selling a product, but most of these folks aren't selling a product. They're selling access to user data.

Erin: People have been talking about the millennial flee from Facebook for a while and yet the stock prices, it keeps going up into the right and looking good until recently. I'm personally not optimistic that that's going to mean Facebook is going to start doing the capital R right thing. We'll see. You're right, I do think people are making these tradeoffs of convenience and where their networks are and what they themselves think is right as consumers all the time and just as a business gets the cost to rip and replace an existing SAS technology, it's the same with consumers. I don't think we're going to like in mass flea, Google, Amazon, Facebook and Apple because of some. To return to the question of kind of the individual designer, the individual PM or the teams within these massive platforms and the impact they can have and the example that you'd shared on twitter, the open letter from Gillion.

Erin: She's mentioning at least Amazon, Google and Facebook as part of this cross platform, data, personalization of ads, situation that she's part of and most of us are. When you're one person in one company that's part of that, how do you even, like, again, let's say I care, I can imagine, I recognize that this could happen. Now, we're assuming you're a person that could do anything and you talked about telling stories. What do you do it so big.

Laura: I think there are a lot of options. Remember at the beginning of the last century, we didn't have weekends. Like that was weekends we're not necessarily a thing people.

Erin: The weekends.

Laura: We got them because people work together, unions specifically work together and said we need limits to the amount of time that we're working and we need to not let eight year olds put their hands into unprotected machinery and did all great things that we take for granted now. That like, well, of course that's a ridiculous. Wasn't always just of course we're not going to do that terrible thing to our workers. Those were things that people banded together and thought for. We can do that to protect our users. We don't just have to do it to protect ourselves. We can do it to protect our users. I think that's the big hard thing. That's if you want to be a union organizer. I think not everybody does. And I get that and I respect it.

Laura: That said, I think that we can bring these things up constantly to the point of being irritating. We can make the people who make the decisions have to confront what they are actually doing. One of the things that I found extremely useful on a much smaller scale, just showing actual user problems was, video of people experiencing user problems showing it to, in my case it was the engineers were deciding what to build, but showing it to them and saying, this is what you're making people do, this is what you're making people go through.

Laura: I remember bringing them into user research sessions, having them sit behind me. I had one because we worked in a continuous deployment environment where one of the engineers just fixed a bug during the test, during the usability session because you just couldn't stand seeing it happen. It made it real for him. That bug had been around by the way for like four years. Like don't underestimate the power of just making the people who make these decisions aware of the problems. It's very easy not to be. What have you guys seen work? You folks, what have you seen work?

Erin: I think, obviously we're huge believers of the power of qualitative research and stories as part of that. I think going upper down the hierarchy, visual best case. If it can be in there and part of the session, whether remote or in person, even better sort of that audio quotes, but bringing the stories to life outside of texts on paper, I think you're absolutely right. I think ignoring the sociopath class, most people are-

Laura: Very small percentage.

Erin: It's a small percentage.

Laura: Like 10 percent of CEOs, that's my understanding I don't know.

Erin: Most people aren't capable of empathy, but you have to make it easy for them to see what's actually happening based on a system that they're part of, if not solely responsible for. I think absolutely that works and, and telling those stories over and over again until there's a tipping point or something.

Laura: It's not easy and you do have to take some personal risks in your job and of course I'm never going to encourage anybody to do that who absolutely cannot get it. I sympathize but and yet good job market for UX designers.

JH: I think what individual team or an individual designer can do right is, I think you guys mentioned it, but bringing in the user stories, whether that's video or actually having a real user come in and talk to folks is a pretty good lever. I think you can also find smaller experiments where maybe you try to fix this on a very small slice of the population, whether that's a percentage or a market or whatever and improve that the impact is neutral or show what happens when you start to address some of these things.

JH: I worked for a company for a long time called Vistaprint, which grew really fast on a strategy of basically free, so free business cards. It became an incredible marketing machine. You get people in the door and these advertise for free business cards. You charge them if they want a nicer paper stock, you charge them if they wanted to back, to print something on the back. You charge them a lot for shipping and so they to the card after spending all this time designing the product and reluctantly order it because the sunk cost of I spent two hours designing my card and now it's more than I thought, but well I'll buy it.

JH: At some point the growth started to slow and the business was smart enough to look into it and realize that what happened was we just had a huge leaky bathtub where people were not pleased with that experience and so they'd make the one order but they would never come back. Despite the marketing machines effectiveness, we were running out of people we could continue to acquire on that strategy. I was part of a small team that was asked to like use the Canadian market to try a different go to market method with normal ecommerce practices and lower discounting and free shipping.

JH: We were able to prove that like we could over the course of six months, it took a long time that we could make it work and at first it was revenue and profit neutral, but over time it became compounding. It snowballed into something that was actually like a winning strategy and it rolled out across the organization. I think that's a little bit of an exceptional case but like I do think you can make small inroads and bring in more stories or run more experiments and start to like plant the seeds and chip away at it.

JH: It's probably not always the most satisfying work or it's probably not always feeling like you're making progress but I think there are ways to do it and I do think it starts with bringing the users to the forefront and making people aware of it. To your point earlier, Laura of like making at least people say we're not going to do this and we are going to ignore this person and at least be explicit about it and instead of it happening behind the scenes or without being said.

Laura: I think that's a great story. I think that's fantastic just overall UX design way of going about things. I think that we should do more of that. We should all do more of that with everything. It's a good example of what you said earlier, which was if we actually make the user happy instead of trying to trick them, we will in the long run make more money. It is both a good user experience decision and also a good business decision. I agree we should 100 percent be looking for those things. We should be looking for those wins all the time. What's good for me, what's good for the company, the company still pays the bill also. It's not all about sacrificing the company's money to make people happy. We can't do that all the time.

Laura: I think that's a really good way of approaching things. I do think that when you work at some of the larger companies that are used to really as a utility by people that that comes with an extra amount of responsibility. If you don't like Vistaprint, you can go to a half dozen other places and get your cards there. It's not unreasonable if Vistaprint did make a decision to say, well, we're just not going to serve this particular set of users, that's fine, we're willing to make that sacrifice. If you're Google or Facebook and you're making that same decision, it's a different decision. It really is. This is why monopolies are dangerous because it's very hard to get away from those companies.

JH: It's a good point. They are approaching utility status, which is a different beast in and of itself.

Laura: I don't want to get into the whole government regulation certified. That's again, whole other set of podcasts and I don't know enough about it to talk about it.

JH: Our running joke is at some point this, this devolves into a politics podcast.

Laura: Yes, it does and an economics podcast.

JH: Yes, I do think that what we were saying of the tech economy right now is very favorable to employees. I think users have more ways of being outspoken and amplifying their experience. The fact that this one woman shared the story and it has since been consumed by probably hundreds of thousands of other people is pretty incredible. Ironically, on the tools that also created experience. I think the inside out pressure of you're going to have pressure of maybe difficulty retaining employees if they don't feel like they're working on something that is, a mission they can believe in or supporting users in the way that they want and they're going to have other opportunities. I think it is maybe hard for some consumers to untangle themselves with some of these services, but they can certainly probably use it less if they can't stop using it altogether. I think when you have pressure on both fronts, I do think there is hopefully something there that people are paying attention to and recognizing is in their interest to address. I guess we'll see.

Laura: I know I shared this story. I got off Facebook entirely, I don't know, sometime last year, maybe this year I just cancel the account or whatever, deactivated the account. I had actually stopped using it really several years earlier when it kept reminding me that I should be connecting more with a couple of friends who were dead. I had literally no way of making that stop. I just couldn't. They didn't even give me the ability to say, please stop reminding me about this person. I would love to still be friends with them, that would be great but it's not an option. I think you're right. I think that people do will eventually stop using this stuff. You can decouple from some of these systems is the thing and more and more people well. I wouldn't be happy with myself if I were building a system that did that to people and I don't have to, so I don't. I feel like that's true of a lot of us right now.

JH: I think the employee retention piece is maybe an underrated piece of it. To your point, not everyone can afford or is in a situation where they can leave a lucrative and probably on the most part are pretty enjoyable job but some people can and not everyone can leave the service from a user standpoint, but some people can. I think in aggregate those things do start to add up. Hopefully people will figure out that I'm fixing it. I think death is the craziest one that social media get wrong because it's just so annoying.

Laura: It literally happens to everyone. It doesn't happen as much to 20 something to fairly wealthy 20-somethings. It does happen some. I definitely know I lost friends when I was in my twenties, but it doesn't happen as much and it's not as normal. You can absolutely be in a room with folks who have never experienced anything like this and who don't think about not just the fact that like somebody died and that's awful, but what it would feel like to be constantly reminded about that in a place where you're supposed to be connecting with friends and feeling safe and being social that's a thing.

Erin: I meant literally everyone does die. You're absolutely right that everyone has had the experience of a close friend dying, but we will all die. I think part of the issue is that these companies are so young that they haven't gone to this full cycle yet where they've lost where the generation of users.

Laura: They promised you once you die, you're doing much less work on Facebook.

JH: I was going to try to make an active users joke, which is probably a sign we should wrap up.

Laura: Suddenly the podcast takes a very dark turn.

Erin: That's politics, economics all in there. Laura, you mentioned giving up Facebook. Have you given up any of the other big four or big tech?

Laura: I've cut way back on Amazon. I used to buy everything from them. I now actively search out other things. Sadly, I'm going to be perfectly honest. I did not do it out of any high moral reasoning. I did it because I actually think they made their UX much worse. The addition of all of the marketplace stuff and the fact that they are actively tricking users with dark patterns, when they're showing things as recommended that they're not marketing as recommended and that I could be ordering something from somebody that's a counterfeit and I would have no way of knowing. Those all just contributed to this, I think make the shopping experience for most things on their garbage. I will still buy books on my kindle because I can't give it up it's my precious. I do use them for that but generally if I'm buying something that knows a thing, like I will actually go someplace else and pay more, just to know I'm getting the actual thing.

JH: The marketplace thing is weird. We ordered a thermometer like two or three weeks ago and I was going back to my orders for something and I guess some other marketplace vendor had hacked that account and changed the product image to like the Guy Fawkes V for Vendetta Mask. As I'm scrolling through my orders and its like next to the thermometer and it was terrifying. I did like a double take and then they had like taken over their page and their listing and said we're accusing them of something but it was like this is not a great experience. I'm not sure I want to use that thermometer anymore. You think it works?

Laura: It's the Mr. Robot thermometer. Maybe not the best choice.

Erin: Awesome. Laura what did we not ask you? I think we've covered a lot of ground. I think this is a really interesting conversation.

Laura: This was great and I so much appreciate you guys for having the conversation. It's great to be able to talk about this stuff. Like I said, I don't want to come across as somebody who's blaming PMs and designers for all of this stuff. I get it. It's super hard. I just want, I just don't want any of this to happen by accident anymore and I want decisions to be made intentionally and I think we'll get better and more humane decisions if we all know about it and work together and try to make it better. I don't expect perfection, I expect better. I'm not mad, I'm just disappointed. This is the most middle aged conversation I've ever had in my life. I expect better of you.

Erin: That's important, right because it's, the issue is so big and complicated that to say, you know, I showed up, I cared, I tried his, gets us pretty far. Maybe if enough of us are doing that and saying that and sharing stories.

Laura: Bring your conscience to work day.

JH: It does feel like there was a period where like apathy was cool and it feels like lately being engaged and being outspoken and, and you know, trying to create positive changes is becoming a little bit cooler. Hopefully we're heading in the right direction.

Laura: I entirely blame my generation X for that.

Erin: Thanks for listening to Awkward Silences brought to you by User Interviews.

JH: Theme music by Fragile Gang.

Erin: Editing and sound production by Carrie Boyd.

P.S. Can we email you?

Get "Fresh Views," our weekly newsletter, delivered to your inbox. We feature interviews and point-of-views from UXers, product managers, and research nerds of all stripes.

Subscribe
Carrie Boyd

Carrie Boyd is a Content Creator at User Interviews. She loves writing, traveling, and learning new things. You can typically find her hunched over her computer with a cup of coffee the size of her face.

Subscribe to FRESH VIEWS newsletter

Weekly interviews and point of views all about UX research

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.