• 01:02:30

Episode number: 73

Data-driven Design

with Matthew Oliphant

Summary

UX expert Matthew Oliphant stops by the show to help define what data-driven design even means! We discuss testing assumptions before a project begins, as well as how UX research affects process and the project lifecycle. From critical business decisions to design and development, we chat about different ways to test a website or app, real-world scenarios of testing, and how to make sense of it all to fulfill client and user goals.

Tags

Sponsored by

  • Visual Chefs
  • Your ad here (dimensions: 520 pixels wide and 60 pixels tall)

Episode Transcript

CTRL+CLICK CAST is proud to provide transcripts for our audience members who prefer text-based content. However, our episodes are designed for an audio experience, which includes emotion and emphasis that don't always translate to our transcripts. Additionally, our transcripts are generated by human transcribers and may contain errors. If you require clarification, please listen to the audio.

[Music]

Lea Alcantara: From Bright Umbrella, this is CTRL+CLICK CAST! We inspect the web for you! Today our friend Matthew Oliphant joins the show to talk about data-driven design. I’m your host, Lea Alcantara, and I’m joined by my fab co-host:

Emily Lewis: Emily Lewis!

Lea Alcantara: Today’s episode is sponsored by Visual Chefs, a versatile web development agency with expertise in content management system and custom web application development. Through partnerships with designers, agencies and organizations, Visual Chefs propels the web forward. Visit visualchefs.com to learn more.

[Music ends]

Emily Lewis: Today we’re excited to have Matthew Oliphant on the show. Starting in UX as a technical writer, Matthew has worked as a freelancer as well as in corporate cube farms, agencies and startups. His experience runs the gamut from redesigning large-scale corporate design and dev processes to leading research efforts to understand the needs of organ transplant recipients. Welcome to the show, Matthew.

Matthew Oliphant: Hello.

Lea Alcantara: Hello. Can you tell our listeners a bit more about yourself?

Matthew Oliphant: Well, sure. I’m coming to you live from or recorded, I guess, from Anchorage, Alaska right now. I’m up here for a little visit and I’m very happy to be here.

Emily Lewis: And you took a road trip, is that right?

Matthew Oliphant: Yeah. We drove. My daughter and I drove up and spent about six days getting here.

Emily Lewis: Oh wow.

Matthew Oliphant: And she is flying back tonight and then I’ll be spending anywhere from six to ten days going back.

Emily Lewis: When I moved from the East Coast to the Southwest, my drive across the country, like there were parts that weren’t great, but for the most part, it was amazing and I wish I had had more time instead of trying to get one place as fast as I could because I didn’t want my stuff to get there before me. [Laughs]

Matthew Oliphant: [Agrees]

Lea Alcantara: [Laughs]

Emily Lewis: Was it beautiful?

Matthew Oliphant: It was really gorgeous, and thankfully, I think just a little few miles after we crossed into Alaska from Yukon territory, there was this torrential rain pour and it got all the bugs guts off the windshield.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs] Yeah, convenient.

Matthew Oliphant: So that was like, “Hey, this is Alaska.”

Emily Lewis: [Laughs]

Lea Alcantara: Nice welcome.

Matthew Oliphant: Yeah.

Emily Lewis: All right, so let’s get into today’s topic. Let’s start with the basics, what is data-driven design, and is it any different between data-informed design?

Matthew Oliphant: So I’m going to be a total UX jerk, I guess.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: And say, “What is data-driven design mean to you?”

Emily Lewis: [Laughs]

Matthew Oliphant: Because, I mean, I looked over some of the questions that you sent, and I wanted to make sure that I was giving you an answer that sort of resonated.

Emily Lewis: [Agrees]

Matthew Oliphant: So what do you think of data-driven design?

Emily Lewis: I mean, I’ll let Lea speak for herself, but what comes to me, not really being a designer, but it’s that you make something and then you see how people use it and then you adjust the design based on how people use it.

Matthew Oliphant: [Agrees]

Emily Lewis: I don’t know, Lea, what is your perception?

Lea Alcantara: That’s more or less it. Essentially, it’s basically once you review how people interact with something and then you align it. While you’re reviewing, you’re trying to see how that fulfill certain goals or not fulfill certain goals, and then you make adjustments and tweaks to the particular design. But I don’t know, like I’m curious to see if that actually is something that’s a misconception about it because can data inform design before it even begins?

Matthew Oliphant: [Agrees]

Emily Lewis: [Agrees]

Lea Alcantara: Because all our discussions right now, it’s kind of saying like, “Well, we make some professional judgment calls and then we send it off to be tested.” But can you test before you even make those professional judgment calls?

Matthew Oliphant: Well, basically, so I wanted to make sure in answering your question that there wasn’t like an invisible ™ after data-driven design.

Lea Alcantara: Right.

Matthew Oliphant: Because one of the things that I’ve seen over the past – I don’t know – maybe ten years or so is someone says, “Here’s this new method. I’ve given it a new name,” and everyone goes, “Oh, that’s so wonderful. What a great idea,” and I think, “But isn’t it just UX design?”

Emily Lewis: [Laughs]

Lea Alcantara: Right.

Matthew Oliphant: Or, “Isn’t that just this?”

Lea Alcantara: Right.

Matthew Oliphant: So I wanted to make sure that it wasn’t capital D, Data-Driven Design, and that I was missing something in my answer. [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: Sure.

Matthew Oliphant: But basically, yeah, I mean, what you said is it’s the idea that you’re going to go out and find out what people are capable of, what people are motivated to do, and I guess basically those two initially, and that’s going to inform your design decisions and what you build.

Lea Alcantara: Right.

Matthew Oliphant: And you can absolutely do it, even before you know what you want to do, and you can do it while you’re designing and you can do it while you’re building, and you can do it while the product is live. So basically, these activities that we’re going to talk about today are things you can do throughout the entire life cycle of the product you’re working on, whether it’s a physical product or a digital product

Emily Lewis: [Agrees]

Lea Alcantara: All right, so let’s talk a little bit more about that, like so how would you use data to inform your decisions and process from… let’s start from beginning, from the start of the entire process.

Matthew Oliphant: We can start at the very beginning, which is like coming up with an idea of what you want to do, and that could be whether the idea is being built for something that exist already that you need to figure out if a product needs a new feature or if you are sitting around and you’re like, “Hey, I want to make an app. What should I make?”

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: You can use a lot of these tools and processes to derive answers for that. With UX design or UX research, which I just kind of clump together as activities related to user experience, you can go out and talk to people at any point and say, “You know, I’m having an idea.” I mean, you may not phrase it this way exactly, but just to sort of set up the context, “I’m having an idea that I want to make an app that does X.”

And you can go out and do market research, what is currently out there that does this? How do people accomplish this today if there isn’t a product? If people aren’t actually doing this today, what would motivate them to start doing it? And you can go out and talk to people and ask them those questions, maybe not specifically that way, but to get to those answers.

Lea Alcantara: Oh, that’s really interesting because Emily and I kind of started off with our own definition that the idea was already done and that we’re just testing assumptions, but here, you’re saying that forget even testing assumptions with the design work, let’s test whether this idea of this business or this app is even viable.

Matthew Oliphant: Absolutely.

Emily Lewis: I wonder if that’s because of where Lea and I fall in the cycle because clients hire us to solve a problem they’ve already decided needs to be solved, but I question how often a client have done what you just described to decide whether there actually exist a need for what they want built.

Matthew Oliphant: Well, and I don’t mean this in the sense of picking on clients at all because people come to us, at certain stages, in whatever they’re trying to work on and they need help.

Lea Alcantara: Right.

Matthew Oliphant: But I think we’ve all had time where someone comes and says, “Hey, I need a website,” and then just as an example, “I need a website.” And you start talking to them and you’re like, “I don’t think you need a website. I think you need this instead, but you’ve hired us to build a website so that’s what we’re going to build for you.”

Emily Lewis: [Agrees]

Matthew Oliphant: And I guess what I would say is that’s an opportunity to talk to the client or talk to the prospect about it. You may not be asking the right questions or you are…

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: Because what’s very typical for all of us is we tend to talking solutions rather than talking problem and opportunity statements.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: And I think getting people to think in those terms gets them to think about what it is they’re actually trying to solve for, and it gets away from and serve solution talk, which may not be…

Emily Lewis: [Agrees]

Lea Alcantara: Interesting.

Matthew Oliphant: It may not be appropriate. I mean, most companies probably do need a website at this point.

Lea Alcantara: Right.

Matthew Oliphant: But…

Lea Alcantara: Whatever that entails is, you know…

Matthew Oliphant: Right.

Lea Alcantara: There are a variety of things you could go down that path.

Matthew Oliphant: [Agrees]

Lea Alcantara: So in regards to that, this kind of makes me think about discovery first or discovery only engagements with small business. Is this something that you would do as UX researcher as like as part of an agency’s discovery process where you are the one who’s kind of doing the questions over like, “Well, have you used this competitor’s app? What do you think about this, that or the other?” before any “active” engagement of like building the solution as you mentioned begins, or this more typically dealt with in like a larger corporate scale.

Matthew Oliphant: It can happen at the corporate level, the agency level, and it can happen in a startup. I think the real challenge regardless of where the activity is taking place is getting people to believe that it’s worth taking the time to do it.

Emily Lewis: [Agree]

Lea Alcantara: Right

Timestamp: 00:09:54

Matthew Oliphant: I often equate it to, is it less expensive to not build bugs into the system or is it less expensive to be fixing bugs after the product is launched?

Lea Alcantara: Right.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: Because whatever your premise is, whatever your starting point is, that is going to define where you end up.

Lea Alcantara: Right.

Matthew Oliphant: And if you don’t get it right to start, it’s going to be expensive to fix it.

Lea Alcantara: [Agrees]

Matthew Oliphant: So my pitch for doing like standalone discovery projects from when we’re doing this at an agency or when I worked at nGen Works. We did it at State Farm when I worked in the cube farms.

Emily Lewis: [Agrees]

Matthew Oliphant: And I recently did some of it at a startup that I worked at recently and it was always very illuminating for everybody.

Lea Alcantara: [Agrees]

Matthew Oliphant: At nGen, we had we had the opportunity to shut down two people’s businesses because of it.

Emily Lewis: Ha?

Lea Alcantara: Ha?

Matthew Oliphant: That’s rare that it happens I’m sure.

Lea Alcantara: Sure.

Matthew Oliphant: But basically going through this standalone discovery process allowed the client to say, “I can’t do this.”

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: As opposed to just coming to us and saying, “I need this app built,” and us going, “Okay.” And then they…

Lea Alcantara: [Agrees]

Matthew Oliphant: I mean, realistically they’re wasting a lot of money to end up with something that ultimately they’re not going to use.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: So they gave us $20,000 and didn’t spend a quarter million.

Lea Alcantara: Right.

Emily Lewis: Right.

Lea Alcantara: So I’m curious then, what kind of data do you gather for this particular engagement? Like what questions are you asking to figure out whether something is even viable?

Matthew Oliphant: I think the first stage is always getting an understanding of what the client or what the business thinks they want, and again, trying to walk them back from solution to what’s the problem here or what are the things you’re least confident about.

Lea Alcantara: [Agrees]

Matthew Oliphant: Because the bulk of this activity is to build a level of confidence such that whatever you’re designing, whatever you’re building you know is meeting the needs of your customers or your users or your internal employees if it’s an app for an internal people. That’s the first place, and then typically we try and get an understanding of who the users are, the potential users are, and who are the users used to be.

Lea Alcantara: Right.

Matthew Oliphant: So like if you’ve lost customers in the past because of something, trying to engage with those people and have conversations with them about why they left, and get a sense from the users what they think the problems are or what they think the opportunities are, and then basically have sort of a conversation with the business saying, “All right, you told me this, the users told me this, here are the gaps, let’s talk about that.”

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: And then that’s an opportunity for more research or that’s the stuff that’s going to start informing whatever the design solution is.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Emily Lewis: And that I guess become essentially the benchmark that the project is constantly being measured against?

Matthew Oliphant: Yes, I mean, hopefully anyway, but yeah, it’s always the thing that’s going to inform all the decisions you’re going to make. So if someone says, “Hey, I think we should add this feature that allows people to generate a PDF so they can print it out,” that may be a great idea someday, but is that activity you want a developer working on when that need was never …

Emily Lewis: [Agrees]

Lea Alcantara: Articulated, yeah.

Matthew Oliphant: … articulated by the users or even by the business.

Lea Alcantara: Right.

Matthew Oliphant: I’ve sat with, and I’m not picking on developers, but it’s just germane to this part of the conversation…

Lea Alcantara: [Laughs]

Matthew Oliphant: I’ve sat with developers who have said, “Hey, I made this thing because I thought it would be great.”

Emily Lewis: [Agrees]

Matthew Oliphant: I’m like, “Well, that’s really cool. How about these six things I’ve already asked you to do?”

Emily Lewis: [Laughs]

Lea Alcantara: Right.

Matthew Oliphant: So to me, it’s always everybody who works on a project, their responsibility is towards the user experience because for the most part, unless you’re building an app that your employees have to use to do their job because I feel like that’s a different relationship than someone who’s buying your application or coming to your website, it’s those users that are the ones who give your business money that allow you to pay your employees, and without them, the whole system falls apart.

Emily Lewis: [Agrees]

Lea Alcantara: Right, right.

Matthew Oliphant: So all the decisions of what makes into the product need to be tied back to somebody who’s a user or a potential user expressing an interest in that.

Emily Lewis: I’m curious, have you seen something that’s been relatively successful for ensuring that, like having a single person on the team who’s one of their primary responsibilities is to make sure that the developers and the designers and the project managers and who knows, maybe the salespeople are all still focused on that initial goal that was established?

Matthew Oliphant: I think, I mean, to me, it’s typically either the project manager or the UX person.

Emily Lewis: So UX person could have project management over a developer, and I hate to use the word “over,” but kind of guiding a developer in that way?

Matthew Oliphant: Yeah, I think honestly the only way that this works consistently is either with a team that trusts each other and has worked together for a long time.

Emily Lewis: [Agrees]

Lea Alcantara: Yeah.

Matthew Oliphant: Or if the team is put together for the purpose of the project, have someone who is the absolute authority figure who all decisions have to be run by.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: Especially if you’re talking about a project team of 12 to 30 people, that gets hard to keep track of what everybody is doing.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: But it can work, but it’s just hard.

Emily Lewis: It requires the commitment.

Matthew Oliphant: Absolutely.

Lea Alcantara: So Matthew, I just want to dive a little bit more about the kind of data that you’re gathering throughout the project. So you kind of explained at the beginning about gathering data as to the viability of the project and the business goals and the user goals and all that fun stuff. Now, let’s assume that everyone is on the same page regarding that, let’s start building. So how does the testing fit in the UX process while you’re building this solution?

Matthew Oliphant: So there are a lot of ways that you can do this testing. It can happen during the entire life cycle of the project or the life cycle of the product. Going to the very final things, what I’ve done in the past is for prospects that with a B2B company, testing out the prospect’s application and doing usability testing on that.

Lea Alcantara: [Agrees]

Matthew Oliphant: And then showing them the way things aren’t working and the way our company can help them fix their stuff.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: That’s always very illustrative.

Emily Lewis: Like a sales tool almost. Like a sales tool?

Matthew Oliphant: That’s exactly a sales tool.

Emily Lewis: Oh.

Matthew Oliphant: Because what I would do is put together a report that had, “Here are the problems. Here are the potential solutions. Here are mockups of how it might look like with our product supporting your product,” and that all gets packaged up in the pitch.

Emily Lewis: Smart.

Matthew Oliphant: And that’s before any activity even starts, before a contract is even signed, that is a way a usability testing can be useful to help craft a story for a sales pitch.

Lea Alcantara: And this is a paid usability test though?

Matthew Oliphant: Well, in this case, it was more the hope was that it will get the business and therefore cover the cost of doing the usability test.

Emily Lewis: [Agrees]

Lea Alcantara: Right, right.

Matthew Oliphant: So that was from a B2B perspective. From thinking from an agency’s perspective or a company perspective where you might go out and do research prior to even deciding to build something, you can absolutely do testing to get a sense of how people use competitor products, how people are currently solving for the problem that you want to solve for, whether there is a product out there or not.

Lea Alcantara: [Agrees]

Matthew Oliphant: The fewer, and I’ll say fewer real world artifacts, whether that’s a digital product or a physical product that exists that are solving the problem, the more likely you are doing one-on-one conversations with people.

Emily Lewis: [Agrees]

Matthew Oliphant: If you have competitor product or if you have a prototype that’s digital, you may be able to do unmoderated usability testing, and that you could get 50 people in a week kind of thing.

Emily Lewis: Is that a kind of thing where you just put a URL out there and have people try something or…

Matthew Oliphant: [Agrees]

Emily Lewis: Okay.

Matthew Oliphant: Yeah, there are products in the market that allow you to – they handle the recruiting and it records what they’re doing on the screen, it records the voice and people understand that they’re supposed to speak aloud while they’re going through the application or going through the website to try and find something, and then you see what the problems are.

Lea Alcantara: Okay, so then how do you identify what to test? Like you can say, “Okay, we need to test out this product or this app or whatever,” and then you’re like, “Yeah, we need to have X amount of people to do it.” But then what do you tell them to do?

Timestamp: 00:20:08

Matthew Oliphant: Right.

Lea Alcantara: Like what exactly is the test?

Matthew Oliphant: So roughly speaking, there are basically two kinds of tests that you can do. You can do exploratory testing where you have less of a premise.

Emily Lewis: [Agrees]

Matthew Oliphant: That’s basically, “Walk me through how you accomplish X,” like buying plane tickets, and I don’t care what site you use. I don’t care if you pick up the phone and tell your assistant, “Buy me a plane ticket.”

Emily Lewis: [Laughs]

Lea Alcantara: Right.

Matthew Oliphant: There is a process that you follow, and you’re not so concerned about coming up with solutions. You’re exploring a space to understand where the pain points are, where the opportunities like to create something, and you’re really not asking much in the way of questions. You’re just saying, “Show me how you do this.”

Emily Lewis: [Agrees]

Matthew Oliphant: And whatever you get back from that is what you get back. Going through the data from that would then define potentially a project brief that would outline what you’re going to try to accomplish.

Lea Alcantara: [Agrees]

Matthew Oliphant: Within that, on the other general type of testing you do is validation where you’re validating your design solutions against how people react in the real world.

Lea Alcantara: [Agrees]

Matthew Oliphant: And that’s usually what people think of when they think of usability testing.

Lea Alcantara: Right.

Matthew Oliphant: They think of sitting down in front of a computer on a prototype or a working application on a phone and having someone say, “All right, if you need to sign up for service, please talk aloud and tell us what you’re thinking while you’re signing up,” and you’re capturing all the problems that there are with the signup process because you designed it in a certain way and you want to make sure you’re not losing people during that process. So you can use exploratory testing, you can use validation testing pretty much at any point, whether you’re generating ideas for what to do or you’ve just released a major update to a product and you’re trying to figure out what the impact is for that or what to do next. So it can happen at any time.

Emily Lewis: And so it sounds like to me like the validation testing in terms of the what you’re trying to validate essentially should have been defined during the exploratory, so what you test is what you defined at the start of the project?

Matthew Oliphant: That’s absolutely right.

Emily Lewis: And you were describing a tool that you can get a bunch of people to test something. Are there some examples that you used before that you’ve liked?

Matthew Oliphant: There are definitely some examples that I’ve used.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: There is no perfect tool obviously.

Emily Lewis: [Agrees]

Matthew Oliphant: And a lot of these tools have a lot of drawbacks. I would say from a tool and a customer service perspective, I’ve had a fair bit of luck with usertesting.com.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: Not to promote them too much, but it’s basically something where you can set up a URL or upload an APK, basically a prototype iOS or Android application, and say, “I want people from this country who make this amount of money per year and use these products to test this application and here are the tasks,” and then they go out and they recruit people and it records what’s on their screen, it records what they say about it, and it provides a little video that you can then go through and make notes on. I mean, it’s just basic usability testing at that point.

Emily Lewis: [Agrees]

Lea Alcantara: So how do you identify the breadth and scope of these particular tests, so even if you understand what the assumptions you’re trying to test are?

Matthew Oliphant: [Agrees]

Lea Alcantara: For example, like do you need to test five people? Do you need to test ten people? Do you need to test a hundred for a month versus like one hour?

Matthew Oliphant: Right. I was really trying to get to the end of this conversation without saying it depends.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: So I’m not going to, but it does depend a little bit on two primary things. The first one is my sentiment is that you test until you have a high level of confidence that your design solution is the right one, and if that means that you are testing a hundred people, then that’s what’s you’re doing.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: And hopefully you can figure out what the design solution should be before that. Lea Alcantara: [Agrees]

Matthew Oliphant: But the other thing to keep in mind is there’s always a question of resources, how much money can you spend on this?

Lea Alcantara: Right.

Matthew Oliphant: Do you have an active user base that you can draw from? Is this the type of thing where you can get feedback from 50 people in the day or is it the type of thing where you really do need to sit down one on one with people and have a relatively open ended conversation with them as they go through your product? If you take that path, if you’ve taken the path of having one-on-one conversations, going through a hundred people is going to take a long time.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: So typically what I do for that is I go in sets of three instead of five. We’ll do a round of three people and we’ll stop. The first round of three we’ll talk in testing the test to make sure that you’re asking the right questions.

Lea Alcantara: Oh okay.

Matthew Oliphant: To make sure that your assumptions in putting together, let’s say, putting together the prototype, that your assumptions are relatively accurate for the feedback you’re getting. So, the first three is often hard data collection and part testing the test. The second round of three is depending on how much changes you make between the first and second tests, either a brand new test or it’s validating compared to the first round of testing.

Lea Alcantara: Right.

Matthew Oliphant: And basically I just kind of go in sets of three until we hit a level of confidence that we’re happy with.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: And sometimes that all it takes is three people because you’re testing something that is I’ll say simple, in the sense of simple versus complex.

Lea Alcantara: Right.

Matthew Oliphant: But if you’re testing an entire application flow for an accounting application, people need to do their taxes or something, and that’s going to be really complex flow. You’re either going to want to break that down or you’re going to want to like per task like enter your income.

Emily Lewis: [Agrees]

Matthew Oliphant: Okay, that’s one test and we’re done with that.

Emily Lewis: [Agrees]

Matthew Oliphant: Or you’re going to test the entire flow, the entire workflow, and if you’re going to do that, for longer workflows, I tend to do more rounds of testing.

Lea Alcantara: So are all the participants of these tests, I’m assuming that the demographic is the end user or do you choose people in the staff, like who’s part of the three?

Matthew Oliphant: Well, if you have a really hard time saying, and I’m not saying it depends…

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: But I’ll say this, context will drive decisions.

Emily Lewis: [Laughs]

Lea Alcantara: Right, right.

Matthew Oliphant: If you really don’t have a lot of time, I tend to fall back to what I call warm body testing, you know, “Excuse me, do you have a pulse? Please test this for me.”

Lea Alcantara: [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: Right.

Emily Lewis: I think I’ve gotten requests for me on Skype of that variety.

Matthew Oliphant: Yeah, yes, you have.

Emily Lewis: [Laughs]

Matthew Oliphant: Yes, you have.

Lea Alcantara: Yes, me too. [Laughs]

Matthew Oliphant: And honestly, when you’re doing that, you’re looking for the things that are just total, “Oh, of course, that’s wrong.”

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: Like that anybody, even if they aren’t savvy to filling out medical billing information would say, “Hey, it didn’t make any sense.” You’re just looking for the really big usability problems that you likely can’t see because you’ve been so close to it in designing.

Lea Alcantara: Right.

Matthew Oliphant: And to me, that’s a totally valid way to go.

Emily Lewis: [Agrees]

Matthew Oliphant: I don’t think it should end there, but that can also be a great way to do to test the test.

Lea Alcantara: Right, test the test.

Matthew Oliphant: Yeah.

Emily Lewis: You know what occurs to me as you’re describing from I guess the least resource time-intensive to the most is something better than nothing, like if you’re dealing with a limited budget or some type of constraint that prevents you from making a very big investment in user testing, user research.

Matthew Oliphant: [Agrees]

Emily Lewis: Is it okay to just do a little bit and put all your eggs in the basket that that little bit shows you or is that a dangerous approach?

Matthew Oliphant: I think that you should always be doing some.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: And I think where the decision point lies is one aspect of your testing. So if you have a signup flow that is basically the same as everybody else’s signup flow, I don’t feel that we’re strapped for resources and time, I wouldn’t test that.

Timestamp: 00:30:05

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: I would say, “We’ll just hope that works. Where I would put the time is into the parts that we’re really unsure about or require for whatever reason the user to have sort of multiple inputs if they’re trying to get something done, like doing taxes requires lots of outside of the system thinking and activity, like signing your W2 and all that kind of stuff.

Emily Lewis: [Agrees]

Matthew Oliphant: It’s things that you can’t control within your system and also if there are things you know about your users or potential users where they may not be able to finish a task because of something, whether it’s real world distractions or a time delay for some particular reason within the application that they have to come back and do something later, those are the points where I would want to focus my time, the things I have complex interactions.

Emily Lewis: So Matthew, you described basically the exploratory and the validative user testing and you mentioned the validation stuff as something like having users use something. I’m curious, have you ever relied on things like basic A/B testing with some tool like Optimizely or even like eye tracking software or mouse tracking? Do any of those sorts of test come into play?

Matthew Oliphant: I’ll say that I don’t really like eye tracking or mouse tracking.

Emily Lewis: [Agrees]

Matthew Oliphant: Especially like for mouse tracking because I watched some of my test where people spin the mouse around the screen while they’re thinking.

Emily Lewis: [Agrees]

Lea Alcantara: Right, right.

Matthew Oliphant: Like, “Well, okay, that really doesn’t help me decide anything.” So I don’t really care for that kind of testing. I know that there are people who build businesses around it and it helps with their being successful, but I don’t really care for it. I don’t mind A/B testing in general, but it’s got to be something where you’re getting statistically significant results. So if you can generate 10,000 to 100,000 results over the course of a day or a week, A/B testing will tell you maybe what is happening.

Emily Lewis: [Agrees]

Matthew Oliphant: But the way I use A/B testing is to inform me what questions to go ask.

Emily Lewis: Oh.

Matthew Oliphant: Like it informs what questions I go ask when I go out and talk to people. I don’t like to use it to inform design decisions directly because it doesn’t tell you what the motivation is for people.

Lea Alcantara: Aha, right.

Matthew Oliphant: It just tells you what they’re doing.

Lea Alcantara: Right. So that actually leads to my next question, which is, when you’re evaluating all this data, like you said, okay, you can have this data from eye tracking or mouse tracking, but you personally believe that there’s not enough context, so how do you evaluate when the data you’re gathering is any good or even useful?

Matthew Oliphant: That’s hard. [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: That’s really hard. There are some conceits to usability testing, and the primary one is that usability test participant will try harder, spend more time and be more gracious than they would normally be if they were just “regular” user.

Lea Alcantara: Right.

Matthew Oliphant: So that is something you always have to take into account when you’re doing usability testing or when you’re talking to people about what they would want to do or what they do, because people are really good at overestimating their own capabilities.

Sponsored by

  • Visual Chefs
  • Your ad here (dimensions: 520 pixels wide and 60 pixels tall)

Emily Lewis: [Laughs]

Lea Alcantara: Right, right.

Matthew Oliphant: And I guess putting more emphasis on what they think their motivations are, which may or may not be correct, but you just kind of have to go with it sometimes because otherwise you’re just making stuff up.

Lea Alcantara: Because I was just thinking when you’re earlier talking about testing the test in the first place, like when part of that be kind of like vetting the test and assumptions that you’re even gathering.

Matthew Oliphant: [Agrees]

Lea Alcantara: And then I was just thinking about how they could be the subject matter themselves simply because they were chosen for that test, like you said, they could be more gracious because they’re there and chosen and they want to be participant. You went to also say with testing the test and figuring out the questions, there’s a possibility that the question itself could be leading, like a wrongly phrased question could lead them to doing action that negates the test results because you literally told them what to do in the question itself when they’re supposed to kind of figure it out themselves.

Matthew Oliphant: [Agrees]

Lea Alcantara: Or I don’t know how or if I’m phrasing this right.

Matthew Oliphant: No.

Lea Alcantara: But you get what I’m trying to say?

Matthew Oliphant: I do.

Lea Alcantara: Yeah.

Matthew Oliphant: And I think that the testing the test will help with that. Ideally, whoever is putting together the test in the first place has an understanding of what open-ended and close-ended questions are. Some people that I’ve worked with in the past who will never ask a close-ended question and I find that really annoying.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: Because I feel like at some point during a moderated usability test session, you just have to say, “Do you think that this is the thing to do?”

Emily Lewis: [Agrees]

Matthew Oliphant: Particularly, I leave that to the end of a point where the participant is getting very frustrated.

Lea Alcantara: Aha.

Matthew Oliphant: Because, yes, you can note that this is frustrating. Okay, but we don’t have to let them stew with it and you can just say, “Well, what do you think you should do? That’s where you should start, which is where we started this conversation today. What do you think data-driven design is?”

Lea Alcantara: [Laughs]

Matthew Oliphant: See how professional I am?

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: But yeah, I mean, it’s a pattern that I’m not sure that there is a prescription for other than to say you start with open-ended questions of “Show me how you would accomplish X” or open-ended questions like, “Well, if you feel you’re stuck at this point, what do you think would be the best way to get unstuck?” And then he leaves the close-ended question for when they are actually stuck or during your debriefing time where you’re saying, “Remember when you were on this? Did you think that this was happening?”

Because sometimes you’re just going to have to have questions that you need answered that you just need to be pretty specific about, but I tend to leave those until the very end and start with very open-ended stuff and allow the participant to sort of – even though the task can be very defined, even though the prototype can be very limited, I allow the participant to go where they go because in that sense that they’re representing an actual user, the user I won’t be able to sit next to everybody who uses this application and be there when he gets stuck so I want to let them just go where they think they need to go and ask them to show me where the problem areas are going to be.

Lea Alcantara: Okay. So once you’ve gotten all that information and evaluated whether it’s relevant and they answered it to your satisfaction, what do you do to this information? Do you just make a report and say, “Stakeholder, here is the info,” or do you build a plan for that saying, “Okay, they really are confused with this signup process, we need to do this,” or do you just tell them they’re confused about the signup process?

Matthew Oliphant: I always, and there are certainly other people who do differently, but I always do a written report. Well, we always do a written report depending on how interactive the team is.

Lea Alcantara: [Agrees]

Matthew Oliphant: But do a written report, outline what the themes were or the problem areas were, and say, “Here is a potential solution for this.”

Lea Alcantara: [Agrees]

Matthew Oliphant: Now, if the problem is obvious and the solution is very straightforward and not complex, I tend to just leave it at that.

Lea Alcantara: [Agrees]

Matthew Oliphant: But if the problem is like a solution for the problem is not obvious or there is more than one way to – and there’s almost always more than one way to solve a problem.

Lea Alcantara: Sure.

Matthew Oliphant: But if there’s obviously more than one way to solve for it, that it then becomes a conversation point to, “Should we do more testing or do we not have time and we should just pick a solution that we think will work and go for it.” So my hope is, like I never liked to be in a situation where it’s, “Here is your report, see you later.” You know?

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: It’s, “Here’s the report, which you can look at, but really we’re going to walk through it and we’re going to have a conversation about what we’ve been seeing.” And hopefully most of the people you’re talking to have been informed the entire time so a lot of it isn’t a surprise to them. I’m not a fan at all of, particularly in the agency world of the whole big reveal. I think that’s a waste of time because often you say, “Tada!” and they go, “Oh, that’s not what I was expecting at all.”

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: So I try to involve people from the get-go, so that when they’re receiving the report, they typically already know everything that’s in it, but it does sort of follow that structure of categorizing what the themes of the problems were and what potential solutions might be.

Emily Lewis: You know it occurs to me that most of everything that you’ve talked about today, the end goal is to make something better, make a product better, make a workflow better, make the user experience better, but are there times, are there situations when data actually leads design down a bad path or leads a product in a bad direction? Are there kind of problems that can come from using data to drive design?

Timestamp: 00:40:05

Matthew Oliphant: I think the only times where there are problems with data-driving design is when your initial premise or question was ill informed in the first place.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: Which to me goes back to why you do test the test, that you don’t just assume that – well, I guess the other thing too is asking a lot of leading questions, and I see that a lot, where sometimes if it’s a non-UX person or if the business comes to you and says, “Oh, we’ve already done the research,” and I say, “Oh, well, can I look at what you came up with?” And they’ll say, “Yeah.” And here are all these questions of, “Would you like a feature that helped you live your life better and made you richer?”

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: Ah…

Lea Alcantara: The answer is yes.

Matthew Oliphant: The answer is yes.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: And they’re like, “Well, so we’re going to build that because they said yes,” and I’m like…

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Emily Lewis: It sounds like marketing…

Matthew Oliphant: Hang on… [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: Yeah, garbage in, garbage out.

Emily Lewis: [Laughs]

Lea Alcantara: Well, I have a specific example. When we kind of framed and formed this question, I was thinking specifically about how there’s data out there that says, “Popup modals convert very well if your goal is to sign people up for a service or to a mailing list.”

Emily Lewis: [Agrees]

Lea Alcantara: But it’s been drilled down to us as designers and developers that it’s really intrusive, nobody really wants it.

Matthew Oliphant: [Agrees]

Lea Alcantara: There’s Tumblr that’s full of screenshots of these – to my opinion – aesthetically offensive [laughs] boxes.

Matthew Oliphant: [Agrees]

Lea Alcantara: And then on another level, the popup box with shaming confirmations where instead of cancel or closing, it’s like, “No, I don’t want a good deal” is the actual text for the button.

Matthew Oliphant: Right.

Lea Alcantara: And I feel like clearly those types of things started popping up post-testing. So I’m trying to reconcile my own personal feelings with this and the actual “results” that it’s getting.

Matthew Oliphant: [Agrees]

Lea Alcantara: Can you tell us a little bit how that can be reconciled and how as a UX professional, you can figure out whether I should put aside my personal feelings because just because I don’t like it doesn’t mean it isn’t useful for the client or even useful for the user.

Matthew Oliphant: [Agrees]

Lea Alcantara: How do you deal with something like that?

Matthew Oliphant: So I’m right there with you.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: Most of the sites, like if someone shares a link on Twitter or Facebook and I click it and if I get a popup, I close the tab.

Emily Lewis: [Agrees]

Matthew Oliphant: I’m like, “Well, I just don’t care enough to stay on this site and deal with it.” With that said, I think my answer is you should set aside your personal taste, because what it comes down to is there is always this friction between what the business is trying to accomplish and what the users are willing to put up with.

Lea Alcantara: [Agrees]

Matthew Oliphant: And that’s what you’re designing for.

Emily Lewis: [Agrees]

Matthew Oliphant: And it really pains me to bring up this story because on one hand, it’s really interesting. On the other hand, it validates something I hate, that you’ll go to a site sometimes and then they’ll say, “Hey, do you want to open this on our app?”

Emily Lewis: Oh, yeah, yeah, that.

Lea Alcantara: Yeah, yeah.

Matthew Oliphant: “Do you want to download our app?”

Lea Alcantara: [Agrees]

Matthew Oliphant: Well, I think Google did something a few months ago or maybe it was late last year where they said, and basically the research was, “This is a terrible way to do things.” So the startup that I just left, we did our own study and it was very well statistically valid because we got something like 20,000 data points within two days or something like that where were basically doing it.

Lea Alcantara: Wow!

Matthew Oliphant: This is where A/B testing can be interesting sometimes. Fifty percent of the visitors got a popup, and fifty percent didn’t. The people who got the popup like 99.9% of them just missed the popup, but stayed on the site longer.

Lea Alcantara: Aha.

Matthew Oliphant: Almost twice as long.

Lea Alcantara: Wow!

Matthew Oliphant: And we left it there. We didn’t do any more investigating, but the hypothesis was that even though it was a very short action of tapping or clicking on the X to close the popup…

Emily Lewis: They engaged.

Matthew Oliphant: They engaged and they invested time.

Emily Lewis: Wow!

Lea Alcantara: [Agrees]

Matthew Oliphant: And so they stayed longer.

Emily Lewis: [Laughs]

Matthew Oliphant: And I hated that that’s the result.

Emily Lewis: [Laughs]

Lea Alcantara: Wow! Wow!

Matthew Oliphant: But…

Emily Lewis: Man, I hate popups.

Matthew Oliphant: Yeah, I know.

Emily Lewis: Like now, like it’s harder to be like…

Lea Alcantara: Now, I’m like, “Should we have a popup?” [Laughs]

Emily Lewis: Oh no. We’re not doing that. [Laughs]

Lea Alcantara: No, I’m just joking.

Matthew Oliphant: Yeah.

Lea Alcantara: But those are the kinds of hard conversations that we have.

Emily Lewis: [Agrees]

Lea Alcantara: If you’re faced with that situation and your goal is further engagement and if it takes them one second to close that popup and then they spend minutes more on your site, then how do you argue with that?

Matthew Oliphant: [Agrees]

Lea Alcantara: And the part of that too is like there are qualitative things that you can think about.

Matthew Oliphant: [Agrees]

Lea Alcantara: Like is there a cumulative grand ill will brought up because they keep seeing that and they’re like, “Oh, I’m never going to go to this site again because it keeps annoying me.”

Matthew Oliphant: [Agrees]

Lea Alcantara: Or are “normal” people, because we’re industry people so we’ve got our own perspective, but are normal people just like, “Eh, popups are a reality and I’m just auto-clicking X anyway without even reading.”

Emily Lewis: [Agrees]

Lea Alcantara: But just the act of doing that has a psychological trigger for them to like, “Okay, I’ve already decided I did want to read this article, so I’m going to read the article.”

Matthew Oliphant: Yeah. I think that in the short term, it doesn’t bother “normal” people as much as it bothers us.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Emily Lewis: I’m pretty sure you’re right about that.

Matthew Oliphant: Yeah.

Lea Alcantara: Yeah.

Matthew Oliphant: Our premise is, basically, well, instead of these popups, maybe you should just provide better content.

Emily Lewis: [Laughs]

Lea Alcantara: Right.

Matthew Oliphant: Or content that people are actually interested in or if you are popping something up, it’s because you’re telling people, “I’ll give you more content than what you can get on this site,” for some reason, you know?

Lea Alcantara: Right.

Matthew Oliphant: “There’s something more we can give you.”

Lea Alcantara: Right.

Matthew Oliphant: There’s something, it isn’t just want your data. It’s we want to trade your data for something better.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: But yeah, it’s in the long term, I’m pretty convinced that all these strange interactions, strange design choices that we’re putting or that people are putting on their sites or in their apps are slowly driving people insane over the course of years.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: But yeah, I don’t think those popups are good design solution, but at the same time, they seemed to work.

Emily Lewis: Yeah. That’s the tension I feel. We had a client recently, who asked to add them to their site, and yeah, I didn’t like doing it, but they’re happy.

Matthew Oliphant: [Agrees]

Lea Alcantara: But we spoke to them about the pros and cons and there’s also like some compromise, like we have clients where there’s a time-based delay for the popups.

Matthew Oliphant: [Agrees]

Lea Alcantara: So it’s kind of like, “Well, if this person is already reading the site and they’re sticking around, then let’s give them something” as opposed to just like bombarding them with a closed screen right away.

Matthew Oliphant: [Agrees]

Emily Lewis: And also supporting like it doesn’t keep popping up once someone said no with a cookie-based sort of solution.

Matthew Oliphant: [Agrees]

Emily Lewis: But let’s stop talking about the negative. [Laughs]

Lea Alcantara: [Laughs]

Emily Lewis: And let’s talk a little bit about the positive. I think it would be interesting to sort of take this idea of using data to inform design and apply it to “real world” situation. Our listener, John F. Croston III wrote in, he asked, “Would having more data on what kinds of ice cream sell best at Salt & Straw allow them to rearrange the list on flavors of their website?”

So basically, it sounds like he wants to know if they had more data on what their most popular ice creams were. Would that make sense to then apply it to the website and list the popular flavors differently?

Matthew Oliphant: So just for everybody who doesn’t know what Salt & Straw is, it’s an ice cream shop that does really whacky flavors like mint and sea urchin meringue.

Emily Lewis: Oh. [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: See, you say that and whenever we go there, my wife, as a little aside, my wife always tries the thing that she thinks sounds the worst and often it’s amazing.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Matthew Oliphant: And the mint and sea urchin meringue was amazing.

Emily Lewis: Wow.

Lea Alcantara: Interesting.

Matthew Oliphant: So it’s the Salt & Straw. I would say this is where data is really important because I think John’s question is one sort of a solution-based question, which I’m not picking on John for.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: But it is assuming a lot of stuff, and I actually kind of like the idea that the flavor list rearranges itself based off of whatever criteria, but this is where Salt & Straw as a brick and mortar, you may be designing a website, but as a designer in this case, you would want to go and observe Salt & Straw to see how things work, and I would say that not having done any research on this, but my guess is how Salt & Straw works, the website doesn’t really play a lot into people’s experience.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: People may just see what the new flavors of the month are because they have their standards and then every month they have new flavors, but the in-store experience is such that you basically have an ice cream concierge.

Timestamp: 00:50:04

Lea Alcantara: [Laughs]

Matthew Oliphant: And they are like once you have their attention, they’re with you your entire time there and you can try every single flavor and it’s totally okay. So the in-store experience doesn’t map to what the website does at all.

Emily Lewis: [Agrees]

Matthew Oliphant: So if Salt & Straw said, “Hey, we’re thinking of doing a little project to rearrange all the flavors on the site based on what’s selling best or what we want to highlight,” I’ll be like, “Well, you could do that, but is that really the best use of your time?”

Emily Lewis: [Agrees]

Lea Alcantara: John does have a follow-up question because you’re talking about, well, the website doesn’t translate well to the in-person, but I can see that having a mobile device, and especially if you’re visiting Portland for the first time and people tell you, “Go visit Salt & Straw,” and you’re checking out either their site or app that they could have a different experience on mobile that could translate to there, like what kind of data? Would there be anything useful to gather in regards to that to see like if there’s information that you can find that would help people draw to Salt & Straw even more or what have you?

Matthew Oliphant: I don’t think they’re having problems with drawing people. [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: No. [Laughs]

Matthew Oliphant: Given that the line snakes around the block, but I’m thinking about this and like, actually, in some respects for the locals who want to just rush in to get a couple of pints, to be able to know beforehand that they have pints available for the flavor that they’re looking for.

Emily Lewis: [Agrees]

Lea Alcantara: Right, like an alert.

Matthew Oliphant: And if the Alberta Street store doesn’t have them, but Division does, that they can just go there instead of wasting their time going to Alberta.

Lea Alcantara: [Agrees]

Matthew Oliphant: That’s where inventory driving what shows up on the site could be helpful, and then I think as far as like people coming from out of town, I’d almost want to see sort of a story-driven thing for Salt & Straw where it’s, “Hey, have you ever been to Salt & Straw before? Well, if there’s a line, it’s worth it.” And because I think Portland and some other cities have this thing where people stand in line for a long time for food.

Lea Alcantara: [Laughs] Yeah.

Matthew Oliphant: And most people don’t have that experience, like they’ll go to restaurant and they’d be like, “Oh, it’s a 15-minute wait.” And it’s like, “Well, that’s too long.” Whereas you go to Salt & Straw and you may be standing in that line for 45 minutes, but it becomes a social event, and being able to know what the descriptions are for, and particularly the ingredients are for all of the things that you’re about to taste could be pretty helpful.

Lea Alcantara: [Agrees]

Emily Lewis: Right.

Matthew Oliphant: Like saying, “This is the most popular flavor,” I think that tends to be the type of thing where if you don’t give it a time limit, the list is never really going to shift.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Matthew Oliphant: Because it’s sort of like the most viewed videos on YouTube almost always are going to be the same videos on the Top 10 unless something really amazing happens.

Lea Alcantara: Like Adele.

Matthew Oliphant: Right, yeah.

Emily Lewis: Yeah, it sounds like that that’s just a good example that his question was a good example, kind of what you shouldn’t do, that you should be really evaluating what you want to measure first.

Matthew Oliphant: Yeah.

Emily Lewis: And then coming in with a question.

Matthew Oliphant: I think, in and of themselves, they’re really good ideas, but I think taking, like if Salt & Straw came to me and said, “Hey, we want to redo our website or we want to build a mobile app,” I’d start with going and sitting in all different Salt & Straws and seeing what they talk about while they’re in line.

Lea Alcantara: [Agrees]

Matthew Oliphant: What they expect to find when they get in and they’ve never been there before, and then that will inform the kinds of things that go into the app, you know?

Emily Lewis: [Agrees]

Lea Alcantara: Very cool, very cool. Well, we’re nearing the end of the show, but we’d like to know if you have some top resources for our listeners to learn about good user experience principles, what are good resources for those who are well versed in UX, but want to take their work to the next level?

Matthew Oliphant: I guess for people getting into it or people wanting a bit more support because they’re earlier in their career, there essentially a fair bit of resources at usability.gov.

Emily Lewis: [Agrees]

Lea Alcantara: Oh, okay.

Matthew Oliphant: It’s all free. There are consent forms that you can use when you need to have people get consent for being recorded and things like that. It also has a lot of different methods that have been tested and are based on research.

Lea Alcantara: [Agrees]

Matthew Oliphant: So you can see like eye tracking in on there, usability testing, different types of usability testing, and how much research there is out there that supports that this is a valid way of doing things, so that’s really good and it’s free. And then there are some old school books that I like, but they’re more textbooks. One is called Human Performance Engineering by Bailey, and one is User and Task Analysis for Interface Design by Hakos and Redish, but the newer stuff is that Rosenfeld Media and A Book Apart tend to put out can be good introductory level kind of stuff as well, but nothing really beats working with a mentor or working somebody to guide you through this stuff.

Emily Lewis: [Agrees]

Matthew Oliphant: As for the people who are already well versed in UX, that’s a really good question, and I don’t know. It’s something that I’ve been thinking about a lot lately because I have gone to a couple of UX conferences in the past couple of years.

Lea Alcantara: Sure.

Matthew Oliphant: And there’s nothing for me there other than seeing friends.

Lea Alcantara: [Agrees]

Matthew Oliphant: Contact-wise, this all tends to be for people who are just getting into the field or who have may be one to two years’ experience and I’ve been doing this for about 17 years now and I’m trying to figure out like, how do I get better at what I do?

Emily Lewis: [Agrees]

Matthew Oliphant: And one of the ideas I’m toying with, there’s this German cooking called Staging. It’s a French term where one chef will go to work for another chef to learn another technique.

Emily Lewis: [Agrees]

Matthew Oliphant: And I’m trying to think of a way to get to work from a UX perspective where I’ve got a friend who works at Salesforce and he’s been doing this stuff longer than I have, and wouldn’t it be cool to like go there for a couple of weeks and work on a project with him.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Matthew Oliphant: And learn how he does stuff, and then that then informs, like maybe I need to be a better at some aspect or maybe I need to reconsider how I do something, and I can get that from working with him.

Emily Lewis: [Agrees]

Lea Alcantara: Mentorship essentially.

Matthew Oliphant: Yeah, it is a mentorship, but it’s like specifically around a project, so it’s mentorship between two experts, I guess.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Emily Lewis: I think that would be great for anyone working in our field. It’s probably a little harder for what you do because so much of it you’d want to like be in person with.

Matthew Oliphant: Yeah.

Emily Lewis: But I know when Lea and I have been brought in to another agency or collaborated with another, let’s say, front-end developer on something, getting to see how they work and even just getting a hold of their code is exciting for me because I’m always going to spot some new things that I’m not trying or not using, even though I’ve been doing this for 20 years.

Matthew Oliphant: Right, right.

Emily Lewis: Yeah.

Lea Alcantara: So you talked about like general UX resources. Is there anything that’s specific to like using data, specifically data for UX…

Emily Lewis: Testing.

Lea Alcantara: And testing for design?

Matthew Oliphant: Well, yes and no. There really isn’t anything out there like from a tool perspective that I found that works really well across the board. I mentioned user testing, that tends to be helpful if you need to get a fair amount of people to react to something in a short amount of time, but it doesn’t really allow for in-depth conversations. It kind of depends on what you’re trying to accomplish, but on most part, there really isn’t anything that exist that makes that easy. You’re tending to use multiple tools that all do different things.

Emily Lewis: [Agrees]

Matthew Oliphant: I mean, honestly, I tend to end up in spreadsheets more often than not. Whether it’s going through survey results or how I’m categorizing usability problems that are coming out of usability testing or whether it’s keeping track of research participants and things like that. I would say some of the old school tools are still the best tools.

Lea Alcantara: Right. Well, thanks, Matthew.

Matthew Oliphant: [Agrees]

Lea Alcantara: But before we finish up, we have our Rapid Fire Ten Questions, so our listeners can get to know you a bit better.

Matthew Oliphant: [Laughs] Okay.

Emily Lewis: [Laughs]

Lea Alcantara: Are you ready?

Matthew Oliphant: I’m always ready.

Lea Alcantara: All right, first question…

Matthew Oliphant: I’m not ready.

Emily Lewis: [Laughs]

Matthew Oliphant: Okay, now I’m ready. Now, I’m ready.

Lea Alcantara: Morning person or a night owl?

Matthew Oliphant: Neither. [Laughs] My optimal time is between 10 a.m. and 2 p.m.

Emily Lewis: [Laughs]

Matthew Oliphant: That is where I get most of my work done.

Emily Lewis: All right, what’s one of your guilty pleasures?

Matthew Oliphant: Korean dramas.

Emily Lewis: [Agrees]

Lea Alcantara: Oh, I love Korean dramas.

Matthew Oliphant: They’re so wonderful.

Emily Lewis: [Laughs]

Lea Alcantara: Boys Over Flowers.

Matthew Oliphant: [Laughs]

Lea Alcantara: What software could you not live without?

Matthew Oliphant: Any software, nothing specific, but any software that allows me to connect with my friends who live all over the place.

Emily Lewis: What profession other than your own would you like to try?

Matthew Oliphant: Cooking.

Lea Alcantara: What profession would you not like to try?

Matthew Oliphant: Anything that involves getting dirty.

Timestamp: 01:00:02

Emily Lewis: [Laughs] If you could take us to one restaurant in your town, where would we go?

Matthew Oliphant: I think we would go to Expatriate because they make good cocktails and have good food.

Emily Lewis: Is that like a bistro, like American bistro? Is that it?

Matthew Oliphant: No, it really is more cocktail bar. They just happen to have really good food as well and they have more like an izakaya kind of thing where they have food that goes with the drinks.

Lea Alcantara: Oh nice.

Emily Lewis: [Agrees]

Matthew Oliphant: But it’s not Japanese food at all, but like you can say, “I want something that doesn’t have a lot of alcohol, that has sort of a citrusy flavor and is really refreshing.”

Emily Lewis: [Agrees]

Matthew Oliphant: And then they’ll put a drink in front of you and it will be exactly what you wanted.

Lea Alcantara: Nice.

Emily Lewis: Cool.

Matthew Oliphant: So it’s a nice little pub.

Lea Alcantara: If you can meet someone famous, living or dead, who would it be?

Matthew Oliphant: I don’t know. I think I would probably want to meet, I don’t know, I don’t really care about famous people.

Emily Lewis: [Laughs]

Lea Alcantara: Fair enough.

Emily Lewis: If you could have a super power, what it would be?

Matthew Oliphant: I think probably flying.

Lea Alcantara: What is your favorite band or musician?

Matthew Oliphant: I don’t know. I know it’s a terrible thing, but it’s like too many things is coming to mind at once. It really depends on the mood I’m in.

Emily Lewis: All right, last question, pancakes or waffles?

Matthew Oliphant: Well, waffles are just pancakes with little squares on them, so I’m going to go with pancakes.

Emily Lewis: [Laughs]

Lea Alcantara: Nice, nice.

Matthew Oliphant: That was a quote from a movie. [Laughs]

[Music starts]

Lea Alcantara: Oh.

Matthew Oliphant: Sorry.

Lea Alcantara: Which movie was that from?

Matthew Oliphant: Tapeheads.

Lea Alcantara: Oh, fun. So that’s all the time we have for today. Thanks for joining the show, Matthew.

Matthew Oliphant: Thank you.

Emily Lewis: In case listeners want to follow up with you, where can they find you online?

Matthew Oliphant: I am @matto on Twitter and that leads to all the other connections that I have.

Emily Lewis: Okay, excellent. Well, thanks for joining us today.

Matthew Oliphant: Thank you.

Lea Alcantara: CTRL+CLICK is produced by Bright Umbrella, a web services agency obsessed with happy clients. Today’s podcast would not be possible without the support of this episode’s sponsor! Thank you, Visual Chefs!

Emily Lewis: We’d also like to thank our partners: Arcustech and Devot:ee.

Lea Alcantara: And thanks to our listeners for tuning in! If you want to know more about CTRL+CLICK, make sure you follow us on Twitter @ctrlclickcast or visit our website, ctrlclickcast.com. And if you liked this episode, please give us a review on iTunes, Stitcher or both! And if you really liked this episode, consider donating to the show. Links are in our show notes and on our site.

Emily Lewis: Don’t forget to tune in to our next episode when Lewis Francis joins the show to talk about the QA process. Be sure to check out our schedule on ctrlclickcast.com/schedule for more upcoming topics.

Lea Alcantara: This is Lea Alcantara …

Emily Lewis: And Emily Lewis …

Lea Alcantara: Signing off for CTRL+CLICK CAST. See you next time!

Emily Lewis: Cheers!

[Music stops]

Timestamp: 01:02:47

Love this Episode? Leave a Review!

Emily Lewis and Lea Alcantara

CTRL+CLICK CAST inspects the web for you!

Your hosts Emily Lewis and Lea Alcantara proudly feature diverse voices from the industry’s leaders and innovators. Our focused, topical discussions teach, inspire and waste no time getting to the heart of the matter.