• 48:00

Episode number: 98

Ethics & Technology

with Kristin Valentine

Summary

Doctors and lawyers have a code of ethics — maybe it’s time web professionals do as well! Kristin Valentine explains how more of us should consider ethical practices when designing and developing for the web. We talk about the importance of privacy and security, how other industries address ethical concerns, and guidelines to help navigate the complicated topic of applied ethics in web work.

Tags

Sponsored by

  • Craft CMS: Dot All Conference
  • Your ad here (dimensions: 520 pixels wide and 60 pixels tall)

Episode Transcript

CTRL+CLICK CAST is proud to provide transcripts for our audience members who prefer text-based content. However, our episodes are designed for an audio experience, which includes emotion and emphasis that don't always translate to our transcripts. Additionally, our transcripts are generated by human transcribers and may contain errors. If you require clarification, please listen to the audio.

Preview: Educating the client is a huge part of our job. Making sure they understand that this vital, that could have legal ramifications, and that’s completely unethical, in my opinion, to not inform the client of these issues.

[Music]

Lea Alcantara: From Bright Umbrella, this is CTRL+CLICK CAST! We inspect the web for you! Today we have Kristin Valentine on the show to discuss ethics and technology. I’m your host, Lea Alcantara, and I’m joined by my fab co-host:

Emily Lewis: Emily Lewis!

Lea Alcantara: Craft CMS is excited to announce the inaugural Dot All Conference for Craft developers. Dot All 2017 is taking place on October 22nd and 23rd in Portland, Oregon and will feature a Craft free workshop led by Ryan Irelan of Mijingo, plugin development lab and a full day of sessions relevant to Craft and web development. For more information and to buy your ticket, visit dotall.com. Hope to see you there.

[Music ends]

Emily Lewis: Kristin had every intention of becoming a famous oboist, but instead she went into technology both as an interactive designer and web developer. She has championed accessibility, thoughtful systems implementation and user-centric interfaces, and today she’s here to talk with us about technology and ethics. Welcome to the show, Kristin.

Kristin Valentine: Thanks for having me.

Lea Alcantara: So Kristin, can you tell our listeners a bit more about yourself?

Kristin Valentine: Sure, sure. Yeah, I’ve been a web developer for about 12 years. Professionally I was messing around in Geocities and Angelfire long before that, of course. [Laughs]

Emily Lewis: [Laughs]

Kristin Valentine: But yeah, when I’m not doing that, I usually try to spend as much time outside as possible. So I do a lot of mountain biking, road biking, hiking and all that good stuff.

Emily Lewis: And you’re in the Pacific Northwest, so it’s kind of like perfect area to do that kind of stuff, right?

Kristin Valentine: Yeah, yeah. I’m in Portland so it’s really easy to go outside, that’s for sure. Pretty much everyone who moves here ends up becoming an outdoor person. [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: Yeah, it sounds a lot like Seattle too, yeah.

Kristin Valentine: Yeah. Yeah, I grew up in Spokane.

Lea Alcantara: Oh.

Kristin Valentine: So it’s not exactly far from me. [Laughs]

Emily Lewis: So Kristin, before we dive into how ethics and technology intersect, can you define ethics? What are they?

Kristin Valentine: Yeah, it’s kind of a hard thing to define in some ways, but generally speaking, it’s sort of generally agreed upon principles of conduct for people, so there are lots of different ideas around what that might mean. So in terms of it’s kind of like the standing ideas of right and wrong like right to life, right to privacy, do unto other as you would have them do unto you, and there’s also the idea of like theoretical versus applied ethics. So theoretical ethics is kind of more about understanding the nature of ethics and sort of that philosophical reasoning.

Emily Lewis: [Agrees]

Kristin Valentine: Whereas applied ethics is like, “In situation X, what would I do?”

Emily Lewis: [Agrees]

Kristin Valentine: So that’s kind of more along the lines of like professional ethics is part of applied in terms of thinking. It’s sort of like the obvious things I tend to think of like lawyers have very specific professional ethics they have to abide to or they’ll lose their license.

Lea Alcantara: Right.

Emily Lewis: It’s same with like doctors, too.

Kristin Valentine: Right.

Emily Lewis: People who have these critical professional roles, but it’s been a super, super long time since I was in college. [Laughs]

Kristin Valentine: Yeah. [Laughs]

Emily Lewis: But I had to take an Introduction to Ethics class for my degree, and it was more theoretical than practical in sort of how ethics have evolved in different lines of thinking. The most interesting lesson that I walked away from from that class is – I’m not sure it was the best lesson for a 19-year-old me, but we had to do a group project and when we presented the project in front of the class, the professor then asked each of us if we wanted to be graded as an individual or as a group after we had given a presentation, and I had worked with a group that was just like not doing a whole lot of work, but I went with what I felt was like the socially-acceptable choice to say, “Well, I want to be judged as a group,” and so he gave us a B and then informed me that if I had chosen individual, I would have got an A in front of the whole class.

Kristin Valentine: [Laughs]

Emily Lewis: So like that was his lesson in theoretical ethics, “Look out for number one.”

Lea Alcantara: Well, it’s just interesting that you bring that up because before this show, you sent Kristin and I this link about trust.

Emily Lewis: [Agrees]

Lea Alcantara: And it was actually a game simulation about trust exercises and what would you do in situation A, B, or C and a lot of it was look after number one or look after the overall group or a variation of those.

Emily Lewis: [Agrees]

Lea Alcantara: And so it’s interesting that your Ethics professor basically did that in a live real world simulation.

Emily Lewis: Well, I had shared you that game, it’s called The Evolution of Trust, and it’s like an online game that works great on mobile, or at least it did on my Android and it works nicely on the desktop. You know what, I played it the first time with the assumption of choosing to look out for number each time and that really works in certain situations in terms of like everyone doing well, which frankly is kind of I think the premise of the game is trying to promote, like if everyone can do well and how you achieve that, and I think the lesson that my professor was trying to teach was at the age I was with the experience in life that I had at the time, I was pretty upset because I felt like I had missed out on something, but looking back, I learned a lot working in that group. So maybe I got a B, but I got some other things that maybe couldn’t have been measured and I can sort of see that perspective as well.

Lea Alcantara: Well, I find that an interesting conclusion because I thought that from me, the perspective of the game was overall, depending on the length of time and the length of interactions, choosing the group is actually beneficial, like as in, “Well, really, hmm, I don’t know. Was the real lesson actually finding a win-win for everyone?” Right?

Emily Lewis: That’s what I walked away from.

Lea Alcantara: That’s it, win-win.

Emily Lewis: Yeah, for everybody.

Lea Alcantara: Yeah, the win-win for everyone. But then if that’s the perspective, isn’t that a group perspective instead of individual?

Emily Lewis: Yeah, I mean I guess it’s kind of comes down to what Kristin was defining earlier like what people’s ideas of what the right and wrong way of doing things are.

Kristin Valentine: Yeah.

Emily Lewis: And I think it’s kind of easy a point to lawyers and doctors because that’s stuff is codified, like you said, they could lose their license if they don’t follow these rules, but then it becomes nebulous when you’re talking about industries that don’t have this defined like ours or like some areas of tech.

Kristin Valentine: And it’s harder still when so many of us didn’t go to school for this, you know?

Emily Lewis: Yes.

Lea Alcantara: Right.

Kristin Valentine: When I went to school, there was no such thing as web design class. There was no such thing as web development class. [Laughs] And even CSS isn’t necessarily the same thing, and so, so many of us are self-taught, and even the people that go into 6-week to 6-month code boot camps, I’m seriously doubting they’re taking some kind of ethics class.

Emily Lewis: Yeah.

Lea Alcantara: So in regards to that, we’ve mentioned other industries and we’ve already pointed out that our sector technically doesn’t have those structured learning for ethics. How do you believe ethics plays into our technology sector overall?

Kristin Valentine: Yeah, it’s a really great question. I mean, I think in the past it hasn’t really. [Laughs] I think that it’s a tricky thing because I think in some ways technology separates humanity from animals, you know?

Emily Lewis: [Agrees]

Kristin Valentine: Like from the beginning, we’ve created technology in the forms of whether it’s a rock to hit with or a wheel and obviously so much more than that, but it does set us apart, and so I think that we get stuck in this mindset that progress means everything.

Emily Lewis: [Agrees]

Kristin Valentine: It’s like we have to keep pushing ahead and anyone who disagrees with this notion of technological process as a crank pot or lead eye. I mean, it can even be back at the Industrial Revolution when people started to say, “Hey, like this is changing our lives in a way that is unhealthy.”

Emily Lewis: [Agrees]

Kristin Valentine: And those people were just swiftly ignored and you start looking at things like… I think really when we had invented the atomic bomb was when people really started to say, “Hey, is this progress, like is this what we want?”

Emily Lewis: [Agrees]

Kristin Valentine: And those questions have been around for a long time, and we’re starting again to like gene editing, artificial intelligence, self-driving cars. I think it’s really starting to come to a point where people are questioning, “Do we just blindly continue on with this ‘progress’ or is this is something that we, as a society, need to take a step back and look at which is very difficult to do because we’re not really set up to do that, I think, in a successful way?”

Emily Lewis: Yeah, I think because, as you were describing, there’s such this push for innovation and technology, there lacks the pull of reflection and analysis of, I guess, the human component. I mean, maybe we analyze the data, but maybe we don’t really analyze the people and even effects over time.

Kristin Valentine: Yeah. I think what we tend to look at is we get really excited. Every new thing is going to change the way we live, you know?

Lea Alcantara: [Agrees]

Kristin Valentine: Like the computer revolution was supposed to like free us from the binds of… [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Kristin Valentine: You know, the internet was supposed to level the playing field. That’s what’s promised. It was supposed to bring up the poor and make everyone have a chance, but you could argue in some ways, it has, but if anything, we’re becoming more and more polarized.

Timestamp: 00:10:04

Lea Alcantara: I find it interesting how you even brought up all the way back. We’re talking about current technology, but with the Industrial Revolution, I think the computer revolution, the information revolution has a great parallel in that, “Oh, we can have factories and we could have energy and we can just throw as many people into this problem and get things done.”

But at the time when it first started blowing up, there was no regulations with like pollution and working conditions and all those things where like people were getting overworked and sick and then on top of that, too, increasing disparity between economic opportunity, like the factory workers versus the factory owner kind of situation and it feels like to be like when I’m thinking about it, that’s very similar to even what’s happening with the technological revolution where we’re like, “Oh, innovate, innovate, innovate,” and objectively, if you’re a robot and you look at it, you’re like, “Yehey, progress! Like stuff is getting done. Things are being made.” But at what cost?

Kristin Valentine: Exactly.

Emily Lewis: Yeah, I feel like, as we’re talking, just so many examples pop into my head of where we can see ethics at play in technology right now. I forget which guest it was, Lea. I feel like it was either Noah [Bernsohn] or Ben on that episode we did a while back on shipping and fulfillment.

Lea Alcantara: Okay, yeah.

Emily Lewis: But we were talking about how everyone has gotten so used to how inexpensive it is to ship things through something like Amazon and get it quickly that we’ve, in essence, perhaps perverted the sense of what the value is in what takes to actually ship, and this has repercussions on other shipping institutions, not institutions, but companies and employees and workers or like Uber and the reality of the people running the company versus the drivers, versus the impact on the local community and cab drivers, like these are all real technological innovations that have very equally real ethical implications.

Kristin Valentine: I feel like one issue with that is sort of that startup culture that we have where it’s just innovate, innovate, innovate.

Emily Lewis: [Agrees]

Kristin Valentine: The other problem with that, too, is that our government, they want to create jobs.

Emily Lewis: [Agrees]

Kristin Valentine: They want to create economic opportunity and they see that tech is such a huge and growing industry so they want to attract that kind of innovation. They want to attract those kinds of software companies, but the problem is there’s very little reflection on how those innovations might impact the communities that they’re a part of.

Lea Alcantara: Yeah, that reminds me of our latest episode on mobile tech for global change and our guest, Rowena Luk, basically mentioned that technology is an amplifier of human intention.

Kristin Valentine: Definitely.

Lea Alcantara: And that what’s interesting in, and we discussed this in the episode several times about how one of the issues when you’re deploying or creating apps for underserved communities, global communities, places in Africa, et cetera, a problem happens where it’s like people are making tech for tech’s sake and not actually thinking about the audience involved, and so it’s like, “Yes, let’s push to innovate, but what are you actually innovating on?”

Kristin Valentine: Or for?

Emily Lewis: [Agrees]

Kristin Valentine: Right.

Lea Alcantara: For, yeah.

Emily Lewis: So we’ve been talking about tech, let’s get a little narrower and talk about our field, our web design and development. Where does ethics come in to that sort of role as a designer or as a developer? Are there challenges that you see in today’s news or whatever that you think are demonstrations of like the challenges that we face as web workers and ethics?

Kristin Valentine: Oh, yeah, definitely. I think one thing, when I was younger, when I tended to think about ethics on what I was doing, I would tend to think like, “Well, I wouldn’t ever work on a cigarette ad or I would never build a porn site,” and that was really as far as my thinking got on the matter. But that’s not really, you know.

That’s fine and well for your personal standards, but that’s not really like a general professional standard in like, “How could I bring ethics into everything that I did?” And I think it’s a little tough because it’s easy to feel like you don’t have a choice or you can’t make decisions, but specifically as a designer, clients hire you to make decisions, to consult on things, and to help them understand why doing this or that has ethical implications.

Emily Lewis: [Agrees]

Kristin Valentine: So I do think there’s a lot there, and actually, Mike Monteiro wrote a great article on Dear Design Student about this, that we all have choices and we have to day to day like, how can we make the world better? How can we, you know? It’s not just about working on nonprofit websites or with a nonprofit organizations, but it’s about how can we create ethical work day to day, and I think part of the problem is that even the organizations and associations that we currently have like the AIGA. There are ethical standards on things that are more related to the client relationship.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Kristin Valentine: So keeping confidentiality and that kind of stuff, which is great. I mean, that’s certainly a part of it, but I guess I’m kind of wondering, “Where are the information about making sure that we’re not deceiving people, that we’re not tricking, you know?

Emily Lewis: Right.

Lea Alcantara: [Agrees]

Kristin Valentine: Like there are so many things in there. As far as development goes, there are some software engineering best practices in organizations that have ethical guidelines that I think is a little bit less related to web development specifically.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Kristin Valentine: But they do a pretty good job of kind of talking about privacy being a huge one, and generally, trying to create software that improves the world, but given the lack of formal education most of us have in this industry, I doubt many people have read that or know much about it, and I think unlike lawyers and doctors, there’s no bite there. There’s no like stick. [Laughs]

Emily Lewis: Right, right.

Kristin Valentine: So if you deliver a software that’s not bug free, you’re not going to get your license taken away. So they’re guidelines, but they’re not very sticky.

Lea Alcantara: Although like it is leaning a little bit more when we’re talking about accessibility and in terms of stickiness of that.

Kristin Valentine: Yeah.

Lea Alcantara: Because I believe in 2018, the federal government is actually going to start…

Emily Lewis: US.

Kristin Valentine: [Agrees]

Lea Alcantara: Yeah, the US government is going to start being a lot more strict about accessibility of federal websites or federally-funded websites in order to be more accessible.

Kristin Valentine: Yeah, yeah. I mean, that’s such a huge issue for me, and I think that that is one of the big ethical standards that I think everyone should have. I mean, there is a huge stick there. I mean, many huge companies have been sued for not having their sites fully accessible.

Emily Lewis: Yeah, and just to add a little bit to what Lea was saying, it’s not just websites, but also internal applications.

Lea Alcantara: [Agrees]

Emily Lewis: My boyfriend Jason is actually working on a project right now involving that sort of huge scope of all of the legacy applications and use of Excel files and PDFs and all that other stuff to sort of bring these government agencies and tools they use up to this accessibility standards, and I couldn’t agree with you more, Kris, and I mean, I can only speak to myself as a front-end developer, but I feel like that’s fundamental to what I do.

Kristin Valentine: Yes.

Emily Lewis: And I feel like it’s an expensive, incredibly time-consuming and resource-intensive effort to retrofit or mediate the stuff that has been not built right from the start.

Kristin Valentine: Yeah, and there’s an added problem of for a lot of people, that web development is a black box. Yehey…

Emily Lewis: [Agrees]

Kristin Valentine: The developers go away. They come back when some things has been made and they don’t really know anything about the details about accessibility that that’s something they even have to think about. Educating the client is a huge part of our job.

Emily Lewis: Totally, yeah.

Kristin Valentine: Making sure they understand that this vital, that could have legal ramifications, and that’s completely unethical, in my opinion, to not inform the client of these issues.

Emily Lewis: [Agrees]

Lea Alcantara: Well, what’s interesting, too, is that the internet or internet access is considered now a basic human right.

Kristin Valentine: [Agrees]

Lea Alcantara: And like other basic human rights, there should be standards involved in actually applying that and making sure that this human right is not abused.

Emily Lewis: Or denied.

Lea Alcantara: Yes.

Kristin Valentine: Yeah. I mean, in terms of accessibility, there are standards that we can all apply our websites to in the app.

Sponsored by

  • Craft CMS: Dot All Conference
  • Your ad here (dimensions: 520 pixels wide and 60 pixels tall)

Emily Lewis: And that’s why that sort of the stick of legal implications is a useful one when you’re trying to talk to your clients. They may not appreciate the ethical obligation of making their site or tools or whatever accessible to everyone, but they probably will appreciate the legal implications.

Kristin Valentine: Yeah.

Lea Alcantara: Well, there are also the business implications in that. Don’t you want to address 20% more of your audience? Don’t you want 25% increase in customers and users?

Emily Lewis: Yeah, it’s true, but then you have the challenge of – let’s just say, Marketing has decided they want one thing because it converts.

Lea Alcantara: [Agrees]

Emily Lewis: You can point out all the examples of how it’s not accessible and could be excluding a portion of their audience, but that is their opinion, and do you know what I mean?

Lea Alcantara: [Agrees]

Emily Lewis: So it sort of becomes a big challenge.

Lea Alcantara: Yeah, it’s always tricky.

Timestamp: 00:19:50

Emily Lewis: Yeah, it really is, but Kristin is right, it’s our obligation to have that conversation. Let’s talk a little bit on ethics, like people being unethical. So you have written an article about ethics and technology and you included an example, a practical situation of design. It was a bridge in Long Island that was intentionally built to keep people out so it was intentionally unethical. Have you come across other similar intentional examples in tech or the web?

Kristin Valentine: Yeah, I think some of these things, intentional versus unintentional, are like kind of a hard thing to define sometimes.

Emily Lewis: [Agrees]

Kristin Valentine: Because a lot of it is lack of education, although I don’t think that’s a pass, but yes, I mean, I think anytime design is used to deceive people, which it often is when you think about those like popups that come up to try to get you to sign up for newsletters and stuff.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Kristin Valentine: When it have that huge button like, “Sign up,” and then underneath it, really taut, really tiny, you can barely see it, and says like, “No, I don’t want your newsletter, I’m a horrible person.” [Laughs]

Emily Lewis: [Agrees] [Laughs]

Lea Alcantara: Yeah, yes, yeah. [Laughs]

Kristin Valentine: “I hate animals. No, I don’t want your newsletter.” [Laughs]

Emily Lewis: [Agrees] [Laughs]

Lea Alcantara: Yeah, yeah, yeah. [Laughs]

Kristin Valentine: I mean, for me, I have to hunt for that tiny little link that says, “No, I don’t want the thing.”

Emily Lewis: [Agrees]

Kristin Valentine: Imagine someone with poor vision or someone with like cognitive issue where they think, “But I like animals.”

Emily Lewis: Right.

Kristin Valentine: For you and me, it’s probably easy to roll our eyes and close it, but that’s not true for everybody.

Emily Lewis: [Agrees]

Kristin Valentine: And I think that is incredibly unethical and extremely intentional.

Emily Lewis: I agree. Gosh, some of this stuff makes me upset.

Kristin Valentine: I know.

Emily Lewis: Because you see things improving, but at the same time they’re not. So as things improve, there are still new issues that are popping up or that aren’t being addressed.

Kristin Valentine: Yeah, and I think, too, like I guess the big news article recently is how biased many algorithms can be about showing people this or that.

Lea Alcantara: Right.

Kristin Valentine: That’s a pretty intentional. They create those algorithms with a very specific intention. They may claim that that’s not really what they’re trying to do, but you know what they’re trying to do is make money regardless of how it affects people.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Kristin Valentine: So I would say, “Yeah, that’s pretty intentional.”

Emily Lewis: [Agrees]

Kristin Valentine: And along those lines, selling users’ information.

Emily Lewis: Yeah.

Kristin Valentine: Hiding those privacy settings so that people don’t realize that they’re being sold to advertisers. That’s starting to get better because people are kind of waking up to these issues, but it’s definitely still happening.

Emily Lewis: I mean, even something that, of course, is like a big topic right now, but the dissemination of false information and packaging it in a way that is really intended to deceive people about the legitimacy of the news source.

Lea Alcantara: Yeah.

Kristin Valentine: Yeah.

Emily Lewis: So I think some of those are really clearly intentional. What are unintentional things that you see that designers and devs that are tuning in and want to improve how they approach their profession that they could be on the lookout for and try and start avoiding?

Kristin Valentine: Yeah, I think one thing is making sure you’re focusing on the people and not on the tech, as I kind of mentioned earlier, which is kind of a broad thing to say, but I do think that that gives you a sense of whether or not what you’re creating, how it might affect people, puts you in that mindset. But like we were talking about earlier, I think accessibility is one of those things that if a team is not educated on accessibility, they just might not even be thinking about it.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Kristin Valentine: I don’t think that’s a pass, but I think that can be unintentional.

Emily Lewis: [Agrees]

Kristin Valentine: I think also one thing we haven’t really touched on is security, making sure that the things that we create are secure, that they won’t leak credit card information or other sensitive data.

Emily Lewis: [Agrees]

Kristin Valentine: And so an unintentional thing can be people not setting their sites up to be secure, allowing so many WordPress sites get hacked.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Kristin Valentine: And I think that’s an unintentional thing that WordPress is so easy to set up and anyone can 5-minute install, but I think people don’t realize they have to keep their software up to date and take basic secure standards to make sure that those things won’t happen, but unfortunately, they do. I don’t think that’s really intentional, but given that the tools we use are becoming easier for everyone to use, which is great. With things like security, they’re not just thinking about that. They don’t have education and knowledge in those situations.

Emily Lewis: Could you talk a little bit more about keeping the user, the humans, the person, in mind? We had Beth Dean on the show. Lea, was it last year or earlier this year to sort of talk about emotional intelligence in design?

Lea Alcantara: Yeah. It was last year.

Emily Lewis: I feel like that is kind of like the sort of empathy in design theme that we are seeing a lot more in our industry. Is that kind of what you’re talking about?

Kristin Valentine: Yeah, for sure. Shout to Beth.

Emily Lewis: Whohoo…

Kristin Valentine: Yeah, she’s on a lot off… [Laughs]

Lea Alcantara: [Laughs]

Kristin Valentine: She’s on a lot of work in this area. She’s awesome. I mean, I think when you’re really focused on how does this affect people, sort of one example that I can think of is Eric Meyer brought to the forefront a little while ago about when Facebook had those Year in Review things that they showed him his daughter who had recently died, which was horribly traumatic for him, and you have to imagine when the designers created that, they were only thinking about positive things. They didn’t think through how many millions and billions of people who are seeing this, like not everyone had a great year. [Laughs]

Emily Lewis: [Agrees]

Kristin Valentine: Like how can we? Really, that’s kind of the point of UX research, right? Design research is talking to people, understanding all the different situations, which, obviously, for Facebook being used by so many people, they must have to do so much research, but taking a step back and really asking, “How is this going to be used by people? How does this solve their problem? What are the things that we’re not thinking about?” I even had a recent death in my family, so it’s just not something that’s top of mind.

Emily Lewis: [Agrees]

Kristin Valentine: But you really have to step back and think about those things, and I really think that that’s where UX can come into play a lot because I think that’s kind of their job, that’s kind of…

Emily Lewis: They’re already asking those questions.

Kristin Valentine: Yeah, exactly.

Emily Lewis: Because of the nature of the work.

Kristin Valentine: Yeah, and I have to think that that can kind of create that, I guess, foil to the tech side. I was talking, I was visiting my brother last weekend, and he’s a biomedical researcher and we had kind of gotten into this topic a little bit. It’s interesting to hear him talk about how ethics are applied in medical research because to them, privacy is number one, like the privacy of the human subjects that are going through these randomized controlled trials, and so anyone who comes into contact in any way with anything has to take an ethics class.

Emily Lewis: [Agrees]

Kristin Valentine: And then they have these review boards, so he’s at a university and so anytime there’s a question, they go in front of the review board that has somebody from the community on it, and this is someone who’s completely unrelated to the field whatsoever, just a community member who has an interest in the subject.

Emily Lewis: [Agrees]

Kristin Valentine: Which is an interesting thing because they don’t really have a stake in the game other than making sure that the community is thought of, and then obviously, if one person does something that’s unethical, they’ll get fired, there are ramifications, and if the university as a whole is doing something unethical, they’ll lose their funding.

Emily Lewis: [Agrees]

Kristin Valentine: So there are all these checks and balances that start to come into play, but they have these foils. They just have these people that are making sure that what they’re doing is ethical. In terms of animal research, they have veterinarians that take care of the animals and they would like nothing more than to tell on their research and say like, “This researcher isn’t doing something that’s ethical.”

Emily Lewis: Right.

Kristin Valentine: Like that’s their job, to be empowered to point out the unethical behavior of the researcher, and so when I think about technology, I’m like, “Who’s the foil?”

Emily Lewis: Yeah.

Lea Alcantara: [Agrees]

Kristin Valentine: Like who is empowered?

Lea Alcantara: Who is the checks and balance? Yeah.

Kristin Valentine: Yeah. I mean, and it’s not even just that they’re the designer and they should like have empathy, but what they really need is to be empowered to turn around and say, “Hey guys, like I know what we’re doing is really cool and innovative, but it has the side effects that we’re just not really considering,” and I think that it’s hard because they’re still in the company, right?

Emily Lewis: [Agrees]

Kristin Valentine: As you know, it will be easy for other people in that company to turn around and be like, “Oh, well, yeah, but…” So it’s like how do we empower those people to really speak up, you know?

Emily Lewis: [Agrees]

Kristin Valentine: I don’t know, and that’s a huge challenge.

Lea Alcantara: I find it so fascinating like when you mentioned how everyone is so concerned and serious about privacy and medicine, and for us, as we’re discussing this, and even to layman people, it’s like, “Yeah, of course, like medical information is really private.” However, all these web apps that we’re using have probably close to similar amount of information on us. [Laughs]

Kristin Valentine: Yeah.

Lea Alcantara: And including medical information, to me, like I have my own level of privacy comfort or whatever, how much I’m willing to share, but there are people who share their medical information as well as, of course, all their addresses and phone numbers and everything to all these particular web apps. You know I use Mint, for example, and my banks are attached to that information and now you’re making me wonder, “Holy crap, all these people who are actually working on my info probably should take an ethics class.” [Laughs]

Kristin Valentine: [Laughs]

Emily Lewis: [Laughs]

Kristin Valentine: Yeah, it’s kind of scary to think about.

Timestamp: 00:30:00

Emily Lewis: Yeah.

Kristin Valentine: I mean, in terms of like security and privacy, there are some companies who typically like I’m not going to deal with credit cards, like I’m going to go find someone like Stripe where it’s all going to live on their servers and I know that I don’t have to deal with it.

Emily Lewis: [Agrees]

Kristin Valentine: But like in terms of someone’s address, like there are no standards about the security of the database that those addresses live in.

Emily Lewis: Yeah. I think it’s really frustrating, but I think it’s an important conversation to have. I was also just thinking how challenging it is for just the three of us to sort of discuss this in a really hopeful, optimistic “Oh, this is what we do” way, [laughs] because it’s not clear right now.

But I think this is when we start getting into an even bigger challenge when it goes outside of tech professional speaking amongst themselves, but talking to the broader public about technical issues that affect them, their privacy, them caring about that the – what is it – the ISPs can now track our data and sell it based on our browsing behavior. So the concepts are so difficult for us to talk about how do you inform the broader public so that they also vote and advocate for their interest where tech is involved.

Kristin Valentine: Yeah, in many ways I think that the public has to be the foil to the companies like, “We have to discuss these issues.” I mean, think people are starting to understand, but what’s interesting about that is that the people who control that flow of information are the tech companies.

Emily Lewis: [Agrees]

Kristin Valentine: So that’s kind of disturbing [laughs] when you think about it, but I think, unfortunately, it seems at this point that we have, as the consumers of these products have to say, “This isn’t up to my standards of comfort level of privacy, I’m not going to use it.”

Emily Lewis: [Agrees]

Kristin Valentine: But how do you evaluate that? I don’t have any answer for you.

Emily Lewis: Yeahl

Kristin Valentine: It’s really, really hard.

Emily Lewis: As we mentioned, there are really no formal guidelines for ethics on the web for people who build the web. If you could create a standard, what would be the basics? What would you have to include?

Kristin Valentine: I think some of the things that we talked about like making sure what we create is accessible. I think it’s pretty basic. I think privacy concerns. I mean, the EU is definitely waiting on this, like creating legal standards around privacy, around cookies, and gathering information, that whole right to be forgotten idea. So that is something that’s happening not in the US, of course, and then as well as paramount security. In some ways, the companies are starting this, like Google is starting to prioritize sites that use HTTPS, but I think it also comes down to confidentiality, honesty, and a big one is really just being transparent with your users about how you’re using their data. If there could be some way that we could rate or let people know like, “How does my data live here? How can I get it out?”

Emily Lewis: [Agrees]

Kristin Valentine: Like what happens when the company shuts down? Does all my data get sold to somebody? [Laughs] Like I feel like a lot of it comes down to transparency and understanding and also just understanding why it’s important, but yeah, I don’t know that I have like specific bullet points of exactly what I would include, but those are sort of the topics I think that stand out to me.

Emily Lewis: I mean, it feels like not impossible, but just incredibly overwhelming challenge if you just think about the web standards movement and how – I mean, I don’t even hear people talking “web standards” anymore because people are calling it different things, accessibility, progressive enhancement or whatever, but I think one of the things I liked about that notion is that, in a way, especially as a new developer who needs constraints, it was kind of my rule book.

Kristin Valentine: [Agrees]

Emily Lewis: These are my rules, these are my standards, therefore, these are the things I’m going to hold myself accountable for, and I think that’s useful and I wonder how much that kind of approach is being introduced, like I said, especially to new developers who do need something to measure themselves against.

Kristin Valentine: Yeah, I think having the accessibility standards are helpful, but there are so many other things to think about. It would be nice to have something more tangible.

Lea Alcantara: Well, what I liked was this article that Emily shared to me called Design Ethics in Practice, and she had very specific questions that should be asked throughout the project that I thought like were very straightforward and very clear. We’re going to link to this article in the show notes, but she breaks them up into three particular phases like questions you ask before your research and when you’re starting to prepare design, questions while you’re defining the problem during your research, and of course, questions after the research or after you’ve already started to design, and just a couple of ones of my favorites.

Of course, we’ve been kind of emphasizing this, but I like the specificity of some of these questions like: what are your assumptions about who will use this product? And then when you close your eyes and think of the personas you’re making, are you thinking of stock photos of smiling white people in large offices and houses? You know, like who are you not?

Kristin Valentine: [Agrees]

Lea Alcantara: Who are you not thinking about? And even the actual test itself, like the question, how accessible is the building where you will be talking to users? So when you’re actually doing the test, does the environment affect how the outcome is going to be, even if it’s for a technological solution? So I just found that really, really interesting. I think one way we could help with like formal guidelines is to have like these generalized questions like codified somehow, almost like the accessibility checklist or web standards checklist when you’re going through these design processes.

Emily Lewis: A designing for humans checklist. [Laughs]

Kristin Valentine: Yeah.

Lea Alcantara: Exactly, exactly, beyond the fact that where it’s like, is the contrast great, which is good for accessibility in terms of just visual design answer. Perhaps the real basis of all that is context, like what is the point of all of this, and who is going to be using it? Like again, like you mentioned, Kristin, earlier like getting people back into the focus of the app or the website in the first place.

Kristin Valentine: Yeah, and I think one thing that continues to be tricky there is what are the unintended side effects of the thing that we create, you know?

Emily Lewis: [Agrees]

Kristin Valentine: Like for your average site, like that maybe isn’t a big deal, but when we’re talking about something like inventing the iPhone, you know, like… [Laughs]

Emily Lewis: [Agrees]

Lea Alcantara: Right, right, right.

Kristin Valentine: When you start thinking through that, how huge the society changes that had come from that, you know?

Lea Alcantara: Right.

Kristin Valentine: Like, I mean, how could you possibly know? But when you start thinking about how much like notifications has changed the way where you get like phantom buzzing in your pocket.

Emily Lewis: Yeah, yeah.

Lea Alcantara: Right, right.

Kristin Valentine: It’s like psychological changes in people. I mean, it’s overwhelming to even begin to think about, but I think we should start thinking about these things.

Emily Lewis: Even if nothing changes ultimately, I agree, we do have to think about these things because this is how – I don’t know – not only people want to innovate, but don’t we also want to create things that are long lasting?

Lea Alcantara: Right.

Emily Lewis: And the long lasting things are the things that help people, that really support a person’s quality of life.

Kristin Valentine: Yeah.

Emily Lewis: So Lea mentioned the Design Ethics in Practice article that has some of those specific questions you could ask yourself. Kristin, earlier you mentioned Mike Monteiro’s Designer Code of Ethics. Do you have any other resources that you think could help someone or even an organization or team establish ethical guidelines at their work?

Kristin Valentine: I think probably the best might be looking at other industries and see how they set up their ethical guidelines.

Emily Lewis: [Agrees]

Kristin Valentine: Because our industry is relatively new, and that’s why we don’t have a code of ethics. [Laughs]

Emily Lewis: [Laughs]

Kristin Valentine: I mean, if you think about how software engineering generally has been around a lot longer than just web work, but taking a look and see like, “What does it mean for a lawyer to be ethical? What does it mean for…” And perhaps a better example for us might be, “What does it mean for an architect to be ethical? Or what does it mean for like a general contractor, do they have a code of ethics?”

Lea Alcantara: Right.

Emily Lewis: A mechanic.

Lea Alcantara: [Agrees]

Kristin Valentine: Yeah, I think those might be a little bit closer to the stuff that we work on, but certainly, an architect has accessibility standards that they have to abide by. How does space affect the people inside of it? I think that’s something that I think would be helpful for us as people who also create spaces that people gather.

Emily Lewis: [Agrees]

Lea Alcantara: Yeah. I think that’s a really great statement, even with the architecture, especially because of mobile devices. We’re creating websites and apps, but they’re interacting with it in a physical space and physical areas, and whatever we’re creating might not even end up in that mobile device. It could be in different types of spaces as well.

Emily Lewis: [Agrees]

Lea Alcantara: So that could be a really good standard to start looking at.

Emily Lewis: You know, Lea, that just gets me thinking about the technology that’s still just so, so innovative right now that it’s really hasn’t hit mainstream or even just the geeks, but the technology that we don’t even know yet where it’s even more human than before, you know, wearable technology, technology in your physical space where it’s not tied to any kind of device.

Lea Alcantara: Right, right.

Emily Lewis: Or, you know people getting chipped. I was reading some article about that last week, people are getting microchips. What are the ethical implications of that? [Laughs] So I think it’s important that we start having these discussions on a more regular serious basis because this is going to balloon as a challenge.

Timestamp: 00:40:03

Kristin Valentine: Yeah. And I think there are many other people, Anil Dash being one of them that are having these conversations publicly, and I think that actually another great resource for people generally interested in this topic is – there’s a great podcast called Note to Self that talks a lot about – they have like a privacy game, I want to say, but they do a lot of great work just to sort of about how tech influences us as people and society. It’s a pretty great one.

Emily Lewis: Before we wrap up, Kristin, I’m just curious, so I think this has been a really fascinating talk, but I have to say I don’t feel super hopeful. So I’m just wondering if this is something that you think about a lot, you’ve written about on your company’s blog, what do you think about to focus on the opportunity, the hopefulness of how this could get better?

Kristin Valentine: I think just the fact that people are becoming more tech savvy and having these conversations is making me hopeful. Facebook has changed a lot based on people like starting to realize how much it’s like controlling their lives. It’s controlling the media that we consume. I think people are starting to finally wake up to that, and what I really want is for our government also to wake up to that.

I think, unfortunately, it seems like a lot of people in government tend to bury their heads when it comes to technology and they often say like, “Oh, I don’t even use the internet. I don’t even know.” It’s like this has huge implications for us as a democracy, as a society, like we all have to wake up here. I don’t know where laws and regulations need to come in here, but they certainly do and especially when we start talking about net neutrality, which I think is obviously a huge issue for everybody on the web.

Emily Lewis: [Agrees]

Kristin Valentine: But I think that the fact that as people become more and more tech savvy, that they’re going to start waking up to some of these issues, but at the same time, I think as people get used to things, they become blinded by them.

Lea Alcantara: [Agrees]

Kristin Valentine: I think a really great quote from Langdon Winner, “Do artifacts have politics?” which is a great essay. He wrote, “In our times, people are often willing to make drastic changes in the way they live to accord with technological innovation. At the same time, they would resist similar kinds of changes justified on political grounds.”

Emily Lewis: Yeah. I’d even say social grounds, too.

Lea Alcantara: Yeah.

Kristin Valentine: Yeah. I think that’s the scary part, you know. I think when technology is kind of released into the wild, people get used to it and they’re unwilling to let it go.

Emily Lewis: [Agrees]

Kristin Valentine: They’re unwilling to see how it’s changing their lives because you kind of become used to it.

Emily Lewis: [Agrees]

Kristin Valentine: You become blind to it, and that’s a huge challenge, and I’m sorry, you asked me something optimistic. [Laughs]

Emily Lewis: [Laughs]

Kristin Valentine: And I’m telling you something super pessimistic. [Laughs]

Emily Lewis: [Laughs]

Kristin Valentine: But I do think people are starting to kind of see how these devices, how websites, how everything is affecting them and affecting their lives, and I hope that that starts to bring forth bigger conversations, and I think it is.

Lea Alcantara: Well, and we’re having this conversation.

Kristin Valentine: Yeah.

Emily Lewis: [Agrees]

Lea Alcantara: And we’re sharing the resources that we found, so that to me is hopeful in that these things are being carefully considered, and it’s just kind of the arc of time eventually.

Emily Lewis: [Agrees]

Lea Alcantara: Like we, again, you mentioned Kristin earlier the web is young. This industry is young. Sometimes it takes a certain amount of time for people to codify these particular standards, because even though we mentioned architects and doctors and lawyers, that took time for them to decide what make sense.

Kristin Valentine: [Agrees]

Emily Lewis: [Agrees]

Lea Alcantara: Or even like let’s even be more broad, there’s the Constitution and how many amendments are there, right?

Emily Lewis: [Agrees]

Lea Alcantara: So at one point, somebody wrote a list of rules and then as society changes and people change and lives change, then you make changes to that as well, right?

Kristin Valentine: [Agrees]

Emily Lewis: [Agrees]

Lea Alcantara: And society has advanced because of that, and I think we’re just at the beginning stages of that for the web and since other things have changed and improved, I don’t see why the web can be any different from that, right?

Kristin Valentine: Anything, too, like even doctors are having to change as we start getting the technology around gene editing or all these other things that are going to start to become huge issues in the medical community.

Emily Lewis: [Agrees]

Lea Alcantara: Right.

Kristin Valentine: So it’s going to be an ever changing conversation for sure.

Emily Lewis: Excellent, excellent conversation, in my opinion. [Laughs]

Kristin Valentine: [Laughs]

Lea Alcantara: Yeah. I mean, this is one of those things where I really do feel like more and more of our peers should be having this amongst themselves and to the public and everyone else because it’s a complicated problem to tackle, but without discussing it, then nobody is tackling it, right?

Kristin Valentine: [Agrees]

Emily Lewis: Well, and it’s worth tackling.

Kristin Valentine: Yeah.

Emily Lewis: I think to your point, Lea, about us being young and that it takes time. It is going to take time for everyone to see the win-win.

Lea Alcantara: Right.

Emily Lewis: For everyone to say, “Okay, I’ll build this site accessibly and I can only envision that benefiting this group of people,” but someday that person will see, “Oh, it benefits everyone and me.”

Lea Alcantara: Right.

Emily Lewis: So I think it just takes time to start seeing where the win-win is.

Kristin Valentine: [Agrees]

Lea Alcantara: Well, that’s all the time we have for today.

Emily Lewis: [Laughs]

Lea Alcantara: But before we finish up, we’ve got our Rapid Fire Ten Questions so our listeners can get to know you a bit better.

Kristin Valentine: Okay.

Lea Alcantara: Are you ready?

Kristin Valentine: Yeah. [Laughs]

Lea Alcantara: Okay. First question, introvert or extrovert?

Kristin Valentine: Extroverted introvert.

Emily Lewis: [Laughs] The power is going to be out for the next week, what food from the fridge do you eat first?

Kristin Valentine: Currently, there’s nothing in the fridge. [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Kristin Valentine: So condiments. [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Emily Lewis: Catsup. [Laughs]

Kristin Valentine: Yeah. [Laughs]

Lea Alcantara: What’s your favorite website for fun?

Kristin Valentine: Oh, gosh. My feed reader. [Laughs]

Emily Lewis: What’s the last thing you read?

Kristin Valentine: Well, actually, I was reading Landon Winner’s The Whale and the Reactor, so I’ve been reading lots of 1980’s articles on tech. [Laughs]

Lea Alcantara: Very cool. What’s the best piece of professional advice you’ve received?

Kristin Valentine: I think that would have to be you have to ask for what you want.

Lea Alcantara: [Agrees]

Emily Lewis: How about the worst piece of professional advice you’ve received?

Kristin Valentine: Go get a graduate degree. [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Kristin Valentine: Which I did not do. [Laughs]

Lea Alcantara: What’s your favorite color?

Kristin Valentine: Red.

Emily Lewis: If you could take us to one restaurant in your town, where would we go?

Kristin Valentine: We would go to Yoko’s Sushi, which is awesome.

Lea Alcantara: What’s your favorite board game?

Kristin Valentine: I think it’s called Small World. I haven’t played it in a little while, but…

Emily Lewis: All right, last question, Hulu or Netflix?

Kristin Valentine: Netflix.

Lea Alcantara: So that’s all the time we have for today. Thanks for joining the show.

Kristin Valentine: Thanks for having me on. It was fun.

Emily Lewis: In case our listeners want to follow up with you, where can they find you online?

Kristin Valentine: I am on Twitter and I’m @kristinjval, and then I started a Medium publication very recently kind of around these topics, around ethics and how tech affects society with the same username, @kristinjval. Right now, there are just two of my own articles, but I’d love to have other contributors, so if you’re interested, please get in touch.

[Music starts]

Emily Lewis: Thanks again, Kristin. I really think this was a good conversation and definitely, hopefully, puts some food for thought in our listener’s minds.

Kristin Valentine: Awesome, thanks.

Lea Alcantara: CTRL+CLICK is produced by Bright Umbrella, a web services agency obsessed with happy clients. Today’s podcast would not be possible without the support of this episode’s sponsor! Many thanks to you, Craft CMS!

Emily Lewis: We’d also like to thank our partner: Arcustech.

Lea Alcantara: And thanks to our listeners for tuning in! If you want to know more about CTRL+CLICK, make sure you follow us on Twitter @ctrlclickcast or visit our website, ctrlclickcast.com. And if you liked this episode, please give us a review on Stitcher or Apple Podcast or both! Links are in our show notes and on our site.

Emily Lewis: Don’t forget to tune in to our next episode when Laurie Voss joins the show to discuss the current state of web development trends. Be sure to check out ctrlclickcast.com/schedule for more upcoming topics.

Lea Alcantara: This is Lea Alcantara …

Emily Lewis: And Emily Lewis …

Lea Alcantara: Signing off for CTRL+CLICK CAST. See you next time!

Emily Lewis: Cheers!

[Music stops]

Timestamp: 00:48:14

Love this Episode? Leave a Review!

Emily Lewis and Lea Alcantara

CTRL+CLICK CAST inspects the web for you!

Your hosts Emily Lewis and Lea Alcantara proudly feature diverse voices from the industry’s leaders and innovators. Our focused, topical discussions teach, inspire and waste no time getting to the heart of the matter.