Episode 10 The Ethical UX Exploration: How Digital Products, Dictate Lifestyles

In the season 2 premiere of Thoughtcast by Onething, Divanshu sits down with Manik and Venky from the team to discuss a crucial topic in the world of design; that being ethical UX, and the use of dark patterns. We run through various examples, including Netflix, and talk about how the laws of UX are sometimes exploited to achieve business objectives and KPI’s. We then discuss the process of finding a balance between the business side, and ethical side of things, with Venky bringing to the table some real-world examples of where this decision had to be made, including Lokal!
All told, the discussion of ethical design is an important one. UX designers do hold a certain responsibility to deliver honest products to users, and the solutions discussed in this podcast may just help you on your own UI/UX design journey! Tune in to this episode of Thoughtcast by Onething now!

Episode Transcript

[00:00:00] Hey guys, welcome to the all-new season of Onething’s podcast, which is now a
Thoughtcast. I have with me, Manik Arora and Venky Hariharan from the team. Manik is my
co-founder and Venky is one of the founding members at Onething and also a UX lead.
Today in this room, we are going to discuss something around UX, which is slightly
controversial, slightly on the edge, but it is very important and gets missed often. We are
going to talk about ethical UX and I believe that as designers, it’s our responsibility to take
ethical UX into consideration and, experience in a very ethical manner when once we are
delivering it or once we researching it and during the entire process. What, what we have
realized is as designers or knowingly or unknowingly, we end up, conducting practices, which
can result into manipulative, very persuasive UX that might not be in the users interests.
And how in this discussion, we want to bring out those touch points with users and how as
designers, we can design responsibility and take care of such things in the, UX process.
Manik and Venky. I mean, we’ve designed hundreds of products together. What do you guys
think? You know, when it comes to trying to balance UX objectives and business objectives
or compliance objectives?Where do you see that line kind of diverging towards?
I think the first thing is as you started, it’s very critical for designers to now think about
ethics. We design what people do when they spend their time. We design it for them. What
people should do. We don’t know it maybe, but it’s a huge responsibility, which is why it’s
[00:02:03] [00:02:00] Not to forget the fact that there’s psychology mapping, that behavior
and having them sort of on the product. The maps that you’ve created for their behavior. So
it’s very easy to sort of, you know, be manipulative about your practices and manipulative
about the way the products transform the user. This is something that’s very critical at this
[00:02:34] I think that’s the point. It’s not being dark at all. In fact, as a designer, everyone
talks about how to design ethically, sustainably in a way that benefits both the business and
the user more people absorbing. Digital media is growing, especially post pandemic and
more and more people are coming onto this digital platform, accessing information, doing
transactions. And everything previously in a digital world is now getting replicated to digital.
I think, I would say these are the ones that use most is probably my mother, probably my
parents, you know, people that, you know, I think it becomes a social responsibility that we
have as designers when we guide them in a way which does not make them do things that
they would not want to. You know, and so how do you, how do you, how do you take care of
that? How do you bring that equation, into the design process?I mean, how do you do that?
Very, very thin line between manipulation, which is done for something good and something
which has done only for business objectives.
[00:04:08] If you see for an example, WhatsApp, you gave the example. They believe
everything that they read in it, anything and everything.

[00:04:24] So even though WhatsApp has kind of a small ‘forwarded’ thing, but it’s too small,
it goes unnoticed. And I would say that I think it means more.
[00:04:45] Yeah, and I think, bringing it back to the design aspect of things, we are behind
those practices in place now, kind of having the right guy know what you’re getting into.
[00:05:00] It’s just a matter of telling them how to get out of it as well. And I think that’s
where a lot of businesses feel.
[00:05:07] Consciously or unconsciously, I think we were discussing subscription services.
Like it’s very easy to get into the subscription model, but it’s very difficult to get out of it. But
if you want to make sure that that loop is easy. I don’t think we would be here discussing
how to be ethical because that’s one of the key concerns.
[00:05:32] How to understand the business needs to, to get capabilities and design it,
starting the process to think of the implications of this 10 years down the line. What if it goes huge.
[00:05:52] A hundred thousand users, bringing users back in [00:06:00] six figures, a million
users. That’s the size of a small country. It’s about, they want to go about with them as well.
And then that’s how that goes. So either you can grow it as an ethical, sustainable model
and a product, or you can make it into a virus. And there are many
[00:06:26] that have followed that model when you’ve been on the other side of that
equation, completely disrupted. Well, you know, the worst possible way
[00:06:37] people talking about these different types of manipulations, anything which is not
done on the face of things, do go under the table. Imagine GoDaddy for example. Right.
Even though not a lot of people use it, but there is, it’s a huge audience. A very clear
example. I see there is they [00:07:00] add things to your cart without letting you know. You
don’t choose what is in your cart!
[00:07:05] Absolutely. I mean, why would you do that?
[00:07:12] I mean, as, as you said to me again, business objectives dictate, a lot of design
decisions, and that’s where I think this discussion is being led towards and how much does
the design part get influenced by the business. So I’m going to do it with the human
experience and not a business objective because ultimately what matters is longevity of a
brand and not goals.
[00:07:36] And I think that’s where. Businesses also need to understand that, you know,
designers don’t just bring in screens and visual delight, but they bring in brand loyalty. So
that’s where we miss out on, being bought. So many of these sorts of projects are about
I design the product in that manner. I think one of the key drawbacks that we’ve done in the
past, which you know now, is when I followed favorite buttons to talk about.
So, when you put it in the research phase, and I think that’s exactly what I think needs to
begin the research phase. Lokal as a product built for local language, news and classified

users, and as soon as we thought about the three cities, we just, you know, we went on the
assumption that, you know, how much they’d be tech savvy, like, you know, how much do
they know about?
[00:08:55] And we jumped in with that notion, right? But in the process, thank God for an
[00:09:00] ethical process of user research that informed us that these people know us more
than I initially thought, and they are actually actively using their devices to do that. Pretty
much everything that we do, maybe even more sometimes.
[00:09:16] So, and I think that is where it comes into play as well as in terms of your user
personas, that also plays a major part in being ethical. And I think that Lokal really broke that
notion for us. Right. That’s right. Lokall, especially that close to an even bigger responsibility.
That’d be hard because these people we’re building for are more susceptible to dark
[00:09:40] We can still identify, we can still kind of sway away from them and in turn, then
they may go through it. I think it ties back to what he was saying about, you know, affecting
change within like a hundred thousand people. These are the next billion users exactly. To
the point where it’s a big, big game.
[00:10:01] Bill and Melinda gates foundation, Google Facebook, all of these guys are coming
together to sort of help them get into tech in a sustainable fashion. And I think that is, you
know, what we are discussing is that they are also doing their part in trying to bring in that
ethics into the new set of users, because , it’s a little difficult to sort of, you know, break
down that chain that has afterward. Yeah, let’s talk about a couple of more examples.
[00:10:29] These are very interesting. I think that’ll be fun. Yeah. So by then we started with
Netflix. We were talking about it. Binging Netflix is the bane of my weekends. So there’s a
reason that you spend so much time on that. But as we’re talking about, you know, making
things sustainable, I think now Netflix has gone the other direction and died in that regard.
[00:10:56] Right? Netflix first let’s talk about the way that they [00:11:00] label all of their
content. Movies, TV shows, Secondly, you have a guide. You have none of that shows up on
Netflix, at least not in the first five chances. You have things like binge of dramas as as a job.
You don’t know, you don’t know these things. We got it. So that you can just keep scrolling
map you there the entire time, just scrolling and increasing their time on the product.
[00:11:30] There is a reason they say that the biggest competitor to sleep, but I think what
I’ve realized that, we’ve been using Netflix for so many years and I think last year, especially
because of the fact that maybe it was what I realized that I spend more time on Netflix,
finding more to watch, then watch it.
[00:11:49] So exactly I’m being delivered work. Immense amount of decision-making that I
ended up watching the same show with them. I’ve actually watched it before, because I
don’t know why. Exactly, you know, there’s [00:12:00] a reason for that. There is. So firstly,
they play on their decision fatigue. They wanted to be part of the decision making process.

[00:12:06] Then they show you these big trailers that cover if you’re watching on TV, it’s half
your screen. So you’re just watching these moving images, the things that we do. That’s
exactly what we’re doing to ourselves and we’re treating ourselves like that.
[00:12:25] And then we were discussing how they don’t give you the time to choose
between episodes, right? Like how long?
[00:12:47] And then if they don’t give you enough time to make a decision that actually
directly ties into, this is what we were talking about in psychology. That directly ties you to
your Doherty threshold, which says that there is a, [00:13:00] if the system and the user
have less than 400 milliseconds between interactions, it makes it a functional interaction.
[00:13:06] We are like Netflix. Online. Right. At least I’m not tooling this please you the next
episode. And then suddenly you’re just, you know, gotten a loop of not having to make a
decision. I mean, you know, if you actually see, all digital products, you know, I mean, they
are valued at billions of dollars today, they all have those patterns.
[00:13:27] Right. I mean, there is no way a product reaches that scale without patterns.
Right. Okay. Lately not patents have been kind of categorized into a few of, you know,
elements of buckets one is nagging where you would keep pushing somebody to, you know,
take a decision. Yeah. And you don’t leave that person to that, instruction.
[00:13:52] We talked about our concept sliding from anything, I mean talk about or any
product, I mean, it sort of moves to, I look at iPhones. I mean, if you have subscribed,
everybody’s watching smart TVs. Right instruction. Is that sneaky? I mean, come on
Facebook. I mean, just go under the carpet and pull out everything that you want and you
don’t even let the user know that, Hey, I’m reading into your information and all that
interference where you, where the user is made to do what he’s not willing to do or not
wanting to donate or decide
[00:14:38] Okay. Netflix. I mean, if you even combine these two, are being sneaky, I think
that’s one of the key things that, as I say, was so much in terms of growth for the business,
he has more cost because what you’ve essentially done is sold off, but I bought [00:15:00]
someone’s life and then sold it back to them in a fancier package. And then now thats what
you’re basically doing.
[00:15:06] You know, building a product that is nothing but data mining. And I think
something that auto is impacting is the vulnerable set of new users forced actions. You are
forcing somebody to take that action, which favors the business, you know, your business
objectives or any organizational objectives, right?
[00:15:28] What are your objectives regarding the act? We can only design ethical products
at the cost of business objectives. Right. Is that so? I don’t think so. I don’t think so because
all these businesses function without giving up ethics, right? Let’s say, we have slow growth
then if you acquire. Instead of a hundred thousand users, you should decide to acquire a
thousand users in the first month.

[00:15:53] But I, but if you’ve done it, that will be damn sure that this is going to spread
[00:16:02] [00:16:00] More people coming in because you’ve built it right. The Ethical
building. It doesn’t mean you don’t grow you’re building it for slower growth, slower
approach you’re building and still using the same processes. We’ll say
[00:16:19] you’re just informing the user ethically. It doesn’t have to be so far out as you go
and check that everyone is on the right track and all of that. It just has to be something as
small as, Hey, check this box to let us know whether you want to receive emails from us.
Check this box to let us know if we can track you all your data for better, you know, look at
the way Airbnb does it. It tells you.
[00:16:46] which has so many of these, you know, credentials that aggregation platforms
don’t do. I mean are you okay? I mean, I think I love it. It’d be me as a product and the way it
informs me of my decision making. I’m sure there [00:17:00] will be things that are. It’d be,
and we wouldn’t have, which I have not noticed, which could be manipulative, but good.
[00:17:04] We put a lot of things, for all their faults. It has made sure that privacy is there
with US government dollars.
[00:17:23] Like, if you can see that out loud, that alone is enough. We’ll accept your
messages. I didn’t do any encryption. And again, going to war with Mark people,
governments over, you know, giving them a backdoor to these messages, all of that. I mean,
they made, of course they would be a lot of Dr. Dylan’s in the house, but with screen time
coming up,
[00:17:51] And, you know, that is exactly what ethical UX is about, it’s not about, you know,
do the right thing. Always it’s about helping your users [00:18:00] pick the right decisions,
you know, just don’t be an early informed decision and that’s it. That’s all that we have
enabled them for exactly. But let’s talk about how we do that?
[00:18:10] Like designers, how do we kind of. When we put our foot down, how do we
convince these, business, people of business objectives? How do we convince people not to
grow fast and grow slow? I think, I mean, I think because, you know, I think it comes back
into something that can always be achieved with several other things.
[00:18:34] Ethics one breaks a growth pattern. There can be more than one way to kind of
grow. But I think that, so there’s something on us, ethical mediation that we should bring on
the table once. We’re building a product where we get the compliance, the business team
and the design side of things.
[00:18:53] You know, it’s all about just putting it down. It’s about laying out why it is
important to do a good design way, which is ethical. It’s not about overriding anybody, but
it’s about being collaborative, because I think the business side of things also doesn’t realize
the importance of being ethical.

[00:19:12] How have you informed them? Our dark patterns? Maybe if we perform ethical
UX, right. If we do UX in an ethical way, they would, of course be. And that would have an
effect on business objectives. but they will be slow.
[00:19:42] Convincing people to do that and go back, waiting is going to be difficult. But of
course, I mean, I’m not saying we shouldn’t go there, but it’s, it’s going to be difficult. So how
do we do that? Like, I mean, of course. And understanding our own [00:20:00] impact for
our own solutions sense, but ultimately business people are going to decide what is good for
[00:20:06] And what’s not , there’s the user’s target at all. Are they fine with not keeping it in
mind? Which is why they go for any takeaways? I would say getting more users, getting
revenue. Exactly. It’s become the norm. That’s the problem. Right. So as designers, it’s up to
us to actually point to. It’s, you know, if you don’t call it out, it will always be normal.
[00:20:32] That’s the problem. If you can put your foot down and say, here you are asking
someone in confusion, don’t give out their data over here in this section, when they’re
signing up and you’re onboarding them, right. Don’t do that here. Use this line. But we, as
designers, have to put it out there that when something is wrong, you call it out and you let
them know what’s wrong.
[00:21:00] [00:20:59] If the business student persists, you want a case in the clear that, you
know, you know what? I tried a design that I tried as you win our heart because this is not
something that I would want my loved ones to be using. Right. And which is why I am trying
to do right by myself as a designer. And those as human beings would, where does it fit in, in
our design process?
[00:21:22] Or does it fit in the whole of the design process exactly as it is through and
through, right? During the research phase, if your client is coming up with some biased
opinion about the certain personas that use their products, it’s up to you to do that research
and ensure that the biases are moved and you let them know this is not how you use this
function research on the impact of our solutions.
[00:21:45] Like long-term solutions. I mean, ideally. Yeah. But the fact of the matter is, again,
it comes back to the business decision, whether or not it’s going to be feasible for us as
service designers to do it because why did they just don’t want to continue with us after
that? But [00:22:00] they, we, they have their solution and they’ve gone away.
[00:22:04] Right? How do we ensure that we, you know, either in some way during our
discussion. Let them know what the impact is or boot it out in any way and give them the
embark and just hope to God that they use it. Right. But then, so this design company, I was
also, I mean, at least we can do what we can do is that we have a ticket value, right.
[00:22:25] At least how we perform business. And are we, at least on the us? We didn’t
know in a way, I would say a couple of years down the line. We didn’t know that ethical UX is
the way to go. But today that we knew about it, it’s an incident and values as well. Even if a
client goes, we should be fine with it.

[00:22:47] No, of course, this is bad and we shouldn’t be, we are not. Who designed this
impact, obviously, if they still want to do it, I think we should kind of say [00:23:00] thank
you so much. Exactly know that that’s what I’m taking with that as part of the business
decision. But thankfully our business matter along with our design matter is ethical.
[00:23:09] If you align those two together, you’d always find that there is a very clear area.
There’s only so much greed that you can bring it. There is no one in this whole product
design life cycle or development life cycle for that matter, who could raise this topic and
who could make them aware before they would decide on time it’s on us.
[00:23:30] Absolutely because we are the ones that were doing that initial research. We are
the ones talking to the actual users of the other day. As designers, you are looking at the
bulletins. Exactly. And now we have like I said, our whole industry is based on some
psychological facts, human behavioral facts that have existed way before, like even
computers or mobile phones for that matter.
[00:23:52] So obviously, these are some hard truths of our humanity that we know, and it’s
up to us to actually convert them into a product [00:24:00] that serves the people rather
than the business. Balances, both activities. Yeah, but I mean, for example, in terms of
balancing back in their day for, let’s do the, I’m just going to be discussing games.
[00:24:13] For example, like back in the day, did the concept of a loot box. What games did
you have to go through a loot box to get models? no, you bought the brick and you got and
more so in like mobile games, anything
[00:24:43] That’s exactly. So that is where we have to think about these things. You would
have heard a lot of news about, children spending
[00:24:57] doesn’t know that it’s major, major [00:25:00] issue audiences are dealing with.
There’s no way. And yet they don’t keep those checks in a place where they go like, Hey, you
know, you, the player cardholder is that you know, like there, there, there is two-factor
authentication for your bloody Netflix. For, for buying a full-time scheme.
[00:25:20] And that’s, that’s how you end up with kids. As you said, another kid says,
spending an hour and a half on something useless like what you would like them spending
real money on. Okay. So I think it comes to that factor of we know that that’s when you can
be back to my product and that’s when you can be ethical about the, I mean, I think the way
the deck is advancing and if we’re missing out on these things that we only do in a load of
that, Hey, there’s suspicious biomass that could happen on your card or your account.
[00:25:55] Definitely [00:26:00] certainly should be they’re going to come while building
something. The idea is to just highlight the point that as designers, it’s our responsibility to,
uh, you know, be more aware of how our works, our design could impact the society in
general, especially a set of users.
[00:26:18] Probably I haven’t thought of. Right. So, you know, I think with that, uh, we
should kind of that I bought this discussion and a basket audience, you know if they have
any viewpoint on ethics, if the face certain, such instances where they have to override their

decision with other, uh, businesses, uh, objectives, all compliance objectives and, how they
dealt with it, you know, please leave that in comments and we love to hear your feedback.
[00:26:44] This is a format of Onething’s podcast and we keep bringing you more exciting
stuff and just in conversations and yeah, I mean, yeah, maybe you were to take away from
this podcast. [00:27:00] Oh, just, just keep things in mind. Like you make your app usable,
make your privacy more than a sustainability makeup process, overall ethical and just, all
this, keep your user in mind, finding all the problems they might be facing.
[00:27:17] And all of these lookout for your user before you take off for your business
needs. I think those would be the key things. It’s just one thing. If you have this ethic. More
integrated into your design process. Now that’s the Onething to start with, I would say, yeah.
Cool. Thank you, guys. And I think it’s been fun, everyone. And to leave your comments, as I
said, and we’ll keep bringing you more content, more people, and great conversations.
[00:27:50] I mean, I’d love to know, uh, if you have, as Manik said, any more topics for us to
discuss or bring in people to discuss it. [00:28:00]

Expand Full Transcript