Design and Ethics: How designers fulfill their responsibilities
Published: August 02, 2019
Digital design has an immediate, enormously scalable impact that is often difficult to reverse. It’s even more worrying when UX, front-end and service designers and developers are unaware of this influence and all the associated responsibilities. Lead Experience Designer Henning Fritzenwalder and Product Strategist Alexander Steinhart talk about the negative developments that can appear with the tech hype and how designers can react to them.
This interview and article was originally in German and appeared first in PAGE, it was conducted by Nina Kirst. Translated by Thoughtworks and published with endorsement by the publisher. PAGE 07.19 is devoted to the topic of ethics and gives tips on how to act ethically as a designer.
Nina Kirst: Why is ethics such an important issue for designers right now?
Alexander: Thanks to the ever-increasing usage of smartphones and mobile Internet, designers are able to rapidly reach many more people and have a much bigger impact today than they had in the past. Digital products are increasingly influencing user decisions and behaviors, resulting in increased designer responsibility.
Henning: Speaking about ethics is complicated because it is not clearly outlined like a law. It is always a matter of judgment. When it comes to technology, ethics must also be looked upon in a realistic and proactive way: the law doesn’t regulate many things that are technically possible today. Legislature always responds to the existing issues – but often quite late. Being a responsible designer or developer means that you should develop an attitude to specific practices before the legislation deals with it - and preferably before application and implementation. For example, what do you think is appropriate and fair in terms of copyright on the internet? If you haven’t yet formed an opinion on this, you may find yourself in situations where you act unethically.
Can you share some examples of these situations?
Alexander: You can always ask yourself whether capturing attention at all costs is always appropriate and ethical. Just because the business model of Google and Facebook is based on it, it doesn’t mean that all digital offerings need to follow. We should rather focus on what we can and want to do for the user - and especially for the person behind it - in order to design a product that supports this exact goal. Most users don’t want to spend more time on the Internet but want fast and efficient applications. Designers need to strike a balance between using an app and the cost of the application, such as time and attention. The more technology enters our everyday lives, the more we have to think about what a healthy relationship with this technology looks like. Addiction and dependence are signs of a toxic relationship.
Henning: Just like fraud, for example, dark patterns come into play here, design tricks that manipulate users. With such a thing, you might briefly increase click-through rates or sales figures. However, when the user feels betrayed by this, the trust in the brand and in a company suffers in the long run.
What if the customer insists on dark patterns: can designers freely say no?
Alexander: Designers have plenty of room for decision-making, and they can initiate important discussions. Often it helps to look at the goals and values of a company. Is it about short-term sales or customer satisfaction over a longer period? Why was the company founded? What are the brand values today and are they reflected in the application? Are dark patterns really effective in this context? Such questions can trigger the client to rethink their intention - and many are grateful for that.
What questions should designers ask themselves repeatedly to help make ethical decisions?
Henning: A good approach is to transfer methods to other contexts and question if you would agree. How would I feel if I found hidden shoelaces in a pair of shoes, for which I paid for unknowingly? It’s no different if I add hidden insurance costs for users when booking travel. If I don’t like that, I shouldn’t do it - the answer is easy.
Alexander: Maybe even send a pair of shoelaces unsolicited once a year! People tend to let technology do things they themselves wouldn’t do. That's easy - and cowardly. Instead of secretly influencing the behavior of users, - as in nudging - we should rather rely on education. This may take longer, but it’s more sustainable and more satisfying for people. This enables them to make informed decisions.
Photo by Mark Fletcher-Brown on Unsplash
What other aberrations do you currently observe?
Alexander: We should make ourselves again independent of tools in the context of social interactions. This is enormously important for social cohesion. Applications should be designed in a way to promote direct communication rather than outsource to technology and automate because the trust in social structures will otherwise be lost in addition to the individual ability for interpersonal exchange.
Henning: The human factor generally often plays a worryingly minor role. For services like Deliveroo and Uber, we just see the app and forget that there are people behind it who have to do an unstable, low-paid job. The main thing, for us, is that the service is as convenient as possible. It doesn’t matter what the long-term consequences are.
Instead of dealing with long-term consequences, the motto in recent years has been "move fast and break things". However, once things are broken, they can’t be repaired. Are permanent beta and rapid prototyping a mistake?
Henning: I wouldn’t say that. This principle releases a great dynamic by encouraging people to try things out. However, you have to think carefully about what you really want to break. If, for example, these are outdated hierarchies, that is a good goal in itself. Breaking things for the lone sake of breaking something is not the point. In addition, you can install some safeguards, so you can always turn back to the point again if needed.
Compared to the US, we still don’t discuss ethical issues in digital design in Germany.
Henning: In Germany, we often go the way of standardization. We try to formulate as correct a set of norms as possible - but it takes time to find and formulate it together. There is another culture for software in the US. These include an open-source movement, intellectual dissent, as by Sherry Turkle, and initiatives such as the Center for Humane Technology. Former coworkers and investors of Google, as well as Facebook, founded the latter, in order to keep the companies in check.
Google has now established a Digital Wellbeing Initiative, and Facebook is supporting the "Time Well Spent" initiative that resulted in the Center for Humane Technology.
Alexander: Such announcements must be seen with caution. Companies might jump on the bandwagon of these discourses for ethical “greenwashing”. The business model of these companies doesn’t change - they only go so far to the users so that the users don’t run away from them. However, I have the feeling that there is something happening in the back rows - the people who work for the tech companies as well as the veterans of the internet. They increasingly demand that technology be reinstated for good - and demonstrate when their employers have other things in mind. New companies are founded. New approaches and UX patterns emerge that make it easier to gauge other products and the impact of our digital products. I’m sure there will be more of this in the future.
In-depth reading, frameworks, theoretical and practical tips, as well as links to initiatives and organizations:
Last but not least, Alexander Steinhart’s curated "Humane-Tech Reading List" is a good starting point for a deep dive into the topic.
This interview and article was originally in German and appeared first in PAGE, it was conducted by Nina Kirst. Translated by Thoughtworks and published with endorsement by the publisher. PAGE 07.19 is devoted to the topic of ethics and gives tips on how to act ethically as a designer.
Nina Kirst: Why is ethics such an important issue for designers right now?
Alexander: Thanks to the ever-increasing usage of smartphones and mobile Internet, designers are able to rapidly reach many more people and have a much bigger impact today than they had in the past. Digital products are increasingly influencing user decisions and behaviors, resulting in increased designer responsibility.
Speaking about ethics is complicated because it is not clearly outlined like a law
Henning: Speaking about ethics is complicated because it is not clearly outlined like a law. It is always a matter of judgment. When it comes to technology, ethics must also be looked upon in a realistic and proactive way: the law doesn’t regulate many things that are technically possible today. Legislature always responds to the existing issues – but often quite late. Being a responsible designer or developer means that you should develop an attitude to specific practices before the legislation deals with it - and preferably before application and implementation. For example, what do you think is appropriate and fair in terms of copyright on the internet? If you haven’t yet formed an opinion on this, you may find yourself in situations where you act unethically.
Can you share some examples of these situations?
Alexander: You can always ask yourself whether capturing attention at all costs is always appropriate and ethical. Just because the business model of Google and Facebook is based on it, it doesn’t mean that all digital offerings need to follow. We should rather focus on what we can and want to do for the user - and especially for the person behind it - in order to design a product that supports this exact goal. Most users don’t want to spend more time on the Internet but want fast and efficient applications. Designers need to strike a balance between using an app and the cost of the application, such as time and attention. The more technology enters our everyday lives, the more we have to think about what a healthy relationship with this technology looks like. Addiction and dependence are signs of a toxic relationship.
Designers need to strike a balance between using an app and the cost of the application, such as time and attention.
Henning: Just like fraud, for example, dark patterns come into play here, design tricks that manipulate users. With such a thing, you might briefly increase click-through rates or sales figures. However, when the user feels betrayed by this, the trust in the brand and in a company suffers in the long run.
What if the customer insists on dark patterns: can designers freely say no?
Alexander: Designers have plenty of room for decision-making, and they can initiate important discussions. Often it helps to look at the goals and values of a company. Is it about short-term sales or customer satisfaction over a longer period? Why was the company founded? What are the brand values today and are they reflected in the application? Are dark patterns really effective in this context? Such questions can trigger the client to rethink their intention - and many are grateful for that.
What questions should designers ask themselves repeatedly to help make ethical decisions?
Henning: A good approach is to transfer methods to other contexts and question if you would agree. How would I feel if I found hidden shoelaces in a pair of shoes, for which I paid for unknowingly? It’s no different if I add hidden insurance costs for users when booking travel. If I don’t like that, I shouldn’t do it - the answer is easy.
People tend to let technology do things they themselves wouldn’t do. That's easy - and cowardly.
Alexander: Maybe even send a pair of shoelaces unsolicited once a year! People tend to let technology do things they themselves wouldn’t do. That's easy - and cowardly. Instead of secretly influencing the behavior of users, - as in nudging - we should rather rely on education. This may take longer, but it’s more sustainable and more satisfying for people. This enables them to make informed decisions.
Photo by Mark Fletcher-Brown on Unsplash
What other aberrations do you currently observe?
Alexander: We should make ourselves again independent of tools in the context of social interactions. This is enormously important for social cohesion. Applications should be designed in a way to promote direct communication rather than outsource to technology and automate because the trust in social structures will otherwise be lost in addition to the individual ability for interpersonal exchange.
Henning: The human factor generally often plays a worryingly minor role. For services like Deliveroo and Uber, we just see the app and forget that there are people behind it who have to do an unstable, low-paid job. The main thing, for us, is that the service is as convenient as possible. It doesn’t matter what the long-term consequences are.
Instead of dealing with long-term consequences, the motto in recent years has been "move fast and break things". However, once things are broken, they can’t be repaired. Are permanent beta and rapid prototyping a mistake?
Henning: I wouldn’t say that. This principle releases a great dynamic by encouraging people to try things out. However, you have to think carefully about what you really want to break. If, for example, these are outdated hierarchies, that is a good goal in itself. Breaking things for the lone sake of breaking something is not the point. In addition, you can install some safeguards, so you can always turn back to the point again if needed.
We should make ourselves again independent of tools in the context of social interactions.
Compared to the US, we still don’t discuss ethical issues in digital design in Germany.
Henning: In Germany, we often go the way of standardization. We try to formulate as correct a set of norms as possible - but it takes time to find and formulate it together. There is another culture for software in the US. These include an open-source movement, intellectual dissent, as by Sherry Turkle, and initiatives such as the Center for Humane Technology. Former coworkers and investors of Google, as well as Facebook, founded the latter, in order to keep the companies in check.
Google has now established a Digital Wellbeing Initiative, and Facebook is supporting the "Time Well Spent" initiative that resulted in the Center for Humane Technology.
Alexander: Such announcements must be seen with caution. Companies might jump on the bandwagon of these discourses for ethical “greenwashing”. The business model of these companies doesn’t change - they only go so far to the users so that the users don’t run away from them. However, I have the feeling that there is something happening in the back rows - the people who work for the tech companies as well as the veterans of the internet. They increasingly demand that technology be reinstated for good - and demonstrate when their employers have other things in mind. New companies are founded. New approaches and UX patterns emerge that make it easier to gauge other products and the impact of our digital products. I’m sure there will be more of this in the future.
In-depth reading, frameworks, theoretical and practical tips, as well as links to initiatives and organizations:
- The book "Mindful Design" by Scott Riley: How and Why to Make Design Decisions for the Good of Those Using Your Product.
- The book "Ruined by Design" by Mike Monteiro: How Designers destroyed the world, and what we can do to fix it.
- Manifest "Designing Mindfulness" by Agency Mindfulness Everywhere: Guidelines and application examples for more ethical design.
- Center for Humane Technology: Our mission is to reverse human downgrading by inspiring a new race to the top and realigning technology with humanity.
- Humane by Design: A resource that provides guidance for designing ethically humane digital products through patterns focused on user well-being.
- Design Ethically: Your go-to resource for designing ethically. This framework lays down the theoretical and philosophical foundation and also provides a toolkit for you to implement the ideas presented here.
- Ethical Alternatives: From browsers and search engines to collaboration tools, wikis and social media.
- The book "Future Ethics" by Cennydd Bowles: Technology was never neutral; its social, political, and moral impacts have become painfully clear.
- The book "White Hat UX" by Trine Falbe: The next generation in user experience.
- Simply Secure: We’re building a community of practitioners who put people at the center of privacy, security and transparency.
- Podcast "Should this exist?": It’s the question of our times: How is technology impacting our humanity? We invite creators of radical technologies to set aside their business plan and think through the human side.
Last but not least, Alexander Steinhart’s curated "Humane-Tech Reading List" is a good starting point for a deep dive into the topic.
Disclaimer: The statements and opinions expressed in this article are those of the author(s) and do not necessarily reflect the positions of Thoughtworks.