Brief summary
Artificial intelligence has been presented as a technology with the potential to transform many different fields and professions. One of the most notable is design — but if we want to design in a way that is truly human-centric and inclusive, to what extent can artificial intelligence really help us do better work?
In this episode of the Technology Podcast, hosts Rebecca Parsons and Lilly Ryan speak to Thoughtworks design leaders Kate Linton and Esther Tham to get their perspective on how AI might be able to support designers. They discuss what AI tools could help the design process, how these tools could fit neatly into current practices and what the emergence of this technology could mean for design practices more broadly.
Episode transcript
Rebecca Parsons: Hello, everyone. My name is Rebecca Parsons. Welcome to the ThoughtWorks Technology Podcast. I'm one of your recurring co-hosts, and I'm joined by one of our new recurring co-hosts, Lilly Ryan. Hello, Lilly.
Lilly Ryan: Hello. Welcome, everyone.
Rebecca: What we want to talk about today are designed tools that take advantage of some of the innovations in artificial intelligence that we've seen. We are joined by two people from our product and design organization, Kate Linton. Kate, would you introduce yourself, please?
Kate Linton: Good morning, everyone, or whatever time of day it is for you right now. Thank you for having me here. I lead our design community of practice support works.
Rebecca: Esther.
Esther Tham: Hi, everyone. Good morning. I'm Esther. I'm the community of practice for the Singapore office, and it's very nice to have us here.
Rebecca: Excellent. What I'd like to start with is how do you all view design as part of the software development lifecycle? People think about developer tools and all of that, but how do you see design as a part of the software development lifecycle and interacting with other roles on a software delivery team?
Kate: Rebecca, I see it as a really integral part of software development. It's an essential core role. At ThoughtWorks, we try to very much work in parallel with delivery when we're designing and testing software with our users. The process that we follow is a very tried and tested approach that starts usually with discovery and going out and understanding the people who are going to be using our product or service, and the market, the competitor space. Then we work really closely with developers to understand the technical feasibility of what we're building.
We test with prototypes, we test in the code, and we refine as we go. It's really very much a collaborative effort with our product delivery teams.
Esther: I tend to not try to box myself in by, oh, I'm a designer. I do interface design. I do user testing. Generally, I think everyone on the team at ThoughtWorks, we are problem solvers. Even as a designer, it doesn't mean that I can't maybe take my time to understand a bit of what a tech stack does and how it actually affects how we design certain user experiences for people. To me, it's like, I don't necessarily just think of myself as, "Oh, I'm just going to design screens." I want to understand like, "Hey, where's the data coming from?" Then putting information on my screens and stuff like that.
I try to find problems to solve, even if it's just not on a user experience side. Maybe there are some technical issues on the back that might have an insight that can bring other insights into the discussions as well.
Rebecca: Yes, I know. When I first started in technology back before fire was invented, we didn't care about things like user experience. You basically took what you got and if you ever watched any of the, say airline employees when they were dealing with the old airline reservation systems, the finger contortions that they had to go through to do certain activities, it was just horrendous. It hasn't always been a valued part of software delivery, but I think it's safe to say now that you cannot have a compelling product if you don't take into account the design aspects and the user experiences of the customers. I agree completely. It's a very integral part of how we deliver software now.
Kate: I've also been working in the industry decades now. I remember those days where it was a debate whether we should start with discovery and talking to people who are going to be using the product and having to defend the need for user testing with real people. Today, really, customer centricity is a very orthodox approach to UI design, but it was hard won. There were years and years of debates at the exec level to create budgets to be able to do research and testing with customers. It's certainly one of the risks that I see of us going backwards if we start to create shortcuts when we use GenAI to understand our users.
Rebecca: Let's talk a little bit about what are the kinds of tools you all use as you go through the design process. For those who don't know me very well, you would know, upfront, you would never ever want me to be designing your user interface. [chuckles] I'm definitely a back-end person. Help me understand, what are the kinds of tools? We're talking about AI tools for design, but before AI, just what kinds of tools do you use in your work? Then we can talk about how that's been impacted by AI.
Kate: Yes, good question. Tools wise, it starts with pencil and paper. You don't really need a lot of tools to generate ideas that designers typically would use tools to take those ideas and turn them into prototypes that we can test with. That's where we've used for years products like Adobe, Adobe XD for prototyping, and now Figma is really the tool of choice for a lot of designers when it comes to prototyping. We use tools like Dovetail for collecting customer research. Esther, what tools do you use?
Esther: I think just pretty much the same, like starting out, pencil, paper, doing sketches. I deep dive quite quickly into Figma, sometimes, so just going in and just fiddling around, trying out certain wireframing, and stuff like that, just to get quick prototypes across. Nothing like high fidelity or fancy, but just like gray boxes and a few lines to denote images, copy, just to get the discussion going with my product owners to see like, "Hey, is this something that would fit in the vision of where you want to take your product in and then deep dive right into Figma itself?" Figma itself is really quite powerful, but I'm actually excited for the AI suite that hopefully will come out soon.
Lilly: You attended the Figma conference in Singapore yesterday, didn't you, Esther?
Esther: Yes. I attended online, yes.
Lilly: Figma has been a really interesting one in terms of some of the tooling that they have brought into that space when it comes to how they are incorporating GenAI. Maybe that that leads to the question of what is going on in this space at the moment with pencil and paper? How are we making that smarter? Are we making that smarter?
Esther: It depends on the companies that you are looking at. I can see AI being taken into maybe a couple of directions when you're considering like which companies are doing it. The mainstays of what designers actually use today, like the tools like Dovetail and Figma. Their stance on AI is about helping to augment our workflows. Dovetail and Figma, they're introducing AI tools to actually help us with our work rather than trying to see, "Oh, how can we actually design like a designer?"
There are other tools out there that democratize design in a way where I don't need you to be a designer, you can spin up like a working prototype, let's say, with minimal knowledge about what it means to actually design something. The AI tools that Figma is trying to introduce is something that would actually benefit designers a lot. I'm very much looking forward to auto naming my layers because my Figma file is just a mess of like gibberish layers, so looking forward to that tool.
It's something that helps us not saying that, "Oh, I'm just going to click a button and here's a prototype that my product owner can just take and put in production." No, it's about helping us to search for assets, just doing the things that I don't necessarily take time out to do with my Figma files to make it cleaner so other designers can understand my work as well.
Lilly: You spoke a little bit about democratizing design and also one of the things you mentioned there was renaming layers as well. It sounds like that type of feature is something that we know GenAI can do really well, but that itself is not core to a design process. That's sort of a product help or a product feature. It augments your existing workflow.
Do you see this kind of tooling as something that can help people who are not in the design space to design? Do you see it as something that augments the existing professional skills of designers or is there something about it that goes both ways?
Esther: For me, it would definitely augment me as a designer because, let's say, I'm trying to find a particular layer and I know what's on the layer. Let's say it's a button, but it's named X, Y, Z something, something, and I cannot find it easily. I have to dig down to certain nest of layers and stuff, especially when you turn on a lot of configurations and Figma as well and you hide things in frames and frames and frames, so it would help me.
I'm actually not going to say I know for sure how it would help people who are just starting out using Figma. Maybe it would because it would help them to understand how to, I guess, be better about keeping their files cleaner and not messy, so it helps them to maybe be a bit more conscious about how they're designing things on the surface. I'm not sure. Kate, what are your thoughts?
Kate: I think it's really helpful in collaboration because Figma is a collaboration tool as much as it is a prototyping tool. It enables people to work in parallel and to provide feedback. Ensuring that things like layers are clearly marked is actually really helpful for people who are collaborating the one file. That's creating clarity and efficiency for everyone who's involved in the design process.
I'd also say that when it comes to GenAI, creating efficiencies and democratizing the design, what we're seeing at the moment and over the last year is just lots and lots of new startup products and tools coming out that are promising to do just that. There's certainly too many to keep abreast of them all. We've been trying to, we've been assessing some of these tools and experimenting and also comparing them with the default tools that we use as part of our practice to compare the output and to really understand where they can create efficiencies and still deliver a good outcome.
It is a really potentially confusing space for early career designers starting out to know what are the right tools that I should be using. Because the well-known tools that we potentially learn when we're studying at design school, they're still, in my mind, the tools that we should be using, but for a lot of non-designers, there's so many other cheaper options now that are promising to do exactly the same thing and to do it more quickly.
It's a really confusing design market out there at the moment when it comes to tools. It's changing from week to week. It's impossible, really, to keep up with the innovation in this space.
Esther: I wonder, though, if it's not so much different to when web design first started and you had spin up companies that could help you to just create, I guess, websites without really the help of a designer or knowing how to code, but web designers still exist as a job that is out there. I probably don't think it's as much as like, "Hey, here's this AI tool. It will help you become a designer." Maybe it's an AI tool that can help you to visualize your ideas and then maybe take it to someone who can actually then help to refine your product better.
It could help in, let's say, discoveries or maybe you're pitching a product to someone, but it's not fully realized yet, but you just need the extra help to visualize something. Maybe it helps in terms of storyboarding, for example, we just want to get an idea across to sell it, for instance.
Rebecca: I was going to ask about that too because I know in other parts of the software development lifecycle, generative AI is used to generate ideas. Give me some alternatives to this. I can see that being quite useful in that discovery process or where you're trying to say, "Okay, what are the different ways I can morph this base idea and maybe come up with something better?" Are you seeing any tools like that or is that just something that you might do with one of the large language models, one of the chat interfaces?
Kate: Yes, that's a good question because ChatGPT can actually do all of that that you just described in terms of facilitating ideation through coming up with multiple ideas and scenarios and pros and cons for different scenarios in different industries. We're already doing a lot of experiments in that space, but ThoughtWorks has created GenAI tools like Boba and more recently Haven. Boba was also created with that goal of helping teams in ideation and strategic options for new products and services and evolving existing products. We've been playing around with that tool and sharing it with our clients in facilitated workshops to really be I think of it as having a really great sparring partner when it comes to coming up with new ideas.
We all know that the creative journey is littered with bad ideas, most ideas bad ideas. A lot of the ideas that created by GenAI are no different. They're not going to work, but it's great to at least kickstart that process and get past the blank canvas. It does create a lot of efficiency. We've certainly been using those tools and there's still a lot of hard work in terms of the discovery process, but the tools certainly help in really kickstarting the process but not reducing the need to then go and validate those ideas.
The hard work of doing customer research still needs to be done with real people, so it doesn't really eliminate that work. It just creates some efficiencies when it comes to exploring new ideas and looking at the pros and cons and thinking about even what competitors are doing those tools could be useful for that as well.
Lilly: You touched on using this as a sparring partner. I have found that to be one of the best ways to engage with this type of tooling as a kind of interlocutor, to have a back-and-forth and use it to evolve your own ideas as a bit of a sounding board. In most cases, that can be beneficial because you, as a professional with your knowledge, can evaluate what comes back to you and shape the output from there, especially with that chatbot paradigm. I think it's been a pretty helpful thing regardless of your discipline.
In terms of what you're talking about regarding the work of interviewing users, there have also been a lot of offerings out there that are aiming to automate some part of this or say, "Well, couldn't we just generate users and we could get input from them in that sense, because we know here are the breadth of things that users might or might not want." We could synthesize that. What are your thoughts on how well that fits within your practice and whether that is a compliment to what you're doing or whether that actually distracts from the value of the work that you're bringing to the table?
Kate: Synthetic users and products that simulate users and create personas is probably one of the most divisive areas of GenAI tools in the world of research. You'll get a lot of mixed opinions for and against the use of these tools.
We've run a few experiments internally. I think those tools can be great in formulating what we call a proto persona. It's like a straw man persona that requires further validation. Typically, and historically, we would create proto-personas using subject matter experts or proxies for the user who have a lot of knowledge and experience with target users of a product or service.
It doesn't eliminate the need to then go out and meet those people who are using your product and test those hypotheses with them and do that for the work. I don't see it yet as being a substitute for talking to real people, but it can certainly be a great kickstart into the process. With some of these tools, they also allow you to upload your own customer data, which is also a great starting point. Synthetic users has that ability. Apart from four large language models that it's built on, it also has a RAG. That means that organizations can upload their own customer data and ensure that there's accuracy in the data, that it is representative of the segments that their customers have.
I think there's certainly a role for it, but my fear is that some product companies will see this as a substitute and a shortcut, and it really isn't. At this stage, our customers are the most valuable part of our products and you need to be having a continual dialogue with them, testing out new ideas, and continually improving based upon that real-time insight. Certainly, I'd say that it's potentially going to be a part of our research, but it won't replace it.
Esther: I think it's probably a bit dangerous to say I can completely replace my real people research because we've got to understand the product that we're designing. The products that I work on in the Singapore office are sometimes like systems that are quite internal, and it's not something that I'm going to openly talk about in public. That's what these kind of synthetic users do. They crawl a lot of information that's available publicly on the Internet to improve or train their datasets.
If it's a product that is not going to be talked about on the website a lot or it's not going to be reviewed publicly a lot, where are you going to get the insights from? It's going to be still locked within all your users and their knowledge of certain tools and stuff and products that they use.
I'm pretty sure for very niche user basis, you're probably not going to find this kind of users, this kind of synthetic simulations very useful unless we go down the route of we go and we actually build our own synthetic user for our own use. That's going to take time and we're going to have to do the research anyway to go and build out this prototype persona.
I think it depends on maybe fine-tuning the way we use this kind of personas because if it's something that say, "Hey, I want to understand customer feedback around the world for a global product that I'm working on," sure. I can use it to maybe crawl insights from all over the world on the internet and bring it to me instead of having to do desk research hours on end to crawl for that same information.
If it's information that's not going to be available publicly and I have to actually go talk to a person to understand how they actually use a very particular product, then that's a very different story we're talking about.
Kate: One of the core concerns that we have at Thoughtworks around designing accessible products is ensuring that we take a universal design approach to accommodate all users of the product and incorporate the diversity that exists within any segment. That requires designing for the edge cases, so designing for those niche users that may have a disability or even the 50% of your segment that probably wear glasses or have other requirements that won't be sufficiently represented in a large language model, which is generalizing and not always delivering some of those niche requirements that are represented by those edge cases.
Esther: I think I also want to add the hallmark of what makes a truly great user researcher is the ability to tease out certain insights that are hidden nuggets. We're having a conversation and then suddenly the person you're interviewing says something maybe in passing, but if you're a good researcher and then you can identify that, hey, actually that mention can actually lead to certain insights that haven't been discovered yet. We're able to observe and understand like, "Hey, maybe there is some other insights that could be unlocked just because the user mentioned something that they may not think is important," but it could review something that is totally relevant to where we are trying to take a product or a research.
I'm not sure if LMS can actually do that. They are very linear in their discussions. "Hey, here's a discussion guide. This is a list of answers I want answered." They're just going to go like one, two, three, four, five. When talking to a user or just simulating how a user will answer questions, but human user researchers we will actually say, "Hey, wait a minute. You said something here around question two. Can we deep dive into that and go further?" It might be a tangent that leads nowhere, but it could also sometimes lead to something amazing.
Rebecca: As designers and given what you know about the powers of AI, I will use that phrase even though it drives me nuts, what's your dream tool? What is a place where you would love to see AI used to support your work as a designer?
Kate: I guess my dream is just having a tool that gets rid of all of the repetitive annoying drudgery involved in customer research and testing and analyzing results and finding the patterns in the data. That seems to be what GenAI is perfect for. I don't want it to replace the fun stuff, though. The really exciting work of ideation and the flow state that I get into when I'm designing. I could easily lose an afternoon designing something that I really enjoy working on. That's the reason why I'm a designer. I don't want a tool to take care of the really enjoyable part of my job.
Esther: Yes, I agree with Kate there. Sometimes just the fact that if we go analog, it could yield some interesting results as well. We were running a workshop with some of our clients a while back, and one of our activities we had was to do ideation, but we tried to get them to contrast between results that you would get with pen and paper like Crazy 8 versus, let's say, using Google AI to come up with ideas. Sometimes we come up with similar ideas to AI, but sometimes we have left few ideas that are super wild and creative. I don't know what it would get us, but it's something that only the human imagination can dream up of because it's something that maybe never existed in the thought bubble of a very general average internet data.
Sometimes interesting things can come up if we just sit down and just let our minds wander. To Kate's point, a lot of repetitive work that sometimes just takes, I guess, time out of a day. Let's say I have to go clean up my design system in Figma, for example. I might have to hunt down where all my different components might be used. Right now, today in Figma, I can't just go to the design system and say, "Hey, there's this component. I make some changes. I want to make sure that the changes get propagated everywhere" because sometimes there's a glitch. It might not happen completely or if I have changed or overwrote certain settings for a particular component, it might not get the change propagated for that effect.
With AI, I think Figma is promising that we can actually be able to search where all these components might be used. That would save me so much time if I'm trying to make sure that my production file is actually kept clean and up to date and they're not missing out some very special component that I might have made special changes to.
Rebecca: We talked earlier about junior designers and the potential impact of these tools, but if you were talking to someone just out of design school or who is interested in getting involved in design and wanted to explore the use of AI, where would you have them start?
Kate: Probably advise them to start learning about prompt engineering and start with ChatGPT and really understand the importance of being able to define the outcome that you want from that tool. There's a bit of work there and I'm still learning how to write good prompts, but it's a really good starting point for designers to really begin with understanding what's the outcome that you want?
Who's the audience? What's the goal here? What are the constraints that we're operating in? What's the context? Now take all of that and put that into a prompt and start there and then refine and refine until you get the outcome that you want. I would start with with prompt engineering. I know that a lot of designers disagree with me because a lot of the tools out there have almost eliminated the need to be good at prompt engineering.
They provide interfaces that kind of help you to think about that, but I think it's really useful to understand the way these AI powered tools work. If you can improve your prompt engineering, your prompting skills, then that will go a long way to understanding how to get a good outcome out of GenAI tools. ChatGPT is a handy way to start. It's a tool that a lot of us have in our pocket or on our laptop, so I would start there.
Esther: I want to challenge Kate a little bit though. You said about understanding how to prompt better so you can achieve the outcomes that you want, but whether or not someone who understands what a good outcome is, I'm not sure if AI can necessarily help us with that. Knowing what good is takes years of training experience to understand what's a good UI design, what is bad UX? It's trial and error.
We learn as we go along and that's why we do user research, we do user testing. We bring in front of people to understand that, "Hey, I designed this particular user flow, but people are having issues or difficulties understanding how to use it or complete it." Then that's where I say, "Oh, okay. Then I need to make some changes to this.
If you just get AI to fast track and just speed up and say, "Hey, I just want this app to do this, this, this," it spits out a prototype and you say, "Okay, that's done," but how do you understand where all the issues lie? Do you understand what decisions the AI actually made for you throughout this, let's say, prototype?
I'm not sure if there is a magic bullet to just get to the point where, "Oh, okay, this AI tools, I magically know what a good outcome is, or I trust AI so much that I just assume that it's giving me the best outcome that I can achieve with my prompt." I'm not sure if there is a quick way to help people understand good outcomes.
I think there's still value in training up true proper design principles and practices. Don't discard the grunt work. Maybe you use AI to do some prototyping, maybe you also do it manually on your own to learn and understand why did I place certain elements in this order? Is it important that something comes first or something comes on the next page? How does everything with regards to all your designing? I think that's also important as well.
Kate: Oh, yes. Prompt engineering is never going to replace design school. [laughter] It's interesting, actually, because it's a really good point and some of the most beautiful outcomes that I've seen from GenAI have come from super experienced designers who have been working in the industry for years and really can recognize good design when they see it. They're able to use mid-journey to create extraordinary images, but it's only because they've been creating images in their own practice for years, and they're using their own style and their own skills to allow a GenAI tool to augment what they do.
Recognizing good design is that there's a huge risk there for non-designers using these tools that they're just going to produce a lot of very ordinary stuff. We've all seen that, you open Instagram and have a look at a lot of the GenAI that's created there, and it's very ordinary. You can tell that it's boring.
Lilly: The word augment that you use there, I think, is really key to what we've been talking about this entire time, that it can elevate a level of skill that you may already possess. At the same time, also, you do need to understand where your core principles are in order to know how to guide and to shape whatever you're getting to an ideal result.
I wondered also with regard to development teams who don't have a designer on staff, design is not always the first role to get filled. With access to tooling like this, and potentially also with a small budget, particularly if you're just starting, it may be that people on development teams are asked to perform the role of a designer as they frequently are without the requisite training and skills. What kind of advice would you give to development teams without a designer on the staff who have access to the kinds of tooling that we've been discussing into how to use it effectively?
Kate: I would always start with the design process and design thinking and trust the process, which it doesn't change just because there's GenAI tools to help, but the process really begins with understanding your users, understanding the market, and synthesizing the results of those interviews that you do. You don't have to be a designer to go out and do discovery and test your hypotheses, and you don't have to be a designer to validate ideas and improve them as you go using that design-picking approach. GenAI doesn't change that, it just creates efficiencies and augments. We see opportunities through the whole software development lifecycle to include GenAI that I would trust the process of design thinking, even if you're not a designer.
Esther: It's not something that's a very special skill set that you have to be a designer to do. Design thinking is about understanding problems to be solved, and what's maybe a good solution to solve them. Maybe designers have this instinct to just get those insights maybe quicker or just know how to identify like where are these opportunity areas maybe just quicker and hone in on what's good to solve and what's maybe necessarily not the most value or effort to solve.
I guess we also augment the teams in that way of the way that we think and just because we are just more attuned to that design process because that's what we do day in and day out. We are practiced, and it's not that they can't pick it up, it just might be something that might just be a bit more difficult for them to do at the very early stages, but through time they would also learn and understand that it's not something that is very unique, like anybody can practice design.
Rebecca: I would say in summary, you and other designers are not in any danger of being put out of business anytime soon, given where the AI tools sit today and who knows what's going to happen in the future. I do think, in particular, a lot of what you've been talking about with the interaction with the customers and the user research, those are the kinds of things we humans are better at than machines, and we can be empathetic, and we can make leaps that perhaps are more difficult for an AI system to make, at least at the moment. It sounds like you guys don't have to worry about being put out of a job by automation anytime soon.
Kate: I'm not worried, but what I would say, and what we hear, it's true for a lot of different roles, is that we could potentially get displaced by designers who are effectively using GenAI tools.
Rebecca: Yes, I think that is quite true across a large number of roles. Well, thank you, Esther, thank you, Kate, for joining me and Lily in this discussion of AI tools in the design space. I hope you all have a wonderful day.
Kate: Thank you.
Esther: Thank you for having us.