Hello and welcome to the future of UX. My name is Patricia Reiners and today in this podcast episode we are welcoming back a very special guest, Torsten Jonas. If you've been following the podcast, you might remember our last conversation, which definitely sparked some amazing discussions around sustainable UX.
In fact, that episode was one of our most successful ones and I received tons of great feedback from listeners like you, so thank you so much for that. So I am thrilled to have Torsten back for another deep dive. This time we are exploring the intersection of sustainability, AI and UX.
So how do we bridge the gap between cutting-edge technology on the one side and then responsible and ethical design on the other side? How can we ensure that AI-driven innovation doesn't come at the cost of our environment, humanity or society?
So before we dive in, Thorsten will give a quick intro for those who are new to his work. And if you haven't listened to our first episode together, I will link it in the show notes. Definitely worth a listen. So I would say let's get started.
Hi, Thorsten. I'm so happy that you're back. This is your second time in the future of your ex podcast. So welcome again. Hey, thank you. Thank you so much for having me. It's a pleasure. It's a big honor. And yes, super happy to be back here on your wonderful podcast.
Thank you for the nice words. I really appreciate that. And I need to say that the last podcast episode was actually really, really successful. So we had a lot of listeners who really loved this podcast episode. I got amazing feedback on it. I'm also going to link it in the show notes for everyone who hasn't listened to it yet.
But before we are diving into the topic of sustainability and AI and UX and how we can all like basically bridge everything together, I would love you to do a super quick intro. So listeners know who you are and what you're doing. Yeah, sure. Yeah. So hello, everyone. I'm Torsten, Torsten Jonas. I'm the founder of the Sux Network, the Sustainable UX Network. And as part of that, also of the Sux Academy. And well, I have...
dedicated my work life to digital sustainability to sustainable ux sustainable design and also as part of this i'm working a lot with sustainability and ai with or on responsible ai and what what that what that means and i do i do courses and master classes on this i do a lot of workshops and talks at conferences but also at company company events and
I do a very little consulting on the topic as well, always depending on how much time is left on my schedule. Yeah. And I mentioned the Sux Network. You're all invited. If you are not part of that, you don't have to pay or buy anything. You can just join. It's for free. It's a huge global community. We have a huge Slack space. We have lots of resources about sustainable design, sustainable UX that we share for free. We have also a podcast.
where we interview people not speaking only about digital or sustainable UX, but digital sustainability in general. We also speak about AI and there are, I would say, really wonderful people joining me to talk about these things. So yeah, feel yourself invited to the Cirx network and join hopefully the movement for a more sustainable design and tech world in the future.
I love that amazing resources. So make sure to check everything out. So you mentioned a more sustainable tech world. And before we are diving into AI, I would like to understand what does a sustainable tech world or sustainable UX means for you? Maybe I start with sustainable UX. One of the core principles of UX is that we try to build
great experiences for our users. And to do so, we put this user into the center of our work, into the center of our thinking. We try to understand the needs and the desires of the users to fulfill them with our designs. And the problem is every digital product or experience that we build is never...
standing on its own alone in this world it's it's part of a bigger world it's part of a bigger ecosystem and it's influencing this ecosystem and that means um if i build a great user experience for a user that might cause harm somewhere else so let's say i make the amazon experience super nice right it's super convenient for me to order something at amazon and and to send it back etc
but someone else is paying the price the delivery drivers that are have really tight schedules and do not even have the time to make a lunch break and the people working at the fulfillment centers etc the small shops that that die because they cannot match the prices of these big companies so very often a great user experience
needs to be paid by someone else somewhere else. And that could be another person, but that could be also the environment. Because, and with that, I want to widen up a little bit on the sustainable tech world.
um everything that we build relies on on digital tech right on servers that that serve the data that serve the websites and that we have our devices that show the apps or the other websites and well they they need a lot of power and this power needs to be produced and causes a lot of carbon and that is a huge problem because the emissions are the whole internet
overall um produces more carbon than the whole airline industry for example and that's well yeah not enough people talk about it and and that's again that's a price that's also paid right for the next great user experience the next great digital product um product that we build and i'm not saying we should switch off all digital stuff that's not the point but
The thing is, or the question that we need to ask ourselves is how can we balance the things that we build with the surrounding ecosystem? So instead of seeing only the user or trying to bring the user needs together with the business needs, because that's something we as UX designers very often try to handle, right? There are business needs and we try to bring it together with the user needs. We need to see
our user and the product that we work in as part of a bigger system and think about how can we balance it how can we cause less harm with with what we are building and making that a default in our not only design but product building processes that's something that not only me but many other really really great people all over the world um are working on and and are advocating for
that we change um change change our mindset in that direction and yeah create more sustainable digital products and experiences by um by default because someone has to start doing that and unfortunately the the big tech world seems not to be willing to do that we we need to push them and that's again the connection to the to the big um to the big tech actually
It's still a journey we have to go, but there is a lot of hope. And that's the good message here. Yeah, I totally agree. It's super important to focus on that. And this is the future. Thinking less about like the business and like the single user and more about like humanity and sustainability and our environment.
But the big problem is that sustainability is not a KPI. It's not something that brings money. It's not something that attracts users. So I understand why many big tech companies don't put a lot of effort in it. I'm wondering, from your perspective, what is needed that there is this demand for sustainability in digital products? I think it's different things at first.
the first thing and that's also when in my courses that i do the first thing is always creating transparency creating visibility of the negative impacts because there are a lot of people who do not know what the negative impact of the digital world really is what it means actually how much power is needed how much water is needed for cooling the data centers that is then lost in local areas for example
how bad the mining of the rare raw materials is that is needed for the chips and the servers and the phones and people just do not know and it's not because they are bad people because they have many other things in their mind it's totally human that they do not know and creating this transparency is the first step actually and this already changes something that's at least what i experienced over the past years it's so important to talk about these things and not to point with the finger on someone but saying hey
yeah that's the problem and now let's talk about how we can maybe start um start making things better so creating transparency also creating transparency about the business aspect of sustainability because it's much more a business aspect than the common narrative tells us i would i would say if we look look for example in the eu if we look at the regulation
We look at the new Green Deal of the EU, digital sustainability or let's say digital emissions become a factor there. Companies will need to start to report and to know what are their digital emissions actually. So that is well, not a business factor in terms of I make money, but a business factor in terms of I have to fulfill regulations or otherwise I have a problem with my business.
But it's also that there is research and I'm quoting quite often the research by a Gemini from I think it's 2022. And so in their research, they were working on sustainable product design and their finding says that it's not only good to save the world actually, but that it is also good for business in terms of there is potential for more revenue.
There is it's good for a better relationship with your customer because the customer wants greener product. I mean, if we go to the supermarket, we see, oh, everything is green and that's a lot of greenwashing. But people are not dumb. In two years, it won't be enough anymore to paint everything green, right?
um so there is already demand a demand and there is already a business a business factor and i mean in terms of good example is patagonia the company that makes the outdoor clothing right they are super sustainable they have even um the planet as a stakeholder um in their in their vision and and they are a super successful company and they put these things into the center of their of their doing so that's the second thing also the
changing narrative from sustainability is only a cost factor it's also an opportunity already today and if not today it will be in two years for sure um so that's uh that's the second second thing and speaking about the big tech and the and the responsibility i think one problem about all these things is that we never see the real price of it right if i use
chat gpt the the pro version and it costs 20 per month that's not the real price that's not the environmental price that needs to be paid somewhere else that's not the price of the mining people that mine the materials for the servers that are needed etc etc etc so that's also something that i think we should work towards making the real price for all these things transparent and visible
And in the end, make also these big tech companies pay the real price for the impact because they don't. Right. So making it a more fair thing in a global and societal context. And that's something I hope that we can achieve. I don't know, but that's something we should work towards. But that's important because this real price is not paid by the people that make the money out of these things.
And it's not, it's even not visible, right? It's most of the people do not know what prices needs to be paid everywhere on the world so that they can use. Of course not. These things. And that's something we need to change. I mean, I totally agree. And I'm wondering, like, what is needed for that? I would assume regulations like from the government, something, you know, that's above all of that.
wondering why it's not there yet, but I think it's great that you're really educating in that direction. Also, the more people are aware, the demand gets bigger. And at some point, there might come some regulations because also environment is super important. This is where we live. This is where the future will live. So it's super, super important. But do you have maybe some quick tips for designers to say, like, okay, I'm designing a new product now. We just started. I got the brief.
I want to make it more sustainable. How do I get started? My first question always is, what's the carbon impact of your website? For example, website is the easiest. I don't know, I guess. Right. And that's actually that's something if you have a public website, that's something that you can measure quite easy. There are tools out there, websites where you just fill in your URL.
and where you get an idea about your carbon impact, actually. So that's the easiest thing to think about what is my carbon impact. And then you could start thinking about, okay, where does it come from? How heavy is my design? The easiest thing is if we would use more efficient image compressing formats,
that are not not worse than jpeg or png in terms of quality but much more efficient in terms of compression that would would already make a huge factor so for everybody who has has an own small website you can try that if you just change the image format from jpeg to webp that makes already a huge impact it's super easy to change that and and lowers the data so this is a super easy one
Another one is I always tell people, hey, look for what are three to five quick wins that you could easily change. Look at I mean, we're using so much video content on our sites, right? And well, videos are data heavy for sure. And very often are they really needed or is it just a background thing? Right. At the project I worked on, we did exactly this thing and look for quick wins.
and we found a very very important page in their web environment actually so with many users and they had a background video and the background video made i think the page rate of this page was 20 megabytes the video only the video was 15 megabytes if you calculate the carbon input pack that's something about four gram of carbon per page view
just because of this video that is not transporting any information so super easy to fix that and that's that's always a good good starting point looking for these quick wins because that's also a good story that you can tell other stakeholders in the company right if you say hey we just need to get rid of this video and then we can save hundreds of kilograms of co2 per year that's a good story that that helps to convince um convince stakeholders
And then there is more things. There is a lot about actually data, all the data that we store. There are people saying up to 90% of all data that is stored in the internet is not used at all. So it's useless data, but it's stored on actively running servers that need energy, that produce carbon, that need water for cooling.
So thinking about these, do I really need this or thinking about a lifetime for content, for example, because most of the content is outdated after three months. And we store everything forever. That's our attitude. Even our iPhone. Storage is cheap. Yeah, storage is cheap. Let's store everything.
That's also a thing. But also, how can we help the user to act more sustainable? If the user has a choice on my website or in my app, what would be the most sustainable choice and why not making this the default option? So we are not taking anything away. We are just nudging the user slightly and saying, hey, the default option is, I don't know, I'm ordering something and the default option is delivery to the next hub.
which is just around the corner here for me, for example, and not to my home door. That's also an interesting... And there are many more tactics. What I especially very often say to designers, to the X designers, I think the user journey, something that we use so often in our work, is a pretty good tool actually to identify
drivers for carbon impact but also for other for other negative impacts and to to work on ideas because you can easily add additional layers to the journey and say okay what's the
What's the impact of this step? What's the carbon impact of this step? Where does it come from? Oh, it's maybe because I have many pictures, images on these on this website. That's a pretty helpful tool from my personal experience that you can use for your work as a designer, but also easily used with other stakeholders, right? Because it's so easy that you can make a great workshop out of it with other stakeholders.
So before I talk forever now, that's a few things that you could do. Awesome. Thanks for sharing those tips. I think great starting points for designers who really would like to make a first difference in a project and get the conversation started within their company or within their design team. I think I'm working on a lot of futuristic projects, metaverse, spatial design, AI, and
There things get crazy. You have 3D content, you have videos everywhere, you have AI integrations, API, large language models. This is like next level than websites where I totally get that you don't need to have a video all the time. But like, what do you do with these futuristic experiences like metaverse or AI integrations? How do we make those things sustainable?
Let me start with a little maybe philosophical point of view. Why is it futuristic? We call it futuristic, but is having huge metaverse things, having AI on everything, is this the only future that we see or are there other futures? Because I just recently had this discussion actually, what are the future narratives? What is the vision for the future? And that these are
super tech driven i think because the only future visions that we have are always tech driven it's it's more tech actually that is that is our future but is it best better for us as humans um maybe not right if if we all in the future and and there are great stories telling these uh these stories in in in science fiction right if we all hang out in the metaverse
and are not going outside anymore. Is that good for us or is it not? But that's a very philosophical point of view. I'm not saying we should get rid of all the things because I also, for example, see the power of making, I don't know, during COVID, making a meetup, not via Zoom, but in VR, in virtual reality.
So much better, though we had very basic avatars that had no facial expressions, etc. Still, it was better than having a Zoom meeting. So I see the power of technology. And that's the maybe difficult thing here. There is no easy answer in terms of use it or not use it. But rather, we always need to ask ourselves what makes sense, what is really good and what is not.
And I think we are not asking these questions enough nowadays, especially when I see the rise of Gen AI at the moment, right? It's like every new feature is super hyped and is super nice. And when I look at my LinkedIn feed, I see every day I see the next great feature that is there. And then I feel like, yeah, but do I really need it? Is it a use case?
And again, what is the price or who is paying the price? Right. Let's say in Gen AI, I mean, it feels like magic, right? Just with a prompt, I can create a beautiful image. Let's stick with the simple thing. And that is great. I can understand that feels magic. In my talks, I very often say, hey, what's the icon for Gen AI? And it's very often these magical sparkles, right? Everywhere.
And that's also a narrative because it's telling us, hey, it's magic and it's making me not a magician, a sorcerer or a witch. Right. I can I can do magic with this tool. But is it really magic? I don't know. Especially as creative people, we should ask ourselves, what is creativity? How can I be creative with AI or and what is not creative?
the work of which people was necessary to be part of the data set that was used for training the AI so the AI is capable of producing this image and what did the person get for that, right? And that's a really complex and difficult question here, right? What are the prices paid somewhere else? And is it fair or is it not fair?
And that's the questions we need to discuss, actually, to make sure that we use AI in a way that it's fair for everybody and that it's also producing a lot less harm than it's doing now. I mean, the ecological impact of AI at the moment is just huge. The data centers need so much energy.
that we do not even know where to take the energy from. The idea of big tech at the moment is we need nuclear power to run these, to run our data centers. Microsoft wants to reactivate, activate Three Mile Island. That's the nuclear power plant in the US that saw the biggest nuclear accident in the US in the past. And they want to reactivate that to get the power for their data centers.
And it's producing at the moment so much carbon. Google, for example, last year in their environmental report said, so AI has the potential to mitigate the rise of greenhouse gas emissions by, I don't know, 10% until 2030. But only in 2023, they saw a rise of, I think, up to 13% of their own greenhouse gas emissions.
in one year, mainly because of the data centers they are building. And I know that we will have efficiency gains. I mean, we have now there is deep seek, right? Well, coming with other problems, but which is much more efficient. But still, at the moment, the energy hunger is so high, the water it needs to cool all these data centers, the water is so high. So this is a pretty high environmental price for
Using AI to, I don't know, create the invitation for my birthday party. To be a little bit sarcastic here. So again, it's not about switching it off, but it's rather the question when it's worth the price that needs to be paid. And there are use cases where it is worth, but there are many where it's not, I think. Yeah.
I mean, I totally agree with everything you said. I mean, I think we're all watching what's happening and we're seeing like this big rise of AI and the use of AI everywhere and the integration everywhere. But like for me, I mean, I'm very positive. I need to say that, but I think that's very obvious. Although I also see the disadvantages. So I totally hear you. I think to be realistic.
It will be so difficult to convince people to not do the birthday invitation with AI.
Because the problem is that it's already there. It's very convenient. It's very easy to use. The experience around it is amazing. And I can see the big tech companies slowing down with AI, especially not right now with deep seek. I feel like this actually even started the whole race of more computing power and more machines. And everything gets even faster and faster. And it's so difficult to hold back.
But I think, yeah. It's too cheap. That's the problem. It's too cheap to use it. We are not paying the real price. Yeah, yeah, yeah. That's the problem. Last year, I had Hannah Fmiss from the Green Web Foundation being the guest in my podcast, and we were talking about that topic as well. And she said a pretty nice sentence saying, okay, if people would have to pay the real price for using whatever Gen AI, they...
maybe more often would go to a, for example, real designer to have their work done. So we are not paying the real price. And that is, I think, a huge problem. The price is either paid somewhere else in the world right now, or it's paid by our future, by our kids, or by ourselves in the next 10 or 20 years. And this is not part of the $20 I pay to OpenAI. And that is a huge problem.
That is a huge problem, but also an angle where we could maybe tackle this problem, making this price more transparent and also make sure that people need to pay this real price. And with people, I mean not only us, but I mean also the big tech companies who are also not paying the real price. And I think, of course, they are not paying it, but even if they would need to pay, they wouldn't let us, the users, pay because right now we are training the systems.
They need us. They are just not giving us a Gen AI for such a cheap price for fun. They need us, every single one of us, to train those systems. So even if we need to pay, they wouldn't want us to pay. So they would cover up. But I think one thing is super important that's really stuck with me because I have the same feeling.
that a lot of AI features also that companies develop, especially smaller companies, not the big tech companies. I mean, Chattopadhyay is great and so on. But the one millionth chatbot that's designed, I think this is something sometimes very, very, very unnecessary. And I think there I see a lot of potential where you think first if AI is the right solution in the first place before you spend so much money and resources on just building something. And then you realize,
No one really wants those features. And I feel, you know, there's this gold rush right now. So people think we need to do it and then they're wasting all the money, all the resources and not on the right things. They don't really solve the right problems. And this is also like typical UX problem. I think this is something where teams need to think first before they build something. Yeah, exactly. It's you're so right. It's like,
We again are in a time where people just use the technology because they want to use it because it's there. They do not think if it makes sense to use it for their use case. And that's something that we need to talk about before we use these things. I mean, I do also workshops on this topic. And I mean, the positive thing is doing this workshop, I can see that it works, right? And the people start to think. So again, there is I'm a pretty hopeful person.
And there is hope. But yeah, we need that much more. We need much more critical thinking. Does it make sense? But also critical thinking about who controls the model, who controls the data that was used. Is my own data used? I mean, look at you can upload your pictures and stuff to ChatGPT. OpenAI is storing that. Nobody knows what they're doing with that. You don't know.
well and i'm pretty convinced they will use it for training their stuff but right and so would i upload an image of i don't know my kid there maybe not and this these are the questions that we um that we need to discuss because especially the big tech companies they do not act responsible that's that is the big problem that we see right we need
this discussion needs to be held somewhere somewhere else and it's also i mean we had this discussion with with deep seek right so every was oh ray there is something new and it needs so much less energy that's so great and then there was this news that open eye said well they used
our tool actually to train their tool and therefore they are so efficient. Which is pretty funny because I mean OpenAI is scraping the internet and using all the data of other people to train their models. So this is a little bit funny that they are blaming them now. But yeah, in the end that there are these big companies that have so much control, right? Should such a tool be
only in control of a company, I mean, seeing in the US what's happening there right now on the political side and seeing like Mark Zuckerberg, I don't want to speak about Elon Musk, so let's, Mark Zuckerberg changing, right, and working so much towards Donald Trump. And I don't know, should I, or I should ask myself much more if I use Meta's
llm is that is that good is that net not good what influence does he or do they take what data is used for training um training these model what are the answers i do not get right if i ask deep seek for the massacre on the um on the on these uh in in beijing in in the end of the 80s right you won't get an answer there it's just blocked there so what what are the answers i don't get from
chat gpt what are the answers i don't get from meta etc etc etc and these are the questions that we need to ask ourselves that's i have a lot of hope in the in the eu that we can create our own stuff own stuff here actually and create stuff that is in the hand or that is much more publicly controlled and not only by by a company because in the end we yeah we don't know what they do with our data
um we don't know what data is used for training it and we do not know what they are blocking out actually and what the machine is not not telling us and yeah these are the discussions we need much more and i'm a little afraid actually as i said exactly when seeing what's happening in the us because that's
Much more similar to China as we think at the moment. I'm afraid that's so true. But unfortunately, I'm seeing that the EU is a little bit behind with everything. Totally. That's something like even like to get to the phase where chat GPT is or open AI is right now. It needs years.
And where's ChatGPT or the other big tech companies? Where are they then? So I feel like they're so behind. Not sure why. I mean, there are different reasons, but this is what worries me a little bit. I don't have a lot of hope that they will pick up. How should that even? I would start at another end. At first, making transparent. We need to teach people at first how do
how does gen ai really work what to expect from it and what not to expect from it i mean when i see these okay i'm using chat gpt instead of google at the moment i always feel like yeah but it's it's working differently and you should understand first how to interpret the results that you get there and people just do not know and i mean you could say okay we
well we are not old but at least middle-aged whatever but nobody is teaching that to the kids to the next generations they are just using the stuff nobody is telling them these things and that's what makes me makes me really afraid and so i think creating this transparency i mean let's say what could be a good analogy let's say sending mail so physical mail right if if i would
send mail with a company where I know they are opening my letter, they're making a copy of it, archiving that in their archive, and then sending the letter to whomever I want to send it, I probably won't use that, right? Unthinkable. I'm a kid of the 80s, so I was born in 78, and so I still know the analog world. And
Nobody would have used such a service, right? And I think we need to find these stories to explain people what it means if they use this technology without thinking about it. And it's still their decision. But creating these base level of transparency, that's the challenge I see at the moment. And yes, you are right. The EU is slow. I mean, at least they do something now.
But so for me, it's two ends, right? It's one end is okay, how can let's say we in Europe build own models that are competitive, but on the other end also creating transparency about what it means and what the price is for using the other ones, because that hopefully leads to people using it maybe a little bit more informed and probably a little bit less when they know what it means.
that it maybe is not a good idea to upload the image of my kid to chatgpt. So yeah, I have hope, but it is a battle. And sometimes, I mean, to be honest, sometimes it makes me furious to see all these unreflected postings about AI on LinkedIn.
And again, not because I want to switch it off. Not at all. I'm using it as well. And it makes sense to use it in some cases. But as we just said, and as you said as well, not for everything. Right. So it's rather the question when to use it and when not to use it and making that a default discussion before we start using it. That's something I hope for or work towards. 100%. And I feel what really helps there is education.
Because it's not like, as you mentioned, I mean, if you want, then create a birthday card with GenAI if you want. Right. But I think the most important thing is also to have a look at your own workflows and then really understand what AI is really good at, where it can bring an immense amount of value to your projects, to your workflows, that you really excel in the things that you do.
And those things, well, it's just like fun and nice and the results are not so good. And I think this is the big problem because a lot of auto designers don't really understand how can I use it strategically, that it makes sense for work and where are the things that I actually need to do on my own where maybe don't integrate AI. I mean, I can throw everything at AI and see what comes out.
But I feel like education is definitely something that's missing, not only for the younger generation, also for our generations, for designers, especially because I'm seeing non-designers using crazy AI tools. I just had a crazy experience with a client who used an AI heat map tool. I don't know. I want to name the tool.
to give some feedback on the design. So you wanted to basically test two versions against each other and was saying, "Oh, this is performing so much better." It's like, "Yeah, but we can't do that. We need to do testing." I mean, wonderful that you tried it out, but still. And in the end, we tested both versions, and the version that won with the heatmap, AI heatmap tool, lost. The people didn't even see the button.
I was like, what the fuck? So I think it's a really small path between staying up to date, using the tools that really push you forward in your career, and doing nonsense things that doesn't bring you anything that only harm the user experience. And I feel like to bridge the gap, we need education, which is very difficult because there are also a lot of shitty posts on LinkedIn from people who have absolutely no idea and say they're experts.
but they don't they are not they don't and they also don't care and i think one problem about gen ai is i mean the promise of gen ai is everything is just one prompt away yeah and that's that's not true i mean i think and that's what i hear from many conversations with people with creative people who work with ai it's it's never one prompt it's always it's the process so people who who do ai ai art for example
they all say hey it's never one prompt it's it's a process it's a ping pong game that i play but in the beginning is always my idea that i throw in and then i work with that and i think what many people think is i don't even need this idea i will create it for me and that's i personally think that is that is so wrong because the danger here is that
If we let AI create our ideas, then everything will be more and more similar because that's how LLMs work, right? So the strongest signals in the data set will always be the ones that the AI will throw back at us. So things will be more and more similar if we only take the results from AI. AI is not possible to recreate
the nuances and you could say the nuances that make us human, right? This one that is the things that are not typical. And therefore it's so important to always see AI as an maybe as a companion, as something that I can use for playing back and forth ping pong. But if I think that I can do all my work
That's, in my opinion, a totally wrong way. And it's dangerous because AI by itself is never, no, never is wrong. It's not creative. Because in the end, Gen AI or LLMs are always the window to the past, right? They are trained by data from the past. And I had a nice conversation also on the podcast with Tom Greenwood, great person, one of the
trailblazers of web sustainability, I would say. And we were speaking about AI and about creativity. And he said so well that there are different forms of creativity, right? Sometimes I'm just more or less copying something that is there and maybe adding a little bit. And that's something you can do pretty well with AI. But then there are sometimes these so-called eureka moments, right? Where someone does something totally different from how it has been done before.
And that's something then I in its form as it exists today cannot do. So if we think I sometimes use the example of back in the time when we had had smartphones with with keys and not with touch screens. And then at some point someone put a touch screen on the phone. Would I be able to think of a touch screen phone if I don't if I back then would not have told her told it to create a touch screen phone?
no it would have made maybe more keys and a bigger screen on the phone but it won't be able on its own to make the switch from ah let's get rid of the keys and make a touchscreen onto it so these eureka moments that's something jenna i cannot create and if we rely with all of our creativity on ai then we are losing this and that's especially for us creative or people waking work in the creative world world that is
super important to understand, right? Last year when I saw the Figma with, oh, now you can with one prompt create wireframes and designs in Figma. My first thing was, okay, now everything will look even more similar than it already does. Is that a good thing? I don't know. Well, and then in the end it turned out they might use your designs and to train their data set. And then if maybe I was a prompt, get a design that you created some weeks ago.
That's a totally different story there. So yeah, these are the dangers that I see. And what I try to say, right, it's not about not using it. It's about what's the right way to using it and what's the right way to decide when using it. And that is super important. And again, I'm hopeful because what I see is when I talk to people about these things, when I do
these workshops or also talks about these things, people are understanding this, right? They understand the problem. So it's rather about creating this transparency and making it visible and explain it to the people. As I said, education. And that is super important, I think. And we need much more of that. 100%. I mean,
Totally agree. To summarize, maybe could you share some tips for maybe design teams? We're working on a project. The idea is to integrate AI somehow, even maybe train their own models or integrate in their workflows. How can they still make this sustainable without losing
features that might help the product stand out from the competitors because also their competitors are using AI somehow. They don't know how, but they definitely do.
the question is when when when will it be a competitive advantage that i do not use ai again that's a philosophical question but yeah i don't know yeah maybe no i don't know either it's just it just came to my mind when you when you sent it so um i mean the first thing that i think is what every company or team and doesn't matter if it's a design team or a product team
the first step for me always is to think about okay using it what are the potential positive and negative impacts of it because it's this transparency that we are already lacking and um there is i use in my workshop a pretty simple tool for that for example that um where you emphasize about the use case and think about yeah what are the positive and negative outcomes because that's
that's the basis actually for at least for a discussion should i use it or should i not use it so and that's that's what i wish for at first and once you have done that and you have the feeling that it still might be a good idea then think about the use case and think about what does what what does it really make better for my certain use case right classic ux work what define the use case first
And then you can think about if it makes sense to use it or not. So there is a lot of and that's the maybe sad answer here. There is a lot of work that does not include any AI and any prompting before you should even considering start really using it. And that would be my advice actually to do so. I mean, in my workshops, sometimes
i do a four hour workshop or i recently did also um a master class for our master class and there was no prompting in the whole master class and i know that's not what people want to hear sometimes i understand i understand that and but where there were so many other things we were discussing right and so maybe as as a as a little reminder working with ai never starts with a prompt maybe that's that's that's a
good good thought but first some other thinking before you start um start prompting i think that's a really good tip and super important never start with a prompt think first because actually even when you start with like a prompt you need to think about it what would be a good prompt you know what what is something that i can actually ask but use your own brain first
and last as well right like never you know like oh you're totally right yeah in the end always also um think about what came out of the machine and is it reasonable is it correct is it good um having this extra thought is also super important yes I agree
I think it's fascinating. We're living in such an exciting time right now where we're still trying to make the UX better and trying to design great digital products.
But then all those things come in like GenAI and 3D metaverse content and everything. And it's such a big challenge for designers nowadays to really find a good way to approach things and to learn about those things. So thank you so much for being so open about all the learning that you're having when it comes to sustainability. I think it's absolutely necessary. It's very important. And also to be critical when it comes to AI.
I mean, I always say like be critical, but also curious. Curiosity like will lead you very far. But critical thinking is something that you will need in the future. Whatever happens, you know, like good future or the dystopian future that you mentioned earlier, where we are all like living in the 3D world, the metaverse. Critical thinking is necessary. Yeah, no, you're totally right. It's not about criticizing everything, but it's about creating a
the base layer of what are the real impacts and then also think about the potentials because in the end it's always about balancing things right there is never only right or wrong there's always what's a good compromise that we actually um that we actually need and yes that's something where we need to get much better um and especially when looking at the pace how
AI or DNA but also technology in general evolves and we need to prepare ourselves actually to make better or let's say more informed decisions here. So yeah, that's what I hope for, that's what I work towards and well, hopefully, I mean that's the plan. Hopefully soon there will be some sort of masterclass about this as well. So how to use it responsibly.
Let's see, maybe when you release the podcast, it's already there, but I can't promise. If it is, we will see. You will find it in the show notes. I'm definitely going to link it there as well with all the other resources that you mentioned. So you can dive a little bit deeper into the topic, learn about it and sustainability and like thinking yes about less about like the single user and more about like the bigger impact, humanity, the environment.
It's super important and definitely not. I think like not in a lot of people's mind because they think it's so abstract and difficult. But even if you're like in your organization, have some mini ideas to improve the product,
I think people are very, very grateful for it because if you save some like, you know, megabyte on the website, a couple of mission, there's also money somehow, right? That's you need to store all the data and in the end, like,
I think it's a really good thing that you're doing and could also become part of the branding. I really love the Patagonia example that you mentioned. I think it's wonderful when I have a lot of things from Patagonia. I love the brand and when I think about Patagonia, it's exactly what comes into my head. This sustainability. I'm buying a jacket. They take care of it. I have this for a lifetime and this gives this value, this feeling of value and
It's a bit more difficult with digital products, I totally get that. But why not give people more of a feeling that they are part of a good movement? So I think there's so much in the future that needs to be responsible and sustainable.
So long story short, thank you so much for being here, Torsten. It was wonderful having you. Thanks for sharing all those learnings and insights. I also learned a lot, so I love that. And yes, thank you so much for being here. Thank you. Thank you so much for having me. As I said in the beginning, it's a great pleasure and also an honor. And maybe one last thing, just
One sentence, actually. Sure. Because you mentioned that as well, right? It sometimes seems so difficult to do all these things. And it seems like this huge mountain that I never can conquer. And for you out there, it's not look at this whole mountain. Look on what's the one thing you can do better until tomorrow, until next month, until the end of the year. And that's the mindset that we need, right? I will do these things until the end of my life. We will never reach the point where we can say, okay, now we are done. World is safe.
will not happen. So that's the mindset that we need. What's the one little thing I can do better until tomorrow? And that always makes impact and leads to something. Love that. I think perfect last words. I think it's something that we can all take away with and take through the week, the month, the year maybe. So again, thank you so much for being here.
I'm going to link all resources in the show notes so you can check it out and then talk to you soon. Bye-bye. Thank you for having me. Bye.