Hello and welcome to the future of UX, the podcast where we explore the trends, the challenges and innovations shaping the future of design. I'm Patricia Reines and in each episode we dive into the intersection of UX, technology and human behavior so you can stay ahead of the curve. Today's topic is one of, yeah, that every UX designer thinks they understand, but do we really?
Are we designing for how people actually think or just how we assume they think? We rely on
psychology based UX laws like the Higgs laws. Too many decisions slow down people, right? So if you have a look at a very traditional TV control with so many knobs, so many decisions and compare that to the one that Apple designed where you just have like five, six options, much less, it's easier.
Or the Fitts law: Bigger buttons are easier to click. Makes sense, right? So it's very easy. Or Jacob's law: Users expect interfaces to be familiar of what they already know. Makes sense, right? So if you have a look at our interface, basically how buttons look. This is how physical interfaces also look like where you clicked on a button, for example.
So what if these rules oversimplify human psychology? But if they don't apply in real world situations? In this episode we will explore: When psychology based UX rules fail in real life
why people don't always act logically and what that means for a design. We'll also talk about the dark side of persuasive UX when psychology or behavioral design is used to manipulate. And we will talk about the future of UX psychology from behavioral AI to brain computer interfaces. I would say let's get started with a very interesting question.
So a couple of weeks ago I wanted to book a hotel room for a trip to Berlin. Currently I'm living in Zurich in Switzerland. Sometimes I'm going to Berlin, sometimes for work, sometimes just to meet friends. So I wanted to book a hotel room on a very famous hotel website. And I saw a hotel that I liked and this hotel had a bright red message: "Only two rooms left at this price."
I was already quite in a hurry because I was a bit late with like booking the hotel room. So the prices were already pretty high. So I saw this message, only two rooms left. For me, I really instantly feel the urgency to book this immediately. You know, I didn't really want that anyone maybe books the room and then I need to pay much more. And after I booked this room, I realized later that the sign was still there.
So there was still the sign "Only two rooms left at this price". Also the price didn't change and those two rooms weren't really the last ones available.
And this is a classic UX psychology trick or even a dark pattern, leveraging people in doing things they actually wouldn't do using the scarcity effect. People are always very, very afraid to lose something. So what works really well is to scare them. The problem is that it works super well.
But it's not very ethical and me for example, I mean I recognize that this was maybe a little trick from the website and definitely not ethical also not legal what they did. So it definitely destroyed the trust that I already had in the company. So trying to trick people into things they wouldn't do can go wrong when we trust rules instead of observing real human behavior. And users don't always act logically.
Let's talk a little bit about people don't do what they say they want. People say they really value privacy but they also click on accept all cookies every time because that's much more convenient than going through all the different toggles and accepting every single one of them so they prefer less privacy but also make it easy and quick. People say
They want more control, but they also trust AI recommendations over their own decisions. And people say they want fast experiences, but spend hours brooding Netflix instead of watching something. And everyone who has done user research at some point knows that there is a huge gap between what people say they want and what they actually do. This is also how people work. And why does that matter for your ex?
First of all, usability testing often fails because users have or behave differently in real world than in test environments. So surveys and interviews can be misleading. Users don't always know why they make decisions. And the choice overload is real. But removing too many choices can also frustrate users instead of helping them.
So why do people usually ignore search filters? So most e-commerce websites have advanced filtering options on Zalando, on Amazon, on basically any e-commerce website you can think of. You can filter the price, you can filter the color, the size, the brand, but users still prefer to scroll endlessly instead of using them, right?
Because they trust their intuition more over logic and users browse emotionally, not systematically. They might not even notice the filters exist. So, well, it's a lot of work to really go through the filters. Okay, so now let's come to this episode's sponsor, Wix Studio. Web designers, let's talk about the C word, creative burnout.
Your client side has real portfolio potential, but between resourcing, feedback, end-tied budgets, and ever-tidier deadlines, it just doesn't make the cut. Wix Studio helps close that gap. Built for agencies and enterprises, you can bring your own vision to life and keep it alive with no-code animations, tons of AI tools, reusable design assets, and advanced layout tools. For your next project, check out wixstudio.com
That's quickstudio.com. So what's the key takeaway for us? We should focus less on what users say and more on what they actually do. And this is also why it's so important to include user research in every single product that we build. If it's an AI product, if it's a traditional digital product, we need user research because we need to watch people what they do.
And although I am a huge fan of integrating AI into every step of the process, every step of the design process, basically, I know that AI won't be able to replace that. What people, what they do, what they say, and then what they actually do, this is something that only humans can do. And this is super important. And I think this is a great example of
AI is a huge help, especially when it comes to research and design, but there are also a lot of fields where we need the human being in the center. Another big question that I'm asking is: Where is the line between these helpful nudges and manipulation? We all have probably heard about dark patterns, these UX tricks that force or mislead users. A little bit what happened, or actually what happened to me when I booked the hotel.
It could also be the unsubscribe maze. You probably all have seen that at some point. Some websites make cancelling a subscription intently difficult. I just signed up to, I think, I forgot what product it actually was, but I wanted to cancel it and the button was hidden somewhere underneath a...
I guess subscription, that didn't really make sense. So it was hidden somewhere and it was absolutely horrible. And what does it actually shows is that companies push for engagement and conversions even at the cost of user well-being. So UX designers should halt to be or use an ethical standard maybe like doctors or architects. So another or I think another
An example that I really, really like is the tool Myra and I already talked about it in another podcast episode. They are making canceling a subscription very, very easy. You can cancel for that day. And I think this is something very uncommon. Usually when you cancel a subscription, you pay, you know, the end of the month or at the current cycle and then it's canceled.
But Myra does it differently. They say you can cancel any day and you get your money back for the rest of the cycle that you already paid. I think this is a fascinating approach. So they are not trying to trick you to stay on that subscription if it's not a good fit because they are focusing on building a good product. And they also know that
users are coming back and forth. And this is the thing about cancelling or cancelling a subscription. It doesn't mean that you lost this user, it just means that for that certain time the product is not relevant to them. So this is something that we should definitely keep in mind and thinking more about the long-term effect. Now let's think a little bit about the future.
And the question of how will new psychology research change UX in the next 5 to 10 years? So what does the future of UX look like? First of all, from my experience, it's definitely behavioral AI and hyper-personalized UX. So AI will be able to predict what users want before they ask. So UX will shift from these static designs to more adaptive experiences.
A really good example is Netflix or Spotify or even Google Maps. They already do this with content. So if you compare or if you would compare your Netflix account to mine, it would look completely different because all the recommendations that you see on your Netflix page as well as the cover images, the preview visuals would be completely different.
And when we're thinking a little bit, yeah, even more far into the future, will we design for brainwaves instead of clicks? Computer or brain-computer interfaces could change interactions design completely when we're thinking about Neuralink, which is already developing devices that let people control computers with their thoughts.
This is basically where you implant a little device in a human's brain and then it analyzes the brain waves and people can control digital devices with it. This is a startup from Elon Musk. It's very fascinating for now. It's only used for people who are paralyzed, who are sick.
But potentially thinking about the future, there might be some people, you know, when this becomes more common, who are willing to get an implant themselves. So the future of UX is not just usability. It's more understanding the human mind in deeper ways than even before. UX is more than screens and UX is more than layouts. It's about understanding how people think. This is the way how it is right now, but also how the future will look like.
And I think it's super important for us to get prepared for everything that's ahead of us. If you want to share your thoughts, feel free to connect either on Instagram or LinkedIn. You can find me at ux.patricia or patriciareiners on LinkedIn. Let me know what you think. Let me know what you think about the future of also behavioral design of UX patterns and also dark patterns. And
If you liked that episode, feel free to give us a rating. It's something that I would really, really appreciate. It's a little thank you. So if you had some aha moments, some learnings, please give us a rating here on Spotify, on Apple Podcasts, wherever you're listening to the podcast. And now, thank you so much for listening and see you or hear you in the future.