Hello and welcome to the future of UX and welcome to today's episode where we will delve into the revolutionary world of the Apple Vision Pro. As the latest entrant into the realm of spatial computing, Apple Vision Pro has stirred the tech community with its ambitious features and cutting-edge technology.
My name is Patricia Reiners. I am an innovation designer from Zurich, Switzerland, and I am your host for this podcast. Before we get started with the episodes, with the insights, I just want to share something. At the moment, the AI for Designers training is currently taking place over five weeks.
And I personally really love to see how the participants are learning, developing further and also how they're working on their AI projects. So for me, it's really fascinating to watch them and see how they're growing. If this sounds interesting to you, here is some amazing news. So although the AI for Designs is currently already happening, there will be a second round coming.
And I can't tell you exactly when it will start. But if you don't want to miss out and if you don't want to miss the start, yeah, be sure to sign up for the waiting list, which you can find in the show notes. So if you're on the waiting list, you will not only get earlier access to the course, but also some exclusive bonuses.
So don't forget to sign up and I would say now let's get started. In this podcast episode, we are going to explore the impact, especially from a UX perspective. We are going to share some stories and insights. We will also talk about what it means for the future of user experience design when we suddenly are working with spatial computing, with the Apple Vision Pro, with AR, VR, with mixed reality glasses.
First of all, what is the Apple Vision Pro? Maybe you're living behind the moon somewhere and you haven't noticed that the Apple Vision Pro is now available for a price around, depends on the different features that you select, $3,500. And this device really represents Apple's forefront into mixed reality.
It offers a blend of high-quality hardware and immersive experiences. It has 24 million pixels across two panels and it promises unparalleled visual fidelity.
The device has eye tracking and gesture control that really stand for their precision, allowing users to interact with the digital environment effortlessly. So I think a super fascinating, innovative way to use eye tracking to select certain parts to scroll on the interface.
And unlike traditional VR headsets, the Vision Pro really addresses some common issues like latency and doing snorchia or isolation with it that month.
R1 and M2 chips, ensuring a smooth and inclusive experience. So from a technical standpoint, this is an absolutely fascinating device. You can basically use it as a MacBook, right? You don't need your MacBook. You can use that as a standalone computer to get your work done, to do any kind of live workshops. And what's so interesting is that you can also
define the level of immersion. So you decide if you really want to shut out completely and really dim the background or if you just want some mixed reality features like a separate screen, for example, next to your MacBook. For me as a UX interaction designer, the user experience is of course the most interesting part of this device.
And this is probably also one of the most groundbreaking aspects of the Apple Vision Pro. It's the interaction model. Foregoing traditional controllers, it employs, I already mentioned, eye tracking, hand gestures and voice commands for navigation.
This innovation is definitely a leap towards more natural user interfaces, but also opens new doors for UX designers to rethink how we interact with digital content overall. And what this also shows is that the device
Infinite Canvas and the ability to run both specially designed and existing iPhone and iPad apps offer a glimpse into the future of personalized immersive computing. For me as a UX designer, it's super interesting to
We observe how users are interacting with the Apple Vision Pro. There is always this goal from someone who's developing something, who is designing an app and then seeing actual users interacting with it, using it. And I think this is super fascinating. At the moment when you open Twitter or X or Instagram or LinkedIn, every user
Basically, every feed is buzzing with these quirky videos. A mix of, yeah, intrigue and a dash of audility. You see people, you know, driving their Tesla with the Apple Vision Pro, you know. You see people in a coffee shop ordering coffee with the Apple Vision Pro. And of course, this looks very bizarre, a little bit dystopian.
And it also looks like people are completely shut out of the real world. Because although you can kind of see the eyes from the outside, it doesn't really seem like it. It seems like someone is in their own world. And you don't really know if you are just standing by and communicating with someone wearing the Apple Vision Pro how much they actually see of their surrounding if they see you. So this is definitely a little bit problematic. Another interesting thing
problem that I'm also seeing is people who are using the Apple Vision Pro and wearing it at home and preparing their homes basically for the Apple Vision Pro. So how does that look like? People who are placing a screen on top of their stove, right, like to watch YouTube cooking tutorials while cooking.
And they're placing a notes app next to the fridge to write down some like rosary list, for example. They are also placing a music interface next to the TV. They are preparing a dedicated workspace with a desk with a lot of different screens. And I think seeing that is so fascinating for me because it seems like we all wanted fewer screens, right? We want less screen time.
And what we ended up with or what people who are currently using the Apple Vision Pro or some of those are ending up with are many more because they can't, of course they can remove it, but they set it up that they, the screens actually stay there and
I think from a UX point of view, it's super fascinating to see that, right? Because what this really means is that the Apple Vision Pro, although the design looks nice and it's amazing for a first generation device, so don't get me wrong, it really shows that the UX needs a serious upgrade. It's
Almost impossible, right? A user feels confident and this is a healthy environment when you wear the Apple Vision Pro or any kind of mixed reality device. And you have a lot of screens all around you that are distracting you. And we as designers know that you need focus to be productive, to have a good life, to know where to look at, right? Like no distractions.
So how do I see the future? What do I see as the next step that is probably going to happen? Because the Apple Vision Pro is fascinating. It's amazing to see.
But for me, it's also a little bit shocking to see how people use it in their private, in their personal life with like placing screens basically all over everywhere, which is for me from a design perspective, not ideal. So how do I see the future? Some of you might remember one of the podcast episodes that I shared, I think like four weeks ago about the Rabbit R1. The interesting thing about this device is that it works very differently, right?
This AI model was trained on workflows, basically. So it removed the part from the intention to the final solution. And I think that the Apple Vision Pro or any mixed reality glasses need a similar approach.
Where you don't place content all the time everywhere, but based on situations, based on locations maybe, based on an input, on an interaction, right? Like play me some music on Spotify that's perfect for dancing. You don't need to see the whole Spotify interface for that. You just don't.
You can start with a voice prompt and then see some recommendations, select one of those and then it disappears again, right? So you don't need to see the Spotify interface all the time next to your TV. And this is what I assume the future will look like. Integrating these large language models into these hardware components. I think it's a little bit different when it comes to working, for example,
This is for me personally the era that I find the most promising when it comes to mixed reality glasses working remotely from basically anywhere.
having screens around you and creating your perfect work setup, half remote, hybrid, whatever, doing really productive workshops from your home. I think this is absolutely fascinating to me. And there is so much potential, especially if you compare it to the situation that we are having at the moment where we have workshops on a tiny screen or maybe we have a big screen at home, but still we're interacting on a digital board.
with tiny videos from people and it's so difficult to really interact with each other. So I feel in the work context, there is so much potential when it comes to spatial computing.
A little summary. The move towards spatial computing demands re-evaluation of user interfaces, prioritizing context awareness and situational adaptability. As UX designers, we stand at the forefront of this new era where really designing for 3D spaces and augmented reality will become a norm, right? Although this is still very futuristic, step after step,
more people are going to use these devices, although it might look a little bit bizarre at the moment, this will become very common pretty soon. And the Vision Pro with its impressive but initial steps invites us to really imagine a future where digital and physical world seamlessly blend, offering more intuitive and engaging user experiences.
So the Apple Vision Pro is more than just a new gadget. I think it's definitely a signal of the transformative changes coming to the UX design landscape. Yeah, well, at the moment, it's showcasing the potential of a special computer. It also highlights the areas that needing refinement for truly immersive and user-friendly experiences. And with AI, a lot of things become possible. And I know that Apple is already working on these topics at the moment.
But not just Apple, also Meta is working on similar devices, which will be a super interesting year. I would say thank you so much for joining us in this episode as we explored the Apple Vision Pro through a UX lens. Stay tuned for more insights into the evolving world of technology and design and hear you in the future.
you