Hello and welcome to another episode of App Stories straight from Apple Park. I am here with Federico Vittici, and this episode is sponsored by Click for Sonos and Elements by RealMac Software. Hey, Federico. Hello, John. It's good to be back here in the podcast studio. It is. Here we are. Just you and me this time. Just you and me. Okay. I'm maybe a little more sunburned than when I was yesterday. See, I told you.
I went to the beach right before coming to WWDC so that I could get like a base level tan so that I wouldn't get sunburned. I had a little bit of a tan before I got here, but you know, I didn't want to must my beautiful hair before we did our show. So I didn't want to wear a hat. Sure. Okay. You don't need a hat head when you're doing a video podcast. Yeah.
Okay. All right. Tell Chris Lawley about that. Yeah, I'm sure. But it's just you and me, and we're going to do like a regular app stories. We are. We are going to talk about some of the stuff we've seen this week. It's been amazing. So on Mac Stories, we...
thanks to our incredible team, Nilean, Jonathan, and Devin, back at their respective homes, they've been working on all the overviews. They've been working very hard and very late into the night. And we really, we should thank them here because they've done a lot for us because you and I have been running around like crazy people, not able to do any writing yet. Although I hope to this afternoon. Yeah, we just were incredibly busy with other things. But they published the overviews of all the major softwares.
And you and me were now in this position where we were here, but we're just still processing our first impressions. Yes, and also just trying to catch up with all those details because there are, I mean, Jonathan's been sending me some details, things I was like, oh, I didn't realize that was a thing, you know, because...
It is a little hard when you don't have time to read every last footnote in every last press release, but we're getting there. We're catching up. Yeah. So obviously like I'm going to have thoughts about iPadOS. You're going to have thoughts about MacOS. I kind of wanted to start with the Vision Pro and VisionOS. All right. That's a good place to start. If anything, because like I really am fascinated by the idea of widgets and
And placing widgets almost as objects in the physical space. It was really interesting to me because in our Discord this morning, I saw Devin had a message. And Devin had tried it last night. And he was unsure of it, I think, at first. He thought, you know, having seen the keynote, he wasn't sure whether he would like to have kind of those widgets embedded in his wall or that kind of thing. But he tried it and he loved it, honestly.
So he's a big fan and he'll be doing our Vision OS review in the fall. Yeah, the idea of that was in the keynote, like, and that was actually like,
some third-party apps that have been Sherlocked with that feature. But the idea of like placing a photo on the wall and it almost becomes a window. There were dedicated Vision OS third-party apps. Day one there were, yeah. They did exactly that feature and now you can just put a photo on the wall. And like a clock, for instance, that you could put on a shelf or on the wall. I told you yesterday during the keynote, oh, this is useful because I don't have a clock on my wall. Now I can just have a virtual one. Are you going to go digital or are you going to go analog?
You know me, I cannot read analog clocks. All right, there were a lot of analog clocks that were shown off that looked very nice. I know, I just like numbers. Okay, I hear you. I hear you on that. But this idea of like, you can have some widgets, I don't know, in your living room and some widgets in the office, or like even, you know, you can have some widgets at home and other widgets in a completely different location. Yeah, that's an important factor, I think, that it remembers where things are in space, right? And that even, that goes beyond just...
it goes to restarting the Vision Pro, I believe, that if you... Even if, you know, your battery died for some reason and you had to kind of reboot from fresh, those widgets will still be in the same place. And Windows, too. I mean, the apps that you have open, too. The...
The other thing that I saw, and I was actually watching some videos last night, the new personas look incredible. - They really do. Much more realistic. - Much like the complexion, like the detail in the eyes, the hair. - A little more three-dimensional too, I think too, like more of the side of people. 'Cause they've really come a long way since those original ones. - Oh, I mean, even if you compare like the first version to the latest one. To the point where like I was looking at one of these videos on YouTube,
And it was like, now that's a pretty good replica of somebody who's not physically there, but can be virtually there with you. Are we going to start doing virtual meetings, Federico? Kind of, maybe. Maybe, I'm thinking we should. Because when you pass that threshold of like the uncanny valley of somebody who looks like John, but he's not quite John, when you pass that,
I think it becomes a lot more... Well, especially the reality. I mean, our reality is you and I, this is our first time together in two years. So I could see some utility for that. You do look better than your persona. Thank you. Thank you. You do too. No, thank you. Appreciate that. And the other sort of big picture topic is...
sort of shout out to VisionOS for laying the groundwork for Liquid Glass. Yeah, absolutely. Absolutely. There's a lot of lineage there. And having like a call out in the keynote saying like inspired by VisionOS and obviously like
Liquid glass takes on different shapes and forms depending on, you know, you're using the Apple Watch or you're using a Mac or you're using an iPhone. But like Apple actually saying it was inspired by the work that we did there. Now it's paying off. Yeah, it really is. And I think, yeah, we should talk a little bit about liquid glass.
with glass because you and I were having dinner last night and talking about it. And it's very interesting because when you think about glass, I mean, we were sitting at dinner and we took one of the water glasses and we were kind of putting it over the menu because you get that sort of effect, some of those kind of effects. That light refraction. Right. When you're doing different selections in a tab bar or something like that. And at
on the one hand, when you think about glass, you think glass is not kind of this fluid liquid type of thing. However, with liquid glass, this, this new design language, it is. So it's kind of its own thing, but it feels very natural and organic. And then the way that it kind of flows and moves around your, you know, the touch gestures. I mentioned, I mentioned this on comfort zone, which I'm going to be a guest on comfort zone also from the podcast studio. Um,
And like, I think it's interesting that you have that contrast between something that in the physical world is so rigid and fragile, almost. You know, glass is fragile. But in this case, it's like malleable. It's fluid. It's viscous, maybe even. Yeah, because you mentioned how you've seen glass blowers in Venice and in Italy before and that it was like that.
when they had heated the glass and it had kind of that stretchy quality. Yeah, and it's not something that you typically associate with the physical material, but in this case, I mean, it looks really good. And, you know, from...
You got to think like when somebody that is not typical App Stories listener, you know, somebody in the real world, like the teenager in the real world looks at iOS 26 and says, oh, you know, that looks cool. Yeah. Like that's the new one. Right. And it's the reflections. I mean, I think what it really shows a lot of is the advancement of the chips, the chipsets that we're seeing that both the graphics and the CPU and that there's enough power there and enough access.
excess power that can be brought to bear on those kinds of reflections and animations without also at the same time destroying the battery life. So that's a hard thing to do and it's taken a while but I think it looks great. I have to assume that it all has to be like GPU accelerated. With Metal 4. With Metal 4 because that's the sort of thing that I
I mean, it's basically like you're doing something video game adjacent almost. It is like ray tracing kind of. Yeah. When you're taking care of all those refractions and refractions, which are two separate things. And you have this element that is fluid and sort of follows your finger at 120 degrees.
frames per second, like that's a lot of like video gamey stuff, but applied to just, I don't know, opening photos, for example. Right. Just selecting an email. Yeah. I mean, no, this email never looked better with liquid glass. So, yeah, I mean, I think, I really do think that the spatial personas with the Vision Pro, they look really nice. And I am, I am very keen to try the widgets. Like,
There's something about the idea of like placing these little objects around you and finding them again. I think what, I mean, at least what I'm anticipating is that there'll be a greater sense of permanence that when you use your vision pro, it won't be like you're setting things up all over again because things will be there, the widgets, the apps and whatnot. And that that's going to, that, that will, I think make it a better experience overall, even though it's not necessarily changing those apps themselves. Yeah.
This episode of App Stories is brought to you by Click for Sonos, the fastest, most polished Sonos client built natively for every Apple device you own. iPhone, iPad, Mac, Apple Watch, Apple Vision Pro, and now Apple TV. Instant control means no lag, no loading spinners, just seamless playback.
Qlik was engineered specifically for Apple users with deep system integrations like interactive widgets, live activities, shortcuts automation, a handy Mac menu bar controller, and control center support.
Setup takes just seconds. No click account and no Sonos login required. Just open the app, connect your system, and enjoy. Effortlessly group speakers, fine-tune volume room by room, or enable volume syncing so the whole house follows your lead.
You can create custom scenes for movie night, your morning routine, or a wind-down mix, and then trigger them with a single tap or a voice command through shortcuts.
Click streams from all of your favorite services, including Sonos Radio, Apple Music, Spotify, Tidal, Plex, and TuneIn. Audio files will appreciate lossless and Dolby Atmos support where those formats are available. Plus, Click includes a ton of thoughtful extras like customizable widgets, subtle haptics, and an immersive VisionOS now plain view that feels right at home in spatial computing.
The app is actively developed by a small indie team that really cares, shipping frequent updates and listening to user feedback.
For a limited time, AppStories listeners can unlock a special deal. Get one full year for just $9.99. That's 30% off. Or grab lifetime access for $30, which is 50% off. Visit click.dance.com to learn more and claim the offer and experience Sonos Control the way it should be. Again, that's click.dance.com.
Our thanks to Click for Sonos for their support of the show.
Let's switch gears. I want to talk about the Apple Watch and watchOS. We got a few demos of watchOS 26. I want you to talk about workout buddy in a minute. I just want to mention a couple. Yeah, the buddy is the star of WWDC. The two things I want to mention that I sort of latched onto, there's a notes app on the watch where
which is making me rethink. See, I didn't even bother to tell you this a couple of weeks ago. No, I know you. What? You went back to notes from Obsidian because... To what? To what? To notes. No. No? No, and I didn't tell you because... Oh, you went to Notion. Yeah, because you were going to be upset that I was using Notion again. But now, between some of the Apple Intelligence integration that we're seeing with shortcuts, and we're going to talk about that in a minute, and this thing on the Apple Watch...
I'm sort of thinking, oh, all those things that I was actually doing in Notion, I can pretty much rebuild them in Apple Notes. And then do it in a way that's like fits more with the way you work, I'm sure. Especially with shortcuts. Yeah. And so, I, honest to God, like, I didn't think I would come into WWDC being like an Apple Intelligence fan. No.
I'm leaving WWDC thinking, you know, actually, I'm doing a bunch of things with Apple intelligence. It's funny. Anyway, there's notes on the Apple Watch. And I will say about the notes on the Apple Watch, Federico, is that to me, having that kind of thing on the Apple Watch really kind of makes or breaks what app I'm using because it's like you're out for a walk, you're running errands, you think of something you have to do when you go home. And it's not necessarily a reminder, but it's like,
an idea you have for a story or something you need to talk to somebody about. And it makes a big difference if you can just go to your Apple Watch and input the information right away. It supports dictation. Yes. So you can actually see all of your notes. I saw a demo that was like a full note with links. Okay, I didn't see that demo. I was so tempted to ask, can you click on the link? But I think it was hyperlinked, so I'm assuming you can click it. Was it to the web or to internally? Do you?
Okay. I don't know. I don't remember. And the other thing is the wrist action for quickly dismissing a notification. That'll get me to upgrade my Apple Watch this fall, I think. I really do. That's the thing, because I am on an Ultra One. Same here. Which doesn't support the double tap. Right. Will not support the flick of the wrist thing. But yeah, those are the two things that stood out to me.
Workout Buddy. Workout Buddy. So Workout Buddy, you're a runner. I am. You have been a runner. For my whole life, pretty much. Yeah. Okay. Ever since I was a teenager. Okay. What do you think of Workout Buddy? I like it. I think it's a great idea. I think it is a shame that you have to have your phone with you. Right, because it's doing Apple intelligence and obviously Apple intelligence is, at the
Right. It's a combination of things that are going on. I mean, obviously the app is on your watch and it's communicating with your phone and it's also communicating with private cloud compute. And to do that, and then it's generating spoken words encouragement to you. It's like, you know, you start a workout and it kind of says, oh, you know, I saw you went for a long walk yesterday. Why don't you see if you can crush this 5K today? And you go out and you do your run and along the way, it'll give you things like,
your stats, you know, what was your mile split? And, and, you know, this is your hundredth mile you ran this year, you know, give you a little bits of information or your heartbeats at 95, you know, keep pushing it and that kind of thing. And I like that. I mean, I think there, I've seen other apps do these kinds of things before. I've never really gotten that into it myself, but I, I, you know, for me, I have a cellular Apple watch because that's the whole point is that I want to go out for a run and have my,
podcast and my music with me and having a, like a coach with me too, would be really great. Now I do like, especially in the colder weather, I'll wear like some sweatpants or something where I could bring my phone with me. I prefer not to, but I also do other kinds of workouts where that'll be totally fine because I do take my, my phone with me when I go say for a long walk on the weekend with my wife, you know, I will go for like
five, six, eight kilometer walk and I'll bring my phone and maybe not with her. I wouldn't listen to Workout Buddy, but if I do that by myself, I would like to get that encouragement. Yeah, yeah.
You can choose between multiple voices. Yeah, there are two voices. Two voices. Voice 1 and voice 2. Voice 1 and voice 2. They're always named Buddy. So it's always the workout Buddy. It doesn't have like a person name or anything. Yeah, I am going to upgrade to a new watch this year. I think it's time because you and I both are in the like mid to high 80% battery health. It's time. It is time, I think. Before we talk about other platforms, I want to mention AirPods.
Because this is a very particular thing for me. I wrote this article a couple of months ago about like me discovering that I'm the sort of person who likes to talk out loud by myself.
to sort of brainstorm ideas and record myself in the car or doing chores around the house. So just talking out loud to, I don't know, brainstorm, you know, save ideas, save some notes. You do this when you're by yourself though, right? Not when other people are at home. Not in front of people. Okay, good. By myself, by myself, in front of the dogs. Sure. Oh, they're okay. They understand.
This year, Apple is going to do better. I'm reading. They mentioned studio quality audio recordings on H2 compatible AirPods. Okay. I can tell you, having tested a whole bunch of wireless earbuds, despite the fact that I love my AirPods 4, they were not great for just recording myself. Right. Very compressed, very rough.
robot-y sounding voice. I am so glad to hear that they're doing better recording for that kind of purpose, just talking to myself. Right. Well, in the demo, they showed someone doing vocals for singing. And I mean, if it's good enough for that, it would be definitely good enough for dictating your thoughts. Yeah. And for all the influencers out there, you will also be able to start a camera remote from your AirPods. So if you're
recording the video. - Got your selfie stick in front of you. - Got your selfie stick, got your iPhone on a gimbal or whatever, you can start a video from the AirPods. So no need to mess around with that. - Yeah, I think those are cool features. There's a couple of the smaller things, but I think that those are both great.
tvos is there anything that we should say about tvos it's got it's getting liquid glass you can't really tell that it's i mean kind of well there's a couple things i mean i think that in a way that tvos had that kind of look a little bit before in terms of the depth not so much with the reflection i don't think necessarily but so that there will be a difference there and maybe i think i'll have to see it to really get a sense for how much different it is
in terms of the design elements. But there is like new poster art when you're going through sections with lots of movies and TV shows. There's a change in the way the artwork looks that I thought looked really nice in the keynote. I haven't, you know, when we get home, I'll probably put on the beta and check it out, but I haven't done that yet. Yeah. It's also like the consideration is that tvOS is the only device
Apple platform that doesn't move. Yeah, that's a good point. Right? It's the only static one because you're putting it on a TV. I mean, unless you're Sigmund Hohenheim
and you have a crazy setup with the portable Apple TV, but by and large, most people just... It sits on their media center and doesn't move. It's not like a laptop. It's not like a phone. It's not like a watch. It's not like the Vision Pro that is literally on your face. So it's static. And so because of that, I'm guessing like it loses all of that. Like, for example, on the phone with iOS 26 and liquid glass, you got the 3D effect. You know, when you move your phone...
the TV, you're not moving your TV. Right. Well, and the TV is obviously not part of the Apple TV itself. So it's like, you know, they're not controlling. Yeah. Yeah. So,
This episode of App Stories is brought to you by Elements by RealMac Software. Elements is a modern drag-and-drop website builder for the Mac that just launched after more than three years in development. Its powerful WYSIWYG editor lets you build fast, responsive static sites visually. No code required. Every project is fully based on Tailwind CSS. That means no code is required, but you can also dive in deeper whenever you want.
When you export your site, you'll have a clean, static website you fully own and can host anywhere. That gives you the kind of control and flexibility that will free you from the whims of platforms that don't have your interests at heart. Elements is the next generation of the beloved RapidWeaver, but completely reimagined from the ground up. It's built exclusively for macOS and features an elegant native user interface.
The app comes from a team with 20 plus years of experience creating well-known Mac and iOS titles like Clear, Little Snapper, Ember, Courier, and Squash. Plus, it's updated every single week based on user feedback.
You can even follow the team's weekly dev diaries on YouTube to see what they're up to. The entire development process happens in the open, so users help shape the roadmap on the fly. Elements is made by indie developers for indie developers, designers, and makers who care about quality and control. Be sure to check out their library of ready-to-use templates, components, and the app's included Markdown CMS.
Visit elementsapp.io today to learn more and use the code MACSTORIES at checkout to save 10%. That's elementsapp.io and the code MACSTORIES to save 10% at checkout. Our thanks to Elements by RealMac Software for their support of App Stories.
All right. Mac OS. Mac OS Tahoe. So it's Mac OS 26, but it's also Tahoe. Yes. All right. So it's got dual names. That's an easier one to spell. And I appreciate that. Which one was not your favorite? You didn't like Sequoia.
Because, I don't know, for you Americans, it's like a problem to... Sequoia is very hard in my head to... I mean, I learned very quickly because, I mean, I wrote that review and I had to spell that out like millions of times. So, yeah, I got used to it. Do you remember all of the names? No, not in order. It's just too hard. Sequoia. So, going back in time, Sequoia, Sonoma. No, Capitan is way back. No, I know. I remember Maverick. That's where I started. Remember Maverick?
Yeah, Mavericks. Nobody remembers. I was at WWDC for Mavericks. I think that was my first WWDC. I think so, yeah. I think I did like a Mavericks review myself on Mac Stories. I used to do macOS reviews before iOS ones. Times have changed. So anyway, what do you want to talk about macOS? I want to talk about Spotlight because...
It's the big one. It's really the big one. I have never really been much of a spotlight user because I've always felt like there were third-party apps that fulfilled my needs better. And that has been apps like in the past, like LaunchBar or Alfred or maybe Raycast. And most recently it's been Raycast. But, you know, you and I, you wrote recently about Sky, right?
And Sky has some launcher qualities to it, but also is focused a lot more on automation. And where I think I'm going to kind of head towards, at least for the summer, is a combination of Spotlight and Sky because I want the Sky for some of the AI and automation tools. And I want Spotlight for some of the same things, frankly. Yeah.
but I just like where spotlight has gone because for a long time, spotlight, you know, it was able to find information for you on the web, is able to launch your apps. There's a lot of things that it did, but now we're getting it, the app intents coming directly into spotlight. And that's a big deal, I think, because one of the things that I've thought about with, you know, Apple intelligence coming to Siri eventually is that I love that idea, but I,
I don't want always to be talking to Siri to do automation type things. I mean, there's a variety of ways, you know, it depends like what context you're in, whether you're walking around your house or whether you're sitting in front of your Mac and having spotlight where if my hands are already on the keyboard, I can just very quickly trigger an automation based on the kinds of actions that we're familiar with already from shortcuts, right?
but in the context of spotlight is i think really powerful i mean it's a it's a great mixture of automation in a new part of the the os that we haven't had before and i think i'll be using that a lot yeah and i think
If you're a third-party developer, right? It creates this incentive, even more so than before, right? I mean, developers were already incentivized to use App Intents as much as possible because it really is the API that keeps on giving. You know, it's everywhere. For years now, yeah. For years now. It's in shortcuts, it's in Siri, Control Center, widgets, lock screen widgets. Lock, yeah, home screen. WatchOS, like literally everywhere. And now...
I am so fascinated by this idea that you're going to have all possible commands in Spotlight, not by going to a command gallery, like say, for example, you need to do with Raycast, for example. But all the, and this is very much like the Apple intelligence playbook right now. Apple is basically saying, we're letting like AI features come to your apps that you use.
And I mean, probably that's a convenient thing when you don't have a large language model chatbot UI. But regardless, they're sort of like making the best of the situation. Yeah, and I think there's a lot to it, though, too. I really do. I mean, I've been using a lot of chatbots, and it is kind of a hassle to flip back and forth sometimes. A lot of that stuff you want in context, because there are apps that you and I have used that do have AI in context already, third-party apps.
whether it's like shortwave for email or Zapier for automations. And it is more convenient to have that built into what you're using. And that's kind of the approach that Apple's taking with Spotlight and with other aspects of Apple intelligence. And with Spotlight, so they're saying, well, you can keep using your favorite apps,
you're just going to see the same commands that you see in Siri, that you see in shortcuts. You can just fill them out in Spotlight. And so it's the kind of UI that is very reminiscent of Raycast, how you can invoke a shortcut with either the full name or a quick key.
which by default is an acronym. Right, and it's automatically applied. Automatically applied, but you can customize it. So if you have a shortcut that is called, I don't know, add new reminder, it's going to be A-N-R. But if you want to make it faster, just say new reminder, you can customize the quick key to be N-R.
So you can do that, and then you can tab through the parameters that are going to be empty. Which are just like the parameters you'd find in a shortcut action, right? Like if you expand a shortcut action and it has various parameters that you can fill out, this will be the same thing, but you'll tab through. Plus, there's some special syntax I know that like, I think it's the slash key maybe, is that right, Federico? I think it is. Where you can then...
Kind of send it to a particular app. Is that the right? I think that's what we saw. And we also saw a demo. We saw a lot of demos this week. So our memory is kind of fuzzy. It is a little. Because we have a lot of... That was one from early yesterday, which I mean, it was only 24 hours ago. The combo of Keynote, State of the Union, Briefings, Podcasts, Demos...
all together with very little food and very little coffee and jet lag, it's not great. But I'm going to try my best. We saw a demo of that spotlight integration pulling in from the context of the currently active app.
Right. Which is sort of similar to what we saw with Sky, right? So I think it's fascinating that different teams at different companies are all sort of going in the same direction. Because, I mean, it's a good idea. When a good idea is a good idea, everybody's going to do it. Right, right. So I think...
I am so jealous that Spotlight is just on macOS. Yeah, it is a shame that it's not on the iPad. But, you know, you've got to have something to wish for for next year. Well, you've got to leave people wanting more. That is always the strategy, you know. Yes, it is. And this year it's Spotlight. But no, I really do think, and there's a bigger conversation maybe to be had about like third-party developers are going to continue making Apple products.
better because, I mean, Spotlight, sure, it's going to have actions from built-in apps, system apps, but it really shines when you connect that to your third-party apps. Yes, yes. And I don't think, like, launcher apps are really going to go away because of this sort of thing, personally. I mean, for instance, Raycast connects to a lot of web services, which I think it may end up doing better in the long run than Spotlight can do. But, you know, I
All that said, we haven't tried it yet. So that's kind of the caveat to all this is we've tried very little so far other than some of the iPad things I know you and I have poked around at a bit. Going just by my recollection at the moment, we got live activities on Mac OS 26.
which is nice yep uh obviously liquid glass which i think of all the platforms maybe mac os is the one that needs a little bit more work it looks kind of janky everything look it's beta one everything yeah yeah i need to spend some time with it and i'm not i mean one of the problems is when you're here it's like i'm not going to put a beta on my mac right now because i've got some podcasts to produce and things like that and i can't really afford any bugs that might be
I put it on my iPad, but I didn't put it anywhere else. I still have my phone. Same with the phone. Yeah. But yeah, that, that I think it'll be interesting. The, the live activities are nice because it integrates with the iPhone mirror. And so if it's like, say you're getting a live activity for an app like flighty and I'm like tracking where you come in from Italy or something, and I can then click on it and,
It'll open up the iPhone mirroring and get me right into the flighty app, which is nice. You got the tinting of the UI, similar to iOS from last year. The new clear look for the... There's a lot more translucency. A lot more translucency. Especially on the sidebars. Do you remember anything else? Obviously... Well, there's the menu bar, too. The menu bar. Yeah, more customization available there. And the folder customization. So you can make...
colored folders with icons and symbols, and they're going to sync with iCloud. Yeah, I like that. I mean, I've never been a person who's really tagged my files and folders in the past, but
And I have actually started doing that more recently because a lot of times when we're producing a podcast with video now, it's just a lot of files. And I'm trying to keep track of which ones are the ones that I want to use and which ones are kind of like the backups. So I'll sometimes tag them with green, like these are the ones I want to use. But now I can do that sort of thing with the folders, which I think visually will make them stand out even more and be a nice way to kind of
separate different kinds of content that you're using for whatever project you have.
And obviously, when it comes to macOS specifically, you've got to talk about shortcuts and personal automations. Yes. Well, I don't think they're even called personal automations. They're just called automations. Automations. And so this has been one that I've wanted for a very long time because what it'll allow you to do is... And there have been a lot of third parties who have filled this gap. However, it'll allow you to run a shortcut on a schedule on a Mac, which is...
Makes a lot of sense, especially for desktop Macs, which a lot of people will have a desktop Mac that's on all day while they're working or even 24 hours a day. And you'll be able to do it on a timer like that. You'll be able to do it, for instance, if you connect a display. So maybe I have my Mac normally connected to my studio display, but I...
plug in my iPad Pro and I want a different setup for that. I can use a shortcut to kind of customize that. All those kinds of automations I think are going to be really powerful on the Mac. And it even kind of Sherlock's some third-party apps, like for example, Hazel, right? Oh, right. I hadn't thought of that. Yeah, you're right. We should mention that because the files and folder stuff is really great. You can run a shortcut when a new file drops in a folder. So like those
You know, going all the way back to folder actions from macOS to more recent stuff like Hazel, for example. But now you can just do that with shortcuts. Yeah, and that's the kind of thing that I really value because a lot of times I'll be post-processing audio or something, and it takes a while. It might take 15, 20 minutes.
And instead of having to kind of keep an eye on the app and see where the progress bar is, I can just set up a shortcut to do something as simple as sending me an alert when that file hits a particular folder. Or I could, if the app, the other, the next app in the tool chain supports shortcuts and
send it directly there and automate the process of starting it. So I'm not kind of in that position where it's like I got deep into something else. I totally forgot that I was processing this audio. Half hour passes and now I could have been completely done with the processing, but I have to move on to the next stage. So I think that that's going to be really good for a lot of workflows. Yeah, yeah.
We should move on to iOS and iPadOS and also mention Apple Intelligence. And I mean, I'm just, you know, ripped the bandaid off, which is not really a bandaid, like I'm so happy. Oh, I know you are. Yeah. Like everything that I've seen on iPadOS, I got it on my big iPad Pro. And we've asked a lot of questions and all the answers, I think we're pretty happy with that.
with. There have been a lot of great changes, big and small. Yeah. And there's going to be more from me on iPadOS in the near future. Got a lot of stuff to cover coming out soon. What's your favorite?
It's hard. Hard to pick. Look, my favorite is actually the story behind. Yeah, I get it. The fact that they did this. The narrative is my favorite. Sure, all the features combined, but it's the recognition of...
We understand that the iPad is a... They're actually setting the keynote. What was it? It's a more... Our most flexible and portable device. And it's a really hard balancing act, right? Because there are people who really want... And there are certain models of iPad that maybe lend themselves to this more than others where you just...
You just want the full screen app because you know, you're watching video, you're reading a book, whatever you're doing. But then there's like the iPad pro where you're really using it a lot more like a laptop Mac replacement. Yeah. And that, like that, that has to be my, my favorite thing because look, it's, it's easier to design something that has a very specific user and a very specific use case in mind. Right. Like if you're,
I mean, obviously all kinds of design is hard and it's easy to talk about design. It's much harder to do design. Yes. But it's even more complicated, I guess, what I'm trying to say is when something can be used by a child or a grandparent, but at the same time can also be used by a YouTuber or a podcaster, or like when you have this incredible spectrum of users, right?
and different activities and different requirements, how do you design for not just one point of the spectrum, but the entirety? All of them. All of them. Not just one end or the other or everything in between. And you're right. I mean, some of the things you do with an iPad are incredibly simple. You know, you tap play and you watch a video. Or, you know, you're trying to record a podcast like you've tried for years. And now you've got the tools to do that a lot better. Yeah.
And so the fact that Apple actually acknowledged, like, we understand that we need to do this and the two natures of the iPad need to exist at the same time. And we think we can do a good job at balancing them. That is my favorite. Like, the narrative behind it is my favorite. But like, obviously, like, you know, having unlimited windows, freely resizable windows, place them anywhere, that's...
They specifically called out, you know, for all the podcasters out there, being able to record a local audio and local video when you're doing like a Zoom call. That has to be like, it's the sort of thing that I mentioned, like going back how many years ago?
Probably in 2018, I think I did my first story. Me and Jason Snell were both trying to solve this problem of like, how do you record a podcast on the iPad? And it involved all of this equipment and cables and USB modes. Right. And now you can just do it from control center. Yeah, exactly. And that's like really interesting to me is that like, you know, you did solve the problem, but you solved it.
But at what cost? Literally, at what cost? Yes, expensive hardware that was outboard and doing the recording separately from the iPad itself. And now all of that is happening internally. And all it really is is a tap of a button in the control center. It's a toggle, a control center that says local capture. Right. And it seems like third-party apps can control. Like Apple is going to have a default for like HEVC video and it's going to be audio and video together. Right.
But then I think also third-party apps can control like what... Like the file format. The file format, what quality, the audio track and all that stuff. So that's cool. And then of course, I mean...
The new design, I think it looks interesting. I'll have to see how it sort of evolves over time. But just in general, the overall iPad story of Apple saying, you know, took us a while, but we think we now have this compelling narrative for one, the same device can be both simple and intuitive in full screen with touch, right?
but it can also be more complex for the people that need it without having to run Mac OS. That is not what they said, but it's basically, you know, reading between the lines. Yeah. And there's a lot of other little elements that play into that. The fact that there are a lot of gestures that are built in that allow you to tile your windows. So you're not always just precisely dragging them into a particular, you know, size. And it remembers those things between sessions, between reboots.
Then there's the whole menuing system at the top, which is just going to kind of come for free for developers who had already implemented the keyboard shortcuts and system that was existing already. But I think we ought to shout out the Files app because to me, Files app is a really big deal because coming from the Mac predominantly myself, one of the things that always felt like a huge point of friction for me was the lack of ways to
sort my files. You know, there were, there are always a few, a few categories, but now the list is much longer in terms of, you know, creation date, modification date, size, file type. There's, there's quite a few of them now. Yeah. Yeah. It's really nice to see more stuff in, in file and hopefully it's more reliable. And you can put folders in your doc too, which is good. You know, your drag and drop items out of the folder in the doc and,
Yeah, so we'll... - I got one for you. Did you know that that, you know, we've seen a demo that when you tap on a folder in the dock, it fans out kind of like the old style on the Mac, but you can also do a grid. Did you know that was an option? - No. - I learned that today from Dan Warren while we were sitting outside at the visitor center just before this. - Interesting. - Dan showed it to me. - What do you want to talk about on iOS? I mean, beyond Liquid Glass, obviously the new design. Like obviously the new design was the star of the show. - Yeah, it really was.
Which, and this is going to be one of those polarizing things this summer. It probably will be. I mean, I think because the way things look is one of those things that where you run into the friction of people not wanting change. I generally, I like it. I mean, I think...
it's a good direction. I like, uh, there's, there on the iPhone. I love that you can take a photo, whether you took it in spatial or not and make it spatial. We've seen that you and I've seen that on a lock screen. We've seen it in the photos app. It looks,
really good. You just need to have a photo that has enough of a definition around the subject compared to the background. You need to have a good photo. Yeah, you have to have a good photo. You have to have a good sharp photo. But, you know, I think that that'll be really, really good. I do like how the lock screen, how the
clock kind of changes depending on the photo you put on there for instance it'll get taller or shorter kind of nestle behind it i also i'm a big fan of the the way the play controls kind of disappear into the bottom when you're playing in the music app yeah those are all good yeah i think there's obviously gonna be a lot of iteration on design this summer um if only because like this is
Typically what Apple does, they go for the most extreme version of anything in the first redesign. But that's what feedback is for, right? That's why we're doing this beta process in the summer. And to be clear, I don't have this on my phone yet. So I don't really have like, I mean, I've held a phone and seen it in action in a limited way. I don't have a...
totally clear opinion about the design on the iPhone yet because I haven't lived it enough. I mean, either. But there's other features in iOS beyond the redesign. The phone stuff.
I'm not a phone person, but I really appreciate the phone features because they'll all be in one place, kind of. The reality is that we're not phone people anymore, but the world is forcing us to be in many ways. Especially with these spam calls that we're getting. And it's a problem everywhere, right? Oh, I get through it for a day. Yeah, same, same. It's a problem everywhere in every region. There's robocalls just happening everywhere. And what I think is interesting is that iOS is going to...
basically have a little agent that talks on the phone for you and you will be able to see the transcript and you will be able to type out a response or you can just let the thing go off on its own for several minutes. It's going to be able to detect hold music and
So it's going to be able to put that phone call in the background when there's music being detected. And when a real support agent comes on, you will be able to pick up... It rings you back, essentially. It tells the person, wait a second. And then you come back. And I don't know about you, but a lot of times...
Just before I came here, I had to, I forget what it was. I had to, I had to call up, you know, a doctor's office or something. And I was on hold for a very long time. And I just kind of like prop my phone at the base of my Mac and try to keep working. And you gotta keep hearing the music though, right? Right. And you have it on speaker and you have it turned up and it's kind of distracting you from whatever you're doing. I think that this will be a really nice way to deal with those kinds of calls. Yeah.
In messages, we got some, I mean, let's face it, WhatsApp-inspired features. Backgrounds. Backgrounds, which apply to the whole conversation, so you can, like, to the whole group. And anybody can change them, too. Anybody can change them. So we're going to have some battles, I think. We're going to have some battles, and you will be able to do polls. Yes.
which is another WhatsApp or Telegram feature. Now you will be able to do polls in iMessage. All nice changes, I would say. And also typing indicators in group chats. Yeah, that's nice. And then you'll be able to see exactly who that's typing, whether it's Federico or me or whoever. Photos. Mm-hmm.
Unfortunately, the redesign didn't stick from last year. I loved it. I did too. I guess enough people complained. I even liked the very first version. I loved the first version with the carousel at the top. Right, right.
I guess we were not in the majority of people who preferred having a grid of photos by default. So photos is going back to having the grid, but then there's going to be just one tab, which is collections. So basically all the fancy stuff that Apple was doing... It's in collections now, right? It's in collections. Yeah, which is all right. I mean, you know, I can see that a lot of people...
I do use the grid a lot just because for, I mean, for what it's weird, unusual because of screenshots, right? You just want to get to the most recent thing in your, in your grid. But, uh, but, but I know like from my family, they really liked the, um, the collections, especially the, like on this day and it'll be, you know, photos of vacation. Maybe you took a year or two ago. Yeah.
Safari and the redesign from iOS 15. I haven't looked at that much. What do you think? Yeah. I've only seen it in, I don't have it on my phone. Yeah. I haven't really looked at it yet. It looks to me like it's better than it used to be five years ago.
In the betas. There was five years ago, there was a very big redesign and then it got kind of walked back. But that design with the floating top bar at the bottom is back. Now, floating top bars are back in a bunch of places. They are. And I actually, I like it. In music, it looks nice. Yeah, that's one of the things that I really want to check out this summer, like spend time with music and
spend time with like... They're notes, I think, aren't they? No, notes doesn't have tabs. Photos, clock, voice memos, like all these apps that have tab bars on the bottom. They're going to have search at the bottom. They're going to have this new style for tabs. And also see when and if Apple sort of changes the look like based on feedback. But yeah, photos is going to have the grid by default.
And now we only have about 10 minutes or so left. And I wanted to talk about Apple intelligence. Yeah, we should. So there's no Siri LLM. There's no chatbot UI. But there's a couple of key things that I want to talk about. Okay. The first one is Apple is opening up the foundation models to developers. Right. And users in a couple of different ways. If you're a developer, there's an Apple foundation models framework. Right.
They published a new blog post. I was reading the blog post. Hopefully there's going to be a paper. Is this on the machine learning? This is on the machine learning blog. There's no paper, like actual white paper yet. Hopefully we want to. It's just a post right now. And I skimmed it last night. I was quite tired, but I was spending time reading the
This is what he does after we go to dinner and it's like almost midnight. This is what I do. I read about parameters and LLMs. So they have an updated version of the AFM on device, the small one, 3 billion, compares favorably to some of these small QN models, for example. And they have a bigger version, which is Apple Foundation model on server that runs on private cloud compute.
But the scene, it's based on a new design of, it's called Parallel Track Mixture of Experts. Okay.
Now, the MOE, the Mixer of Experts, is not a new thing. We saw, I believe, Lama, 4, and Quen. Those models were based on the same architecture. Well, you're not activating all of the parameters. I think Claude might do something similar. I think Claude is also doing MOE. You're not activating all of the parameters of the model all at once. You're just basically chunking the input from the user into multiple tracks, and you're activating just the necessary parameters.
And the AFM on server is about 17 billion active parameters all at once, which is about the same size of the Lama 4 Scout, which is a small model. If you ask me, I've tested the responses from Apple private cloud compute. I prefer its responses to Lama 4 Scout. I really do think Lama 4 Scout is a bad model. Anyway.
So developers can use those models in their apps. They can prompt the model. Yeah, that's something that I don't think you and I expected, right? We thought maybe there would be guardrails built in. I mean, there are guardrails obviously built into these models, but guardrails in the sense of
very specific use cases based on APIs that would funnel the kinds of things you could do with the models. Whereas what you can do in code now, a developer can do a custom prompt that's very specific to the kind of thing they're doing for their app. Yeah. So they, and they can do that in Swift. And, uh, I think it's fine. I was not expecting Apple to just be able to say, yeah, you know what? Just prompt the thing. Yeah. Um,
It can even create content. Like there was a demo we saw where it was like a travel app and, you know, they asked for, the person asked for an itinerary to a location and the prompt went and found a bunch of information about that location and created a nice little summary of what, you know, you might want to do when you visit it. Yeah. A lot of changes to visual intelligence.
So you're going to be able to take a screenshot of anything. And if you want, you will be able to use visual intelligence for anything on screen, essentially. Instead of just using the camera, you can just take a screenshot, select a whole page, and then ask ChatGPT or something, for example. Or you can highlight...
something on your screen and you can either use Google search or compatible image search end points from any of your installed apps that support this feature. So they showed Etsy for example having that integration. - Yeah, you're just gonna scrub it on image and find something on Etsy similar. - Developers will be able to have Swift Assist finally after a year it's back
and they will be able to choose any model they want. So by default, you're going to have ChatGPT in Xcode for code completion and basically cursor level stuff.
But I saw that there's a preference window where you can actually put in any API endpoint and your own API key. Did we see LM Studio even? We saw support for LM Studio if you have on-device local models or just put in your Anthropic API key and use Cloud for Opus. Which is a very good coding model. Very good model, yeah.
Here's a really geeky developer thing I want to mention before we go, is that there is a new API for speech-to-text, for doing transcription. Yes, I actually saved it in my notes. Big shout-out to the... No, it's a collaborative effort, I think, from the Notes team, from FaceTime. It's like a cross-team effort at Apple. And they have this technology. It's called Speech Analyzer. Okay.
where basically it's doing, I always get tripped up, it's speech to text. Yeah, speech to text, not the other way. So it's similar in a way to the open source model, Whisper. Yeah.
Yeah, and it's doing that. But from what I've heard, it's significantly better than Whisper because Whisper is at least a few years old now. It hasn't been updated in a while. But to sort of wrap things up with Apple Intelligence and the show, I wanted to call out Apple Intelligence in shortcuts, which has to be my favorite geeky announcement. In that, in shortcuts, you actually have this new action called the use model.
And use model lets you choose between three things. You can use chat GPT, or you can use the on-device model, or you can call the private cloud compute model directly. Get the bigger model. And it's not just that, because I was doing some research this morning, some tests, and then I did some reading. The fascinating thing here is that Apple trained this model and this action to actually
To actually understand shortcuts variables. Right. To actually understand... Explain what you mean by that. Yes. Because we talked about this this morning over breakfast. And it's kind of hard to explain on a podcast. But you have an example. I think you can probably use one of the examples. I use an example and there's more examples that I can mention. So I put together this very simple proof of concept. Two actions. The first one is get all my notes from this folder. I have a folder in Apple Notes.com.
with some notes inside the folder. And then the use model action. And you can prompt the model in the use model action. And there's a text field where you can write your prompt. And I asked, hey, you're going to take a look at a folder of notes.
And you got to tell me which one is relevant to what I'm asking you to find. And later I asked, find me the note about John. Because I have two notes. One was called like something, something Chris, and the other was something, something John. And I asked the model, tell me which one is about John, but give it back to me as an actual note, like as a native shortcuts variable about the note.
And it did that. Like when it gives you the response, it doesn't give you like the title of the note, like ChatGPT would do, for example. Right. And it doesn't just give you the text from inside the note, paste it into like a text field. It's the actual object. Variable with a thumbnail. It's like the object in shortcuts that contains... That represents a note. That represents a note that contains additional variables.
variables inside, like, oh, what's the title? What's the body text? And what's the formatting and all the things that go into a note. Yeah. And I was doing more reading about this and it appears that notes is just one of the many types that are supported. For example, there's a native understanding of calendar events, reminders, Safari web pages, even things like the model knows if it's inside a repeat loop or
it's going to create lists. Lists and dictionaries too, right? Dictionaries. So there's all kinds of things that, and to me, this feels like validation for this idea of what I personally have been referring to as hybrid automation. But this idea of combining the best of classic automation and classic scripting
That's very deterministic. Very deterministic. You run something from top to bottom and there's going to be a logical flow and it's going to follow that flow. But what if you mix in the middle of it
a large language model, which is by nature, non-deterministic. It's going to give you responses and every time they're going to be different. But what if you combine the two for natural language understanding and deterministic output? And this is exactly what Apple is doing now. And I couldn't be happier because it's like, it opens the door for so many more possibilities in shortcuts, right? And I think it's going to be exciting. Yeah, I think it's going to be really, really exciting.
Federico, I think that's it. I think that's the show right there. We covered everything. We got everything. Wow. That's a lot. That's a lot. Well, shout out, shout out to the, compared to the last time I was here, the improved coffee situation. Oh, you think the coffee's even better? The espresso is much better. Oh, all right. All right. It was good coffee. I had a great coffee this morning too. I had an Americano myself, you know, because I'm an Americano. Yeah.
I'm from here. And obviously, thanks to Apple for letting us do this thing. Yeah, yeah. It's been a lot of fun. There's a lot of work that goes on behind the scenes. There's a lot of people. There are. There are people here with us. You can't see them. It's a whole operation that got going on. It is. And thanks, too, to our sponsors.
Click for Sonos and Elements by RealMac Software for supporting our coverage at WWDC and doing these podcasts. So it's a big thanks to them. And we'll be back in another week with kind of more of a regular show. We won't have all these fans. In our respective countries. Respective countries. No big mural behind us. But, you know, it'll still be good. All right. Talk to you later, Frederick. Ciao, John. Ciao, John.