Hello and welcome to Connected episode 545 for March 26, 2025. Sometimes we say the date, sometimes we don't. It's kind of confusing. My name is Stephen Hackett and I'm joined by my friend and yours, our annual chairman, Mr. Federico Fatici. Oh, thank you. Thank you. Hello. Hi, Stephen. How are you? I am good. You doing all right?
I am doing fantastic today. Good. I'm really, really happy to do this show with you. I'm bringing a fun topic that we'll talk about. Actually, two fun topics that we'll talk about later. So yeah, I'm good. Yeah, you got some great stuff in here. But we're going to start with follow-up because that's what you do at the beginning of a tech podcast in our universe. That's right.
Talking about the rumor design of the next generation of iPhones with that camera bar across the top of the pro phones, Evan and many others. I'm giving Evan credit because Evan was first. Okay. Evan said, I've always assumed, but haven't heard podcasts mentioned that the extra space will be filled for filled with camera improvements. The telephoto lens in particular would benefit from a deeper area to run its Tetra prism design across the
I don't see Apple adding extra weight and dimension to the phone needlessly. And this does, this is me, not Evan. There is a rumor that the telephoto camera on the pros will go to 48 megapixels up from the 12 that it is now, which would be awesome. And so maybe they need that extra space to,
For like the 48 megapixel, I know I said it wrong, so I'm just staying with it now. That sensor will be physically bigger than the 12, I think. And so maybe they've got to shift it over or maybe they want to have even further zoom. But in addition to the area, like it gives them depth, right? Like the depth of the phone is really what limits the cameras. That's why we have...
camera bumps to begin with. And so maybe this gives them more space in there. But I do agree with Evan. I don't think they would do this just as a like just as a design feature. Like if if these phones come out and I fix it, cracks it open on day two and there's like nothing in there or like there's no clear reason that that camera bar exists on the pro phones. I would be surprised.
So, but help me understand something. So are we thinking that the Tetra prism design will like when, when Evan mentioned runs across the area, like, are we saying that Apple is going to extend that camera system like horizontally across the surface of the camera bump? Like, is that the idea? I think so. I think so. Like a periscope lens. Huh? Yeah.
Well, that's something I was not thinking about. Yeah. That's interesting. You can do that? Maybe. I mean, I think some of the other phones that have the periscope lens, like it's more, you know, sort of long and skinny. The Tetra Prism that Apple designed, you know, the light bounces around a few times.
And it may be that if they stretch that design out and it has fewer bounces or turns of the light, maybe the quality is better. Maybe the layout just has to change if they go up in density on the sensor. But it's definitely interesting to consider what Apple could do with that extra volume in the phone. Hmm. Interesting. Well, I was not considering this option. Hmm. Okay. Yeah.
And then we have to talk about the iPhone Mesa. I'm sorry. An anonymous listener wrote in and said, I worked on some Apple commercials a few years back. At that time, Apple internally referred to the physical camera area of the phone as the, quote, camera plateau in discussing on how they wanted the product to appear.
This name, of course, never appears in public-facing documentation, but it was their, quote, official name inside the company. So my question to you, Federico, Plateau or Mesa? Plateau, because more people understand it, because it's also the name of the great plateau in The Legend of Zelda Breath of the Wild.
So I think Mesa is just a name that, you know, people want to use because they think they sound fancy. You know, a lot of people in the Apple community have Apple marketing syndrome in which they think they work at Apple marketing and they can come up with nicknames because they have been imbued with the culture of Apple marketing. Plateau is a name that more people understand. And I think it's a more reasonable name. Okay. Yeah.
I love the feedback. That's awesome. We got some follow out. That's where we tell you to go listen to other shows when you're done with this one. Episode 428 of App Stories was a special one. Will you tell us about it?
Yeah, we had, you know, this guy, Mark Gurman. I don't know if you've heard of him. I have. We're going to talk about him again in a minute. The sheriff has been, you know, on App Stories. We talked to Mark about his Apple reporting career, how he got started, etc.
You may not know that Mark actually started making dashboard widgets. That's really cool. When he was in high school. And then obviously, you know, when he started reporting rumors for 9to5Mac and the move to Bloomberg in 2016 and sort of like what the Apple rumor industry looks like today, you know, all those things. And I think it's been an interesting conversation with Mark, especially...
Sort of talking to the person instead of the byline that you see on Bloomberg necessarily, like actually getting to ask Mark, how do you actually get your work done? How do you make a decision as to whether you want to report on something that you heard or not?
and those were like uh i you know i asked mark hey what do you think of this new generation of leakers uh that you see these like it was a fun conversation and so yeah i'm really happy that that we were able to have that episode with mark yeah i thought it was great i really enjoyed uh really enjoyed the interview i think what's what's was most interesting to me is that uh he writes in the cms like mark don't do that that's a terrible idea like don't
Don't write in the CMS. I was also surprised that he's been at Bloomberg for nine years. Like, that feels new to me. You know? I know, right? It feels like, I don't know, three years, but no, it's... Yeah. But no, it was great. We're going to talk about some more German stuff in a second. But congratulations on the interview. You got a great job with it.
I was on upgrade episode 556 that came out on Monday. Uh, Mike, of course is still on paternity leave. And so Jason has had, uh, people filling in. I gotta say, I love being on upgrade. Like Jason and I don't get to talk about tech very often in public. We, I mean, we talk about it privately. Um, and upgrades is such a great show. And so we talked about a bunch of stuff. Uh, he's been on a bit of a journey with some TV streaming stuff. We talked about that. Uh,
I've talked about some rumors and some legal stuff. It was a lot of fun. So upgrade 556 and app stories for 28 both get the seal of approval this week. Yeah. Thank you. This episode of connected is made possible by Ecamm live, the leading video production and live streaming studio built for the Mac.
Ecamm is great at simplifying your workflow because you can do it all in the Ecamm app. Get started quickly and have everything on hand to create whatever you need to with video. It's great for streaming, recording, podcasting, and presenting. If you want to stand out from the crowd, you need high quality video and Ecamm delivers. With Ecamm, you can screen share, use multiple cameras, and direct the show in real time with their live camera switcher.
Plus you can add logos, titles, lower thirds and graphics, drop in video clips, bring on interview guests, use a green screen and so much more. Ecamm Live truly does it all. And my favorite feature is that the UI makes it easy to manage all of these things. Doing video stuff can be really complicated when you have a bunch of sources and Ecamm has thought about it. They've made it easy to use for a Mac user because it is built as a native Mac application, which I love, of course.
Ecamm's members are entrepreneurs, marketing professionals, podcasters, educators, musicians, church leaders, bloggers, and content creators of all kinds. And if you're on the pro level plan, you can enjoy Ecamm for Zoom. Get this, you can automatically send Ecamm Live's audio and video output into a Zoom meeting, webinar, or event and add up to eight Zoom participants as camera sources in your broadcast or recording.
Plus, you can automatically create individual participant audio and video recordings and add Zoom chat messages to your broadcast or recording as text overlays.
Get one month free by visiting ecamm.com slash connected and use the code connected. That's one month free of Ecamm Live at Ecamm, E-C-A-M-M, ecamm.com slash connected with the code connected at checkout. Go there now and check it out. Our thanks to Ecamm for their support of the show and all of Relay.
It is the end of March, and that means WWDC dates are here. Apple Newsroom yesterday announced that WWDC 25 will be the week of June 9th. So the second full week of June running through the 13th. And it is going to follow the template set kind of in this post-COVID era, right? So the keynote and a few specific things on campus.
Or a small number of people, but the rest of the conference will be, will be online. And there was some talk. I don't think we got to it on the show. It was like one of those things that was like in my notes and just never made it. Um, but a feeling of like, if Apple was going to return to the in-person conference, like this year may be a really good year for it. Like vibe wise, like to kind of bolster the community. Yeah.
Um, they did not take that, uh, take advantage of that. Um, but it's going to be, uh, the way it's been. And don't get me wrong. There are lots of positives, I think, to the current sort of version of WBC. The main one being is it's accessible to everyone and the content's out there basically immediately for everybody, right? They drop a bunch of, uh,
videos at a set time each morning. And then if you're a developer, I think they've used Slack in the past for like one-on-ones and you can meet with Apple experts. So yeah, June 9th through 13th, we're going to talk in a minute about some of the things we expect to be there. But of course, we'll be doing our Rickies episode on June 4th. So be sure to take off work in preparation for that. Prepare yourselves for the Rickies.
But the question for me is, what are our plans? Are you going to go? I don't know. I don't know because I would love to go and see people. See people like you or John or developers or folks that I know at Apple. Honestly, my main issue right now is the idea of
of traveling to the United States as a European. Yeah, it's complicated. And the idea of potentially someone, you know, at the border, you know, looking at my social media history and determining, oh, we think this guy is not a friend of the US government and potentially something happening to me because of that.
Like, it doesn't sound great, you know? I totally get it. I don't want to be arrested for doing nothing. My tweets. You know, it goes without saying, like, I love traveling to the United States. I love what I've seen in the United States. I have my closest friend, my business partner, they're all in the United States. We run an American company. We pay taxes in America.
And yet here we are. It doesn't sound like a great prospect right now to just say blindly, yeah, I'm going to America. Yeah. You know, sight unseen. Yeah. I don't know. I don't know.
No, I totally get it. And if you are part of the population that our current government doesn't like, it's genuinely scary. So I totally get that. I've had that conversation with several people that don't feel comfortable coming in right now.
For me, I'm already here. And my plan is to go if I have a press invitation, which I had for years. I did not get one last year, which was disappointing. So I have a refundable hotel room, but no flights. And if Apple PR smiles upon me, my plan is to be there. But if not, I will cover it from home. And that's the plan. And Mike's busy. Mike's got a baby. He's not going. Mm-hmm.
But that all said, it's kind of dumb to read too much into Apple invite art, but let's do it. I think we should do it. Yes, we are dumb. Yes, okay. We can just say it. So you look at the WWDC 2025 logo, right? There is a colorful element, right? And there is a 25 logo.
the number two and the number five, with a sort of frosted glass light appearance going on.
basically almost looks, and it's got a bit of depth, it's got a bit of shadows, it looks like the light mode equivalent of the Vision OS UI. And I'll give you one more, Stephen. I'll give you one more. If you zoom in on the WWDC 2025 logo and you zoom in on the numbers, okay? Those are actually, zoom in on the numbers, those are actually 3D, okay? You see those are actually 3D objects. Yeah, there's like a lip on the left side.
Yes, and pay especially close attention to the number two. It reflects the colors of the WWDC logo next to it. Yes, it does. Yeah, and you know what's the other operating system that sort of has windows that reflect and cast off light and shadows? Vista.
Wait, not Vista. No, absolutely correct. You got it. It's Windows Vista. Yeah, yeah. So, I mean, it's dumb to try and predict where Apple is going design-wise by looking at a logo for WWDC. But then again... But then again... But then again, why not? It's right there. It's right there. I got a couple of things to add. I hate how they squished the two Ws together.
Like it doesn't even look like a letter. It's like three Vs. V, V, V, D, C. You know? Let's go. V, V. Oh, wait. Is V, V, V. That's like an... V, V, V is an old like Roman history thing from... Isn't it the thing that Caesar said? Like Veni, Vidi, Vinci? Which means I came, I saw, I conquer? I think so, yeah. Yeah. So it's...
I came, I saw, I conquered the developer conference. Is that the message Apple wants to send to the community right now? I'm just asking. Also, we're just going to talk over it, but my generator is running. So if you can hear it, that's why we recorded late the other day. There's a guy here working on it. So I'm sorry. I can't do anything about that. The...
Yeah, the reflection stuff is super interesting. And I wouldn't normally read into it, except that we have all these rumors that macOS and iOS is going to have some Vision OS sort of mixed into it. And I think that's really interesting. And that brings us to the dueling rumors. There are a few things I love more than covering Apple than having...
People who report rumors disagree with each other. It's just so much fun because you get to pick sides and see who's going to win. I love it. So let me walk you through the timeline. Months ago, John Pross, remember him? He shaved his eyebrows when he was wrong about something. And I think he was the one who said they were going to be Steve Jobs limited edition vision glasses or something. Sorry, I think you mean heritage edition. Yes, I'm sorry. Yes. Yeah. Yeah.
That'll be in the show notes because we've got to be thorough. So Prosser for a while has said that this is going to be in the works. And he had some mock-ups early on that honestly I liked the way they looked. And he then on his podcast had some screenshots or like – it looked like it was actually running on a phone potentially. It's kind of unclear. Yeah.
uh of particularly the messages app and there's sort of like this round racked around the keyboard and the buttons have circles around them so buttons look like buttons again but honestly when i saw it i was like that's honestly a little disappointing like that's not sort of the big sweeping change that german said was going to be coming but
And then Gurman comes out and was like, hey, those designs that are, quote, floating around are very old builds or based on like descriptions someone heard and aren't really what Apple is going to do. And on his Discord, he said he, referring to Prosser, either has very old screenshots or hasn't seen the real thing. So, you know, shots fired.
But I think you put all this together, like, I mean, I don't think you would lose points in the, in the Ricky's to say that, Hey, there's going to be a redesign coming. Yeah. I mean, that seems pretty, pretty obvious at this point. Also Gurman didn't exactly say like, he didn't say, Oh, those screenshots are completely wrong. Right. So I,
I mean, to me, this looks like a mock-up. And, you know, the video that Prosser did, like he was talking to his co-host, be like, hey, do you want to see a screenshot of iOS 90? Like, that's not a screenshot. I'm willing to bet that it's not like an actual screenshot. You can obviously tell that it's a mock-up. The quick type corrections above the keyboard say front page tech, which is the name of Prosser's channel. So it is a mock-up. It's not a screenshot.
But German is not saying, no, no, this is completely bonkers. This is totally wrong. It's not what it's going to look like. So I think the next few months are going to be interesting. I want to see if Apple in 2025, especially when it comes to software, is a company that leaks more than before.
That's something that I want to keep an eye on. A company under pressure, like it or not, do they have more people talking to external folks than in the past? We'll see. I mean, do you remember back with iOS 7, like the photos icon leaked like right before? Maybe even by Gurman. I'm trying to Google as I talk, but I think we're going to see more of this before it's over. Yeah.
Yeah. Excuse me. It was, I download blog. So Sonny Dixon had it. Okay. Got it. Huh? So I think we will probably see more of this before the time comes, but I'm, I'm excited about it. I mean, there's lots of things to kind of be anxious about right now in the Apple world, but,
But assuming they don't blow it on the Mac in particular, like I like the way this looks. I think adding some depth and honestly making buttons look like buttons, like some of that could be really good on the iPhone and iPad in particular. So like I'm excited. Yeah, yeah, me too. All right, Stephen, I have a couple of topics that are part of sort of me changing as a computer user, which is a broader thing that,
You will hear more and more on this show and on Mac Stories between now and before Mike comes back. Yeah. This is going to be a thing. All right. And there's going to be multiple things. And today I'm bringing two of them. So the first one is that for the past month or so, I've been using a Mac mini server.
This is not mine. I don't own this computer. It's a Mac mini server hosted at Mac Stadium. Used to be called Mac Mini Colo, now it's Mac Stadium. And I started using this Mac mini server. It's a Mac mini with the M4, just plain M4. Just the base. Yeah, it's a Mac mini that I set up about a month ago and that I've been using for a variety of tasks.
Most of them related to automation, different kinds of automation. And I kind of wanted to talk about some of these. The first one, which I think is obviously going to deserve a proper article on Mac Stories at some point in the near future, is I figured out a way to trigger shortcuts running on my Mac Mini from Android.
So I've been using this Android tablet, right? That I mentioned on the show before. I mentioned on App Stories, on MPC. It's this sort of iPad mini, but it's Android. It's made by Lenovo. It's the Lenovo Legion Tab. Really, really good small tablet. But obviously when I do my reading at night or I'm watching some YouTube videos, like there's multiple times where I'm like, I'm reading an article or I'm watching a video. I'm like, I actually want to save this for...
the newsletter or I want to save it for Mac stories. And typically,
If I were to do this on my computer or if I were to do this on my iPhone, I would just run shortcuts, right? Because I have shortcuts that save those links or like even just if I have an idea, for example. I have shortcuts that do all this, that take a link and do a bunch of things and save it in Obsidian, save it in Todoist. Like I have my shortcuts. I have hundreds of shortcuts, but I have my like 10 shortcuts that I use on a daily basis, right?
And, you know, if I'm using Android, it's like, okay, what am I doing now? Am I just, do I need to save them somewhere else? Because obviously there's no shortcuts on Android. But I figured out a system that involves Tasker. So Tasker is an Android application that is often mentioned as like, oh, this is much better and more powerful than shortcuts. Now let me tell you, Stephen, I don't know if you've ever played around with Tasker,
Shortcuts users have no idea how good they have it until they play around with Tasker. With all due respect to Tasker, but it is the quintessential Android app in terms of user friendliness and design and just how things are laid out and how you're supposed to set up shortcuts.
commands and workflows. It's very confusing. It's very powerful, mind you. It's very, very powerful, but it's also very confusing. And, you know, regardless, I figured out a system with just two actions because I couldn't bring myself to do anything more complex than that in Tasker to...
I send off, it's basically like I'm sending off a web book. Like it's calling a URL. That URL is on a little server app that is running on my Mac mini. And when the Mac mini sees that web book, it fires off a shortcut. So that's the idea. So I have a system that is always there in the cloud that allows me to trigger shortcuts from Android, but the shortcuts themselves, they're obviously running in the shortcuts app on the Mac mini.
I have other use cases for the Mac Mini. I've been using a variety of large language model tasks on the Mini. Okay. Mostly in the terminal.
I've been using this really, really great command line utility by Simon Willison, just called LLM. That's the name. It's a command line utility that you can install manually. You can install it with Python. You can install it with Homebrew. And it basically gives you access to all the modern large language models from the terminal.
And I've been using it to run some experiments to transcribe videos as well as App Stories episodes automatically. So I've created a bunch of commands to transcribe podcast files like MP3s, generate a transcript, and then...
and then take the transcript and fix some common errors and turn that transcript into an SRT file that we can use on YouTube. So it's basically part of me sort of trying to figure out how much can I help John's production workflow when it comes to podcasting and YouTube creation. Just because, you know,
I'm sort of the guy at Mac Stories in charge of coming up with automations and shortcuts for people. Not just for readers, but internally for the Mac Stories team. And I figured maybe I can help John with this sort of stuff. So that's obviously a terminal is not something that you get on an iPad. And it's kind of perfect for the Mac Mini because you can just...
fire off the command, and then leave it running. It's not even running in the background. It's running in Las Vegas. It's running in another country. In a data center, which is just cool. It's cool. How are you getting to the Mac Mini? Are you remoting into the GUI to do this terminal stuff? Yes, sir. And
I want to recommend Jump Desktop. Okay. Jump Desktop is so much better than VNC, in my experience. It's using the... It doesn't use the VNC protocol. It uses the RDP, Remote Desktop Protocol, which I think is a Microsoft thing. And Jump Desktop, you install a client, like a server on the Mac mini, and then you use the client on the iPad or any other computer.
And the quality is just so much better. Like the image quality is so much better. The latency is highly reduced compared to VNC and Jump Desktop in particular has this setting where it automatically uses retina quality and switches the
the resolution of the host computer to match whatever display resolution you're using on your client. So if I'm accessing the Mac Mini from an iPad, I get full screen iPad retina resolution for the Mac Mini that is obviously the Mac Mini is headless, doesn't have a physical display attached to it. But Jump Desktop sort of takes care of all of that. So that's very nice. I've been using Cursor.
to uh as the kids say vibe code some yeah some some internal things uh i've created like i was actually doing some reorganization today i've created uh about like 10 uh sort of personal obsidian plugins uh just for me i i don't think i'm gonna share this with people and i've been creating these plugins sort of just you know uh
chatting with an LLM in Cursor, making the plugin, and then I've been sending these plugins to the one true son, Finn, for actual human manual review, which is a process that makes me feel better about it. But it's the sort of thing where
Actually, Matt Birchler had a great take last night on his blog, like how it sort of feels empowering to be able to do this like low stakes personal projects with these tools now. It's a sort of thing where like I couldn't hire Finn to
for 10 plugins in the span of a month because like Finn has a day job. Like he does these things on the side. So it was either, I'm not going to have these tools or I like to think of these things as like the modern equivalent and the more powerful equivalent of me spending a week copying and pasting random bits of code from Stack Overflow and Reddit.
it's essentially the same thing where these models have been trained on the Stack Overflow and Reddit and GitHub threads, but they do it for you. And so, yeah, I've been doing that. I've been sort of leaving the Mac mini running in the background with cursor, uh, doing the coding. And then I go in and I build the plugin and yeah, I got a bunch of things in Obsidian that are just for me, like very specific Mac stories things. Uh, but it's been nice. It's been nice to do that. And, um,
And then I kind of wanted to mention the final item, which will lead me later in the other thing that I'm bringing to the show today. This is something that I'm going to share this Friday in Mac Stories Weekly. I figured out a way to never worry about the shortcuts permission dialogues again. Never again. So when I was triggering the shortcuts from Android,
Initially, I was happy, but that happiness lasted for about 30 seconds. Because what I didn't consider is that whenever you call a shortcut remotely, obviously shortcuts get scared and be like, ah, do you want shortcuts to allow access to this URL? Do you want shortcuts to allow access to this file? What about this folder? Do you want to enable this shortcut to run another shortcut? It was like,
a deluge of dialogues one after the other. Imagine for every single URL from an individual domain that I was sending from my Android browser to the Mac Mini, I was getting for one shortcut five permission dialogues for each URL. It was impossible. It was impossible. I would have to log in, like grab my iPad, open Jump Desktop, allow...
Five dialogues. And then, you know, the next day I would be sending another webpage. It was like another five dialogues. I was like, this is impossible. Like, it's incredible that Shortcuts doesn't have a power user mode where like, I know what I'm doing. Please never bother me again. But yet here we are. So...
I figured out a system, like this week when I was putting together the next topic that we're going to talk about, I was like, I am done with this. I need to figure out a solution. And so, of course, I turned to good old friend, GUI automation, graphical user interface scripting.
to have a little bot, a little shortcut that is always running. In fact, it's running every 10 seconds and clicks the button for me. And so, yes, yes. So I will have the backstory and all the details and the code
on Mac Stories Weekly. But yeah, so all of these permission dialogues, no more because I have a little thing that is always running because macOS lets you build these kinds of things that run every 10 seconds that clicks the button for me. Incredible. Yeah. I mean, it's really dumb that you have to do that, but incredible. Yeah.
So you seem happy with this. This is a little toe back in the McAuliffe's. I feel rejuvenated in many ways. I'm in many ways. I feel like myself from a decade ago, but with the knowledge of the things I've done and the experience I've had and the things I've seen. Yeah, I feel lots of potential, lots of potential. We'll see how this evolves over the next few months. But yeah.
This episode of Connected is brought to you by Google Gemini. Gemini Live is the feature where you can just talk to it, and it's really wild to have a full-on conversation with this thing. I was messing around and asked it to give me some ideas for hosting an event, and when it starts giving results, you can just stop it and say, okay, well, what about something low-key for a smaller group? And then adjust to that, and you can keep going until you get an idea that you want.
And I think it's really useful for brainstorming things. It's good when you don't know where to start, or if you hit a wall, you can just open Gemini and it helps you get the ball rolling. But you can use it for all kinds of stuff. If you want to learn something new, or have it give you advice or explain Bitcoin in simple terms, which seems impossible, but it can do it. Or you can have it quiz you on something like microbiology. I mean, imagine being a student and you've got a personal tutor on hand.
It's hard to explain. You really just need to play around with it, see how it listens, responds, and adapts to your style of conversation. Just try it out. It's free. Our thanks to Google Gemini for their support of the show and all of Relay. All right, Stephen. So I recently realized something about me. Okay.
I'm 36, going to be 37 this year. And I've become the sort of person, I've become the sort of person who likes to talk out loud by himself, usually when I'm driving or when I'm doing chores around the house to sort of like brainstorm ideas out loud. Hmm.
So you're, just to paint the picture, you're driving in the car. I'm driving my two dogs in the back. Okay. You know? Yeah. You got some dashboard confessional playing and... No, I have no music playing. Well, you had music playing, but then you have a thought.
Yes. Oh, I need to pause. I click pause. Okay. And then you talk. Yeah. What is listening? Like what, what happens next? Okay. The dogs remember, like if you've gotten your dogs, part of your task management system, I feel a little weird about that. So,
It all started, you know, I did this as an experiment about a month ago and I used this app on my iPhone called Super Whisper, which is also on the Mac. Obviously on the Mac it does a whole more things. It's a really cool app. It's got really good dictation. It's using like these new AI models for transcription, like...
Incredible potential for even people with disabilities, for example. It's got the kind of dictation that it doesn't care if you stutter, if you pause, if you lose your train of thought. It keeps listening and transcribes everything. It's incredible. I did this on the iPhone. But then I realized, you know, it's kind of awkward if my phone is in my pocket and I'm driving.
I got to get my phone from the pocket or if I'm doing chores around the house and like, you know, I don't know, I'm doing something. I got to keep my phone in my hand when I'm doing something. It's kind of awkward. So I recently started having this thought that sort of became an obsession. Like, what are the best earbuds for dictation? Very simple question. Like, what are the best earbuds that I can get to get like a really good dictation?
recording of my voice. And by the way, this brainstorming, it could be that I'm thinking out loud about a shortcuts problem or I'm just making a list of the things that I got to do tomorrow. That sort of thing. I'm just talking out loud about work. And you want the cleanest capture of that you can so the robot has the best thing to work with. Precisely. I wanted the cleanest possible version so that the AI would have
a good place to start, right? I tested the AirPods 4 and before I sold them, the AirPods Pro 2
And I realized that they kind of suck for dictation. Like when you record yourself and, you know, it shouldn't have been a surprise given how, you know, when you are on a Zoom call, people tell you, are you wearing AirPods? Because like you can tell. You can tell. You can tell when somebody's wearing AirPods because they don't have great microphones, right? They sound like they're far away. They sound robotic. They have all kinds of compression going on. They're very bad.
So, you know me, I started this long research phase, which mostly consisted of purchasing earbuds, testing them for two days, and returning them, and starting the process over and over. So I started testing a bunch. Started with the Google Pixel Buds 2. Okay.
These are just as bad as the Apple AirPods. First of all, Google doesn't make a version without the in-ear tips. Yeah. They're just like, so I was not a fan, but I was like, okay, I'll try them. But the microphone quality is not...
Not great. Not great. It's, you know, same issues as the AirPods 4 and the AirPods Pro 2. Compressed, robotic, cuts off some words. Like, they were not to be trusted with, like... Here, by the way, we're talking for, like, recordings of, say, 5 to 15 minutes, all right? So long sessions of me just talking by myself. So next up...
Watch some videos, start on watching some. Can you believe it, Stephen, that there are people on YouTube that do tier lists of earbuds when it comes to call quality? That's a thing. That's perfect. I love it. Yeah. Yeah. I saw the Samsung Galaxy Buds 3 Pro. These are the AirPods clones that are like oddly more angular. They have an angular stem to them.
They also have in-ear tips and they are not that much better than the AirPods. Like the mic is not great. And, you know, set aside the fact that like they're kind of weird to squeeze because of that angular design that they have. Yeah. But like the microphone is not good. Now, I have found an absolute winner.
when it comes to the, hands down, the best Bluetooth earbuds for voice calls and local recordings. And this is a name that maybe will surprise you. The Huawei Freebuds Pro 4. Yeah. These earbuds. I did not see this coming. These earbuds for recordings and for calls are incredible. And that's because they have bone conduction microphones.
I don't know what Huawei did here to make this sound so good, but believe me, they are, when it comes to recording yourself or jumping on a Zoom call or whatever, just they sound so good. It's wild. So if you're looking for Bluetooth earbuds that are good for calls and good for recordings,
that's right now my recommendation okay huawei free buds pro 4 i i kind of love how these things look then the black and gold yeah yeah that's what i have that's what i have yeah they're cool looking there is the go if you go to the huawei page the link is in the show notes and you scroll down to the controls to like look through the images it looks like a vision os button like yeah
I see it everywhere now. Yeah, yeah, yeah. So these are good. These are good. And they actually give you both silicon and memory foam tips. So they're pretty good. There is both in our document and in my story, a giant however. Okay.
I ran into two more issues. The first one I already mentioned, the Huawei head in-ear tips, which I'm not a huge fan of, especially because they bother me if I'm wearing them for more than 30 minutes. My problem was that when I'm driving my car and either I'm using CarPlay or I'm using Android Auto for reasons that will be clear as part of the teaching, changing as a computer user down the road,
In either case, if I'm using the voice recorder on Android or Super Whisperer on my iPhone, both CarPlay and Android Auto
When I'm in the car, at that point, they completely ignore the fact that I'm wearing earbuds. They just default to the car's microphone. Which defeats the whole purpose of this. Which defeats the whole purpose of this, right? I was so happy. I was like, yes, now I finally have earbuds that record me with very good quality.
And then I got in my car and I started driving and I listened to my recording later. I was like, oh no, this recorded the car's microphone, which was okay. It was actually better than AirPods, but it was still defeating the purpose of what I wanted to have.
There's no way that I know of in either CarPlay or Android Auto to say for this particular app, ignore the microphone of the car. Yeah. Just use the... There's no setting to do that. Correct. So, yeah. I was back to square one. Got to do more research. Been doing more research.
eventually we got to a few days ago okay so mind you this has been going on for like two to three weeks yeah you're just you're opening an air bud you're taking them back you're trying them out yeah yeah so we got to late last week i was kind of i was like i was about to give up on this idea and then i landed on the another chinese company the xiaomi website so xiaomi turns out they've been making earbuds right and
They're called the Xiaomi Buds, of course. And specifically, with the Buds 5, apparently they've been making five of them. So this is the fifth model. There are both Buds 5 and Buds 5 Pro. Of course. Both of them support what I think is a genius feature
that I am convinced more companies, including Apple, will copy at some point. And it's so obvious, I wonder why didn't I think of searching for this before. These earbuds allow you to triple-click the stem of the earbud to start a local voice recording process.
that is stored inside the earbud itself. Whoa. The audio recording for up to 90 minutes per earbud. So for a total of three hours of local voice recordings are stored inside the earbud.
You can either choose to record with one earbud at a time, with both. You can actually even just open the case and press a button and record by talking to the earbuds case. All right? Later, yeah. I love it. Later, when you're done, you open the Xiaomi earbuds app, which is available both on iOS and on Android. You can see the list of recordings that are stored.
in the earbuds and you can transfer them to your phone. And at that point, you have an M4A file that you can do whatever you want with it. And there's an asterisk there, but I'll get to it in a second and I'll leave you with a bit of a teaser for next time. But it gets better. So the Buds 5 Pro, Xiaomi has so far released in Europe a Bluetooth only version.
But they have also announced a Wi-Fi version. And you may wonder, wait, what? Wi-Fi earbuds? That's starting to become a thing, apparently. You get much greater bandwidth, right? And much greater power consumption. But yes, so this is a thing called, it's a Qualcomm thing. This actually was announced three years ago. And it's only sort of becoming a
like a thing now. It's called Snapdragon Sound. And it's like sort of like what Apple has been doing with AirPods when you use the AirPods with Apple products. It's not like plain vanilla Bluetooth. It's like actually better than Bluetooth. These are using Wi-Fi.
Between so far, you can only use it with Xiaomi phones. You can only take advantage of the Buds 5 Pro model with Wi-Fi, which costs more than the Bluetooth model. Only if you have a Xiaomi Ultra 15, 15 Ultra, I think it's called. So yeah, it's very much proprietary to the Xiaomi ecosystem. But, and this is where I got so happy.
The Buds 5 are the equivalent of the AirPods 4 in that they don't have in-ear tips. They have pretty much the same design of my beloved AirPods 4. They support noise cancellation, basic noise cancellation, even without the tips. And they can record audio locally. I've been testing this this week, Stephen.
And it's incredible. So I will say this. The recording quality that I get from the Buds 5 is not as good as the quality that I get from the Huawei earbuds. But I can get in my car, start driving. My phone is paired with the car's system. I can triple click the left earbud in my ear to start recording independently from my phone.
And later, when I'm done, I can triple click again. The recording is saved inside the earbud. And when I'm home, I can import the file and do things with it. This is finally the dream realized. The Buds 5 compared to the Buds 5 Pro, from what I've read online, don't have the same microphone system.
I think my understanding is that it's slightly worse, even just from the fact that the Buds 5 has microphones that can cancel up to, like they have wind noise resistance up to 12 meters per second, whereas the Buds 5 Pro have wind noise resistance up to 15 meters per second. Is that how wind noise reduction is measured?
I've never come across that. So I would imagine that it's basically measuring the strength of the wind. That's what it looks like. Right? To mitigate wind noise at 15 meters per second, approximately 33 miles an hour. Okay, that's pretty strong wind. Like you couldn't, I mean, if you were, you couldn't drive with your windows down, I don't think, like on the highway. No.
But if you're just walking around and it's breezy, it should sound perfect. Yeah. Yeah. So I tested this this week. I think out of curiosity, I will place an order for the Buds 5 Pro, even though they do have the in-ear tips that I dislike. Yeah. I just want to test for myself how big of a difference they have in terms of microphones. You've gone this far. You might as well keep going. Exactly. So now... Now...
I was at this fork in the road. Either I could only record with excellent quality on the Huawei earbuds, but that would have meant no earbuds in the car because it wouldn't work or get worse quality with the Buds 5 by Xiaomi, but you're able to record in the car because they are completely independent from your phone.
I decided to go with the second one, obviously, because I love the idea of recording something with the earbud itself. I think that's such a clever idea that I just want everybody to steal. I would love to do this with AirPods at some point. But, and this is where I will leave you today, Stephen, with a bit of a teaser. The files that I was getting from the Xiaomi earbuds app
They had a whole bunch of noise. They were not as clear as the audio recordings from the Huawei earbuds. Okay. And they were saved to M4A with the ADTS codec, which is a weird codec that, for example, I have a bunch of shortcuts that deal with transcribing MP4 audio files.
And they were just giving me an error because they said, oh, this file format is not supported. So I, again, did more research and I found another task for the Mac Mini. I created a system. So now follow me here because we're going to talk about this in the future. This is just a teaser. I created a system where I import the audio file from the earbuds. And there's only one thing I need to do.
Take this file, share it with the share sheet, and save it in a folder of my Google Drive. That's all I need to do. On the Mac Mini, there's a Hazel automation that runs first. And it runs two FFmpeg commands. There it is. One, to convert the audio. And two, get ready, it runs a neural network to apply a filter to the audio to clean it up.
So it cleans up the audio recorded from the Xiaomi earbuds. Then, and we'll get into the details of this workflow maybe in a future episode, then it fires off a shortcut. The shortcut uses AI to analyze my audio, creates a transcript, creates a summary, comes up with a title for the file, and identifies if in the conversation that I have with myself,
I mentioned things that are supposed to be like actionable tasks. Like if I say things like I have to, or I need to, like it understands if there are items that are actually tasks that I need to do. It takes everything and creates a note in Obsidian that has the title, the embedded transcript, the summary, the actionable items, and an audio player to listen back to that voice recording.
And it just happens. It all just works. Basically, I just save a file in Google Drive and, I don't know, two minutes later, I open my dashboard in Obsidian and everything is right there. Wow. Yeah. So, yeah, that's part of me changing as a person. Yeah. I'm still trying to wrap my head around all that. So...
You're in the car in Rome, hit a button, you talk. Yes. I talk. Yes. You say some things. You've got, you get the file transferred and then it gets pulled to a Mac mini in Las Vegas, which cleans up the audio, transcribes the audio, and you get it all back in your obsidian vault on your phone in Rome. That's right. That's right. Within two minutes. Within two minutes. It's converting the audio, denoising the audio.
Passing the audio to a transcription app on the Mac Mini. The text from the transcription app is passed to Claude. I have a very complex Claude prompt that I created so that it comes up with a title, with a summary, and with actionable items identified from my conversation, and everything is put together in an Obsidian template.
saved into my Obsidian vault as a markdown file. And in my dashboard note in Obsidian, I have a data view query that shows me all the voice recordings that I recently saved that still have open tasks for me to look into. Yeah. True mad lad status.
That is the sort of thing you can expect from me as a nerd who's rediscovering macOS and assistive AI tools in 2025. I am like, yeah, I feel like I have superpowers now. And this was a fun, long process.
That obviously will eventually make for, I think, a really fascinating guide on Mac stories in the near future. Yeah. Computers, man. They do cool stuff. They do cool things. And yeah. But just as a final thing, it's maybe silly, but I really like this thing where I talk to myself because I discovered that when I talk about a problem out loud,
I don't know what it is, but there's a part of my brain that sort of connects more dots. Like, like, and maybe it's because like I've conditioned my brain over the past 12 years of podcasting to like,
have conversations out loud and to follow up on something else that some like you or Mike said, for example, or John said, you know, and so talking out loud sort of maybe tricks my brain into entering that mode where like, like yesterday I was in my car, I was thinking about a shortcut and I was like,
contradicting myself out loud but like I was like but what if I try this other approach instead yeah which is the sort of thing that if I'm just thinking you know like a normal person silently in my brain I probably wouldn't do it but it's the talking that tricks my weird human brain into doing it so yeah talk by yourself it's useful yeah
How is that to read, right? If you're going like the back and forth, like, is it coherent when you get it into Obsidian, like four days later, you finally look at it, you're like, oh yeah, this makes sense. I don't know, let's look. This recording outlines two main projects, finalizing an article about automating shortcuts permission prompts and setting up a voice recording workflow. The speaker details technical components of both systems, including AppleScript automation, dictation apps, and file management considerations.
Action items. One, finalize column about automating shortcuts permission prompts.
verify if the automation works. If the schedule is reduced to 10 seconds, currently it's 15 seconds. Three, install deep learning models for audio filtering on the Mac mini server. Four, install Hazel and import rules on Mac mini server. Five, ensure shortcuts has access to create files in the Obsidian. Yeah, I think it makes sense. You know? Okay. Yeah. Yeah. Yeah.
And plus, if I want to, there's an audio player in Obsidian if you want to listen back, or there's a link to read the full transcript. But this is the sort of thing where I think large language models are great at, which is take a huge chunk of text and make sense of it. Summarize it, find actionable items. It's a sort of like...
boring task that an LLM is good for. And because it's my text and, you know, my data, I feel much more comfortable about doing this. And it's a, you know, I think it's a pretty good use case to combine
So multiple forms of media and multiple forms of automation. There's like, there's an audio file and then there's some text, there's ASL, there's shortcuts, you know, it's a sort of thing that is very much up my alley and sort of this new flavor of combining AI and shortcuts in this new type of automation that I feel it's like sort of my path as a shortcuts user, as
taking me to this place and I really want to take advantage of it for both me and maybe it's going to be useful for readers and it's going to be useful for you or for Mike or for John, you know, for internal tools like that sort of stuff. Yeah. And it's, you're using some Apple tech, but you're also using a lot of tech that's not Apple.
Apple tech. That's on the web. Yeah. That's either open source or on the web, like FFmpeg. It's like open source and, you know, this, this AI APIs are web-based. It's a, it's a whole combination of different technologies. Um, but at the very much at the center of it all right now, there's Mac OS and shortcuts and Hazel. Those are like the, the main protagonists. Yeah. Yeah. It's cool. It's yeah. Yeah.
The most Federico thing you've built in a while. It is a very Federico thing. Yeah, I cannot wait to share this one. But in the meantime, this week I will share how I automated those permission prompts. Perfect. Yeah.
Cool. Well, I think that does it. Thank you to our sponsors, Ecamm and Google Gemini for making the show possible this week. If you want to find more of us, you can find Federico's work over at MacStories.net. And of course, go check out that App Stories episode we mentioned earlier. I write at 512pixels.net and host Mac Power Users. This coming Sunday, we have an interview with David Pierce from The Verge. Nice. Really a cool episode. I think people will enjoy that.
If you're a member, thank you so much for your direct support. If you're not and you want longer ad-free versions of the show each and every week, you can get signed up at GetConnectedPro.co. There's a link in the show notes. This week, you and I, we spoke about Apple and NVIDIA, both historically and then maybe the future because Apple went out and spent a billion dollars on NVIDIA hardware.
Yeah. Wild. You can also leave feedback or follow up at connectedfeedback.com. It's also a link there. And of course, all the show notes are in your podcast player. Until next week, Federico, say goodbye. Arrivederci. Bye, y'all.