We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Better Apple Intelligence in iOS 18.2 (sort of), OpenAI Sora Review, Google Gemini 2.0 Talks a Big Game

Better Apple Intelligence in iOS 18.2 (sort of), OpenAI Sora Review, Google Gemini 2.0 Talks a Big Game

2024/12/12
logo of podcast Primary Technology

Primary Technology

AI Deep Dive AI Insights AI Chapters Transcript
People
J
Jason Aten
技术作者和评论家,Primary Tech Show 联合主持人,专注于技术趋势和产品评论。
S
Stephen Robles
技术内容创作者、播客主持人和YouTube 视频制作人,专注于苹果产品和视频编辑软件。
Topics
Stephen Robles: 本期节目讨论了iOS 18.2 的新功能,包括视觉智能和ChatGPT集成。视觉智能功能得到了增强,可以识别日期、总结文本等信息,但仍有改进空间。ChatGPT集成允许用户通过语音助手访问ChatGPT,并可以通过在请求前添加“使用ChatGPT”来强制使用ChatGPT。 此外,Stephen Robles 还讨论了OpenAI 发布的Sora 视频生成工具,以及 Google 发布的 Gemini 2.0 多模态 AI 模型和量子计算方面的突破。他认为 Sora 生成的视频质量很高,但同时也表达了对深度伪造和虚假信息传播的担忧。他还指出,Google Gemini 2.0 的发布以及 Google 在量子计算方面的突破,是 Google 对抗 Microsoft 的重要举措。 Jason Aten: Jason Aten 在节目中主要讨论了 OpenAI 的 Sora 视频生成工具和 Google Gemini 2.0。他认为 Sora 的视频生成能力令人印象深刻,但同时也指出其生成的视频中存在一些奇怪的错误,并且生成的文本可能出现错误。他还讨论了 OpenAI 的 ChatGPT Pro 订阅服务,认为其价格昂贵,可能主要面向能够报销费用的企业用户。 此外,Jason Aten 还讨论了 Google 在量子计算方面的突破,认为这比 Gemini 2.0 的发布更重要,并表达了对量子计算可能带来的安全问题的担忧。他还谈到了 FBI 警告人们使用加密的通讯应用,以及 Meta Ray-Ban 智能眼镜和 Air Go Vision 智能眼镜的比较。

Deep Dive

Key Insights

Why is Apple Intelligence in iOS 18.2 considered a "sort of" improvement?

While iOS 18.2 expands Apple Intelligence to more countries and includes features like Visual Intelligence and ChatGPT integration, it falls short of the seamless AI experience showcased in Apple's keynote. Visual Intelligence, while improved, still requires user prompts and lacks features present in the Photos app, like art and plant recognition. Furthermore, the ChatGPT integration feels like a workaround, relying on a third-party app for advanced functionality rather than showcasing native AI prowess.

What new features are included in iOS 18.2 besides Apple Intelligence?

iOS 18.2 introduces natural language search in Music and TV apps, enabling complex queries like "comedy with Julia Roberts and Cameron Diaz." It also adds a two-stage camera control for iPhone 16 users, allowing for focus and exposure lock before capturing a photo, though its usability is debated.

How does the ChatGPT integration in iOS 18.2 work?

Users can log into their ChatGPT Plus account or use the free tier. When Siri encounters a query it can't handle, it defaults to ChatGPT. A user can disable the confirmation prompt for faster access. A useful hack is to preface requests with "using ChatGPT" to force Siri to utilize the service, even for general knowledge or image creation.

What are the limitations and drawbacks of OpenAI's Sora video generation platform?

Sora's free tier limits videos to 5 seconds, while the $200/month Pro subscription extends this to 20 seconds. The platform struggles with realistic human and object interactions, often producing bizarre results. While OpenAI watermarks Sora-generated videos, it's easily circumvented by cropping. Concerns exist regarding the potential for misinformation and deepfakes due to Sora's accessibility.

Why is Google considered to have missed an opportunity in the AI race?

Despite having the resources and technology, Google was hesitant to fully embrace AI-powered search due to concerns about misinformation. This allowed Microsoft to partner with OpenAI and capitalize on the growing demand for AI-driven search and content creation tools like ChatGPT.

What is Google's response to Microsoft's advancements in AI?

Google has introduced Gemini 2.0, a multimodal AI model capable of handling audio, visual, and text data. It will power AI overviews in search results starting next year. Google CEO Sundar Pichai has expressed confidence in Gemini's capabilities, suggesting a side-by-side comparison with competitor models. Additionally, Google has announced a breakthrough in quantum computing, achieving significant advancements in error correction and processing speed.

What are the potential implications of Google's quantum computing breakthrough?

Google's quantum computer demonstrates the ability to solve complex problems exponentially faster than traditional supercomputers. While promising for scientific advancements, this also poses a threat to current encryption methods, necessitating the development of post-quantum cryptography.

Why is the relationship between OpenAI and Microsoft considered tenuous?

OpenAI, initially a non-profit research lab, shifted its focus to commercial endeavors. Its partnership with Microsoft, primarily for Azure compute credits, includes a clause allowing OpenAI to operate independently upon achieving Artificial General Intelligence (AGI). Speculation exists that OpenAI may attempt to redefine AGI to expedite this separation, though their continued reliance on Microsoft's resources complicates matters.

What AI tools are frequently used by the podcast hosts?

Jason regularly uses OpenAI's Whisper for transcription and ChatGPT for various tasks, including generating titles and descriptions for articles. Stephen primarily utilizes ChatGPT integrated with Shortcuts for automation, such as formatting show notes and generating content ideas. Both express limited use of Apple Intelligence and other AI tools like Gemini, Claude, and Perplexity due to integration limitations and workflow disruptions.

Shownotes Transcript

Translations:
中文

Remember, no man is a failure who has friends. Welcome to Primary Technology, the show about the tech news that matters. iOS 18.2 is now out for everyone, including some new countries like the UK and Canada. We're gonna talk about visual intelligence and some more features. OpenAI has released Sora to the world, and I tried to generate some weird polar bears and medieval knights. So we're gonna see what that can do. Plus, Google has a bunch of news taking shots at Microsoft. Might be working on some smart glasses.

and a ton more. This episode is brought to you by 1Password, HelloFresh, and Audio Hijack. And of course, by you, all the members who support us directly. I'm one of your hosts with a cold, Steven Robles.

And then my friend, Jason Aten, is in the cold up in Michigan. How's it going, Jason? Oh, it's actually gone down. It's nine degrees now, Stephen, outside. Nine degrees. It was ten when I walked to my office. This is the one disadvantage, I think I probably mentioned this last week, of having an office that's separate from the house, is I have to walk 35 yards to my office in the snow. In the snow. Uphill both ways. Uphill both ways. There it is. I mean, it actually kind of is because there is a little bit of... Anyway, that's fine.

Anyway, I'm nursing. I was afraid I would have no voice today because I had no voice, but now I have the welcome to 101.5 Radio FM. This is your morning drive. And I have tea for the first time. I don't know if you can see the cup. Respect the beard. It says respect the beard. So you'll be getting tea ASMR. I'm going to try and mute as much as I can, but you might hear a slurp. I can't promise you won't. But we're going to get through this. And I just want to say, if you don't subscribe to our bonus episodes...

This is the week. One, because it's the holidays. Merry Christmas. Happy Hanukkah. All of that. But also, I'm going to rant about the road trip I did last night with the Tesla because charging, Jason, and battery percentage. I wore my battery percentage off t-shirt today because this is how I felt my car was all day. I wore my merch, too. Thank you for calling that out because I wore this t-shirt last week and forgot about

to mention that i'm wearing it so i'm wearing one of our shirts today and i should probably talk about the hat at some point oh yes we should yeah because jason got a percentage off hat yes anyway before i forget do you know what movie that the quote was from yeah uh it's a wonderful life very good yeah nailed it and you're what you're doing you're like binging all the christmas movies yes we haven't watched that one yet but we do watch a lot of christmas movies you're gonna watch die hard you're gonna watch don't my kids are not going to watch that one with me you're

Just use vid angel. But, uh, well, I just don't think they would care that much. Sure. Sure. It will really throw them off because we were just binging to my kids. Don't get to watch this, but we were just binging our way through friends. And we just, a couple of weeks ago past the part where Rachel is dating, you know, Oh,

Ross's girlfriend's dad who is Bruce Willis in the show so that would really be weird to them to then show them because if they've ever seen anything it's just that and they don't have a clue who Bruce Willis is and then all of a sudden to see him in Die Hard that'd be a weird way to be introduced to Bruce Willis that would yes that is a roundabout way there's way more there's way better ways to be introduced yes but we have some five star review shout outs some wrapped shout outs and then we're gonna get to iOS 18.2

I want to thank Kilted Baklava from the UK. That's, that was a cool name. People's reviews now are basically like blog length on our Apple podcast. And I love it because they're listing everything. So Kilted Baklava said battery percentage off. That's right. Pencil to volume buttons. So we're on the same page for both of those roundabouts phone and dominant pocket. So we're all, we're all there. I think, uh, yeah, me and me and them are completely agreed. Anyway,

anyway, they say great, great podcast. Kendall six Oh one. So they don't usually write reviews, but they really love the show dots on. Thank you. I'm winning this round of it so far. Let's see if we can. I'm pretty sure I actually win this entire thing. Just keep going. Listen, we, people want more debates. So we have to figure out what other asinine preferences we have on Mac. I do think we have become the most tribal podcast on in tech right now. I mean,

I think so. We have the most divisive things. Yes. We'll talk about screensavers next. Osbui from the USA, dots on, that's right, battery percentage on, listen, nobody's perfect. And this was interesting, Apple Pencil pointed left...

Because that's the way it looks correct. And I agree. But anyway, they said we're the best tech show. Thank you. And finally, Bat Colby from the USA. A very nice thing to say. Dots on, hiding off, magnification set to teeny tiny bit. Was that...

okay but the best part of that entire review was the reason that they subscribed oh that's what was it i missed it it's it's the first sentence it says i've always listened because of steven but i truly subscribe because of jason's affability i just i i won the whole thing i just i won the podcast that's it that's it someone has called you affable you can now not only that they put their money behind it thank you that's that's very that's very true so thank you for that and there

There are people sending screenshots of primary tech in there or they're wrapped. So this is an overcast wrapped screen.

We made Primary Tech. This is Matt Winship on Blue Sky. 47 hours of Primary Technology subscriber edition. So they're supportive, so thank you for that. You have missed a couple episodes there, Matt. So there's some stuff for you to listen to over the holidays. That's right, that's right. You can catch up. Listen to all the bonus episodes. In great company there next to ATP, Upgrade, and the talk show. Yeah. And then this is Corey Nelson on X. We're number four in their, I believe that's Spotify-wrapped?

Okay. So that's very cool. I don't know what number two is. A hot dog is a sandwich. We're not going to start. We can't start that debate. And also groby picks on threads were up there in this top four as well. So thank you for sharing all of that. Now, 18.2 finally, finally came out. I'm mid update because I had to do a road trip yesterday. So I'm like, my phone is updated my iPad, but I still, my Mac, I haven't done that.

But I always say, tune that two is out. Apple intelligence is now in localized English for things like the UK, Canada, South Africa. So you can available in more countries, quote unquote, if you didn't want to change your language before. And in the next year, Apple says it will come to more languages and more countries. There are some updates for things besides visual intelligence and Apple intelligence. So there are some things, uh, one, which I didn't even know was coming was the natural language search in music and TV.

So you can go to like the TV app or Apple Music. I'll go with the TV and you can search for something like comedy with Julia Roberts and Cameron Diaz. And that kind of search will actually bring you to my best friend's wedding. So you can actually kind of do a who was in that movie. I remember this actor, but I don't remember the movie name. Now you can kind of do that natural language search in a TV and music. And for music, you can ask it things like electronic without words.

or classical with words or whatever, or singing, you know, choir. None of those things actually exist, but that's fine. You know what I mean. But I actually played around a lot with visual intelligence because I wanted to know, like, I tried this early in the betas, and most of the time it was just doing chat GPT searches or Google image, reverse image searches. And I actually found visual intelligence to be upgraded to,

So I have the video here. I'll link my video down in the show notes. But I found that if you're at a business, like I went to a Starbucks here and you do visual intelligence, which is only iPhone 16, you hold the camera control. Now you actually get basically the Apple Maps info. So there's Apple Maps info, even things like menu order, things like that. You'll see it in the visual intelligence screen, which is still kind of this weird place that's like,

Not the camera, but it is a camera. You have to treat it like a camera. It's a little weird. But you can do things like jump to order. Sometimes that goes over to Yelp. Sometimes it'll go to the business's website. Basically, whatever it would do in Apple Maps, it would do in visual intelligence. Those are the options it's giving you. But there's actually way more stuff that visual intelligence can do.

It can do things like recognize dates, which is like what they showed in the keynote. So if you have a concert poster, I actually used a program here from my wife's orchestra. It'll pull the dates and allow you to add the date to the calendar. It wasn't super smart. Like it didn't pull like the title of the concert or anything, but that's fine. But you could also do things like if you have an address, a phone number and website all on a business card, you can use visual intelligence and you actually get the options to call, navigate or visit the website. So that's cool.

But also the summarize feature, which this is I think one of the more like expedited use cases before if you wanted to get it like OCR text by taking a picture of some stuff. It was a few steps in before you could get to a summary. But now with visual intelligence, you can literally just point it at like a book page. That's what I did here. And summarize text is actually just one of the options. So you could take a picture of a book page and tap summarize and actually get a summary of a single page.

You would have to literally do that for every page if you were trying to do it. But if there's like a magazine article or something like that, you can get the summary, which is kind of cool. And it also will recognize like multiple websites and things like that. It doesn't recognize .fm domains for some reason. So I took a picture of the business card with mods.fm and it did not like bring that up as an option to tap. And notably, there's a lot of things that the lookup feature in photos can do that visual intelligence can't.

For instance, art recognition, visual intelligence will make you look it up with ChatGPT or Google Image Search. But if you take a picture of art and you look in photos, you can tap the little eye icon for lookup and you'll get the name of the artwork. So Apple is doing this in photos, but for some reason is not bringing it to visual intelligence just yet. Same with plant identification and animal identification. If you do visual intelligence,

You basically have to send it to chat GPT or Google image search. But if you just take a regular picture of it, the lookup feature in photos will tell you what it is. So, and even like the laundry tag thing, photos will tell you what a laundry tag means, but visual intelligence will not, which is a little ironic. Yeah. It is weird. Yeah.

Did you, have you played around with it at all? I mean, I don't know. There's not really a ton of use cases. I mean, I've used it some that I still feel like it's not the thing we want. It's not the thing they showed us in the pro in the keynote either. It's not like pointed at the dog. The whole thing is seamless. It pops up and shows you what the dog, like it's, it's still more like you could point it at a thing and you can sort of like

You then have to tell it, do I want to ask something or do I want to search something? It's like, and it also is weird. This is very similar to the Easter egg rant that we went on about the podcast memberships. It's like Apple is doing two things that...

to us seem like they should be the same, but they are obviously being developed completely independently because the fact that photos is so much better at giving you this information than when I'm using what is supposed to be an AI powered lookup tool. I just don't understand that. There's something weird happening there.

It is weird. I did ask, like, hey, all these amazing lookup features and photos, is that coming? And they were like, we have nothing to say this time. So, you know, they're not going to say anything. But I imagine more will come. Like, visual intelligence in the very first beta did almost nothing. Like, it would just tell you to do ChatGPT or Google Image. So I think they're adding features, but it still has a ways to go. Don't you think it's going to be really disappointing, though, if the most useful parts of Apple intelligence are just a...

different skinned user interface to get to chat GPT. Well, we're going to get to that. Okay. Yeah. Before we get to the chat GPT integration, another non Apple intelligence feature is the two stage camera control. So if you have an iPhone 16, you can now set it to light press auto focus, auto exposure lock, and then take the picture. Now,

You were all about camera control recently. Are you still using it regularly? Yeah, all the time. And by the way, small update, just so you know, I have stopped using the back for now. The real reason is I just keep janking it up every time I drop the phone. It's getting kind of bad. But the real reason was the camera control button is super hard to find.

On the side of the phone. It is just not intuitive to find it. And the number of times I would be trying to take a photo and I'm just like pressing really hard against this, the titanium side. I'm like, why won't you do anything? So I have switched to the Nomad case that has the cutout.

And I don't love the cutout, but it makes it real easy to know exactly where... Sorry, I'm updating my phone to 18.2 right now as we speak. And I've noticed that they made it so that one side of this case gives a little bit so that you can push it down pretty reliably. But I really like the back, but not only is it getting kind of ugly because I keep dropping it, but also I can't... I use the camera control all the time, probably 30 times a day. And...

I probably try to use it 70 times a day and 40 of those times I'm just pressing against the titanium phone. I will say, I was trying, I was looking for my nomad leather back because I also stopped using it. I really liked how it felt, but it kept like, it got messed up and like with a lot of other leather cases, which I recently did a leather case review. Like I have a lot, like this is why I can't find my, I have stuff everywhere. Yeah. But usually with a leather case, you can like, if you scratch it,

you can kind of like rub it and because it's real leather the scratches will kind of fade or go away and i found with the nomad i don't know if it's because it's so thin or there's not enough leather on it i couldn't rub the scratches out and it doesn't look like a patina it just looks like a scratch so i actually i stopped using the nomad one i'm still using the uh i'm now i'm using the suti the suti which is not a leatherback but it's the it's the gray one and i do like it okay but the

The Nomad one, I will say, does the best at guiding your finger to camera control. Yeah. It just has a nice trough. It's really nice. It's nice. Casemakers still need to do the button, like Apple's silicone case. Do the button on the button, I think, because Apple's silicone, which is right here...

you have the capacitive button on the button, which I think is the best experience. But I feel like it's still hard to know. I guess it does feel a little bit different, at least on the case. This is the worst podcast ever. I'm just sitting here running my finger across the side of this thing. It's like a fidget toy is what it is. I'm just sitting here fidgeting with it. Anyway, but I feel like even that is not...

again, I've changed my mind about the cutout. I actually think it's a better experience because you know exactly where to put your finger. Okay. Now, but with two-stage camera control, you've been on the betas. I do not use two-stage camera control. Okay, that's what I was going to ask. Because that was my thing is like,

pressing the camera control button is pretty consistent, pretty easy. But I found trying to do the light press to auto focus nearly impossible to get it done. Like when I want it. Yeah. I don't like it. And you can't really, you can adjust the sensitivity a little bit, but not enough. And I just don't, I don't know. Like if you have a real mirrorless or DSLR camera, that half shutter press is so clearly half press. And then you could press further for the, take the picture.

The camera control is just not enough. I almost wonder if like the half press could then start a small vibration that's just kind of like constant to know you're in like focus mode and then press it all the way down just to know like, have I activated it? Because I honestly had to look at the screen to know, did I lock the focus or not? Like I could not tell by my finger touching. I had to look at the screen to see whether or not it was locked. And I feel like

That kind of defeats the purpose. Hopefully a feel will let you know. But anyway, can I tell you a secret? Yeah, please. Photographers do not use half press to focus. If you tell me you only manual focus, we're going to use back button focus.

100% of the time. Back button focus all the time. You just, the little AF on button, you use that to focus, lock your focus in, and then you hit the shutter. None of this, when I hit the shutter, I wanted to do the thing a shutter is supposed to do, which is funny because like, you know, half cameras now, there's not even a shutter, but you get like, it's electronic. But I want it to take a picture. I don't want it to be trying to, I don't want it to like decide how, where did I, how much did I, nope, you just, if you push that thing. So I have, the first thing I do every time I buy a camera is I set focus to the back button. Yep.

I like that. I might have to try. Well, I never take my camera off the tripod here, but also you don't take still photos very often. No, I don't. I don't. But I mean, raw. Well, we're going to get to the app awards later. But anyway. Okay. Last couple of things for iOS 18.2 chat GPT integration is the other huge feature, which you can now log into your chat GPT plus account, or you don't have to log in. You can just use the chat GPT features. You'll just have that limit, you know, if you're on like the free plan or you're not logged in.

And basically, whenever you ask the voice assistant something on your iPhone, iPad, or Mac, if it can't do it, it will go out to ChatGPT.

By default, it asks you every time, like, do you really want to send this to ChatGPT? But you can actually disable that. There's the toggle in settings that says confirm ChatGPT requests. Toggle that off, and then it will just go to ChatGPT automatically. And you don't have to confirm it every time. So I highly recommend that. And then the hack is, and I think we talked about this in the last episode, and I haven't heard this other places, but you can basically start every request by saying, you

using ChatGPT and then ask it whatever and the voice assistant will just use ChatGPT for whatever the request, even if it's like general knowledge. So I'll say, you know, if you hold the side button, activate the dingus and say, what's the average rainfall in the Amazon basin? It'll tell you from its knowledge. But then if you do it again and say, using ChatGPT, what's the average rainfall in the Amazon basin? It will use ChatGPT to answer the question. So you can actually preface your requests

with using ChatGPT and just default to that all the time. And that means you can also create images that way because

you know, the dingus doesn't create images. It'll just do an image search, but you can ask the voice assistant using chat GPT, create an image of such and such, and it will generate the image right there using the voice assistant. So it's just a little hack for you. I don't know why I'm obsessed with polar bears and medieval nights. We're going to show more of that during the Sora section, but yeah. Have you noticed any difference there or any useful features? I mean, I just think, I mean, I use chat GPT all the time. We'll talk about that later, but I just, I,

It feels like a letdown to me that the best feature of Apple intelligence at this point is,

is sending things from Siri to ChatGPT. Because I'm like, if that's what I wanted to do, I got an app for that, and the app is actually pretty darn good, and I don't need to do... I guess it's technically a little bit faster to just long press the side button and just say something, but then you have to wait for it to decide, oops, I got to send this to... And I know you just gave us a pro tip that that's great, but it still takes longer for it to do that sort of thing. I don't know. I just... I wanted to... Listen, they promised us...

That I could just say, you know, when is the thing with the person and where do I go? Like they made these promises and that is the thing that people are looking for. And they haven't even made actual dingus smarter yet. It's just, it's not, it's just, it's so bad still. If we did funny titles, it would be actual dingus. Still dumb. Actual dingus.

It is going to be one of those. I forget what other feature this was. There's been a couple of times where Apple will announce a feature at WWDC and we actually don't get it until the next WWDC or even later. Yeah. And I think the semantic index, which is what Apple, that was the term Apple used for dingus being able to look at your photos and text messages and emails and bring all that information together. I don't think we're going to see the semantic index until, uh,

next i mean obviously 2025 but i think even closer to dub dub well isn't is it mark grimmins report like not until like the either the end of next year the beginning of 2026 like i think he said like the good series not coming for a long time and that is a huge letdown like because i'm seriously i use well i'll save it the bar is a lot higher than i think apple thinks it is instead they're making image playgrounds which is like garbage right

I was going to... It is so bad. Okay, so the other two things... I don't think we're going to talk about A2.2 this time, but it was my fault. I'm sorry. No, it was my fault. The Image Wand tool is also there, which only works in Notes right now. I really thought they were going to update Pages and Keynote to include this, and they haven't yet. But you can basically circle a word that you've handwritten or select text in Apple Notes and basically ask it to generate an image from it. Like, it's fine, whatever. But then...

Like, yes, image playgrounds, terrifying images. But I've been wanting to like just post on social media, like here's what Apple intelligence is doing. And then just post a Sora video and be like, this is what open AI is doing. And like that is, and I understand Apple is not an AI company.

Like, I guess maybe there's a distinction there where OpenAI is, like, solely focusing on these large language models and that's what's underpinning Sora and everything. But also, like, this is Apple, the biggest company in the world.

And to see like, here's image playgrounds. And then Sora, like if you just put it side by side, like it is a stark difference what they can do. Yeah. And I know they've put so many guardrails on it because the last thing Apple wants to do is like be responsible for deep fakes and stuff like that. I get it. I understand all those things. Then just don't even do it because this, honestly, it just makes the whole thing. Like it just looks so bad. And it's just, I just don't understand. Like,

I don't know. Genmoji, great. That's so fun. That's fine. It has a reason for existing. But the image playgrounds and stuff. And do you think the reason that it's only available in notes is because they're like intention? I just thought of this as you were saying that.

that they're intentionally limiting the resources that are required to do it. Because if they just put it everywhere, people might actually do like the image one thing that people might do it more. And I mean, that's all compute. So I don't know. That's true. I know people were saying when they updated to 18.2 yesterday, their devices got hot. And that's something we all experienced with the betas. Like, oh shoot. And if you try to do image playgrounds or Genmoji, like your phone starts cooking. It is like, it gets very hot. But I mean, I know this stuff will improve over time. It does feel like Apple intelligence is,

I don't really, day-to-day use. Well, this way, we keep trying to get to it, but we're going to save it for the personal tech segment because you said, what AIs are we actually using? And we should include Apple Intelligence in that question. So we're going to answer that in our personal tech segment. We're teasing it. We're not going to talk about it. But we have so much more to talk about because Sora, I've been playing around with Sora, and it's just that you got some crazy looking stuff. But before we do, we're going to thank a couple friends. HelloFresh. HelloFresh is a

is America's number one meal kit, shipping you

seasonal recipes, free portion ingredients, farm fresh food, all right to your doorstep. And it's super fun to make. We've done HelloFresh with the kids. Jason's doing HelloFresh. And it's just so much more convenient than going to the grocery store and literally trying to fight medieval nights on polar bears for your groceries. I think that's actually how it is right now during the holidays. It is. But I wouldn't know because I either get it HelloFresh or delivered. No, I'm just kidding. I go to Publix and I fight everybody.

And because we're in the holidays, listen, everybody's busy. Everybody's looking for ways to stress less. Absolutely. And HelloFresh makes mealtime manageable, saving you from searching for recipes for all that grocery shopping. Just pick your meals on the website and everything is delivered right to your door. Plus things are customizable. So you get 50 chef crafted recipes to choose from every week. Plus customized options that make your meals just the way you like them. You know, they have vegan, vegetarian. You can do things like keto. And there's 100 add-on items.

like quick breakfast, packable snacks, beverages, and more. Jason, tell me what has been a recent meal that you have cooked? We did the Mooshu pork bowls, which are really good. Like I didn't know what that was just to be honest, but it was like surprisingly good. Uh, and it was very easy to make. Like I can handle like putting things in a pan and putting all the ingredients together. But when it was done, it like, it looked like a thing that

Someone else had made for us. Let's just put it that way. When I do HelloFresh, I'm always surprised. Like, yes, I can cook. I might still have to cut an onion. That's a movie on the side. That was good. But I can cook HelloFresh. And their website right now has these everything bagels that are always tempting me. They look very good. So here's what you do. You can get 10 free meals at HelloFresh.com slash free primary. That's applied across seven boxes. New subscribers only. Varies by plan. But that's 10 free meals.

HelloFresh meals. Just go to HelloFresh.com slash free primary. And that link is in the show notes. You can just click it there. HelloFresh, America's number one meal kit. We also want to thank America's number one way to secure your company's IT devices and apps. I don't know how that transition went, but I think one password will appreciate it. You don't want bad guys eating your lunch.

Oh, you got it. That was the tagline. That's free. 1Password can just run with it. You don't even need to pay us as your marketing team. Just go for it. But 1Password, extended access management. Listen, I've worked in places where there's lots of restrictions on devices, whether it's mobile devices, you got mobile device managers, and people don't like those kind of restrictions, and they end up trying to figure things out for themselves. They'll download random apps, VPNs, anything to get around the security because they just want to use their devices independently.

the way they want to use them but that's when the trouble starts and people can have unmanaged devices shadow IT apps non-employee identities things like that and most security tools only work when everybody follows the rules and listen people you know especially if you're driving around the holidays people don't follow the rules so what you need is 1Password extended access management it's the first security solution that brings all these unmanaged devices apps and identities under your control and it ensures that every user credential is strong protected

Every device is known and healthy and every app is visible. 1Password Extended Access Management solves the problems traditional mobile device managers can't. It's security for the way we work today. And it's now generally available to companies with Okta, Microsoft, Entra, and in beta for Google Workspace customers. What you do is go to 1Password, the number one, password.com slash 1Password.

primary tech and you can learn more about one password extended access management or just click in the show notes if you run the it at your company or know the person who does or maybe don't tell on people but you know many people i listen i think people are getting around the security i think just saying it's not me but other people let the uh your it team know and this is the tool they should use onepassword.com slash primary tech have you generated any wild videos with sora just yet i i

Only generated one, and then I realized that you're limited to five seconds. Well, unless you're paying the $200 a month, and then you're limited to 20 seconds. You can pay $200 a month, and you still only get 20-second videos. This is a parlor trick, Steven. Later in the notes, we were going to talk about OpenAI, ChatGPT Pro, whatever that gets called. I'm just saying. $200 a month. Okay, we'll talk about that later, but my point is you get 20 seconds. I have a lot of thoughts about Sora.

But no, I have, I made one video and it was great. It was exactly what I expected it to be. Yeah. Um, but it is, it is, I don't understand like what, why, what is the point of this? So MKBHD had his video as per usual. He's been like, I've been using this for about a week now. And he opened the video with a bunch of clips basically saying, can you tell what's AI and what's actual footage? And I did pretty good on the test. I'll link his video down in the show notes. If you want to see if you could tell which is which, um,

you know, it is obvious in some places, some places it's harder. And he talks about the things that are bad about it. And it's all the things you would expect. Like, like this crazy clip of cars, like disappearing into each other, weird animals doing weird things. And it's just bizarre, bizarre things can happen with Sora. But I've been playing around with it. Here's my, here's my, my Sora account. And so here's my medieval night riding a polar bear.

Anytime there's like exaggerated movements, like if you watch the sword on the left, he kind of like chops his own head off and somehow still survives. Slightly problematic. Maybe he's Highlander. I don't know. But when he's just sitting on the polar bear, like that looks kind of cool. That looks pretty real. The fur on the bear looks very good. I tried generating an image of the knight charging with the bear. Less good. Like it looks a little weird once you get to that.

Then it generated, like, give me a podcast conversation. And I was pretty impressed with this. Like, the dude looks real. It looks like he's doing a remote interview using Riverside.fm on the laptop. That looks pretty good. And then you can also do things like remix stuff.

And so you can like, once a video is generated, you can like add more to the prompt to have it do more things. And for some reason, this one on the right now has the sword coming out of the bear's nose. That is a scary bear. That's a war bear. That'll be the name of our first motion picture. War bear. Yeah. Medieval knight. So, you know, I mean, it is...

Well, there's lots of ramifications we probably need to talk about, but this being available to the public. If you didn't already have an OpenAI account, they're limiting account creation, so you can't just jump in there and try it. But if you were a ChatGPT Plus subscriber before, you can get in there and start generating some stuff. You can also see things other people have generated in the Features tab. There's some wild things there. I mean, some of the stuff is genuinely impressive. And one of the things that MKBHD did is he generated a news reel.

and made it look like it was CCTV footage along with newscasters, and he posted it on X. I was trying to find it here in his video, but I don't think he found it. But, yeah, I mean, one other thing, and I want to get your thoughts. He asked Sora to generate a tech review, like a YouTuber tech review, and so Sora generated this video. This is generated, but this plant that's on the desk that Sora put in the video, MKBGD's like, now wait a minute.

That's my plant. Like he's holding the plant now in the video. Like that's a video he has used in his, a plant he has used in his videos. And it's like, okay, that was true. I wonder what you trained on. Obviously the biggest tech YouTuber in the space. That's amazing. There's many feelings to feel about this. I am slightly nervous about like deep fakes and people saying,

Like I already had a situation the other day. I don't know if I told you about this, where I had a, a close family relative tell me something outlandish. Like we're not talking about Elon, but they said to me, did you hear that Elon bought Ford? Elon Musk acquired Ford. I was like, I'm pretty sure that's not true, but let me look, I'll look it up. You know, I'll give you the benefit of the doubt. I looked it up and apparently there was a video of,

That was released back in June or July that someone had basically cut together a bunch of Elon interviews, put some Ford footage over it, and actually made it seem like Elon was acquiring Ford. And if you watch this video, it just sounds like it. That was done just with pulling clips from different interviews, and it was enough to fool people. Back then, it fooled a bunch of people, and apparently six months later, it fooled someone as well. And it's like, now...

I don't know. Is it going to be even harder to tell people like, actually, this is not real. Like look closely at the hands. I don't know. Yeah. I don't know how we solve that problem. I, that's the other thing about the video is I just don't, I don't really understand what, so you, I'm assuming you've seen the Coke ad, right? That Coca-Cola holiday ad that was made by magic AI. Like, I don't understand. Like,

I don't get what... And I think in Coke's case, the reason they did it is because you could do it with AI. It's not like they're not... They still paid someone to make that. They paid someone to make it with AI or whatever. Right. And that's fine. I mean, what they made was a very specific thing. It's not like you could not just get that out of Sora because the prompt would... And also, it's only like five seconds long. Right. You only get five seconds long. I just don't... I think... No, I don't think the world is ready for this. I don't think the world is ready for...

what happens when you want to spread misinformation and now you can generate a video that is that now like, and I don't care how hard they try to like lock these things down. Someone will figure out ways to do things. And what's crazy about it is,

you know i saw the video from mkbhd of the newscast it was very obviously fake that's fine and one of the telltale signs is like text is still gibberish in these videos right they can't put words together i remember trying to like tell uh chad gpt early on make me a map of something like show me a map whatever and you'd be like and put these locations on the map and even if you gave it words and spelled them they just turn out as gibberish in the image it just can't so like that's good don't

Don't fix that. Leave it like the, make that problem always exist so that we'll know and require your text in every video, I guess. I don't know, but I feel like it's just, we're not, we're not prepared for what's going to happen. And also I don't,

think it's going to be a good thing that now like because people are so predisposed to believe certain things like so for example i have no idea who told you that elon bought ford but someone like someone heard that lots of people heard that and it's it fit some worldview that they had and if there's a video that supports that they don't look critically at what's happening right the confirmation bias is real and now like people being able to generate it and now

OpenAI is watermarking any video you generate with Sora with this tiny little squiggly line at the bottom right that turns into an OpenAI logo. So I'll just show you real quick, but this is going back to the examples that OpenAI is showing. Like you can see in the bottom right corner, it looks like little lines at first. I thought it was a mistake in the video. The first couple of times I generated something, I was like, what is that? Because it was hard to see because I had Polar Bear and it was snowy and you could barely see the watermark. But it is there. It turns into an OpenAI logo.

But also you can just crop it out. I mean, you can literally just pull this into Premiere or Final Cut and crop the video and there's no watermark. Like what is, there's nothing stopping people from doing that. Yeah. So, and again, there's some videos that are really convincing. Like if you ever wanted footage of ocean waves against a mountain or cliff side, it's great at that stuff for some reason. Like it's great at water physics for some reason. You know, humans less so. But...

It's not horrible. That's the thing. Like some of these videos of like actual people, it's, it's convincing. I'm sending you the one I did. All right. Here's Jason's sore video. More snow. He lives in Michigan. Yeah. This is basically my morning commute right here. It doesn't look like you, but yeah, totally my morning commute. That's every morning me in the forest. No, like not bad.

Not bad. The prompt was a 10 year old girl walking through a wintry forest with snow at dawn. Like that's exactly what that is. That is exactly what that is. The, the, the physics aren't bad. It looks pretty natural. Like you, I feel like you could throw a clip or two in another video with similar clips and people would be hard pressed to like tell the difference. Right. Like to spot it. Like if you watched an hour,

long video and there was like three ai clips in it and they were really good i feel like people wouldn't know and that's uh that's a different kind of world we're in now the other thing i noticed it was because i just made another one and i found that was like you put in a prompt and then it titles your video based on what your prompt was and it actually does that's like really like so the one i sent you i told you what the prompt was the title was winter wonderland stroll i just made a second one although this is gonna crash

I made a coastal road adventure. I put a camper van traveling along a winding road next to the ocean, and it is driving dead smack down the middle of this road, and it is about to have a head-on collision with another car. And then it stops. It's a cliffhanger, literally. Literally a cliffhanger. That's hilarious. Yeah, my titles were Knight's Courageous Charge, Knight's Polar Bear Charge. Listen, we have to talk offline about this, but I would like our brand to just now be a polar bear with a medieval knight riding it. It has nothing to do with tech, but.

Jason's not convinced. - We can work on this. I'm not convinced, but I'm not not convinced. - Okay, very good. Last opening I think, during their 12 days of ship miss, which Sora was one day, they used the Apple Intelligence iOS 18.2 launch as another day of their ship miss, which is like, did you ship anything that day? I think Apple shipped something that day, but anyway. But one of the first days, which was last Friday, the day after we recorded the last episode, they announced their ChatGPT Pro subscription

for $200. That basically gets you slightly better reasoning model and longer videos in Sora.

$200 a month, Jason. Did you sign up immediately? I haven't even signed up for the verge yet. And I'm supposed to, and I need to do that, but no, I did not sign up for it immediately. I don't, I don't know. I know there are people that will pay $200 a month. I think Ben Thompson from Stuttgart said he's got like, this is perfect for him. I don't, I don't understand 200. I mean, it's one thing. No, I get this. Okay. Okay.

The people who are going to do this are the people who can expense it to like their work, right? Like that totally makes sense. It's the same people who pay for the information or right or Bloomberg, right? I actually was just listening to a conversation with the, is it Helen have like the publisher of the verge who was talking about how they came up with

pricing for their subscription. This is why that was on my mind. And they're like, well, we realize that, you know, our readers are people who are paying for this themselves, not people who can expense it to work. And it occurs to me like, that's why the wall street journal and Bloomberg and the information are so expensive. They don't actually think that they're that valuable. There's just this built in cushion of margin just because they know that it's going to get expense to work. And I think the same thing has to be true with this. Yeah.

That's fascinating. Yeah, I agree. So you get access to ChatGPT-01 Pro, which is the new reasoning model. But man, $200 a month. I mean, I've used ChatGPT-01 a couple times just to ask you random questions, and I'm like, I would not pay for more of that. I mean, I guess if you have a work thing that would benefit, but nah, I'm good.

I'm good. I'll take my $20 a month change of T plus because we're going to talk about in our personal tech, what AI is actually use. And I do use mine actually. Also Google, the bunch of news about Google this week, but they released the auto dubbing feature now for YouTube for some creators and channels with knowledge, focused content. YouTube will now auto dub. If you check this box, meaning auto W video and other languages automatically. So people watching your videos can literally click English, Spanish, Japanese, French, whatever, and,

and watch it in a different voice. Now, I actually got an email about my personal channel that this feature is coming, if not already on it. I have to actually upload a new video, I think, in order to see it. But I'm going to do this. I've actually wanted this feature for a while because they announced it months ago, almost last year, I think. They announced the ability that you'd be able to upload other audio tracks to a single video. And that didn't come out until just now. And now YouTube's like, well, we're just going to do it for you.

And I'm about it. I'm about it. I'm going to try to do it. Previously, creators like huge creators like Mr. Beast or whatever, they would create multiple channels for different languages. So they'd be like Mr. Beast in Espanol, Mr. Beast Japanese, Russian he did. And so now the idea is you can just have one channel that reaches multiple languages. And yeah, I mean, I'm about it. So I'm

So I'm going to do it. Yeah. I think they started talking about this, not with YouTube specifically, but at Google IO, not this past summer, but the summer before. Cause I just, I remember sitting there and they were explaining like, you can have a video and it will automatically train, not just translate it because like, that's an easy, like not an easy thing, but that's the thing where it's a lot.

more simple to just have the captions for example be in a like closed captions in a different language right you can do that on your Apple TV like that stuff's not hard but this like mimics the voice of a human and it's like voice over in a different language that's

feels like a pretty big deal. And you just described like Mr. Beast having all these channels, but for monetization, if you're not Mr. Beast and you don't have 200 million followers, subscribers or whatever, it's a big deal because you don't necessarily want to segment your followers. You want them all in one place because that makes your reach much more valuable. So being able to have your Spanish viewers or Japanese or whatever the other, German, French, having those all in one place,

makes you more appealing to advertisers, which makes it easier to monetize and to like actually make a living at this. Yeah. So I'm excited to do it. I mean, Mr. B's, he had talked about his process before and like, he would literally hire a voiceover artist for that other language. He would have someone translate the script manually. So I'm sure it was an expensive process for him and just not sustainable or doable for many solo creators. Like what I actually tried to do was 11 labs, which is an AI voice, uh,

tool, I actually could train it on my voice in English and I can give it a Spanish translation. So what I would do is get a transcript from my video, translate it to Spanish using whatever, Google Translate or whatever, and then give that text to 11 labs and they could generate my voice speaking Spanish.

For the video. And I could do that. I had already done that. But YouTube didn't allow you to upload a separate audio track.

to your video. Like they would let you do captions in another language and you can upload that, but not audio. And I think it's because they were waiting because they wanted to do it themselves rather than you like uploading someone else's stuff. So it's interesting. Watch for my next videos. My next shortcuts video might be dubbed in multiple languages. We'll see, but I'm going to try it. But Google had a bunch of stuff going on before we do, because this is not a break, but I want you to talk about Google taking shots at Microsoft. I saw this post on blue sky.

Which I feel like for all that we've talked about in recent years or recent months, like this is very apropos. This is like Googling stuff then versus now. Googling stuff back then, search results, the thing you want. Googling stuff now, search results. AI taking a wild stab at it with some blatant misinformation. Sponsored results, sponsored results, sponsored results, sponsored results. People also ask, view products, and then like the thing you want online.

off the screen. Yeah. All the way down. That's exactly right. It is exactly right. This, I'll put it at the podcast artwork too so you can see this if you're listening in your podcast app. But this is 100% accurate. This is the history of Google. This was like 15, 20 years ago. Maybe even 10 years ago. It was like four years ago. Yeah, it was true. That was still true. It was like four or five years ago. And now it is...

It is wild. I just don't, I still, it doesn't push me to use chat GPT web search just yet. I don't know why I'm still trying Google and being frustrated and just resolving to read the AI overview and be like, I guess it's true. That's fine. We'll just take that as fact, but don't do that, Steven. I know. No, I don't. Not for important things, but for other things, it's like,

I don't know. There was a question about, I forget what it was. And I was just like, you know what? This AI overview might just be good enough. Well, Google is only good at this point to find a thing you already know. You just don't remember like where it was or to find like a website or a thing, a piece of information. And you're just, you just need to sort of like have the signpost point in the right direction. It is not good for discovering information that you don't know. It is, it has become horrible. I just think, Hmm,

We don't have time for all of that. But the problem with what you just showed is that it just shows you how, when I say that, I mean the graphic about how different Googling is, that the incentives are just completely misaligned between Google making money and human beings who are using Google to find things because Google makes money from all of those things that are before what you actually want. And I just, it is so...

in my mind problematic that someone is like their KPI at Google is like engagement and clicking on things. And so they just keep stuffing stuff in there and human. And they are like, well, look, we got more conversions on this. It's like, of course, if you do more things, you will get more conversions, but it doesn't make people happier. It's just, it's such a mess. I do find because it pushes products so hard.

that if I want to price check something, I will Google search it because I know it's going to show me product results at the top and show me all the different sort, like where I can get a B&H or Amazon. So I actually will use it to price check things. But yeah, the general knowledge stuff is just getting worse and worse. But supposedly, and I'm going to swap these around and then lead into your article talking about Google taking shots at Microsoft because Google also introduced Gemini 2.0, a new model. This is supposedly...

Even better, this new model is going to start powering the AI overviews early next year during your Google search, so maybe those will get better. But Gemini 2.0 is also multimodal, which means it can move between audio, visual, text, and it can do all those different modes within the same kind of conversation or request. And Google's bragging, like this is an amazing new model, supposedly. And my Android phone actually started listening because I said,

Google. Now I can't say Google or the other word. Anyway, so you got Gemini 2.0 launched. Google also revealed that they had a breakthrough in like quantum computing. I don't know why I'm so obsessed with this, but like the superposition of quantum computing where a chip can be both on and off at the same time.

I just love that. It almost sounds like a sci-fi thing. So I don't know. I just wanted to mention. Well, it's like the qubits. Yeah. And not just can they be positive or negative or one or zero. They can be any combination of the two things at any, at the same time. Here's the, like read that deck. It says that the computer solved the problem in five minutes. That would take a supercomputer 10 septillion years to compete.

To complete. This is like of the two things, Gemini versus this, like I think this is actually the more significant breakthrough in the long run. And it's just, it's kind of mind blowing, but it's also terrifying to,

It's going to like quantum computing is going to be the sort of thing that is able to solve problems that just have not been solvable by now. And at the same time, well, it's like the, imagine if they would have had this in interstellar, they could have solved the problem of gravity so much easier. Right. No, but at the same time, it like,

We've already heard Apple talking about building encryption for the post quantum computing age, because if a quantum computer can solve something in five minutes like this, no encryption will work. Because literally encryption is just a defense against someone guessing the key. In computers, it would just take them too long. It will not take quantum computers that long. Right.

And the only other quantum computer I'd seen was the IBM quantum computer. And I watched the whole video on that. And now I'll link in this Verge article. There's the

kind of an inside look at Google's quantum AI lab. Just amazing. And one of the coolest things, if you're not familiar, like these chips run so hot, like that's one of the issues with these like superposition chips that there's this cooling system. Like this whole image right here, if you're watching is like the cooling system for a single chip because they have to basically get it

close to like zero Kelvin. Yep. And they said it's colder than most parts of the universe. Like even in the vacuum of space, they have to make it colder than that. So these chips can actually run for periods of time as it's just amazing. So anyway, Google's doing all that. Well, one more thing about that. Sorry. The thing, the real breakthrough, I just wanted to mention this, that they have done is the error correction on it. It,

So it's like what it's able to do is detect when it is made an error and correct that, which just speeds up like the computing exponentially. And so like I think that's the real breakthrough that Google has come up with. And so because otherwise you just – you can just imagine a computer that's that fast that makes a mistake and just continues spiraling down that, right? But this is able to like correct the errors that it's making. And so, yeah. It would be like a quantum beach ball from a Mac. That would be – Oh.

That's the black hole from Interstellar. Wait a minute. There's so many connections here. Wait a minute. It's all connected. But then you had an article talking about soon over try. And actually before, cause I want to hear this in depth before you do that, I'll take just one more quick break and thank our wonderful sponsor who sponsored us the last three months ending out 2024, which is our great friends at audio hijack. Listen,

Audio Hijack. It is the Mac app you need to try today. We are using it right now, Jason and I, to record our own audio. I use it to record the audio for every video I do. It is an incredibly powerful application, easy to use, and you can do crazy things like add EQ and compressor to your audio stream. And I know it sounds complicated, but Audio Hijack is so easy to use. You

You literally just build these little blockchains. It's like a WYSIWYG editor, but for audio. They could take that tagline and run with it. It's WYSIWYG, but for audio. You could take the audio from the things, even applications like Safari or a web browser, and your USB mics and your audio interfaces. You could do all kinds of crazy stuff with it and then record it, even live stream it. And now they also have things like transcribe the audio live just while it's running through Audio Hijack.

is incredible for podcasting, for video. If you ever record audio on your Mac, you have to try Audio Hijack. They have been, again, I've just used them for years. Jason's used them for years. Rock solid software. Works amazingly well. Always super fast on like the OS updates too. Like they'll always like preempt and be like, listen, Mac OS Koi is coming out. We're getting ready for it. Be like, oh, we're good to go with before. You know, they're always amazing at that. I absolutely love that. It's the first thing I install on any new Mac. So here's what you do. You go to Mac,

Mac audio.com slash primary tech and use the promo code tech X X for 20% off, or you can get 20% off other bundles. So you can get audio hijack and loop back audio hijack and Farago. They have a bunch of amazing apps. You should just go check them out. Mac audio.com slash primary tech. Use the coupon code tech X X save 20%. A massive thank you to Paul and the team at audio hijack and rogue Amoeba.

for making incredible apps that basically like I wouldn't be able to do what I do without them and for sponsoring the show for three months. It's been a wonderful, and listen, if there's someone out there that wants to sponsor the show, reach out 2025 is coming up. And so we are wide open and honestly, a huge thank you to all of the members that support us directly through memberful and Apple podcasts. There's so many of you now, and we really appreciate that direct support as well. So with all the Google news,

Google is also taking pot shots firing at Microsoft. Jason, tell me about this article. Well, okay. So the backstory on this is that Satya Nadella, who is Microsoft CEO, was on the record saying that Google is the one who should have been the winner in the AI race, which is true. 100% true. Google had the top to bottom stack. They had the models. They developed a lot of this stuff. They had the distribution and they just didn't. And the reason that they just didn't is because they make a

ton of money from search. That's the super short version of the story and they were just really apprehensive about putting out the stuff that just makes crap up. That just doesn't seem like a good thing if you're trying to organize the world information and make it useful to people. This just makes up its own information. So that just seemed like a bad thing. So Microsoft made a point of, Microsoft CEO made a point of highlighting, Google should have won this and they didn't, right?

Did he say something like he wants to make Google dance? So, yes. He was on the record. Not only does he want to make them dance, he wants everyone to know that they were the ones that made them dance. Right. Now, these are two very...

chill at least their public personas are just very chill like guys like so that was a that was a real deep cut right there and then uh it was like a week or two ago at the dealbook summit which is a thing the new york times puts on with andrew ross sorkin who is a columnist at the new york times but also is a host on cnbc he does this thing every year brings in a bunch of people this year he had like you know uh google ceo he had um

I think Bookman was there. Sam Baldwin was there. Downplaying AGI because he's... Jeff Bezos was there. Anyway, did an hour-long interview with Bezos. Anyway, so he asks Sundar Pichai, who is Google's CEO, he says, you guys were the originals when it came to AI. Where do you think you are relative to these other players? Mostly Microsoft. And he says, I'd like to see a side-by-side comparison of their models and our models.

And the reason he says that is because they don't have any money. Like Microsoft has some models, but they're terrible. They are using open eyes models, right? Like that's the point he's making is now really like sick burn. But the problem is it's also a self burn. And the reason is,

The difference is, yes, you guys were the originals in AI. Yes, you have models. Yes, you have all this. But what you don't have is like a product that has whatever 300 million daily active users just out of nowhere, which is essentially what ChatGPT has. And so I think it's such an interesting interplay. You can tell that Google feels a lot of pressure or he wouldn't have even said that because...

Google has the distribution. They have billions and billions of users. There's no reason why they shouldn't be completely dominating in this, except for their decisions that they've made as a business.

Cause they also have the mobile platform that's dominant around the world. Yeah. They do make some software that you might've heard of called Android, Android, the dominant browser, the dominant search engine, the dominant video platform, like create, like they have so much dominant. That is, yeah, that is wild.

So tangentially related, at that deal book summit, Sam Altman spoke. He was kind of downplaying AGI, but I listened to the Decoder episode with the Microsoft AI CEO. Yep. Yeah, and Eli was like, why you guys got 18 CEOs? Which I think is a valid question. But he was talking about, and Eli asked him the question, I guess one of these stipulations, because OpenAI and Microsoft have this deal, this partnership,

And it seems like, correct me if I'm wrong, that if OpenAI were to achieve artificial general intelligence, AGI,

that this deal with Microsoft could then, like they could basically split in some way from their partnership with Microsoft. Is that accurate? Okay. Well, we don't actually know all the terms, but that has been reported a bunch of times that if the original agreement was that, because again, opening, I was meant to be a research lab in pursuit of, of safe nonprofit. Yes. Safe AGI to would benefit the world.

and then they realized they can make a crap ton of money doing this and then said forget about that like we don't care about the other part we want to just make a bunch of money right and there was talk that like okay we need we need to we need compute power we need a partner we need money so they went to microsoft and part of the deal was that's fine this you can you have a license to use all of this ip and all these models

until we get to AGI and you don't get to use that. Now, it's been reported a lot that Sam Altman is, they're trying to move the goalposts on when they declare AGI so they can get out of it. But here's the thing. OpenAI is still going to need compute. And right now they get all of that on Azure, which is Microsoft, right? And most of the money that, it's like $11 billion at this point that Microsoft has put into OpenAI is

is Azure credits, right? They didn't just write a check necessarily. I mean, they have given some money, but like a lot of it is credits for Azure and they're still going to need that. And it doesn't matter just saying we have AGI does not suddenly make you profitable. OpenAI is not profitable. Even if it can get a lot of people to sign up for a $200 a month credit,

like service. The thing is that $200 a month service probably isn't any more profitable because of the compute power that's required for whatever they're giving people, especially the Oh one pro model. Yeah. It's like, it's ridiculous. The amount of tokens that are required for that. So I don't, I don't think, I think that's like less of a thing because they could say we have AGI. The problem with like, they may have a motivation to say, yeah, we've made it to AGI. And if you live in the decoder episode, uh,

He's like, I don't, the CEO of Microsoft AI was like, I don't, I mean, we could get there somewhere between two and 10 years. That's a very noncommittal range, two to 10 years. Well, that's something Sam Altman has said recently that he believes we could reach AGI with current hardware, meaning like the NVIDIA chips and graphics cards that whatever everybody's using today could run AGI. And Eli asked the Microsoft AI CEO that, and he was like, nah. Yeah.

Basically, he's like, his exact response is, what does current hardware mean? What do you mean by current hardware? What do you mean by, yeah. He's like, we mean like on 4090s or whatever? Right. But I think it's important to listen to what Sam Maltman is saying because

In the context of he has to continue feeding the hype cycle. Right. Because that is what continues to attract investor money. That is the thing that is going to get them going. That is the thing that maybe will get them more customers. But right now, they're not even at the stage where... Like every business shifts from we have to do things that will get us investors. And then you have to shift to we have to do things that will get us customers. And those are not always the same thing. In fact, they're often not the same thing. And they're not even really at the point where they're... They're not...

they are not self-sufficient based on their customer base yet. They are still based. They're still dependent on investors. And so they have to continue the promise. Like, I don't, it's hard to compare the two, but it's like a very similar thing that we see with someone like Elon Musk, where it's like full self-driving is coming next week. I mean, next week, I mean, next week, I mean, we'll have a fleet of robotaxis by 2019 or whatever. Like,

It's just like continuous forward promises that keep people engaged. I think it's important to understand. I think it would be devastating to OpenAI if the last day of the ship miss, Sam Altman comes out and he's like, actually, O1 Pro is AGI. I don't think anyone would take him seriously. No, no one would. And so I wanted to bring all that context because with Google talking about Gemini 2.0,

it does feel like there is a tenuous relationship with open AI and Microsoft. I feel like that's obvious. Like you just hear Sam Altman and Microsoft talk and it's like, this is a partnership through necessity. Like they need each other right now. Open AI needs the compute and the backing of Microsoft.

Like soon after time might be saying Microsoft does not have the models. And so they need open AI doing that stuff. And so obviously it's a tenuous relationship. And so soon after try, but try over here saying like that, like we can, like, this is us. We got this. I do think heading into 2025, if, and this is the, if when we get to our personal tech segment, if they can make Gemini to me and to more people as useful as chat GPT,

then I think they could have the opportunity to actually maybe move into that

AI spot. And I say that because like, if you think about AI today, we're in the tech bubble, but like users widely, if you say chat GPT, most people will know that is right. They don't know open AI, but they know like, what is AI? Oh, chat GPT. Like that's kind of the association. And so opening, I has won this cycle of being associated with that kind of like Q-tip clout.

Kleenex, whatever. You know what I mean? And so Google has lost that up until now. But if what Sunabhacharya is saying is right, I think they might still have an opportunity to change that around. It just needs to be easier to access and use. Like I don't use Gemini because honestly, I don't even know where to go right now to use it. I guess I go to Gemini.com. Gemini.google.com. I'll get you there.

Or if you have the Google app on your phone, there's actually like a little star little tab at the top. So one of them is for Google search and the other one is for Gemini. While you're trying to figure this out, let me just tell you the most, I don't know what that is. This is Gemini.com. That is not the website. Just to prove the point, they need to, however many millions of dollars they need to spend, they should get Gemini.com. Yeah, that's interesting. And then if I go to Gemini.google.com. Yep, there you go. Is that it? Okay. That is it. Yeah. But like I don't.

Yeah, I don't want to use this. Okay, so this is the most devastating thing. Although it's nice it says hello, Steven. This is the most devastating thing that Google should pay very close attention to this. My 13-year-old on a very regular basis will say to me, because he's learning code. He loves Scratch. He's doing all the things. He wants to learn Python. But as he's building games and stuff, very simple stuff, obviously, he has stopped using Google.

Like he doesn't, he does not. It is useless to him. He uses chat GPT constantly and he does not have a chat GPT account. He's just going to chat.com, typing things in, getting the little piece of information he needs and going back over to his thing. He, he doesn't use Google for anything anymore. And Google is like, this is the Facebook problem, right? Like Facebook has this problem where if you go to the blue app, it's just,

people our age and older do you know like I don't mean to go on a tangent but do you know it is so weird nowadays if you get a friend request from someone on Facebook because one you're like you still use this and two it's like if we aren't already like the window of time when it was socially acceptable to send a friend request to someone on Facebook it ended like 10 years ago so true I have not sent a friend request to

yeah probably like i have friends people i like genuinely know in real life and i would not send them a friend request on facebook because it's the creepiest thing like there are some people where i like i like i yeah anyway it's just the point is the same thing with google they are going to lose an entire generation of people for whom it no longer meets their needs and it goes back to the thing you showed earlier like i don't get the thing i need i just see a bunch of weird

random stuff. Like if I type in, how do I write this script in Python? Which maybe that's not even a thing you should type. I don't even know anything about Python. And the thing you get is like a bunch of sponsored ads for code Academy and whatever. Like I don't, I don't want to buy a course. I just need someone to give me this quick answer. Right. And, and to your point too, my kids also, because they have it as a shortcut, have now defaulted to asking chat GPT things, general knowledge, research, whatever. And so like they are, they are winning right now.

But we'll see. Maybe next year it'll be different. But anyway. All right. A lightning round. That's not usually a lightning round.

We can do it. We can do it. We're talking about the FBI messaging and warning about the encryption thing. Wait, I just clicked your headline and just reloaded the page. But anyway, okay, tell me about the FBI thing. Hopefully I get another click for that. So thanks. Just keep doing that if you don't mind. I'll refresh it again. Keep doing that if you would. My timeline pages just went way down though, but that's okay. The bottom line to this is the FBI has been warning people that they should be using encrypted messaging apps instead of just not because of this...

Chinese hack called Salt Typhoon where they basically infiltrated AT&T and Verizon and another company that I think makes technology for telecoms. That part is not as important. It seemed really interesting. People made a big deal about how the FBI is warning people to use encryption and at the same time the FBI is super against encryption on devices because they want to be able to get into your stuff. And the...

the they have advocated on many times i think it was bill barr the attorney general under trump but he's not the only attorney general who's advocated this to be clear but he was very on record like tech companies should build in a back door to encryption so that we can get in if something if we need to get into it and like the argument they're making was like the sam bernardino shooter right like it's not like these are not sympathetic figures that they're trying to get into their phone so objectively the public is like yeah we think you should probably get into that and apple's like we can't we just we

it's not that we won't we just literally can't because there's no way to do that encryption is encryption and we don't have the key right i found it really ironic that the way that the salt typhoon attack happened is because they just got through the back door that that was built into the wire for wiretapping right there is a legal infrastructure in this company where you know government agents can wiretap people's conversations and the attackers just went in the back door and now they have all of this stuff and it's like

Yeah. Like this is why companies like Apple and Google are like, we're not building a backdoor because if one, if you make a backdoor for the good guys, anyone can get in the backdoor. And so I just found it extremely ironic and it was hilarious to me. They're like, yeah. Yeah. The fail, fail on that.

All right. Further lightning round. I just want to talk about smart glasses for half a second because amazing. The Meta Ray bands. I, I, I chose poorly on my black Friday self-purchase. I bought the aura ring for $10,

Instead of the Meta Ray Bands. And I've already returned the Aura Ring. This feels like exactly the thing that Steven Robles would do. I think you should have kept it and put it on your desk next to the Rabbit R1 in the Humane AI pit. Come on! I'm not keeping any more weird gadgets. I returned that Aura Ring.

For some reason, listen, this is not a review of the Oura Ring, and I don't even know if this is related, but for some reason I started waking up more often in the night while I was wearing it. I don't really think it was a comfort thing. I will say when it's taking your heart rate or whatever, you know that green or red glow from your Apple Watch? Yeah, yeah, yeah. You see it on your finger, and for some reason I saw it way more often in the middle of the night because it's on your hand.

And so I don't know, I don't know what it was. I'm not gonna make a video about it, but anyway, I returned the aura ring and all that to say, I was between the meta Ray-Bans and the aura ring on black Friday. And I chose poorly. I should have gotten the meta Ray-Bans and now, I mean, I might still get them, but there's a, now the air go vision is a pair of smart glasses. They look like the Ray-Bans, but not as cool. They have a camera, but these are powered by chat GPT instead of metas AI. So if you want to chat GPT glasses, you could get it.

The reason why I didn't get the Meta Ray Bans, by the way, just to be clear, the prescription part of that was complicated. And people were like, oh, you just bring them to the optical outlets and get the lenses changed. You already lost me. Okay. I want to order it with my prescription. Like I do my Warby Parkers right here. They haven't sponsored anything I've ever done. And yet people ask me all the time, Warby Parker, you're listening. Right. But anyway, it was too complicated. And I was like, I don't want to replace lenses. So I didn't get it.

So if Meta ever does this thing that the Air Go Vision does, which is just let me put my prescription in when I order so it actually arrives with my prescription in it, I will get the Meta Ray Bans. But now they're also talking about there might be new Meta Ray Bans next year. And I'm like, well, maybe I'll just wait. Just so you know, you can just go to LensCrafters by the Meta Ray Bans with prescription lenses. Is it only LensCrafters or is it where? Well, LensCrafters is owned by Alexotica, which is who makes the frames, the Ray Bans.

I also learned recently, I think it was on a verge cast where Lexotica owns like every, that's what I'm saying. They own lens crafters. They own like all of those. Yeah. They own all, they own all the manufacturing and all the distribution. What a cabal. No, like literally this is like the beers with diamonds. It is lens crafters. I mean, is it like Zotica? Yep. A hundred percent. That is my word. That is wild. Anyway. Uh,

And also Google might be making its own smart glasses, which if you didn't know, real ones know about the Google glass. This is not something, it's not the first time. Although I guess this is different. This is pretty different. But that project Astro inside Google, they might be making their own Google glasses, but not Google. It is, it is Google. It is saying something like it is a notch of pride that,

that google glass was the only thing still that was more like is more notorious for wearing around than if you were to walk around with that humane ai pin on all the time or the apple vision pro but yeah people don't walk around in public with the vision pro not anymore you would not look like a tech bro you would just look like a youtuber which is fine because the people who would do that are okay with it yep that's exactly why i'm doing this yes are you good first of all

We're both going to CES. I think it's the first time we're saying that live on the show. And I think we're going to record an episode at CES. From the show floor. Live from the show floor. Steven's going to say that even if we're in a hotel room. We're going to be in a hotel room. But still. I bet you, I'm going to reach out because I'm sure that there are some podcast studios. I'll see what I can come up with. I'm very excited. So anyway, we're both going to be at CES. This is my first CES ever. I'm very excited. But 100% going to wear the Apple Vision Pro on the plane. Jason.

Bring it on the plane and wear it. Oh my gosh. I got, I do have this little case. You got all the cases. They sent you every accessory for the guy who never leaves the house with it. I do have the best set up for traveling with something. I never travel with it. That's true. I actually ordered that Belkin strap, but it's not going to come in till like maybe the beginning of January. So really? Oh,

man, that thing is fantastic. Maybe. I was actually, so I'm flying to Arizona tonight for our daughters playing in the National League quarterfinals for her soccer team. And I was like, that would be the time I would wear it on a plane when I could embarrass my teenage daughter. If that's what motivates you, do it. That would be the time. But for CES, I could see bringing it. I feel like there would be other, I could justify because it's like, it's CES and I could have like, so maybe I will do that.

We could be the two weirdos in Apple Vision Pros walking around to like the vacuum, robot vacuum booths.

This is my Belkin strap. It says order will be available soon. Hopefully I get it before CES. We'll see. I would send you mine, but then I would stop using it because it is the best thing that has ever been made for the Vision Pro. It changes the game. It's ridiculous, but it is so much better. Well, I'm excited to try it. I also want... I'm excited to... I have to actually charge it so I can update it to Vision OS 2.2 because I want to try the ultra-wide Mac screen because everybody raves about that. But anyway. All right. Last thing before we get to personal tech. Absolutely.

Apple listed its 2024 App Store Award winners, and there's a bunch of them on there, but well-deserved. The overall winner, Kino. Kino, the app by the makers of Halide, Lux Optics. Kino is their video recording app for ProVideo, where you can record ProRes to external SSD. You can do 4K 120 now in their latest update, and they won App of the Year. And, yeah.

Ben Sandofsky and Sebastian DeWitt are behind the app and I think well-deserved. That's awesome. Yeah. I want to say, first of all, absolutely well-deserved. No one should take anything away from their win because they deserve that for on the iPhone. I do. I feel a little bit bad for them because they won. It's kind of like if you won the Superbowl in a year when everyone else was not very good. No,

Not that there weren't other good apps, but if you look at this list, like what if one, the Apple vision pro app of the year, I think it's the only Apple vision pro app that was released this year. Like, and then light room won the Mac app of the year. That is, that is a little strange. I don't understand. First of all, there's nothing different about it except for the AI D noise, which by the way is like,

life-changing i've got 100 the denoise the ai powered denoise is maybe the best photo feature that's ever been and i know we have a listener who works on lightroom so kudos to you he i don't i i wish i could remember i think it's brian um so like fantastic but is it really the mac app of the year does it i feel like it's like saying more about mac apps they had to do it this year because next year it'll be pixelmator pro but it'll be an apple oh that's

honestly like true they literally couldn't have picked your okay good point i take it all back listen lightroom for the mac i use it every day it's fantastic i'm not saying anything but i just feel like it doesn't like all of these other ones are super interesting like some of them are like very boutique type apps and stuff and then you have adobe lightroom

I will say, what if... Did you try the What If app on Apple Vision Prime? I actually have not. It is pretty cool. That was the first time where I was like, you know what? This is a cool experience.

And I like, it was fun. Was it better than the Gucci app? Cause I still think the Gucci app, maybe it's because the Gucci app was first that that seems like the Gucci app was a great experience, but it's not interactive. I mean, you also don't care because you can like pick up a purse and turn it around. But like the, what if app you're like shooting lasers out of your hand, like choose your own adventure. Yeah. Like it feels very, uh, interactive, like half game, uh,

But it's not like pressure. Like you have to worry about like dying in the game because it's, it's more a kind of a story. Anyway, I do think the, what if app was really good? The only thing I want the Apple to do with the vision pro is to make the Avengers tower environment available as an environment. Why can't people, why can't the environments within apps be available globally? That would be killer. That would be nice. Yeah. Cause Disney has a couple of cool. Yeah. I want to just be sitting there in Avengers tower, clacking out my articles.

That would be pretty fun, actually. All right. For a personal take, what AI are we actually using? Yeah. You have Apple Intelligence. You have Gemini. You have Claude, Perplexity, and obviously OpenAI, ChatGPT. Jason, I know you probably use Image Playgrounds daily, but what AI tools do you use on the rig? The two I use every single day are Whisper, which is powered by OpenAI, right? And then ChatGPT. I use them.

Well, I mean, I changed my default search to ChatGPT. That's true. But even beyond that, I use the Mac app. Yeah.

like 35 times a day like I constantly am using it for different things and I use the whisper app to transcribe stuff like a lot the only thing I don't like about the whisper app is it doesn't do speaker identification which is a really important thing for me if I'm looking through let's because I what a thing I do on a pretty regular basis is it's like oh there's this

interview at deal book summit that I should cover, but I don't have time to watch all these things. I can't sit and watch the deal book summer for nine hours or whatever. So it's like, I will just, you know, find a way to get the video and have it locally on my computer without doing any, like anyway. And then I just will dump that into, I used to just dump it into otter.

which was fantastic because it would label and everything. Well, Whisper is way faster, but Whisper doesn't do speaker identification, which is kind of a bummer. So that, but go ahead. Have you tried transcriptionists? I don't know what that is. So no, it's actually from the maker of ferrites. Oh, okay. Maybe I should try that. I just like Whisper because it's like really fast. I actually haven't tried Whisper to be honest. So I would have to try it to do it, but it's, it's really fast. It's one thing that's funny is like, if you put something into opening or to otter, it's,

I like that you can then, and so I do this occasionally, but if I need something in a hurry, another thing I'll do is I'll take a transcript and

in whisper and then i would just copy it into chat gpt with a question right like what was the highlight of this or what was the what did so-and-so say about this thing or whatever it might be because sometimes you'll see an article it'll be like you know bob eiger said that the next ceo of disney is going to be iron man tony starker like and you're like wait what was the context around that thing that he said and i don't have time to like watch a whole video or whatever

So that's the dog, my dog. I know. I always try to pause when that happens. So I can just, anyway. Um, yeah. So those are the two I use all the time. Yeah. Apple intelligence. Not so much. I mean, I use it occasionally, but it is just, there's just,

There's not the gen Moji is probably the part of it that I use the most. Yeah. I would say the only Apple intelligence I've used is like to summarize an article periodically, like I'll be in Safari and if it's really long, I'll pull down reader mode and see what the summary said. Sometimes it's useful, sometimes not. But my thing is I don't want to disrupt all my workflows just to use a different AI tool.

I was looking, Google Gemini has an app for iPhone, but there's no Mac app. And I will say if they had a Mac app like ChatGPT, I might be more inclined to try it because it would be easier to access and this is where I do all my work. But I use ChatGPT every day mostly because I can integrate it into shortcuts and that's where it is most useful to me. And so I will do things like

take a transcript from a video, go to the ChatGPT app and ask it for like title ideas, description ideas. But more often like for once we're done recording here, I've talked about it before, but I basically copy the tab group links. So I have all the articles we talked about. It formats show notes for me automatically. That's shortcuts. And then it also, I can select multiple headlines and it sends those articles to ChatGPT to generate a title and description. And then I massage it after that.

So that kind of integration where it can be part of automations that I use all the time makes it the most useful to me. And so I use ChatGPT probably every day. I use it the most integrated in shortcuts that I'm already using for things. And that's pretty much my use case. Apple Intelligence is known, Jim and I know. I tried Perplexity and Claude.

And I'm just like, I don't... Again, they don't integrate where I use stuff. And I don't want to pay for another one. Yeah. And Claude is supposed to be... The models are supposed to be better. But the experience is not better. And so, I just don't find myself using it very much. I'd be happy to. The Mac app is not as good. And it does weird things like when you... I don't know. And like I...

a thing that I started doing just to sort of compare them is I'm like, cause my, my son keeps asking me questions about Python and I don't know anything about Python. So I was like, give me a 5,000 word starter on how to get started with Python and opening. I did it or chapter GPT did it and stuff. And Claude though was doing the whole thing. And then it stopped at about 1800 words. And it's like, you've reached the maximum length. And I'm like, so here's the thing.

Fine. If that's your maximum length, you should have formatted the article for 1800 words, not just stopped. You were like writing me a 5,000 word and you just stopped at word 1801. Right. It's in the middle of a sentence. Like it literally just stopped. And it's like, I don't know what to do with that. That's not useful. Yeah. Yeah.

We'll see. Again, Jim and I, maybe they pull it out next year. Okay, so we need to record a bonus episode. I need to tell my harrowing Tesla story about trying to take it on a road trip. And Jay's going to be so mad. So let's... If you want to listen to the bonus episodes, you can support the show directly at primarytech.fm. Click bonus episodes. And if you do it there, then you still get chapters, you get ad-free versions of every episode, and you get the whole bonus episodes back catalog and future.

Or you can also support us on Apple Podcasts. You just don't get the chapters there anymore. That's an Apple Podcast thing, not me. We ranted about that a couple weeks ago. But anyway, support the show. You can hear the bonus episode, hear my harrowing Tesla story, and...

We'd appreciate a five-star review in Apple Podcasts. You can get a shout out at the top of the show. We'll have more debates that you could leave. Let us know if you use an AI tool in your five-star review this week, if you'd like to do that. And you can, of course, watch the show and subscribe to our channel there at youtube.com slash at Primary Tech Show, or just search for Primary Tech Show. It'll come right up. Thanks for watching. Thanks for listening. We'll catch you next time.