Support for Waveform comes from AT&T. What's it like to get the new iPhone 16 Pro with AT&T NextUp anytime? It's like when you first light up the grill and think of all the mouth-watering possibilities. Learn how to get the new iPhone 16 Pro with Apple Intelligence on them and the latest iPhone every year with AT&T NextUp anytime. AT&T, connecting changes everything.
Apple Intelligence coming fall 2024 with Siri and device language set to US English. Some features and languages will be coming over the next year. Zero dollar offer may not be available on future iPhones. Next up, anytime features may be discontinued at any time. Subject to change, additional terms, fees, and restrictions apply. See att.com slash iPhone for details. Support for the show comes from Anthropic.
It's time to meet Claude, the AI assistant from Anthropic that can transform how your organization works. Imagine if your whole team had an expert collaborator who knows your company inside and out. With Claude, you can empower every person at your organization with AI that's both powerful and protected. According to Anthropic, they'll never train their models on your company conversations and content so you can securely connect your company knowledge and empower every employee with expert level support from engineering to marketing. Transform your organization's productivity and visit anthropic.com slash enterprise.
What is up, people of the internet? Welcome back to another episode of the Waveform Podcast. We're your hosts. I'm Marques. I'm Andrew. And I'm David. And this week, we've got a mysterious product that we want to talk about. It's sort of a weird one, but it exists in real life, and you've already seen it if you're watching this podcast. Trust me. We've also got Google Pixel leaks. Surprise, surprise. I know that doesn't happen very much. And also, we'll wrap it up with Google finally announcing some of its plans to challenge Microsoft and ChatGPT.
But first of all, can we talk about the moon for a second? And how the moon landing was fake? You still believe in the moon? Oh. Just kidding. I think the whole internet collectively had like a wave of this Samsung moon photo story. It kind of just washed over the internet for like a week. I made a video sort of breaking it down, explaining what's happening. To rewind a little bit,
Galaxy S23 Ultra has 100x space zoom. And it's not the first one. I think the S21 Ultra was the first one to have, or S20 Ultra? S20 Ultra was the first one to have space. It was called Space Zoom. Yeah, Space Zoom. And so Space Zoom...
You know, interesting name, but the idea is you can zoom in 100x to things really, really far away. And what's further away than space? I want to take pictures of things in space. And in some of their earliest and most interesting advertising, zoom in on the moon, take a picture of the moon. Cool.
MARK MANDEL: So that's what people have been doing for years. And I think in the past, we realized that Huawei, they were accused of faking their moon photos, which people would zoom in really far on the moon on some Huawei phones, and they'd take a photo of a blurry moon. And then somehow their photo would be really detailed and be like, what's going on here? And that had its old wave. That already washed over the internet. MARK MIRCHANDANI: Yeah, that was the P30 series, which was like the first phone with that periscope camera. And so they made a huge deal that you could actually get that close to the moon.
MARK MANDEL: Yeah. MARK MIRCHANDANI: And there was a huge controversy over whether that they were faking it or not. MARK MIRCHANDANI: And there's this whole section of the internet that's like, look at how bad the iPhone's camera is. It's just a blob on the iPhone. MARK MIRCHANDANI: Yeah. MARK MIRCHANDANI: And then so that comparison always happens. So now Samsung has this thing happen where I think maybe a month ago I made a short zooming in. We had a full moon. I zoomed all the way in. I just posted the clip because it was impressive on Twitter. And it had like 20 million views the next day. Like it kind of blew up.
And in that, I sort of said, well, at least they're not doing the thing that Huawei got accused of doing, of like overlaying the moon. But now they are getting accused of that exact thing. So what's interesting is we had a Reddit user do an experiment because he was sort of thinking that maybe Samsung was enhancing their moon photos a little bit. So he took a printout of a photo of the moon.
And actually, no, not a printout. I think it's just an onomatopoeia. You know, put it across the room, zoom all the way in to 50x, 100x, and take a picture of the image of the moon on a screen and...
It works. It looks like the moon. Then he takes a blurry image of the moon, takes a picture of that, and the end result is like dramatically sharpened, much more detailed than the original image could ever have, which leads us to all believe whatever's going on here, it's fake detail. Samsung is doing something. Turns out they've been doing this for generations since like S10 or something like that. There's a whole page of documentation on Samsung's site that talks about this.
Um, but I always thought it was funny because we always talk about like computational photography and I did the video on the iPhone's camera and I did another video called smartphone cameras versus reality and smartphone cameras. They just edit reality. They just edit all the time. And this one particular thing people really seem to get attached to, which is moon photos, which is.
as far as i can tell not a very common picture to take but it's such an obvious use case where people can notice that it's quote faking the image of the moon
It's with AI, et cetera. It's an interesting story. I just felt like that was a fun wave of internet. I think it's not common in the sense that we don't see everyone taking it, but it still winds up being, if you think about all the pictures taken of a singular thing in the entire world, it might be the most common because if you're thinking of people, you still have to think of them as individual people. So it does have this opportunity to be the one thing that everyone in the world can see and take a picture of. Is the moon the most photographed thing?
Of all time? Singular thing? I don't know. I mean, it's funny because it's tidally locked, right? So it's going to look the exact same from all angles of the earth on every single night. Prove it.
I could if you want me to. And so everyone's images are going to look vaguely the same. It's like those things happen on Instagram sometimes where people take almost the exact same photo and then they accuse each other of like stealing each other's images and they're slightly different, but they're almost exactly the same. Like the moon is the one thing that's going to look the same to everyone. So if you were going to run a quote unquote scene optimizer on the moon, it would be the easiest thing to do it on.
MARK MANDEL: Yeah. MARK MIRCHANDANI: And Samsung actually ended up releasing a statement this morning after we put our video out yesterday, or two days ago, or whenever we put that out. After multi-frame processing has taken place, Galaxy Camera further harnesses Scene Optimizer's deep learning-based AI detail enhancement-- that's a lot of words-- engine to effectively eliminate remaining noise and enhance the image details even further, which is a lot of words that mean very little. MARK MIRCHANDANI: Yeah. Engine. MARK MIRCHANDANI: Engine.
they have this diagram of like the pipeline of how it apparently works and it just shows like learning data low resolution moon photo and then it's just like black box neural engine it's just like some squiggly lines yeah beep boop and it comes up and it's like
They never I don't know they sometimes deny things, but they sort of use big words to say very little I think the only the only thing that I can definitely tell which is somewhat interesting is I think most people Functionally understand it as going camera sees moon camera recognizes moon Camera takes picture of moon puts it on top of my picture like just an overlay yeah, and that's not actually what's happening that would be
I don't want to say morphic. That would just be like more brute force. There was an old...
i think i saw like a vivo phone that wasn't zooming in on the moon but if it was a picture in the dark sky and there was the moon it was like straight up putting a fake moment in there and like way closer i'll try and find that photo again but it wasn't like the zoom into the moon just randomly would enhance it so this one's a little more interesting we kind of talked about dolly when it came out how you know generative ai can create a photo from static and find details in however much you want it to find details
This one is like, if I take a picture of the moon and it's kind of red tonight, it will still enhance it and find all the details of what it knows the moon should look like because Scene Optimizer is on, but it also will still be kind of red because it is still using my source image and running it through the AI. Oh, they thought they could get away with that? They called it Supermoon mode, so I think you were kind of expecting it to be that fake. Oh, it's a mode. Yeah, yeah, yeah. I guess this one's very different. Oh, my God.
So for the audio listeners, basically the image on the left, which doesn't have the optimizer on or whatever, is just like kind of like a glowing white, small glowing white dot. And then the resulting image after the photo was taken is a giant moon. Yeah. It's also a one times photo down like an alleyway. So you're seeing building and then in the sky, it's a small portion of it, but it's still like...
quadrupling the size of the moon. - That one's a little more hint. - That's hilarious. - Yeah, it's not doing that. But I guess the point is there are tests where you can put a bunch of pixels or let's say an airplane flies in front of the moon or a satellite or something happens where your moon actually happens to look a little bit different, your photo through AI will still have that distinguishing detail. So it's not just an overlay, it is actually running it through this engine, this black box of AI, whatever Samsung's doing.
And so that's what's happening. I just think it's funny that we talk about this so much, but there's no articles about how like, I don't know, you take... Your face? Yeah, the moon is, let's say it's one of the most commonly photographed items. Take another thing like Stonehenge, like a commonly photographed item. Everyone who takes a photo of it with Samsung's scene optimizer turned on, all of their graphs will be greener in all of their pictures. All of their skies will be bluer. And that's not a...
particularly surprising fact, it's just like the way these phones and their AI process images. They're editing these things all the time. We're just a little more reactive when it's the moon for some reason. Well, it's a particular object versus like a general kind of grass and like scene optimizer versus object optimizer is like a little bit different, I guess. But what happens?
When like take somewhere like New York City, right, where we're now we know of like six decades of pretty constant photography and a pretty limited area. Like at what point can these like a company like Apple or Google that has access to these giant amounts of photo sets with location data change?
just start to know what everything looks like from every angle. And then that does not sort of blur the line between scene optimizer and object optimizer. Like if it was like, oh, I've seen like 10,000 pictures from this exact intersection. I know that at 2 p.m. the light hits this building and makes it look crazy. So I'm just going to like
Do that. Yeah. Yeah. Yeah. I think when I think about that, it's the what is a photo question. There's photos of objects and photos of like people and scenes. Yeah. And I think if like you have a billion images of, say, the Empire State Building from 42nd Street and someone takes a photo from that spot, just like everybody else did, your phone recognizes it and sharpens up the windows and does the thing that it knows it's supposed to look like. I wouldn't be mad because that's
actually reality is what you're getting closer to. But where I would get kind of weirded out is if it's like a group of my friends around the campfire and then the phone's like, oh, I know what your friend's face around the campfire is supposed to look like and changes it. Like, I want to remember the moment, not the object. Does that make sense? I was going to say, like, at one point, your phone basically becomes a search engine. Like, that...
example with 42nd Street. Yeah, we talked about that. If you took a photo of 42nd Street, but it's like, oh, it's a cloudy day, but this actually looks the best when the sun is hitting it because it's 7 p.m., and so we're just going to overlay what it looked like at 7 p.m. in someone else's photo. Oh, okay.
Your phone becomes a search engine at that point. It's searching Google Images and then just basically just kind of overlaying. Luminar AI, like giving you a Skyro place. Yeah, like when you take a photo, it's less about taking the photo and it's more about searching a database and showing people that you are at this location that exists. I think it should always aim to improve the photo towards reality, not towards like better looking photos.
if that makes sense like if i take a photo and it's cloudy i don't want it to add golden hour like i want it to be more accurate because it knows what cloudy is supposed to look like and i'm actually in a cloudy scene
So I don't want it to enhance it and make it look better or different. Because we talked about this with the iPhone thing where it's like it lights your face evenly, even though there's no light on your face. Remember how terrible that picture of me was? Yeah, at night. And we're like, that doesn't look right because that's not what real life looks like. I would always want it to enhance towards natural, towards reality. And I think it gets kind of weird. Like the moon one is technically, technically enhancing it to be more accurate.
Right, I guess it's like it's making your photo better everyone's photo of the moon more or less should look the same because we all see the same moon I just don't want them to get to like enhancing it and making it look different Yeah counterpoint the majority of people buying these phones do want I was just gonna say from a marketing perspective They absolutely want your photo to look better I want my face to look better in photos if it's a cloudy day and it could be golden hour in front of the Empire State Building They're gonna make a golden hour
in front of the Empire State Building. - This is really interesting. You've seen this TikTok face filter thing that's going around? Have you guys seen this? - I don't have TikTok 'cause I'm a boomer. - Oh, wait, wait, which one? - Okay, so there's a new face, there's a new glamor filter in TikTok that is actually used, it's not an overlay anymore. It's a little different. It's actually generative AI. - Oh no. - And so it looks at your face. And so you know how on Snapchat, if you cover half your face, it glitches out and disappears? On this one, you can obscure parts of your face and it still perfectly applies the rest of the filter. It's called the glamor filter.
And people are universally weirded out by this. Like everyone's using it and trying it and getting weirded out. And that's the theme is because it looks like a super enhanced version of you. So I don't think people actually want things to look like super enhanced. It's a little weird. They just want it to look...
Accurate. In one month, no one will be weirded out and everyone will be using it. I think people do want to super enhance. In two months, everyone will just be wearing AR glasses and we'll just see it applied to everyone. The makeup industry will collapse and it'll just be AI filters. AR music. Are you trying the glamour? I'm doing it right now. The glamour filter? I'm pretty hot. I mean, you're just saying. Do I have the glamour filter on Maya? Wait, I want to screen record this. I just have the glamour filter in real life.
- It really does, it makes my eyebrows look super clean and like I just got them done. - And cover one of your eyes with your hand and it just goes, oh yeah, it's fine. - He's got the white thing in front of him. - Alice looks like a model. That's so interesting. Very weird. - Get ready for our mental health crisis to get even worse. That's all I'm saying. - Of course it's TikTok that pulls us off.
It's TikTok, so it's like Snapchat does the overlay stuff and they'll probably start doing AI generative filters too. Do people want that? I don't know. Yes, they do. I feel like two years. Yes, they do. And then it will be horrible for society in about a year. Do they want a subtly enhanced version or do they just want the golden hour? They want the crazy. 110%. Yeah. Really? Yeah.
That's not the reaction it's getting right now. Online dating is going to suck soon. I think the reason it's getting that reaction is because people are like...
being like people will be reactive about this kind of technology but at the end of the day people are going to use it every single day even if you just take one step back like to the photos we're taking though like every single photo we take of our face is getting like the remember the pixel 6 how over sharpened it was like that's not much different than this moon shot like yeah clarity 100 just like that's what the moon shot really feels like it's doing and that's what it's doing to our faces or it's
smoothing it so then all of the wrinkles are going away like I prefer similar yeah I guess those are in opposite directions though the sharpening is is theoretically more real more realistic and the smoothening is less realistic smooth smoothening smooth smoothing yeah this smooth smoothing the smooth the smoothie is less realistic
I mean sharpening is not necessarily more realistic. It gets to the point of less realistic. It just depends on like what level of sharpness and contrast our eyes see. Right. It's all based on like... I think our eyes see much more than a camera typically sees. So whenever you can enhance what a camera sees to get more of the detail that your eyes see. Like when you look up at the moon in real life, you see the craters and you see the moon. So when you point the camera at the moon because you want to capture the moon...
You wish it wasn't a blob. You wish you were getting what your eyes were seeing. So I think I like, I personally want to get more of what my eyes are seeing. For sure. Ideally. Yeah. As a tool, I think it's kind of cool because zoom in general on phones is pretty terrible. So like these super zoom into the moon doing that is cool. I hope it can get to the point where if I'm actually zooming in and trying to be like, I saw a grizzly bear at this park and like. Yeah.
I don't want to get anywhere close to it, but now it's just this brown blur on my phone. If they can enhance that, it'll be pretty cool. Don't overlay a bear on it. It's going to overlay a bear eating salmon because that's what bears do. In the middle of the woods with no water nearby. Yeah. So now we're just making generative art instead of taking pictures. It's just dolly. Sick.
MARK MANDEL: Yeah, soon we'll just get the Dolly phone, the open AI phone, where it texts all your friends for you because it knows what you would text like. It takes pictures for you because it knows what your pictures would look like. MARK MIRCHANDANI: I have a lot to say about that later in the episode, so stay tuned. MARK MANDEL: You guys might have been wondering earlier when I talked about this crazy wacky product that you've already seen if you're watching the podcast. MARK MIRCHANDANI: Oh, I forgot that we brought that up.
Is this it's right here? It's on my wrist This if you're an audio listener would be a pretty good time to switch to video because this is gonna be hard to explain or demonstrate Eric
Sorry, Derek. I don't know. Thanks, Derek. Who's Derek? Who's Derek? My bad, Derek. He's my favorite. He is someone, he's a friend of mine who admitted that he only listens to the audio version of the podcast. Derek. Derek. I mean, a lot of people do that. No, yeah, they do. I see you guys. You're fine. It's a podcast. I get it. Have we explained what it is yet? No, but I want to. Okay. I'll try for audio people. I'll try. I'm wearing a Huawei watch, smartwatch, and it looks like a pretty normal watch. It's a thicker one. And if you were wearing it like me, you'd feel it's a little heavier than normal.
But it's a smart watch. It's got a nice big circular display. It's got metal all the way around. It's got one button on the side and it's got a metal watch strap and oh my God, it opens up. You already forgot how to open? And if you can see that, there's going to be a Dope Tech video probably soon by the time this goes up that also shows these. But you open it up and there's earbuds inside and you just pop them out.
i was going to put in my ear as if i could put them on my headphones but these little tiny bullet capsule there is like a size i could swallow that it's the size of an actual pill like max mac is underneath this right now he could classically he would think that's food he would think that's food so these sort of magnetize into the top and they snap back down into the watch
I had very low expectations for this, and it's because my theory about 2-in-1 devices, and I'm sure many people have experienced this, is when you try to combine two different categories of products into one product, something's got to be worse. Usually both. Like when you get a 2-in-1 laptop and it's trying to be a tablet and a laptop,
It's usually not a great tablet or a great laptop, just being honest, but it's two-in-one, so that's cool. This is kind of along the same lines. The smartwatch loses some battery. It still gets...
like pretty good battery life what is pretty good what have you seen so while we watch gt claims like 10 to 14 days of battery i don't believe that at all maybe we don't believe it but this claims three days of battery life so it's worse than the normal but it's usable and i'm on day two right now i haven't done a whole lot on it so i'm at 82 because it's not connected to the phone but it's tracking my heart rate tracking my calories tracking my miles all that stuff it's just acting like a normal watch fine um the earbuds
They suffer because they have to draw battery and charge from the smartwatch. So that's going to take battery from the smartwatch. But they're also tiny. They sound like really bad. They sound not great. They have active noise cancellation. You can't really tell. It's not that good. And then the controls, you kind of have to tap this very tiny area. And it's supposed to also be able to respond to like you tapping your temple or the side of your head. Very hit or miss. Three hour battery life.
Before you have to recharge it. So to me, it's like, it's fine. It's like the convenience factor of you being in the airport and you get a phone call and you just want to grab it real quick. You just pop an earbud out, put one of them in your ear and you're on the phone just like that. That's awesome. You're never going to lose your earbud case because guess what? You're wearing it. Mm hmm.
That's cool. But I'm not about to sit down and listen and watch a movie on these earbuds. This is not for the long-term, high-quality listening experience. This is just a convenience play. And so for that, it was better than I expected, but I would still never get them.
Also, they have to be used on Huawei phones. This particular one, yeah. It has to. It's a Huawei product that pairs with a Huawei phone. Can we compare thickness here? We've got three pretty... This is weird for audio listeners. Sorry, audio listeners. We'll say it out loud. We have an Apple Watch Ultra, a Garmin Epyx Gen 2, and then... What is it? Huawei? Huawei Watch Buds. Watch Buds. So it's funny because the watch is called Watch Buds. Yeah. It's the Watch Buds. It's...
It's about the same thickness as your Garmin. It's actually a little bit chunkier. I would say with the bottom of it, no. It's thicker. It's thicker. It's thicker. Yeah. But it's got the shape of a normal Huawei smartwatch. It doesn't look crazy on my wrist. It looks like a normal big... It looks big, yeah. It looks like a big premium smartwatch on my wrist. It doesn't look out of place. I'm actually kind of impressed that it doesn't look ridiculous. And it opens up and there's earbuds inside and you would never...
think that looking at them. I actually kind of want to do one of those, one of those videos. I always want to do these videos where you go into New York City and you hand someone a tech product and you're like,
Tell me about it. What do you think? And I would do that with these and I don't think anyone would think there's earbuds inside. Nobody would think there's earbuds inside. - Why would you? - Well, yeah. Like I've wanted to do that video with promotion and give people like an iPhone, one with promotion on and one with promotion off and be like, swipe around. How do you feel about the screen? And just see if anybody notices. 'Cause I always have this theory that people notice as soon as they figure out what's happening.
I don't think anyone would notice. People have run those tests on high refresh rate versus low refresh rate and only a few people notice. Interesting. Yeah. It's like 30% of people notice. John did the like PPI on the XR and the 10R and the 10. Yeah. And like,
It was very even. That one... Like a lot of people didn't really. Yeah. I just want... I like to experiment with the framing of it. Like if you just hand them the two phones and say, notice anything, you probably won't get anything. But if you say, swipe back and forth a little bit, does one of them feel different swiping back and forth? Then you start to get people going, oh yeah, wait a second, this one feels smoother. And then they can't unsee it anymore and it kind of... That effect happens. That's a fun project I've wanted to do and just haven't done yet. But this one...
Nobody would suspect a thing. It's just normal. You're James Bond. It's fun. It is a very 007 kind of fun gadget. Oh my god, it's just boring. Is James Bond the good guy or a bad guy? I think he's a hero. I might be wrong. We're going to take a quick break, but before we do, we should do trivia.
Trivia. Okay. So we spoke about the Huawei watch buds. Sure did. But we want to throw it back to the OGs in the game. Yeah. So before Google acquired Fitbit, Fitbit acquired a company called Pebble.
Sure did. They had a very impressive Kickstarter launch. They raised about $10 million in like a couple months. What year did that happen in? The Kickstarter? Yeah, the Kickstarter. The actual Kickstarter funding. Wow. Okay, I remember some details around my life and I'm just going to have to work backwards. That's what I'm trying to do. Yeah. I'm going to guess. Okay, perfect. Solid. We'll do the answers at the end of the pod like usual. Until then, we'll be right back.
Support for Waveform comes from AT&T. What does it feel like to get the new iPhone 16 Pro with AT&T next up anytime? It's like when you first pick up those tongs and you know you're the one running the grill. It's indescribable. It's like something you've never felt before. All the mouthwatering anticipation of new possibilities, whether that's making the perfect cheeseburger or treating your family to a grilled baked potato, which you know will forever change the way they look at potatoes.
With AT&T NextUp Anytime, you can feel this way again and again. Learn how to get the new iPhone 16 Pro with Apple Intelligence on them and the latest iPhone every year with AT&T NextUp Anytime. AT&T, connecting changes everything. Apple Intelligence coming fall 2024 with Siri and device language set to US English. Some features and languages will be coming over the next year. Zero dollar offer may not be available on future iPhones. NextUp Anytime features may be discontinued at any time. Subject to change, additional fees, terms and restrictions apply. See att.com slash iPhone for details.
This episode is brought to you by Google Gemini. With the Gemini app, you can talk live and have a real-time conversation with an AI assistant. It's great for all kinds of things, like if you want to practice for an upcoming interview, ask for advice on things to do in a new city, or brainstorm creative ideas. And by the way, this script was actually read by Gemini. Download the Gemini app for iOS and Android today. Must be 18 plus to use Gemini Live.
all right we're back let's talk about this is a kind of a rare story in tech this almost never happens but we actually have a little bit of leaked information from a new pixel phone before it's even announced this is cra this is wild we don't get to this is like i try not to pay attention to leaks i want it to be a surprise and google stuff is almost always a complete surprise when it comes out they go on stage they go this is the pixel i go dang you did nice so
That's all sarcasm. This might be a bigger story than the yellow iPhone. It might be. So this time we just got a little taste of the Pixel 7a. Just a little breadcrumb. And the breadcrumb is someone has the entire phone already and took high resolution, detailed photos of every angle of it and put them online. Yeah.
And booted it up. And booted it up and has lots of interesting things to say about it. Fun. Okay, so we knew there was going to be a 7A because we've had the 5A and the 6A, and this looks kind of like the 6A again. But it's the 7A. The details are kind of just that it's going to have a lot of the same specs that we expected. It's two 12-megapixel cameras.
a single SIM, no headphone jack, flat screen, the metal bar camera on the back. If you look at these photos, they kind of have like this really subtle pattern. I don't know. How do we feel about this pattern? I kind of think it looks like when you take a case off that has adhesive and it just like
left a residue on the phone in a pattern, but maybe that's part of the phone. I think it kind of looks like the S you used to draw in high school or in like middle school in that one corner. I kind of like it. I think it like adds a little to it. Um, I think like since the regular models usually have the two tone, like,
The A series is always one solid color through, so it adds a little something different that differentiates from the top model. But yeah, it's kind of nice. Okay. Case slapped on it immediately. Yeah, that's a fact. But okay, two improvements, two main improvements why you would be interested in this new 7A. One is they're saying that there's an option for a 90 hertz screen.
which I assume is like a software setting that they found. Yeah. Like it uses additional battery, but would get smoother. So that's, I like, I mean, I'm interested. I think it's big also because the 6A was the one that had the,
the 90 hertz panel but no options and people were hacking ways to find it. So this looks like we're actually getting just the default option for 90 hertz. Which I think is why that they, maybe there's a little bit of a bigger battery or something, I don't know, but maybe this one has 90 hertz, which is cool. High refresh rate on this level of a phone is becoming more and more common. The other thing is it's had a glass back but now finally has allegedly 5 watt wireless charging.
Five watt wireless charging, it's not fast, but it is wireless charging. It's better than nothing. And so again, in a world where it's like in budget phones, you either just cut corners and nix features or like maybe barely include a feature. Like you'll see a phone with like four cameras on the back. You're like, wow, multi cameras. And they're like, one of them's a macro, one of them's a black and white camera, and one of them's a depth sensor. And you're like, you didn't have to spend that money. I don't want any of those. This one is like, some people really like wireless charging.
In a world of people buying the Pixel 7a, I feel like a lot of them are coming from a phone that didn't have wireless charging and it'll be nice and convenient to have it. Yeah. Just have it. Yeah. Trickle charge overnight. Yeah. Fine. I'm totally fine with slow wireless charging, if I'm being honest. Really? Yeah. Because most of my wireless charging is done over a long period of time. And I can do it so often because I can just pick it up so easy. Yeah.
I don't know. I don't really care. Mine is not done over a long period of time. So like fast wireless charging is really nice for me. Um, cause I, I don't really plug my phone in physically anymore at all ever. And I don't always remember to put on the charger at nighttime. Um,
Oh, that sounds like a you problem. Well, I usually do, but sometimes I'm like, you know, using my phone and I just pass out on the couch or something. I just forget to plug it in. So fast charging. Yeah. Fast charging would be nice. But like you said, so here's a real world example. My sister wants a new phone, but she only wants small phones.
She bought a pixel 6 and returned it because she hated how big it was even the regular 6 because she has girl pants and girl pants don't have pockets I asked her if a pixel 6a would work for her because it's a smaller device a little smaller and she doesn't really care about Literally anything except for the fact that it's a pixel, but she said oh I'm used to having wireless charging and I need wireless charging. Oh, that's a specific Okay, so needs to be small and needs to have wireless charging. Yeah, I was about to go straight to Zenfone Yeah, I also told her Zenfone. Yeah, um
But she's really hooked on pixels. MARK MANDEL: OK. Because Zenfone is the closest thing you can get to a pixel, but it does not have wireless charging. MARK MANDEL: OK. So I mean, I'm assuming pricing is going to be about the same as last--
Pixel 6a, Pixel 5a. It was a 350, something like that? I think it was more. I think it was $399. I thought 6a went up a little bit from the 5a, right? I mean, if they add 90 hertz and add 5 watt wireless charging, I wouldn't be shocked if it was at the top of what they were doing last time. Yeah, $449, yeah. Oh, $449. Yeah, it is now at $300. It's on sale for $300. Okay, see, $449, you start to get phones that have like, because this is going to have a Tensor chip, you start to get phones that have like
90 yeah 90 hertz sometimes 120 hertz sometimes already wireless charging yeah this is mid-range territory even with those even with these two new options i still think it should be 449 if they're just going to keep bumping the price up 30 or 50 dollars every then it's just good it's just going to be the six soon or you know the seven or the low end eight i think the pixel 4a was the best a phone ever and i think it started at like
3.49 or 2.59. They were like 3.50 at some point, right? Yeah, they started at 3.50. I think it was 3.49 and it was like such a good phone. That was one of my favorite phones ever. The 4A, yeah. It was so good. It was so good. Yeah. Well, we'll keep an eye on that. Now, believe it or not, there is actually more.
There's more Pixel leaks. I know this is crazy. This never happens. Pixel Fold, we've sort of been like anticipating for a while, thinking Google's going to do a folding phone every year. It's like, is this the year? And then the next year we're like, okay, this might be the year. It seems like 2023 is the year we're going to get a folding Pixel phone.
And there are some renders of it by people who have sources that seem to think it's going to look kind of like an Oppo Find N. So it'll be, if you remember the N, it's a slightly shorter and more passport shaped folding phone. So the screen on the front is, I think, more usable for one handed use like a normal phone. And then you open it and it's not a gigantic inside screen, but it is more screen. And you can do two handed multitasking, multi window, fun pixel stuff.
I am firmly, I'm bracing myself for this to be another classic example of Pixel is all about software and don't get it if you're expecting the most premium hardware. If you want the most premium hardware, you'll probably have to get the Oppo Find N or the Samsung or something else. But you basically can't get it in America.
Right. Or at least a Samsung phone that you know is tested from years of having folding phones. Yeah. And the Google advantage is going to be, well, it's a Pixel. So you're going to want all the magic of the software and the cameras and everything else. That's what I'm bracing myself for. I don't expect this to blow me out of, oh my God, there's no bezel. Oh my God, there's no crease. I don't expect any of that. Yeah. That's where I'm at. Yeah. But yeah. Which would make me sad. Yeah.
Jinx. Everybody's sad. Close, close. We're both sad and that's all that matters. Counterpoint though, the Pixel watch was actually really nice hardware. Really nice hardware. The software is almost where it kind of messed up. The bezel was rough. I do think the overall design made the bezel less annoying just because it looked good in the rounded edges. Yeah.
Yeah, it was good design. I would say that distinguished between good design and good hardware because when I picture great design a great hardware I also think like big battery like great optimization of space like all these nice materials and then with design it's like oh Yeah, you've got the bezels nice to the edge is nice and trim. It's handsome looking device I I think the pixel thing at least with the phones the phones is just like oh
look, it's a fine piece of hardware, but you're here for the software. That's what the Pixel thing is. Yeah. Yeah, but the rumors around this is that it may be announced or may come out on June 13th. I think announced June 13th. Announced. Well, they're saying they might show it off at I.O. That's my guess. Oh, yeah, released in July, potentially. Well, I wrote that. The date I saw was June 13th.
Which I'm guessing is an announcement because that would be around IO, right? IO's in May. Is it in May? Yeah. So maybe it could be. We have seen announced IO released... A month later. Early July. Yeah. A lot of the old A-series devices were always announced to IO and released in early June.
I remember one specifically that was in July. That's because it got pushed. The 4A got pushed to August. And then the 5A came out in July. Cool. Then maybe it is released June 13th. That'd be dope. That'd be dope. Yeah, if we got an A series and a fold in June...
I was gonna say like the Pixel Fold feels like one of those things that has been rumored for literal years and years and years. - Like the watch. - Like the watch. Just like the Apple Mixed Reality headset. They both seem like these products that are like literally never going to come out, but people talk about them all the time. Also like the Apple Car.
even though that's going to be a long time coming. MARK MANDEL: Shaskin' old. MARK MIRCHANDANI: But it would be really amazing if both the Pixel Fold and the Apple Mixed Reality headset actually came out in 2023. MARK MIRCHANDANI: That would be kind of sick. I think that would be longer odds, but it still could happen. The rumors point to Tim Cook going, it needs to be out this year. So that'd be cool.
Counterpoint, we'd have a lot less to talk about on the podcast if both of them actually released. How are we going to talk about rumors and speculations if they actually release? Oh, we just talk about the next generation. Yeah, I mean, we already are talking with the second generation Apple mixed reality headset. Yeah, true. Yeah.
Speaking of next gen, there are yet more Pixel rumors. Oh my God. Because of course we need to have a Pixel 8 and 8 Pro this year. That's, of course. This week we got every single Pixel that is going to come out this year. We got renders and rumors about. Isn't that funny? Pixel things leaking? That usually, that's weird.
This reminds me of, we watched the Eddie Burback video, right? Where he's like, what if there were 10 restaurants? I lied. There's this many. It's like, we have a leak. I lied. There's more. So Pixel 8 and 8 Pro, we're expecting. It's actually not too much, but I think the one thing that caught my eye from Twitter is 8 Pro flat screen. Oh.
That's really all I'm here for I that's all that matters. Yeah, that's it. You pre-ordered it already I basically have like verbally. This is my commitment Google Get me one phone of the year. I'll give you the dollars that you are going to charge for it Yeah, I think the the sort of first two generations of the pixel having that curved outer on the on the pro anyway Yeah, I would always use the pro
because it has a higher refresh rate, because it has a telephoto camera, but I would always wish it had the flat screen of the non-Pro, because I actually low-key like that a little more now. And I think this is the first year we're expecting flat on both. What if there's a chance we got flat screen on the Pro and 120 Hz on the regular?
That would be really hard to sell the Pro. Because remember, every time the Pro comes out... Whoa, I don't want to get into this again. We can save getting destroyed later. I'll summarize it. I'll compress it for you. The only difference between the Pixel and the Pixel Pro is higher refresh rate, bigger battery, bigger screen, telephoto camera. That's basically it. I think David's saying the A Pro is a bad buy.
What? You're just putting words in his mouth. I have not said a single word. Everyone tweet it, David. Leave me alone. God. Team Pixel is out for you right now. They're coming to get you. I never said a single thing. No, this is, I'm excited about that. I think we'll look forward to all these Pixel phones this year. I don't think we're going to get any more leaks. Usually Pixel stuff is really sealed, like tight. So I think this is about all we can expect to get. So laughs.
Probably not going to see any more of this, so enjoy it while you can. Run this joke into the ground. What's this? I would not have been shocked at all. Okay, that's a great place for a break, which means we should do one more trivia question. All right. In 2007, a company called Polymer Vision released an e-reader with a rollable screen. A rollable screen in 2007. I know.
Was this device called A, the Redius, B, the Roller Scroller, or C, the Scrollio? I know how excited you were to make that question. All right, I think we're all going to attempt to picture this device in our heads. We'll see how successful we are. We'll be back after the break. Support for Whiteform comes from Coda.
So when you're planning your work for the first quarter of the year, how many programs and platforms do you actually use? And what about for an entire year? Probably way too many. See, when you're building a successful startup, you don't have time to waste staring at a virtual mountain of spreadsheets and disconnected project trackers. That's where Coda's all-in-one collaborative workspace comes in. So with Coda, you get the flexibility of docs,
the structure of spreadsheets, and the power of applications and the intelligence of AI, all built for enterprise. Coda's seamless workspace can keep your team on the same page, facilitating deeper collaboration and quicker creativity. And that can all add up to better results because when your team isn't forced to focus on the organization of minutia, they can instead focus on the bigger picture. So don't spend your holidays dreading Q1. It's time to take that one off your work wishlist.
Get your work from planning to execution in record time with Coda. To try it for yourself, go to coda.io slash wave today and get free six months of the team plan. So that's c-o-d-a dot i-o slash wave to get started for free and get six months free of the team plan. coda.io slash wave.
Support for the show today comes from NetSuite. Anxious about where the economy is headed? You're not alone. If you ask nine experts, you're likely to get 10 different answers. So unless you're a fortune teller and it's perfectly okay that you're not, nobody can say for certain. So that makes it tricky to future-proof your business in times like these. That's why over 38,000 businesses are already setting their future plans with NetSuite by Oracle.
This top-rated cloud ERP brings accounting, financial management, inventory, HR, and more onto one unified platform, letting you streamline operations and cut down on costs. With NetSuite's real-time insights and forecasting tools, you're not just managing your business, you're anticipating its next move. You can close the books in days, not weeks, and keep your focus forward on what's coming next.
Plus, NetSuite has compiled insights about how AI and machine learning may affect your business and how to best seize this new opportunity. So you can download the CFO's Guide to AI and Machine Learning at netsuite.com slash waveform. The guide is free to you at netsuite.com slash waveform. netsuite.com slash waveform. Okay, welcome back, guys. So we have some new information about Google's basically chat GPT slash generative AI workspace stuff.
They are now officially integrating generative AI into basically all of Google Workspace. It's about time. Yeah. So this is their competitor to all of Microsoft integrating stuff into Office. And there's a lot here, actually. There's like a lot here. They're integrating it into Gmail, into Docs, into Google Slides, pretty much everything that Microsoft had.
But there's a lot of really interesting features. For example, you can say like, help me write an email about this and it'll just write the whole email. That's what I was hoping for. That's exactly what I was hoping for, because I was going to wait for an email client to do this first and like add a text generation thing where I could just be like, write me an excuse to not go to work on Thursday. And then it would just type this amazing email and you just hit send.
And then we're going to start testing. They start charging, start charging a fee because they can write your emails for you. Yeah. So now Google is going to do it and it's going to be free. Love that. Yeah. The official the official list that they say is they have draft, reply, summarize and prioritize your Gmail, brainstorm, proofread, write and rewrite and docs.
Bring your creative vision to life with auto-generated images, audio, and video in slides. Go from raw data to insights and analysis via auto-completion formula generation and contextual categorization in sheets. Generate new backgrounds and capture notes and meet. And enable workflows for getting things done in chat.
Oh, the Google Meet notes thing. I think they already said something about that, where you're on a meeting and it's taking notes about the meeting for you. Which is great. I love that. That's sweet. Yeah, some of these are really, really helpful. Some of these I'm a little bit concerned about.
Which ones are you concerned about? I'm a little bit concerned about the ones where it's writing emails for you. I'm a little bit concerned about the ones where like one of their examples is writing a job listing. I'm just a little bit concerned that the less effort people are going to have to put into these systems to generate
a lot of more information, the less likely they're going to be to actually proofread what it's saying. So if you say, write me an email about this and it ends up being either not exactly what you meant to say, or it added stuff to the email that you didn't ask for and that you didn't know about. And the other person gets that and they read it and they go regarding this point, what do you mean? And you're like, what are they talking about? I never said that. It's like you are, you're passing off.
information from your brain to an AI and in order to not have to do that work yourself and this has been a thing that I've thought about a lot in the last few years it's like we're now starting to offload storage from our brains onto hard drives you know to the point where we have to do less work we have to put less cognitive effort in and therefore it's being stored on a computer instead
And so now if it starts writing all these emails and writing these documents that say things that you didn't pay a lot of attention to, it's like, it's like when we were talking about before, when you're doing a test or you're writing an essay or you're doing a research project and you actually learn about the thing by proofreading yourself and going through something and making sure that you understand it, there's going to be less and less of that happening. And people are going to less understand what they're saying. Yeah.
In the future, instead of like, oh, that was autocorrect, it'll be like, oh, transformer error. You'll be like, wait, I thought this job said that it was two days a week that are remote. Oh, no, no, no, no, the diffusion model. Yeah, imagine that. It's like you read a job posting and it's fully remote because it's trained on most jobs being fully remote, but the actual hiring manager didn't want it to be fully remote. So that'll be the thing is maybe the skill becomes training your AI instead of--
I think that's it's also just like giant user error if you're doing it like that though. And I feel like in those situations where if you're messing that up, then the job listing you're posting to is going to be like, what do you mean by this? Oh, I didn't write that.
End of conversation. Like, I'm not going to do this anymore. The whole point, if you want to write a long, formal email, like, the whole point of it writing the email for you is you have to do less work. But you still have to proofread it. It's still less work. I don't think most people can proofread stuff. Well, then that's... I think that's their problem then and, like, ultimately it will weed people like that out. They'll still try and use it and maybe it'll work, but I think, like...
people who are just that's just the same people who are like copy and pasting work online for school projects and stuff like that they're just gonna fail those instances where they try and do it I kind of disagree a little bit I think that actually it's up to
to the engineers and the designers of these platforms to make it easier for people to proofread the work in the interfaces. And I also think that a lot of times people who make tools like this actually make it harder to self-check and to self-proofread because that is how you...
admit the flaws in the thing you've designed. Yeah, I guess in this current state where the AI tools aren't perfect, it is a skill to be able to take the output from the AI and turn it into something fully correct or fully useful. Like that's a specific skill right now. And then the question is when the AI tools get better and better and better,
Is your skill going to have to shift out of that correction and into some other phase where you're the one training the AI or you're the one like deploying the AI into like what it should be typing where?
i just think right now it's it is the skill you have to be able to take what ai gives you and turn it into something useful you don't just copy and paste it my question is like does it become because at the end of the day you're going to have to have people that are trained to proofread whatever the air generates and make sure that it's tailored to what they actually want and my question is going to be is it going to take them longer to have to proofread what the ai writes then it's going to take them to just write it correctly in the first place
It depends on their knowledge base. Like if someone's just a copywriter and good at researching and fact checking, then it's like you can give me any output and I can turn it into a useful thing. Yeah. But if it's like I need you to write, you know, job listings and job listings only, then maybe if you just get an expert on writing job listings, you don't need this thing that slows them down. They just do the thing that they're good at. Yeah. So yeah, the tool is just a fun it's a fun like
It's a fun time to be interested in this AI stuff because it's not perfect. And so the skill becomes turning it into something useful. What do you do with it? Where? I just think it's fun. I do think like in terms of you saying, could it take longer? It'll be dependent on the person. Like if you are a great writer, like you've written for so long of your life, it'll probably take longer for you because it will write it all out for you. And you're like, this isn't how I want to say this. This is wrong. This is wrong. This is wrong. But for someone who doesn't write, maybe like me, I could be like, this is a really...
couple quick thoughts I have. Can you write this all out? It's like, wow, that described it way better than I ever could have. And therefore, as long as I just make sure a couple of these things sound correctly to me, that will be way faster for me. Like I've written a couple like very short sub stack articles and I've had to proofread them 10 times because I'm just like, I don't like how I said that. Changed the whole paragraph and changes the whole meaning of everything. And then I had to rewrite the entire thing. Yeah, this is, it's really fun because also using AI,
as a tool, as a creator is interesting now. Like I have a certain writing style when I'm writing a video and
uh i don't necessarily script every single word but what i am doing is getting a list of points that i need to cover and say in a certain order and get the facts right and then the sort of connective tissue is just like in the flow of things and so when i am like creating something from scratch and i start with a blank page that first bit of like generating the skeleton of the thing i want to say
actually can be done by AI. Like that actually is a thing where I could plug it into ChatGPT and be like, give me a top five list of the most innovative, let's go, most successful tech products of all time. And then it just goes, hmm, all right. It does the research. It finds a list of five. Maybe I ask for 10 and I narrow it down. And then I take that and I go, okay, let me think. Which one of these 10 do I want to use? And I can narrow it down and pick the five best ones. And then I
write all of the connective tissue that would have taken much less time. So I actually successfully in that instance used the AI to help me write something. Even though it never gets published as written thing, it's just me talking. - And then in the video, David's the fact checker anyway, so he's gotta do all the hard work. - Yeah, so I mean, at least if it's about, this is also what I said in the video, if it's about a topic that I already know about, then when I get the AI output, I can diagnose what I need to fix.
If it's about a topic I don't know about, that's tough. Then I am spending a lot more time fact checking and source reading. MARK MANDEL: I will just say the concerns that I have around a lot of these features are only for a few things. Some of the other features are actually super useful and really nice. For example, the fact that it can take
bulleted notes and summarize meetings for you. Really awesome. - But can it unsubscribe from emails that I no longer want? - That's the first thing that I thought of when you mentioned email. - Gmail can already do that. - No, it can't. - It asks you. - For some of them, it'll have the unsubscribe thing on the top and you click it and then two days later, I'll still get an email from next day. - Yeah, it depends on the website.
MARK MANDEL: But AI should fix that. MARK MANDEL: That's the number one-- as soon as an AI can do that-- MARK MANDEL: That's all I want. That's all I want. MARK MANDEL: Yeah. I really like the meeting notes, the raw data insights
an analysis via auto completion and sheets is really helpful because you're basically just taking something that it's like I don't necessarily have this I don't know how to use Google sheets correctly and I have all this information it's just a button to like do that analysis for you that's good very useful oh yeah being able to like brainstorm things in Google Docs very useful so I think
Just like every other AI category, there's probably going to be a large amount of people arguing about what's okay and what's not okay. You could have people say like, oh, well, people are trained in Excel and you should be hiring an Excel person. Or another feature that's obviously going to be super controversial is that in Google Slides, you can just auto-generate data.
images and audio. And that's been the kind of highlight of most of the controversy recently because the images it generates are going to be based on a database somewhere. Depending on where they get that database, if it's an ethical database versus a regular scraping database, who knows? But I would say that overall, this is an example of boring computing.
which is where you get to integrate AI into something that just makes your life a little bit easier. Because Google has had really, really, really, really basic versions of all of this stuff for a really long time. When you're writing in Google Docs and you say,
I want to, and then it'll suggest that it finishes the rest of the sentence. You hit tab and it fills it in, right? It's just doing that, but with way more tokens. And then a lot of other random features like this. That's why it kind of feels like a natural evolution to me. Like the Gmail app right now, if I say my address is and hit space, it goes, oh, I know what you're about to write. And it says, if you just want to take that address, like you usually do, just swipe over, hit tab, it appears. And like,
So the skill is just knowing like how to trigger that and how to type and have that save you a little bit of time. And I guess now it's like, okay, autocomplete was just me going, press the first, you know how you'll like, you'll write your first word and then just hit the middle autocomplete word and just makes a sentence based on what it thinks. That's just because it's trained off of you typing a lot. Yeah.
It's just a little bit more advanced than that. Maybe I'm just a boomer and I'm a little concerned about this, but like, I just, thanks. I just, I don't know. I, I, I worry about a day where everything is completely written by AI and it removes every single ounce of individualism from our writing to the point where like,
Any email. You know, I know that you don't care when you get an email that's like very, very businessy. 80% of my emails are templates anyway. Yeah. Yeah, exactly. Exactly. Like all the PR emails we get, we get PR emails that say, hello, journalist or like, hello, YouTuber. And obviously they didn't even go in and do the work of filling that in and their templates. But like, I just, I am concerned about a day where you're like, tell Michael why I can't make it to his birthday party. And then it writes this long thing. Like, I'm so sorry. I've got a thing that's going on. And it's like,
you know it's i don't know it's a little maybe it concerns me a little bit but this is like controversial topic and i'm a boomer so i don't think you need to worry because soon ai will be able to tell your tone and your characteristics and they'll be able to write that in a very literally a black mirror episode where like the woman's husband dies and they download all his text messages and then they put him in a in a data-based robot here's um
maybe my hot take version of what you were talking about, which is like if you just write something and it just works, you just hit send. The question of writing an essay for school
If you put in the prompt into ChatGPT and it spits out an essay and it's like a C- essay and you just hit send, you deserve a C- because the prompt that you decided to give it and your knowledge of how to use the tools gave that and you're deciding to not proofread it, you deserve a C-, not an F, because the goal of...
assigning you an essay is to see how good of a response can you give to the prompt. And that's the response you give, given your tools. And that's what's gonna happen in real life. When you get a job in the real world, they're gonna be like, write an essay on this theoretically, and you're gonna give them the C minus thing, and you deserve a C minus, not an F.
But if you get really good at designing prompts for AI, hear me out, you get really good at getting certain words into ChatGPT and it kicks out just the right essay and then you proofread it a little bit and send that in and it's an A-plus essay, you deserve an A-plus because you did that work to create that thing. I completely agree.
i think that's a skill i think that's dependent on the thing you're trying to learn in school if you're trying to be a doctor and have to write an essay about how to save somebody's life you shouldn't have to have to take the time when someone's dying in front of you to look at gpt okay so if it's that's a great point if you have to learn things to memory to future help on the job site thinking then that you should fail like if it's a if it's right a history essay about why uh this monument was important for this
piece of history and time. Maybe you don't need to memorize all those facts, but if you go back and fact check and make sure your essay's good, you are going through the learning process a little bit. But if it's like, write an essay about why this type of surgery is better, and then you're doing that to become a doctor and do that surgery, that Shachi PT thing is not a skill you need to be a good doctor. You should actually...
Use the skills to be a doctor to get the doctor degree. It's like very specific Something I thought about is like when these generative AI can start doing computer science and coding for people which they can And there's all these tools that have been popping up in the last couple years like no code where it can just like Create websites for you create code for you the cons the problem there is that if everyone grows up not having to learn the fundamentals of things and they just are able to use these tools to make stuff and
There becomes a limit of how much you can actually build out from there. Or if you have a problem, how do you diagnose it? I mean, like photography. What? I feel like people don't really know how photography works, but they just hit a button on their phone and they have pictures. Well, yeah, but photography, it's like a photo. You can do a cool photo, but like you're not like transforming. You're like you're not creating the future of photography by taking pictures.
Like in computer science, if you're like building new tools or building new architectures or whatever, you still have to understand how the architecture works. No, but I mean, there will always be people that will do that. Like you, you are a photographer, but everyone can take a picture.
There's going to be a time where everyone can write code, but there are, there will be engineers that know how to write code. I would hope so. I just don't want us to get to the point where in like, in like elementary school and university, it's like click these buttons and then everything's done for you. And that's how you get an essay kids. Ugh.
I think there will be, there will, I think kind of like what you're saying, there's always, there has to be the general use of a thing and then the behind the scenes knowledge of how it works thing. And for most people in most cases, when you drive a car down the highway, you don't have to know how the car works. You just use the tool. Yeah. And when you, you know, take a photo with your phone, you don't have to know how it works. You just got a photo. Yeah.
But there will always be some group of people that is, I am actually interested in how this works. I want to get behind the scenes and build the things. Then you have to learn those things. You don't get to just use the tool. You actually learn the things. That's true. And so this is just one more tool that a lot of people are just going to use for normal things. And then some fraction of them will be like, I want to get these tools to be better. I'm going to learn how they work. And then that's the other group. I suppose it's an aspect of democratization, right? Where like many more people can build things
things that you would normally need to code for, but the people that want to make new stuff still have to be the ones that understand the fundamentals. It's just, there's a much wider smoth of people that are able to create stuff. Sorry. I was gonna say, that's how I felt when you were mentioning how like,
it can do things in Excel or can do things in slide, or you might have an Excel expert or designer. I see it more as it will make the average person who's maybe creating the concept of what you need that sheet for or that slide for can create it. And then when it gets into the professional aspect, it's like this person in Excel can do this a thousand times faster and do all of it. Or this designer, when we need to make slides for presentation, can see my vision and make it into a real thing. It almost feels...
I just had this thought while you were saying it. It's like chat GPT is when you call the help desk at any place. It's the first person you talk to that's reading off of a script and telling you all the things on a script. And then the experts are the expedited ticket that you've made it through all the steps and they can't help you. And now there's an actual expert sitting there that is like, oh yeah, I've dealt with this a hundred times before. Let me walk you through it.
I think the difference is that we're getting to the point where ChatGPT has dealt with this a hundred times before. I can walk you through it. That's the question. It's like, that's the question. If it does happen, I don't know if we're there yet. Well, I'm not saying AGI. I'm just saying that, like, I mean, especially with chat, maybe someday, like we're getting there quickly. Yeah. Is what I'm saying. I think the closer we get,
The harder and the farther we'll get if that makes sense the closer we get there's so much harder those last parts Yeah, the last parts of getting to the actual Human level is going to be so hard that that gap is going to exist forever. It's like cars a little oh, man Sorry, I love the analogy but I'm like everyone knows how to drive a car in the basic form of turn it on and
foot on the gas and steering wheel and if you ask somebody right now to like build a car it's just like there is a giant gap between people who know how to use a car and know how to make the car better drive like build the car is that what you mean i was gonna say i think it works better with a lot of people can drive a car but watch autonomous driving how long that's been going on and it is
It's so much better than it was, but it is so far away from an actual human driving. Yeah, true. Yeah. I think a lot of the Google changes do feel like a more natural evolution of what Google was already doing and not as much of a stark contrast leap away from what they were doing. And I think that that is good and smart. And we have talked about before, Google has everything to lose and other companies have everything to gain. Thanks.
And so them doing that is probably a good move, especially since they low key announced this. They weren't making a big deal about it. Whereas whenever OpenAI releases something new, OpenAI feels like the new Tesla or the new- Same guy. Whatever. They feel like the new Bell Labs. Whenever they do anything, the world pays attention.
And that has only existed for the last six months. It's insane how everyone is just locking on to every little thing they do, whereas Google is being a lot more low-key about all these things. I really want to try all these new Google features. I don't know when it's publicly launching, but it seems really interesting. I can't wait to have it write emails for me.
I cannot wait for that. I'm very excited. I'm nervous. I just think it's going to be funny when you ask it to write long emails and then everyone who receives the email is like, I'm not reading all that. Summarize that for me. And it's just like, within the AI, it's just going like, woo, woo. It's contracting and compressing. I love how you were like, I'm so worried that people aren't going to feel like they're getting my personality in my email. And then those people are going to be like, summarize this. Yeah, I don't want his full email. Yeah.
Amazing. Okay. Well, we've rambled at length about these. I think my summarization take is AI is a tool. For you, it's to cheat in college. Well, as in, it's a useful tool. I didn't cheat in college when I was in college. I'll put it that way. I wish I had this. I would have used so much at GPT. Last little thing. OpenAI actually did announce GPT-4. I'm using it now.
That's the entire thing. Yeah, which was in Bing for a long time, and neither Microsoft nor OpenAI would say that it was in Bing, but now everyone knows. But if you go on chat GPT with GPT-4, it can now parse images.
which is kind of a game changer in a lot of ways because you can literally show it a meme and say, tell me why this is funny. Oh, yeah. And it will tell you why it's funny. I saw that. That's the most boomer thing to do. I thought that was cool because it was like an iPhone charger that looked like a VGA cable and it's like, this is funny because a VGA cable is a clunky old piece of tech and it's being plugged into an iPhone. Yeah.
And then someone at the New York Times took a photo of all the food in their fridge and they said, what can I make with all of this? And it was like, well, you could use this thing that you have mixed with this thing to make this meal, which is pretty amazing. Yeah, now you're in. Now you're in. Samsung refrigerators with ChatGPT so I can look into my smart. You're at the grocery store. What should I have for dinner tonight? What do I need to make this based on what's in my refrigerator and what's in this store right now? It's like buy milk.
MARK MANDEL: Smart fridge, smart fridge. Alice is finally getting a smart fridge. MARK MIRCHANDANI: It also does way better on standardized tests. MARK MANDEL: I saw that. They literally wrote a chart of all the standardized test scores. It's crazy. MARK MIRCHANDANI: Yeah. It can process way more text if you put a lot of text into it. And then it's already getting integrated into a ton of mainstream products. So over the last few months, probably since Bing was released, they've been working with a number of different companies like Duolingo or that
carrot app that you were talking that you use for weather, the weather app to integrate different chat assistants. Obviously Duolingo already had like the owl chat assistant person thing, not person, but owl. Their name is duo duo. It is. Um, and so now obviously all the chat bots that are already within these apps are just going to get better, which I think is amazing. Right. I think it's great that you can have a more natural conversation with a chat bot. That's already in an app. I don't think every app needs a chat bot, but the ones that already have them are going to get better.
MARK MANDEL: Final question. What year of CES are we going to see the first refrigerator with a chatbot? MARK MIRCHANDANI: Next year. MARK MIRCHANDANI: Yeah. MARK MIRCHANDANI: Absolutely, 100% next year. MARK MIRCHANDANI: I think that's probably a lock. MARK MIRCHANDANI: I think we're all in on that one. MARK MIRCHANDANI: CES 2024. MARK MIRCHANDANI: Nice. MARK MIRCHANDANI: All right. Well, that's about it. We've talked at length for this episode about all these things. But of course, we need to get to the end, which is trivia answers.
Trivia. So scores, Marques has eight. - I'm falling behind. - Andrew has seven. David has 11. - David in the lead. - Okay. First question, then the music plays and you have until the end of the music to write down your answer. So we spoke about the watch buds, the Huawei ones. Before Google acquired Fitbit, Fitbit acquired Pebble after they had an impressive Kickstarter launch. What year did Pebble run their Kickstarter campaign?
And the music begins. Kickstarter. Oh, okay. I remember when I had the watch, so I have to rewind a little bit. Not a lot of time for rewinding here. You better skip. I think I'm thinking of the Pebble 2. I'm going to go with that. Not the 2. I thought the same thing. Because I remember the year the Pebble 2 came out. I have to do a little more Googling to get the original Kickstarter. Guessing is so much faster. Okay, I'm ready. All right, flip them and read. What do you got? Oh, wow. Different answers here. I have 2011.
2014 2015 We're both too late. All wrong. The year was 2012. That's why I erased. I know, I saw. Oh my god. I did have the correct answer down and then erased it. I needed that point. Cool, cool, cool, cool, cool. Everything's fine. I needed that point. This is a multiple choice one. Alright, another trivia question. Big smile. In 2007, a company called Polymer Vision released an e-reader with a rollable screen. In 2007!
This device was called A, hit it, the Readius, B, the Roller Reader, or C, the Scrollio. I'm torn. I'm kind of torn. I'm torn because I know how good Ellis is at making up fake names. Aw, you guys. All right, what do we got? I was torn between A and B, but I picked A. Let's go. Nice.
Well, I didn't. I picked C. The Scroleo? I thought you guys liked Scroleo. Scroleo sounds like an actual Silicon Valley product. Yeah. Well, I was trying to pick something that would be 2007 Silicon Valley, you know? So like Roller Reader may have been a Nook competitor. I don't know. 2007 was like kind of before Silicon Valley became Silicon Valley.
That's the first iPhone. Yeah. Facebook existed. Yeah. But it was like before the whole Valley was like startups. Startup culture. Yeah. True. Early. Like I used to go there all the time and it was not, it was not the way it is now. But David Xerox Park was there. I was going to say there needed to be an X in it if you want it to sound like 2007. Yeah. Some crazy spelling. All right. Thank you for watching and for listening. And of course, stay tuned till next week. We'll see you there.
Peace.
Support for Waveform comes from AT&T. What's it like to get the new iPhone 16 Pro with AT&T NextUp anytime? It's like when you first light up the grill and think of all the mouthwatering possibilities. Learn how to get the new iPhone 16 Pro with Apple Intelligence on them and the latest iPhone every year with AT&T NextUp anytime. AT&T, connecting changes everything.
Apple intelligence coming fall 2024 with Siri and device language set to US English. Some features and languages will be coming over the next year. Zero dollar offer may not be available on future iPhones. Next up, anytime features may be discontinued at any time. Subject to change, additional terms, fees, and restrictions apply. See att.com slash iPhone for details.