This is the Everyday AI Show, the everyday podcast where we simplify AI and bring its power to your fingertips. Listen daily for practical advice to boost your career, business, and everyday life.
Sometimes AI is a little bit of a dog-eat-dog world, and it's kind of like that this week for AI news. That's because Meta is essentially going after ChatGPT's slice of the pie. ChatGPT is going after Google's slice of the pie, and Google is just making their pie bigger, adding more and more AI features to some of their best products.
As is every week, this week's AI news was fast and plentiful, and we're going to be breaking it down today on Everyday AI. What's going on, y'all? My name is Jordan Wilson. I'm the host of Everyday AI, and this is your daily AI.
live stream, podcast, and free daily newsletter, helping us all not just learn AI, but how we can leverage it to grow our companies and our careers. I want you to be the smartest person in AI at your company or your department. So it starts here. So what we do each Monday, well, almost every single Monday, we bring you the AI news. So we cut through the fluff. There's so many things that happen every single day. You could literally spend an
We'll be right back.
I got a little bit of a cold allergy thing going on. So bear with me. I don't normally sound this nasally. So hopefully you can still get the AI news that matters, but you can get even that and a whole lot more on our website at youreverydayai.com. So each and every day, yeah, we do the live stream podcast, but then we give you everything else you need to know in our free daily newsletter at youreverydayai.com. All right. Enough chit-chat.
Let's get into it. It starts actually with Amazon. So Amazon and AWS have introduced Nova Premier, their most capable AI model. So Nova Premier is AWS's most advanced AI model designed specifically for enterprise applications, signaling a strategic shift
toward owning more control in the generative AI value chain, according to analysts. So Nova Premier supports multimodal inputs, including text, images, and long form video with a pretty big 750 word or 1 million token context window. So finally, Google has some competition in the 1 million context window with Google
now Mata and AWS.
So the new model Nova Premier is also more than 200 languages, support for more than 200 languages. And one of the standout features in this new model is support for model distillation within AWS Bedrock, which allows enterprises to create smaller, more efficient models like Nova Pro, Lite, and Micro by generating synthetic training data without requiring label documentation.
So AWS reports that a distilled Nova Pro model improved API invocation accuracy by 20% while reducing latency and costs, making it attractive for edge deployments and resource constrained environments.
So experts highlight that AWS's move with Noma Premier is less about having the largest model and more about controlling orchestration, pricing, and architecture to appeal to enterprises seeking sustainable and flexible AI solutions.
All right, moving from one tech trillionaire giant to another, Meta has introduced its new free Meta AI app powered by Llama 4. So Meta officially launched its standalone Meta AI app available now on iOS and the web at
at its MamaCon conference, as well as aiming to provide a more personal and integrated AI experience across devices, including Ray-Ban MetaSmart glasses.
So the new Meta AI app is powered by Meta's new Lama 4 mixture of experts reasoning model, focusing on learning user preferences, maintaining conversation context, and enabling seamless voice-first interaction. So users must log in with a Meta account and existing Facebook or Instagram profiles can be used for sign-on. So you don't have to just, you know, sign on with your social media account. You can just use a Meta account.
So what this means is instantly Meta is trying to compete with ChatGPT for users in a dedicated AI app. Because Meta right now has nearly 4 billion users across its platform. So they're trying to funnel as many of them as they can into this new Meta AI app.
Meta AI supports text, image, and voice interactions with a special Discover feed where users can explore and remix prompts and creative outputs shared by the community, blending social engagement and AI use.
Voice features, including a natural full duplex conversation demo, are currently limited to the U.S., Canada, Australia, and New Zealand. And the app does not yet access real-time web information. So that one is pretty big because even though right now you can go to Meta.ai and that does have a real-time web interface,
integration, the new Meta AI app does not. And that is important. And I think that may set Meta back a little bit as they try to impede on OpenAI's ground more or less. And Meta right now is also testing document editing and analysis features in select countries, signaling future expansion of AI capabilities beyond conversational and creative tasks. So
What I think is pretty cool here, the new Meta AI app integrates with Ray-Ban Metaglasses, allowing conversation history and settings to sync, though conversations cannot yet be started on the app and continued on the glasses. So I'll have to try that one out. My wife did get me the Meta AI Ray-Ban glasses, and I've been using them pretty frequently, but this, I think, is something that may get me to use them even more.
All right. Our next piece of AI news, if you can hear it in between all of my allergies, Google, big news here. They are opening up AI mode in Search Lab to all U.S. users, also adding some new shopping and local business features.
So Google has expanded access to AI mode in search labs now available to all users in the U S moving beyond the previous wait list system, making conversational and personalized search widely accessible. So AI mode integrates AI,
uh, Google's shopping graph, which updates over 2 billion product listings hourly from more than 45 billion total listings, including local store inventories and global retailers, enhancing the ability to find real time product availability and prices. Uh,
The new features include visual cards for local businesses showing ratings, reviews, photos and inventory updates directly within AI generated search results, helping users make more informed decisions about restaurants, salons and other services so users can now resume
past search sessions on desktop through a dedicated panel, allowing them to pick up where they left off and continued multi-step research or project planning without starting over. So Google is testing an AI mode tab in standard search for a small group of US users, signaling a potential deeper integration of generative AI into its core search product.
AI mode right now supports multimodal queries via the Google app on iOS and Android, enabling users to search using text, voice, or images, such as identifying products from photos or conducting hands-free searches. These enhancements reflect Google's focus on transforming search in a more interactive and actionable experience, helping users not just find information, but also make immediate steps like shopping or booking services.
Our next piece of news, and we covered this on Thursday, but Duolingo has replaced contract workers with AI and just released a bunch of new updates.
So Duolingo is an app essentially where you can go and learn new languages. And they made headlines this past week as they announced that they've replaced many contract workers with AI, allowing the company at the same time, they just released 148 new language courses in less than a year, doubling its course offerings that previously took about 12 months to build.
So CEO Luis Van An confirmed that AI now handles work once done by contractors as the company shifts towards an AI-first business model to automate repetitive tasks and reduce reliance on external labor. This workforce change follows a 10% cut in contractors last year when Duolingo began using AI for translations, signaling a steady move toward automating contract roles.
Duolingo plans to continue phasing out contractor jobs that AI can perform while focusing full-time employees on creative and complex tasks supported by training and AI tools. So,
Duolingo will also incorporate AI use into hiring and employee performance reviews, requiring teams to prove that new roles cannot be automated before adding new staff. This approach reflects a growing trend in tech companies similar to Shopify's recent directives, urging employees to maximize AI use before requesting additional human human resources.
It doesn't seem like Wall Street cares very much at all because their latest earnings exceeded expectations with a 38% revenue increase driven largely by its aggressive adoption of AI to accelerate content creation and improve efficiency despite some job cuts. And Wall Street has responded positively. So this is something that I think is going to...
continue to be a part of the conversation. And we talked about this, like I said, Thursday, going over Duolingo and Shopify, essentially saying, hey, we're going to start giving as much work as possible to AI. And essentially, before requesting new or additional human resources
uh, hours or human, uh, power, uh, those humans first have to justify why AI cannot do it. Right. So if you need a bunch of new people on a project, uh, you have to first justify, Hey, here's why AI couldn't do it and why we actually need more humans, uh, to work on this project. So, uh, I know it's kind of controversial, but I think it makes sense, right? Uh, a lot of people aren't going to agree with this, but I think this is the way forward. Uh,
Again, I'm not out here rooting for AI to take jobs, but the reality is big companies are doing this regardless of how you and I feel about it. So when I say this is the way forward, this isn't me encouraging CEOs to cut jobs. I would actually say the opposite because, hey, whenever one else zigs, you should zag. But this is going to be happening now.
probably at your company. So be prepared for it. I think this is really through the latter parts of 2025 and 2026. This is going to become the norm, right? If you want to hire for a new role, if you want to expand people already on a team, you're probably just going to have to justify why AI
can't do it, which I think is probably a good justification for all of us to make. It's something I'm constantly doing or roles that I used to hire humans for.
I give those to AI all the time, consultant roles, content creation, strategy, et cetera. I still hire human, especially human contractors for things, but a lot less. Our hiring, I'd have to go back and look, but even contractors, we used to have dozens of contractors on any given big client project. Now that number is a fraction of what it was, but I think this is just the way forwards.
So yeah, a lot of people won't agree with that, but Hey, well, these AI systems are more and more capable. You have to start thinking, you know, what's the best way to grow my business. All right. Our next piece of AI news, US president Donald Trump is facing backlash for posting AI generated image of himself as Pope. This is not a joke. This actually happened. So,
U.S. President Donald Trump posted an AI-generated image of himself dressed as the Pope on official White House social media accounts, sparking criticism from Catholic groups and public figures. The image came shortly after the death of Pope Francis on April 21st, and as the Vatican prepares for a conclave to select the next pope, making the timing particularly sensitive.
The New York State Catholic Conference condemned the image as disrespectful, stating there is nothing clever or funny about this image and urged Trump not to mock the Catholic faith during this solemn period.
So on Twitter, they said there is nothing clever or funny about this image. Mr. President, we just buried our beloved Pope Francis and the Cardinals are about to enter a solemn conclave to elect a new successor of St. Peter. Do not mock us. So the White House defended Trump, emphasizing his previous visit to Italy to pay respects to Pope Francis and highlighting his support for Catholics and religious freedom.
The incident highlights the growing influence and risk of AI generating content in political and religious contexts, showing how digital images can quickly stir controversy and impact public perception. The controversy also serves as a reminder for public figures and organizations to carefully consider the cultural and religious sensitivities when using AI tools, especially during times of mourning or significant events. All right, y'all.
put your politics aside this is not good this is not good right uh you know i'm from the us uh and i know people are going to you know send me hate mail and say how dare you cover this right this is world news this is the biggest news right now right when i open my phone when i open my browser
This is the biggest news right now. So, you know, I'm just putting that out of the way because anytime, you know, I cover something that's happening with U.S. President Donald Trump for better or worse, I always get a lot of hate mail either way. But let me just say this. This is not a good look for the U.S., right? Yeah.
Not a good time. Not a good time to be playing around with AI imagery. Right. President Trump was very heavy into using AI images during the campaign, which was maybe a little more allowable, but still not good taste, probably because of the way that he was using it.
But this is pretty bad. This is pretty bad. Let's just can we all agree on that? Right. Regardless of your political leanings, probably not a good idea for U.S. President Trump to put out an image of himself as the pope during this sensitive time. All right. Moving on.
So our next piece of AI news, Apple and Google are in advanced talks to bring Google's Gemini AI model into Apple intelligence with Google CEO, Sundar Pichai confirming ongoing discussions with Apple's Tim Cook throughout 2024. This was according to court filings. So the deal is expected to be finalized later this year, coinciding with the planned rollout of iOS 19, making it,
making it likely that a potential Gemini integration could be announced at Apple's WWDC 2025 and possibly appear in early iOS 19 beta versions.
So this partnership would follow Apple's existing arrangement with OpenAI for chat and GVC integration, which reportedly involved no direct financial exchange between Apple and OpenAI, but mutual benefits. It remains unclear if the Gemini deal would include payments.
So Gemini's integration would mark a significant shift as Apple increasingly incorporates third-party AI models into its Apple Intelligence Suite, potentially offering users multiple chatbot options such as ChatGPT and Gemini.
The move could help Apple navigate regulatory challenges by providing users the choice of default AI assistance important in regions like China and the EU, where regulatory scrutiny on AI and digital ecosystems is intensifying. So I don't care what models Apple wants to use, but please.
actually start using these and updating this Apple intelligence, right? I say this, I didn't buy a new iPhone just for the new Apple intelligence, right? Because there's some features that are only available on the newest iPhone. I was probably already due for a new iPhone, but this
Honestly, one of the reasons I'm like, okay, you know, there's a smarter Siri, right? Some of these other features that Apple teased as WWDC 2024 event are still not out. It may not be out for multiple years, which is also why Apple is facing multiple class action lawsuits.
for their reported delay and marketing tactics showing all of these Apple intelligence features that are not yet available. So, hey, if partnering with Google as well as OpenAI helps Apple make Apple intelligence useful, I'm all for it.
But please, Apple, do something. Right. My devices are not any smarter, really, than they were two years ago. Right. The fact that I can make an AI emoji or something like that, that doesn't help me in my personal life. That doesn't help me grow my business. So please, Apple, do something. Partner with Google. Anyone. Just make Siri useful. All right.
Our next piece of AI news, Anthropic is rolling out integrations for Claude, at least if you are in their higher tiered plans, their max, their team or their enterprise plans. So Claude is rolling out
which I think are some pretty new or sorry, some pretty impressive new integration features. But right now they are only available on those higher tier plans and not the baseline, you know, pro plan. But here's some of the connected apps.
apps, Jira, Confluence, Zapier, Intercom, and more. So this allows the AI in Cloud to access real-time project context, task statuses, and organizational data from those tools.
I'm interested in the Zapier one because Zapier connects to thousands of tools. So this could really change, I think, the utility of large language models. So by connecting your work apps, Claude moves beyond simple responses. It can perform actions such as creating tickets,
summarizing documentation and automating workflows across platforms within a single conversation. Also, Anthropic is rolling out their new advanced research mode, which is also in beta right now, only available on those higher tiered paid plans, which enables Claude to conduct
deep multi-source investigations lasting up to 45 minutes. So yeah, this is kind of like the deep research tools, but it can also go across the web and across all of those integrations, which is the thing that I am interested about. So it breaks down complex queries into smaller parts and gathers detailed reports with clear citations from web search, Google Workspace, and different connected integrations.
This extended research capability saves users or could save users significant time by generating thorough source backed insights that would otherwise require hours of manual work.
Right now, web search functionality is globally available to all cloud plans. They just rolled that out about a month or two ago. But this new deeper research tool that can work across all of these new connections is only available on the higher tiered plans. So at launch,
Integrations also supports 10 key services, including Zapier, Jira, and Confluence, Intercom, Asana, Square, Sentry, PayPal, Linear, and Plaid. And more integrations from Stripe and GitLab are planned.
So, uh, right now they are saying these features will be available on the pro plan soon. Uh, so hopefully they will be, and I'll do an episode on this. Well, actually, uh, live stream audience, let me know if you'd like to see an episode on this. Maybe I'll just, uh,
upgrade my plan to the max plan. Although, I don't know. I don't feel like doing that because I think, if I'm being honest, I think Claude's $20 a month plan is the worst deal in AI, if I'm being honest. Even, I'd say, a lot of people don't like Copilot Pro, right? Most people don't even know what Copilot Pro is from Windows or from Microsoft. But there's a web...
version of copilot it's different from microsoft 365 copilot and a lot of people don't even know about it or use it uh but if i'm being honest i think even microsoft co uh microsoft copilot pro for 20 a month is a better value uh than claude just because claude i've been saying this for so long the limits are so little i i can use claude for 10 minutes on a paid plan and hit my limit
So it's like, you know, they're almost forcing you to pay a hundred dollars a month for this max plan in order to use their simple services, right? You get more out of free plans from Google and from open AI than you do the $20 base plan for Claude. So I'll have to see if this new integrations feature is really worth it. If so, I'll go ahead. I'll bite the bullet for you all. Get that a hundred hour plan. If you, if, if, if,
If there's enough of you that want to see that integration at work. All right.
Two more pieces of AI news. We're going all the big labs back to back to back. So this one I'm excited about. Google has upgraded Notebook LM with Gemini 2.5 Flash. So that's pretty big. So previously, Notebook LM, which was our 2024 AI tool of the year. And if you have not used Notebook LM, why?
Stop like stop what you're doing. Stop listening to me. Go use it right now. It's amazing. Anyways,
Previously, it was running on their Gemini 2.0 Flash model, but now it has been upgraded to Gemini 2.5 Flash. And that's huge. Here's why. Because it introduces improved thinking abilities. Yeah, 2.5 Flash has the ability to use that hybrid reasoning or logic. Combine that with, you know, millions of words of your own data in a grounded model.
I can't wait to use this a little more. It just came out about a day or two ago, this new 2.5 update. So haven't had a ton of time to play with it yet, but already it's pretty impressive.
So this upgrade allows Notebook LM to provide more thorough and nuanced answers, particularly for complex multi-step reasoning tasks, enhancing its usefulness for users who rely on it for detailed information and analysis. So Notebook LM also, and this one's pretty big, they also added supports.
for more than 50 languages in the incredibly popular and often viral audio overviews or the Deep Dive AI podcast enabled by Gemini's native audio features, making content more accessible and easier to consume for users around the world.
So let's talk a little bit more about the Gemini 2.0 Flash model. So it's currently available as an experimental feature within the Gemini app. So if you haven't used it before, and if you really want to know the difference between Gemini 2.0 and Gemini, or sorry, Gemini 2.0 Flash and Gemini 2.5 Flash, you can go inside Google's AI studio and run the two models back to back, same prompt, and you'll see Gemini 2.5 Flash is much, much better.
So if you really enjoy notebook LM, like I do, you can probably understand why I am super excited about this. So Gemini 2.5 flash becomes the standard inside notebook LM and Google may change the access model for users, potentially ending free access to the 2.5 pro version of notebook LM.
For developers interested in integrating or experimenting with Notebook LM's underlying tech, Google has released a preview version of Gemini 2.5 Flash inside Google's AI Studio. So these updates suggest that Google's commitment to the evolving Notebook LM is important.
Strong. Yeah. Because we've seen like, I'd say early on notebook LM wasn't getting a lot of fanfare. And I think it was actually the AI audio overviews, right? The deep dive AI podcast that really blew notebook LM up. But yeah,
We did a video review on it. We talked about it on the show well before that. It was the first model that made it simple to ground in just your sources. So let's say as an example, if you upload a lot of sources about AI, right? A hundred different sources, all your information, and then you ask the model about basketball. It's going to be like, yo, I don't know basketball. It's
It's not going to hallucinate. It's not going to go use the web. It's just going to say, hey, I only have information about AI or whatever it is that you upload sources and documents to. So notebook LLAM, I think especially for beginners, is one of the best ways to get to use generative AI and see its value. Because one of the big problems by using other models that are connected to the internet
and that rely on their own internal training data first is hallucinations. So you either have to be, you know, pretty knowledgeable about how large language models work, how they're made, you know, how they think, how they reason, how they logic, how they search the web, right? Or,
Use something like Notebook LL that is grounded. So if you ask it a question that you haven't uploaded information on, it's just going to come back and say, hey, can't help you there. So a great way for beginners to get just a ton of utility, but also just cut down on hallucinations. All right. Our last piece of AI news.
Are you still running in circles trying to figure out how to actually grow your business with AI? Maybe your company has been tinkering with large language models for a year or more, but can't really get traction to find ROI on Gen AI. Hey, this is Jordan Wilson, host of this very podcast.
Companies like Adobe, Microsoft, and NVIDIA have partnered with us because they trust our expertise in educating the masses around generative AI to get ahead. And some of the most innovative companies in the country hire us to help with their AI strategy and to train hundreds of their employees on how to use Gen AI. So whether you're looking for chat GPT training for thousands,
or just need help building your front-end AI strategy, you can partner with us too, just like some of the biggest companies in the world do. Go to youreverydayai.com slash partner to get in contact with our team, or you can just click on the partner section of our website. We'll help you stop running in those AI circles and help get your team ahead and build a straight path to ROI on Gen AI.
Remember I said Meta is going after OpenAI. OpenAI is going after Google. Well, OpenAI, this is how they're going after Google. They introduced, which I think could be either hot or a flop. We'll see how the market responds and how much OpenAI continues to invest in this. But they've added shopping to Google.
chat GPT. So open AI has announced and has already started to roll out this feature to paid users, but they've announced that soon all chat GPT users, whether they're signed in or not on a
on a free account or paid will be able to buy products through shopping buttons integrated into AI powered search queries through checkout. Yeah. So that's, that's pretty big, but this would happen on the merchant's website, at least for now. So you won't be buying anything inside of your chat GPT account. Although they say that that could be coming down the road. So,
So OpenAI demonstrated how this features helps users research items. So I think the demo they did. So if you're looking up like a new office chair or an espresso machine, you know, instead of spending 45 minutes on Amazon or, you know, Google's shopping, you can just ask
And ChatGPT via this new integration is going to go through and look at reviews, look at articles, look at product write-ups, et cetera, to answer those questions conversationally. So ChatGPT already handles over a billion web searches weekly, which is bonkers.
including many related to shopping categories such as beauty, home goods, and electronics showing demand for AI-assisted product research.
Unlike Google Shopping, though, because Google Shopping mixes paid placements and organic results, OpenAI's product recommendations are currently all organic and not sponsored, focusing on genuine user reviews, which I think people might actually like, right? How many times have you put something in to Google Shopping and you see something that you think is amazing and you're like, oh, this is a dud. It's just some company put an ad out there because you see what is at the top.
And if you don't see it's clearly marked as an ad, you just automatically assume it's, oh, it's got to be good if it's coming up at the top of Google Shopping, and then you realize it's not.
So right now, OpenAI is not going to have that. It's just going to be organic content. And their product suggestions will prioritize conversational personalized recommendations by understanding the pros and cons, as well as user preferences, rather than relying solely on keyword algorithms. So yeah, OpenAI also announced a couple of weeks ago its new ability to kind of understand all the information from your chats. So let's say...
Let's say as an example, you always prefer, you know, higher end items, but you don't want to overpay. Right. So if you ask for, you know, a coffee machine, it might not show you, you know, Keurig. Right. But it might show you a Nespresso machine on sale. Nespresso sponsor the podcast. Come on. Drinking Nespresso almost every single show. All right.
Just throw that out there. But you know, that's, that's a good example, right? So if you're constantly having chats inside chat GPT about your preferences that it remembers, it's also going to work those into your shopping experiences. So ultimately this is like having a personalized shopper that understands your preferences and you don't have to waste a bunch of time. Right. So I'm, I'm pretty excited to start using this a little more. I'm not someone that
does a lot of online shopping, but I actually do a lot of product research. So I usually do a lot of product research, but I don't know, maybe I'm like 70 years old because I like to go into a store and buy something with my hands. But I like to do a lot of the research online first. So
Uh, that I, I, I, I think even for myself, even though I don't do a lot of online shopping, uh, I think I'm going to get a lot of use and a lot of time back on this as well. Uh, so the reviews chat GPT polls include a mix of editorial sources and user generated content such as Reddit and users can specify which type of reviews they want, uh, prioritize in their product recommendations. Uh,
So this comes at a pretty important time for open AI financially because open AI is gaming, aiming to grow revenue significantly with productivity.
projections of $125 billion in revenue by 2029, up from just under $4 billion last year. And affiliate fees may become a pretty big part of that strategy, though details remain unclear. So what does that mean? Well, you know, if you are using Chad GPT and you, you know, click on a product and you end up buying it, OpenAI is likely going to get
a part of that sale, right? So kind of affiliate marketing, which is very common for big companies to do. So this one could be something that if it's useful or if it's as useful as I think it could be, number one,
I think it could be a huge source of revenue for open AI. But number two, I think this could change shopping. And as much as I think the headlines are saying that open AI is challenging Google, which on the surface is true, but you know who, like if I am Amazonian,
Amazon, this is what I'm scared of, right? I don't think Google is going to be as concerned about this as Amazon is going to be concerned about this because I think this is a better, obviously, like, open your eyes.
AI is leaps and bounds better than Amazon's right. Uh, Amazon, yes, we talked about their, their new Nova models. Uh, you know, they're, they're, they're fine. I don't think they're anything great. Uh, obviously Amazon has a big, uh, financial, uh, backing for anthropic, but I don't know. Has anyone used Amazon's AI like amazon.com like in the Amazon shopping experience, Rufus, uh,
It's not good. Right. And I think where we're getting or sorry, where we're going as a as an AI first society is we don't want to waste, you know, 10, 20, 30 minutes.
researching a single product. So I think Amazon was trying to get ahead of it with their Rufus AI, which is okay. But if I'm being honest, like from what I've used of it, it's more of like you're searching just certain keywords, right?
inside reviews, which the reviews on Amazon, I think are a little gamed anyways. So I like that this new chat GPT shopping feature uses multiple sources for reviews and you can steer it and it's personalized. So yes, I think this is going to chip a little bit into Google shopping, which I use a ton, but I think ultimately I think this could hurt Amazon's pocket books just as much, if not more. So,
The new shopping feature follows other AI shopping efforts by OpenAI, such as the operator agent can also now control browsers for tasks like grocery shopping, though initial feedback on operator noted some clunkiness. Yeah, operator, great in theory, not quite there. But, you know, here's another means to the end if you're trying to use AI for shopping. I think, hey, also,
Should we do a show just on this? I think there's a couple of updates this week that I think could just be dedicated episodes on. So live stream audience, let me know what you think. Or, you know, if you're listening on the podcast, you know, you can always reply to the daily newsletter and be like, yeah, Jordan, cover this or don't cover this. Right. Ultimately, we do what you want. So you should just let us know. All right. That is a wrap.
the AI news that matters. Let me do a very quick, uh,
bullet point of the biggest AI news stories for the week. So first, uh, Amazon and AWS launched Nova premier, their most capable AI model meta launched their free meta AI app, uh, powered by Lama four, uh, to bring personalized AI to they hope millions or maybe billions eventually. So, uh, kind of going after chat GBT there and trying to get, uh, uh,
onto that action. Google has opened up its new AI mode to all U.S. users, as well as adding some new shopping and local business features. Duolingo was kind of in the news for its somewhat controversial call to replace a lot of contract workers with AI, but it looks like they're crushing it from a revenue perspective.
And, you know, putting out a ton of updates. U.S. President Donald Trump is facing backlash for releasing an AI generated image of himself as the pope, as the Catholic faith mourns the passing of the pope.
Next, Apple and Google are reportedly near an agreement to integrate Gemini AI into Apple devices that could be announced this year at WWDC. Anthropic has rolled out some new and pretty impressive integrations.
But right now, those are only available on the higher tiered paid plans. Google has upgraded Notebook LM with Gemini 2.5 Flash, as well as bringing over 50 languages to the AI audio overviews. And OpenAI has introduced shopping buttons in the shopping feature inside of ChatGPT.
All right. That was a ton, y'all. Thank you for tuning in. Thank you for, you know, sticking with me, even though I sound like this. Maybe this is your first time. I don't normally sound like this. My allergies are absolutely kicking my butt. And, hey, if you're in Boston,
Let me know. So I'm actually going to be partnering with IBM this week, starting, I guess, technically today and tomorrow. I'm in Boston right now for their IBM Think Conference. So pretty excited about this one. I'll be having a lot more information this week about IBM, maybe an interview or two. I'll be at the keynote on Tuesday. So make sure to tune in for that. So
While you're making sure to do things, make sure to tell your friends that this was helpful. Repost this. Tell someone about it. But also, most importantly, please go to youreverydayai.com. Sign up for the free W newsletter. We're going to be recapping all of these biggest AI news stories for the week, as well as keeping you up to date with everything fresh that is just breaking right now. So thank you for tuning in. Hope to see you back tomorrow and every day for more Everyday AI. Thanks, y'all.
And that's a wrap for today's edition of Everyday AI. Thanks for joining us. If you enjoyed this episode, please subscribe and leave us a rating. It helps keep us going. For a little more AI magic, visit youreverydayai.com and sign up to our daily newsletter so you don't get left behind. Go break some barriers and we'll see you next time.