We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Sky and Our 2025 Shortcuts and Apple Intelligence WWDC Wishes

Sky and Our 2025 Shortcuts and Apple Intelligence WWDC Wishes

2025/6/1
logo of podcast AppStories

AppStories

AI Deep Dive AI Chapters Transcript
People
F
Federico Viticci
J
John Voorhees
Topics
John Voorhees: 我认为Sky与其他大型语言模型聊天机器人应用的不同之处在于它内置了约束,例如Shell脚本、Apple脚本和快捷指令。这使得它非常适合自动化任务,可以处理从简单的聊天机器人查询到复杂的自动化流程。Sky允许用户在工作时即时创建自动化,而无需像使用快捷指令那样事先进行抽象规划。Sky能够利用大型语言模型,根据用户当前的任务需求,即时执行或建议创建脚本。Sky使得创建和保存自定义Shell脚本变得非常容易,即使是不擅长Shell脚本的用户也能快速完成复杂任务。Sky可以根据窗口中的内容自动创建任务,并从iTunes搜索API提取信息,从而简化工作流程。Sky能够自动为iTunes和Apple TV的URL添加联盟令牌,并将网页和邮件的元数据合成为Obsidian笔记。Sky可以作为快捷指令、脚本和工具之间的桥梁,与其他Mac自动化工具相比,具有独特的优势。 Federico Viticci: 我认为Sky结合了大型语言模型的简易性和自动化的能力,具有很大的潜力。Sky的灵活性意味着每个用户的使用场景都可能不同,这使得它非常有趣。

Deep Dive

Chapters
This chapter explores the initial reactions to Sky, a Mac automation app leveraging AI. It compares Sky to other automation tools and highlights its unique capabilities.
  • Sky, created by former Workflow/Shortcuts developers, integrates AI with shell scripting, AppleScript, and Shortcuts.
  • It offers contextual automation, allowing users to perform tasks while working, rather than abstract workflow building.
  • Sky's ability to generate scripts based on user prompts simplifies complex tasks for non-programmers.
  • Sky's contextual awareness allows it to interact with the active application, extracting data for automation.

Shownotes Transcript

Translations:
中文

Hello and welcome to another episode of App Stories. Today's episode is brought to you by Notion and P. I'm John Voorhees and with me is Federico Faticci. How are you? Hello, John. How are you? Good, good. Did you see what I did there? It was like P rhymes with Faticci, kind of, sort of. Sure, sure. Okay. All right.

All right. Federico, Federico. I want to remind everybody who's listening today that they can go over to our website, maxstories.net slash podcasts, where they'll find a link to give us feedback. If you want to, you know, tell us what you think of our wishes, share some of your own wishes for various things. We'd love to hear it. Because today we're going to have our final guest.

episode of the wishlist series, we're going to talk all about shortcuts and Apple intelligence, because those things now are kind of in, you know, linked together in a lot of ways, thanks to App Intents.

They kind of go together, and so we're going to have a bit of a hybrid episode, so to speak. But I think before we get to that, we need to talk about Sky. We do. We really do. The new app from the creators of Shortcuts, Ari Weinstein, Conrad Kramer, and Kim Beverett.

left Apple a couple of years ago, and they revealed what they've been working on, this AI-powered digital system for the Mac called Sky. Now, I published an in-depth preview of Sky on Mac Stories, and I talked about it on Connected, and I shared a lot of thoughts on Mac Stories. So, you know, I've covered my...

My end of the bargain. But what maybe people don't know is that you have also been able to test Sky. I have. And this is a chance for you to talk about Sky from your perspective, which I think is very different from mine because we do different things on our computers. And so I wanted to get these first impressions about Sky from you, a different perspective about this app.

Yeah. And I think for me, one of the things that has really set Sky apart from just kind of your typical LLM chatbot apps, you know, something like ChatGPT or Claude or Gemini, is that it has built-in constraints. It has constraints for things like

shell scripting and Apple script and shortcuts. And in some ways that makes it almost like an app like cursor, but for your average, your average everyday person, um,

doing automation because it's perfect for doing all kinds of automations from that scales really nicely i think from you know you can do general chatbot kind of queries you can say i can say you know i can do a quick i can hit the hot key and say really quickly is stack overflow one word or two i don't remember and get an answer you know little little questions like that

Or you can create automations and use those automations as part of your workflow. And those can be based on shell scripting, Apple script, just general prompting as well as shortcuts. And I've been using it that way an awful lot.

and have really liked it because what it does is because you're trying to do these things contextually in the moment when you're working, you're not doing the kind of thing you would necessarily have to do with shortcuts where you're like, let me think about my workflow in the abstract and build something which is deterministic and rigid that will fit best with how I do this task.

Whereas with Sky, because it can leverage large language models, it allows you to kind of be in the moment working and doing a task and then just ask it to do it and see what happens. And sometimes it can just do it because it's a relatively simple thing and it's got a built-in process for that, whether it's related to calendar or reminders or mail or something like that. Or it

it will suggest putting together some kind of script. And what I've been doing a lot, and I think one of the most powerful things is, is I'm not a big shell scripter, but I can go into the custom tools section. I can choose shell script and I can describe what I want my shell script to do. I can say,

take this URL on my clipboard and replace the tokens in the parameters of the URL with these other things. And then the best part about it is, because it's automation, is I can save that with a simple name. And next time I have to do that, I can replace that. And it makes it incredibly easy to perform everything from the simplest small tasks to much more complex things. I mean, I've done things like

open up the app store and because the sky is contextually aware of what's in the window, I can tell it, create a task and to do list and

with the name of the app in the title and the details about the app in the description. And it just does it. It pulls it out of the window or I can have it, you know, I can, what I've actually done is hooked it up to the iTunes search API, which is a, you know, freely available API and just pull all that out.

and synthesize it into the pieces that I find important and have it dropped into the, have it dropped into the task. That's a lot faster than the way I used to do it. Um, I've done things like, as I said, I find an iTunes URL, an Apple TV URL on the web, and I can automatically append affiliate tokens to it. Uh, you can take a sky shot, which is a

form of screenshot with a bunch of extra metadata attached to it of a couple of different things say you know like a web page and an email the web page is for the app the email is from the developer and i can synthesize all that into one summary and drop it into an obsidian note so i have all that data in one place for when i sit down i want to write about the app you know

Obviously, a lot of what I've been doing is app related since we write about that kind of thing so much and we get information coming at us about it from so many different directions. But you can use it for all those kind of things and combine it as kind of a glue between your shortcuts, your scripts and everything and your tools, which is very different to me than...

what we already have with other automation tools on the Mac.

Yeah, that's interesting. Using it with the iTunes search API is totally something that you would do. I have a long history with that API. Yeah, I know. It's a terrible API. Yeah, really, really bad. But no, this is what's so great about Sky. I think the idea of combining the simplicity of LLMs and prompting and just having a chat

with a little bot, with the automation, like covering, this is something that I tried to explain in the story, like trying to cover this spectrum of like the simple tasks and the simple interactions, extending all the way to doing things like Apple scripts and shell scripts and shortcuts, I think there's so much potential about that. And I do, I mean, now, now,

sort of the waiting game begins to see, can they actually deliver this app by the summer? And over the next few months, I'm sure new models will come out from the companies that Sky and software applications are using as their backend. So it'll be interesting to see what happens from now in this closed alpha version that we have

to the point where the app needs to launch to the public, will they have more integrations? Like, for example, MCP is not working in the version that we have, especially with the Zapier flavor of MCP that we like to use. So it'll be interesting to see, and I look forward to more people getting their hands on Sky and getting their impressions because it's one of those apps where, because it's so flexible, everybody's use case will be different.

And that makes it interesting in my opinion. - Yeah, I would love to see something set up where third-party developers can kind of contribute hooks into their apps.

that Sky can use similar to what it's doing with some of the Apple apps, or maybe a way for users to share the things they've built. I mean, one of the things we'll get to shortcuts in a minute, but just the fact that I can go into the custom tools and use a prompt to create a short shell script that does something for me

is such a breath of fresh air because unfortunately shortcuts on the Mac is kind of buggy a lot of the time. And I feel like I'm fighting with the editor more often than not. Whereas if I just open up the preferences in Sky and ask for a script, I get what I want. It works fine.

And I'm off and running, which is just really, really nice. And also why I've been using shell scripts probably more than hooking in existing shortcuts that I have. Yeah. All right. Well, thank you for the first impressions. Thank you.

This episode of App Stories is brought to you by Notion. Notion AI is the all-in-one AI powered by your work and all-in-one place. It automatically captures meeting notes, instantly finds the exact content you need, and it drafts detailed docs for you. Plus, you can chat with the best AI models. Notion AI just became twice as powerful for teams, making it the best AI tool for work.

Notion is a terrific product because it allows you to gather information all in one place and view it in a variety of ways. Plus, you can collaborate with all of your coworkers. It's the kind of tool that really does it all. And with Notion AI, it really takes it to the next level because the AI lets you do things like create complex natural language searches, generate new content from things like meeting notes where you can pull out

tasks or summarize what everyone had to say at the meeting and share it with colleagues who couldn't make it. It's a really remarkable product at this point that can do all sorts of things. Plus, it's got, you know, there's new calendar and email integration too. So it's really the complete solution for helping you get your work done.

And with Enterprise Search, you can use a few keywords to ask an open-ended question for a single powerful search experience across all of your connected tools, unifying scattered knowledge right within your workplace, plus a quick AI summary of results.

You can search across apps too, including apps in the Microsoft ecosystem, Google Workspace, and common business tools like Salesforce. It can even search within PDFs.

Notion has top AI models built right in. Choose from the best AI model for the task, like GPT-4.1 or Cloud 3.7 Sonnet, and chat with it directly in Notion. No separate subscription or tab needed. Notion's used by over half the Fortune 500 companies, and teams that use Notion send less email, cancel more meetings, and save time when they're searching for their work.

They reduce spending on tools and they keep everyone on the same page. The fastest growing companies like OpenAI, Ramp and Versal use Notion AI to make processing faster and help their team stay ahead. Check out Notion, the best AI tool for work right now at notion.com slash Notion.

AppStories. That's all lowercase letters, n-o-t-i-o-n dot com slash AppStories to try the powerful all-in-one Notion AI today. And when you use our link, you're supporting our show, n-o-t-i-o-n dot com slash AppStories. Our thanks to Notion for their support of the show. Let's talk about shortcuts and Apple intelligence. All right.

Okay, so surprisingly, at least to me, I have more wishes about Apple Intelligence than I do about shortcuts. Okay. If anything, because the things that I shared about shortcuts, you know, my shortcuts wishlists over the years have been pretty consistent. Yeah. And the things I wished for never came true. So on one hand, like I could say, well, you can go back and listen to our previous shortcuts wishlist episodes, and many of those things still need to happen.

From personal automations on the Mac to more debugging tools to better stability to, I don't know, there's so many things that I mentioned. Let me ask you this, Federico. Do you think that given that history, the long history of unfulfilled wishes, do you think with app intents being the focus now that Apple has kind of put shortcuts on a shelf and it's not going to change much?

I don't know. Yeah, kind of, maybe. I do worry about that a little bit. I worry about that. I do think that Apple has made shortcuts on the Mac, especially the editor, slightly better over the years. It's more stable. It's faster. It's easier to work with. I personally find it easier to work on shortcuts for Mac than I do on shortcuts for iPadOS, for example. Yeah, for sure. I am concerned that if this year...

The focus of WWDC will be a redesign that, once again, the shortcuts team will just implement the redesign rather than, once again, address the underlying limitations, especially for power users. And I do think that I kind of understand that, too, given that the priority should be app-intense and should be Apple intelligence-based.

Given the tech landscape right now and the position that Apple is in, I think I would understand. I mean, I would love to see power user shortcuts features, but I would also understand why that's such a niche market for Apple right now that it's not a priority. True. Although I think given where Apple is with App Intents and Siri, if they get that working and if you...

are using this, you know, smarter Siri to do something in multiple apps at one time and it works, I'd really like the ability to save that as a shortcut. I mean, I think that there's a world where those two things go hand in hand because having to ask for the same verbal request over and over

It's not great. I mean, it'd be great if it's smarter Siri, but not if I have to say the same incantation every day over and over and over where I can turn it into a shortcut that I can click or tap a button would be a lot better in some circumstances. Yeah. Yeah. We'll see. I don't know what they're going to do with shortcuts, but I do want to talk about Apple Intelligents.

All right. And the first thing I would love to see in Apple intelligence is support for actually letting developers use the Apple foundation model that they have. So AFM. And this, according to Mark Gurman, is something that Apple plans to do to open up their own large language model to developers this year. So in case people don't know, Apple does have a large language model. It's called Apple Foundation Model, AFM in short.

It comes in two versions, AFM on device and AFM server. So AFM server is the one that lives on private cloud compute. It's

It's the cloud-hosted server-based version of AFM that runs on Apple's cluster. And there's AFM on device, which is the Apple Intelligence on-device processing. That is an LLM. It's not a chatbot. It does not currently power a chatbot, if you're thinking of ChatGPT or Cloud or that sort of thing. But it does power on-device processing for things like summaries,

Genmoji, the searches in the Photos app, that sort of stuff happens with a small model that is on your device. Now, that model is used for a variety of tasks, but from Apple only. And the rumors suggest, and I think it's a good idea, and it's something that I would like to see, that Apple is going to open up

AFM to third-party developers. That is all that Mark Gurman said, that Apple is going to open up their model to third-party developers. I think that's a good idea and I want it to happen, especially if there's a major update to AFM. So I do know that AFM gets updated on a regular basis. For example, AFM was updated over the past few months to add more support for languages for the Apple Intelligence rollout.

And so even though we only have the one technical white paper from last year about AFM, you can read it. It's online. There's a PDF from Apple's researchers that you can read.

I'd be interested to see if there's a major new version of AFM, like AFM 2 or something, that can be used by third-party developers to add LLM processing to their apps instead of having to use ChatGPT or Google or Cloud for everything. Right. I'm not saying that Apple is going to roll out an LLM capable of chatbot-like features.

I don't think AFM is a big model that is going to support, especially AFM on device. It's not a big model. It's only like a few gigabytes. It's like, I don't know, like a 3 billion. Is it a 3 billion parameter model? It's a really small model that can run on iPhones to give you an idea.

But it's a sort of model that can be used for summarization. So if you have an app, say you're making an RSS reader and you want to offer a way to summarize articles in RSS, well now you, in theory, you could be able to use it locally in a privacy conscious way at no cost just by using the official Apple frameworks and the Apple APIs. And faster too because it wouldn't have to go out to the internet. And it would be offline so it wouldn't need to use the internet.

There's also an argument to be made for the fact that Apple could also open up web APIs for using the bigger...

better version of AFM server. I'm not convinced that Apple is going to roll out private cloud compute APIs for third-party developers. Happy to be proven wrong, obviously. I could see it happening someday, but it's barely being used, I think, at this point. And I think that they would probably be concerned about load and all kinds of other things in that scenario. Yeah, but a small version of AFM, let's call it AFM2. Yeah.

I could even see a distilled version of AFM that is like AFM, but trained on DeepSeq or QAN or Mistral or any of these other open weights models. But a new version of AFM that runs on device and can do things like summarization, image classification, basic search, like that sort of like small on-device offline private stuff.

they should do it. And if you're a developer, you should consider it, especially if it works well. Yeah, I think so too. Although I do think that politically it would be hard for Apple to use the Chinese market

model to distill their model, I think they might. And that, you know, makes the case for something like Mistral, perhaps. I mean, there may be others that would be usable. I can't see them using any of Meta's models either, just because of the competition between the companies. You know, I am going to dive into a few things with shortcuts for you, Federico, because one of the things that struck me with Sky that we just talked about

is the ability when you're putting together a custom tool in the script editor to use the models to generate your script and test it. And, you know, Apple has had a run shell script action on the Mac since the first day that shortcuts arrived on the Mac.

but it would be so much better if it had a similar tool built into it, whether it's its own model or another one where you could get some help in putting together that script. And, you know, I'll take it even one step further. I feel like an AI developer

sidebar tool, a helper app, something like Cursor, an app that a lot of developers use to write their code, could work with shortcuts. Have something that is trained specifically for shortcuts that knows all of the actions on your device, all the parameters, and can suggest ways to accomplish things where you say to it,

I want to be able to take all of my tabs in Safari and convert them into Markdown and then send them to an RSS client. What apps and actions do I need to use and how would I put it together? I think that that would be a terrific way to kind of take

take people from the beginner step to a more intermediate or advanced step in building shortcuts. I mean, one of the things I've found in using Claude Opus 4, I happened to, I wrote about this for the club, but I spent a day over the weekend just fiddling around with writing a script that takes a

all of the items that I star in my RSS and deposits them in a specially formatted document in ReadWise Reader. It's a 500 line script that I built basically through trial and error with Claude. And I think if you had shortcuts with something on the side that could help you, it would go a long way to helping people get more comfortable and dig in a little deeper with shortcuts in a similar way.

Yeah, I really like what Zapier has done with their co-pilot feature. So when you go to Zapier now, if you want to create, especially if you want to create like a complex multi-step automation, you can just type what you want. And the co-pilot that they have, I'm not sure if it's based on Cloud or something else. I think it is based on Cloud, but it's been trained. It's been trained on Zapier.

and it can assemble the skeleton of an automation for you. And it goes a long way toward making it easier to create an automation in Zapier. And it even guides you through the multiple steps. Like it tells you, okay, now go there, fill in the stuff that you need, and then come back. And it's a walkthrough for putting together an automation based on natural language that

dramatically simplifies the process of creating automations in Zapier. I would totally love to see the same thing in shortcuts. Even if I do know shortcuts quite well, but there are those times where I have this idea in my head and so to go from a long idea to the... It's basically like the blank page syndrome. If you're a writer, it's the same problem if you're a shortcuts creator. And so...

Having that in shortcuts with a little bit of help from the AI would go a long way, I think. But I'm not sure if Apple has a model that has been trained on shortcuts, given how they're behind on everything else. I know, I know. But they should, and I hope they consider it in the future. This episode of App Stories is brought to you by P, a different kind of water app. Feel better every day by adopting one simple habit.

Most hydration apps ask you to tediously track your intake. Pee takes a simpler approach. Just tap a button when you pee. That's it. You'll get reminders to drink water when you need it most. This one small habit helps you get in tune with your body. It sounds weird, but it really works, and users love it.

The Apple Watch app is a favorite too. Log in seconds on the way to the restroom and keep an eye on your hydration right from your watch face. You can even add friends privately through iCloud so you can ping each other to hydrate. It feels great to get a nudge from a loved one when you're slacking or slumping. Pee is crafted by a solo indie developer who cares about thoughtful design, privacy, and helping people stay hydrated. It's a free download.

So try it today and see how great you feel after just a week. Stick with it and you'll find it's more than just hydration. It's a whole experience packed with thoughtful features and surprises.

Search for Pee Water app in the App Store. And here's a fun promo for App Store's listeners. Once you've downloaded Pee, you can send the developer a secret message through the app. If you're one of the first 10, you'll get a free year of premium features. Pee, elevate your hydration habits. Our thanks to Pee for their support of the show. In Apple Integra, I would love to see more third-party integrations.

In addition to ChatGPT, I would love to see a Google Gemini extension. And I would love to see an Anthropic extension to use Cloud. I think using Cloud with Apple Intelligence, Cloud with writing tools, especially given the, you know, Cloud is the best model for prose and English. And it's also very good in Italian. I don't know about other languages. Cloud is the best model for writing. I would love to see Apple Intelligence become a provider.

for all of these things where you can use them for free or if you have an account, you can log in with your account and maybe you can use different LLMs for different tasks in Apple Intelligence. It seems like Google Gemini is pretty much on lock for WWDC and iOS. Right. 26. It's not...

It's not going to be called the 19, it sounds like. But I hope that Apple intelligence, especially now that Apple is behind and they won't have their own large language model, why not embrace the third party ecosystem?

I still kind of wish Apple would buy Anthropix. Anthropix probably too, maybe too valuable, maybe too big of a, you know, too big of a company for Apple to swallow at this point. But it's also very good at code. And one of Anthropix problems is resources because you run out of tokens fairly quickly. And I have run into the same problems you have where it's like, oh, we just can't reach our server. You're going to have to come back later and try this.

And if you're in the middle of something, that's kind of a problem. Yeah, it sounds like the folks at Anthropic...

spare no expense for training. They are one of the best in class for training and preparing new models, but they always struggle with actual delivery of inference at scale. They just don't have enough. Anthropic is often down and the API has more outages than other providers. So they could use a company with- Trillions of dollars. Trillions of dollars like Apple.

I don't know, maybe it's too big a fish, you know? Yeah, that's kind of my thought. Okay, let me give you another shortcut, shortcuts wish, which is I'd like to be able to search for actions based both on the

both on the the kind of action but also the app providing the action so i want to be able to say find me find me rss related actions from unread or i don't know i did

I feel like there's both searching and filtering that could go a long way if you have a lot of apps installed and a lot of overlapping actions. Sometimes it's very hard to find the one you want or to find the best one or just see what all your options are without just kind of scrolling through endlessly long lists by app. That's a nice one. I will go with...

I keep mentioning this idea of Apple embracing the third-party ecosystem. I mentioned cloud-based models. Now I'm going to mention local open source models. I want to be able to install local models on my devices, especially on phones and tablets, with a unified framework that doesn't require me to download the same model, the same

three gigabyte file over and over in different apps. Just let me download it once and use it across my operating system, across multiple apps from one common source of data. I'm basically thinking of a system that is similar, and you may think this sounds strange, but see with me. I'm thinking of a system that sounds similar to how you can install fonts on iOS and iPadOS.

you download and install a font once and it's available everywhere. Now imagine if you could download an open source model once and it's available everywhere. So imagine having a model picker as a UI element that lets you choose one of your locally installed models on your iPhone or on your iPad or on your Mac.

and just not having to waste space and your internet connection to download the same local models multiple times over and over because they're all in different locations. And going one step beyond, Apple should do this with their own MLX framework. The MLX framework is a framework to optimize locally installed models for Apple Silicon.

they should totally have an MLX app, like a little directory that doesn't require you to go to Hugging Face, which is a popular third-party website for the open source community to discover open source models. Just have an official MLX directory where you can go in and find all of the MLX models from the community and install them and make them available everywhere on your devices. And maybe this is wishful thinking because it's...

you know, the idea of a proprietary gallery maybe runs counter to the idea of open source. But I think there's a way for Apple to simplify this and embrace open source, but also make it easier for developers and consumers to install local offline AI without having to waste multiple gigabytes and multiple gigabytes

of their internet connections. Yeah, yeah, I agree. I mean, I think the idea of having that kind of app is a really compelling one because it could be a little bit like the font app. I think the idea of having a gallery where you can choose from things that are going to work, you know, and it could even tell you whether it would work on your

particular model of Mac or not, because whether you have enough, you know, memory installed, that kind of thing, and could also be used to manage what you've already installed. Maybe you need to free up some space. So you're like, all right, I'm going to get rid of this one because it's bigger than this other one. I don't use it that much anymore. I think that that's a really, really good idea. I'd like for shortcuts Federico to have a way to visualize them where I can see them in sort of a mind map and

And especially if it's a shortcut where maybe I took the app, one of the apps that's a dependency off my iPad and I built the shortcut on my iPhone. Show me what those dependencies are that are missing. Show me. I mean, yes, when you try to run it, it'll tell you you don't have this app. You can have a button that says, you know, go to the App Store.

I want something that's a little more visual than that and kind of gives me a rundown of, oh, you've got to install these things first, then this will work. Or tells me alternatively, this has some Mac dependencies and it'll never run on this device. You know, something that really makes that aspect of shortcuts easier. Yeah. Let's see. What else do I have? If there's going to be an AFM framework to use the Apple model on device,

I would love to see some shortcuts actions for it. And I know that you probably won't be able to chat with this model, especially if it's the kind of model that is optimized only for specific tasks. Like,

By that, I mean, if it's a model that is optimized for, say, summarization, and you try to chat with it, you're not going to get a response. You're going to get a shorter version of your question because it only does summaries. That's what I mean by specialization.

But I would love to see those as shortcuts actions. Like if that's possible, give me an Apple Intelligence summarize action in shortcuts that is going to take my text and spit out a shorter version of that text so that once again, I won't have to use a third-party LLM or third-party API for that sort of task. So having that kind of on-device natural language processing, even if it's just for some tasks, I think it will go a long way toward

Opening up a new kind of shortcut creation on all devices. Yeah. Let me take it one step further, Federico, in terms of opening up new kinds of shortcuts. And that is open up shortcuts to MCP. Because, you know, a lot of what we're seeing, Sky is a great example of this, is that automation is happening more and more through the web because that's where the AI models are based. And MCP is a web-based platform

protocol for connecting those models with databases. And if you could find a privacy first secure way to hook shortcuts up to MCP, that would open that whole world up to shortcuts on device in a way that would be really meaningful because I do have, you know, there's part of me that feels like

shortcuts is getting left behind by the web and AI tools because we're just not there with smart Siri. App intents may get there someday. It's not there today. And by the time it gets there, will anybody care anymore? I mean, there is a timing issue here, I think. And opening up shortcuts to be more friendly to large language models would go a long way to kind of filling that gap and keep interest in shortcuts going.

in the meantime. Yeah. Well, I think that's about it for my list of Apple intelligence and shortcuts. I really don't know what to expect. I think it's going to be interesting. I saw some people say, no, Apple will stay quiet about Apple intelligence this year. I don't think they will. I think they won't make their same mistakes again. I don't think they will show off the app intents. I don't think they will pre-announce things that are not ready.

But I do have a feeling that we will see Apple Intelligence features. And I don't know, just my personal theory, I think Apple will open up to the third-party community some more soon.

Also because that's one of the few things they can do at this point. Yeah, they should lean into MLX. MLX is a bright spot for Apple in the AI world. And there's things they could do with that, I think, as you've kind of suggested, that could go a long way this year at WWDC. Yeah.

Yeah. All right. Well, we've done it, John. We have our wish lists. Boy, we have so many wishes. Let me just say it one more time. For iOS, iPadOS, macOS, tvOS, watchOS, and visionOS, 26.

Oh, yeah. I'm trying to get used to it. I'm trying to get used to it. Yeah, get used to it. Because, you know, developers may still have to deal with other numbers. There's a possibility that under the hood they'll still have to deal with those individual numbers. But you know what? It's going to make our lives easier. So that's good. That's very good.

All right, everybody. Thanks for joining us this week. This episode was, of course, brought to you by Notion and P Federico. And I will be back next week and we'll be back.

from WWDC. Don't expect an episode on Monday. We're going to be doing things a little differently during the week because there's a lot going on. But we'll be back with some fun stuff. In the meantime, you can find us on MacStories.net. We are also on social media. Find Federico by looking for at VITICCI. That's V-I-T-I-C-C-I. And I'm at John Voorhees. J-O-H-N-V-O-O-R-H-W-E-S.

Talk to you next week, Federico. Ciao, John.