Hello and welcome to another episode of App Stories. Today's episode is brought to you by BetterTouchTool and Notion. I'm John Voorhees and with me, always and forever, is Federico Vittici. Hello. Hi. How are you, John?
I think Always and Forever is from a song, and I can't remember which one, but I'll have to figure that out. Sounds like a song from the 80s, maybe? Probably. Something like that. I think this is a more recent vintage, but we'll... I don't know. We'll see. We'll see. How are you doing, Federico? I'm doing great. We have planned an entire episode just about automation, so that's...
That's always fun. And I've been playing around with a lot of different automations lately. So I'm very excited to do this. Oh, that's good. That's good. Yeah, no, we did an episode about shortcuts helper apps a couple of weeks ago. And we put a little palette cleanser in between with Mark Gurman, which was a really great interview. I had a fun time talking to Mark.
But today, we're going to get down to talking about automations. And this is not strictly a shortcuts episode, because I think one of the things that...
I see as a theme for us and for the App Store and the Mac platform in general, or the Apple platforms in general, is that automation is really about a lot more than shortcuts. Shortcuts is an important component of it, but there's a lot else of other things going on. And you and I have been experimenting with other things, including some AI related things, as well as other apps and tools.
So we thought we would talk about some of those here today, both during the main show and then for App Stories Plus subscribers. You can go to appstories.plus to learn more about that. But for the post show for App Stories Plus subscribers, we'll be talking about some bonus, maybe a little bit more complex automations that we're doing. So we'll start off with picking three each during the main show, and then we'll move on to a few bonus ones after that.
Yeah, that sounds great. So the thing that I've done lately for the past month that has really changed my relationship with automation is the fact that I've been using a Mac mini server.
as part of my automation workflow and automation setup. So this is part of this journey that I've been on in terms of rethinking and optimizing all the things that I do. But obviously, having a Mac Mini for automation... A Mac Mini as a server can...
It can serve all kinds of purposes. You know, there's people who use it as a file server for things like Blacks, for example. There are plenty of use cases for having a Mac Mini that is always on, always available. In my use case specifically, this is an M4 Mac Mini.
that I've been renting at Mac Stadium. This is just a personal Mac Mini that I've been using for the past month. And I've had so much fun sort of extending my shortcuts-based automation with the many more automation tools that you have access to on a Mac, right? And I'm going to talk about some of those in the post-show for AppStories Plus listeners. And one of the things that I realized lately is that
I don't want to give up on shortcuts. I think for certain types of workflows that I have, shortcuts continues to be the most intuitive UI of all the automation platforms that I've seen, right? I've been...
playing around with Tasker on Android, for example, which is often mentioned as a more powerful alternative to shortcuts. And it is in some regards more powerful than shortcuts, but boy, is it less intuitive. Like the UI, even the fact that, you know, when you're using Tasker and you have a similar setup, right, with actions sort of flowing from top to bottom, but even just the fact that you don't have visual variables, like you don't have a UI to determine variables in a shortcut, right?
All you can do is you can invoke a variable by manually typing the percent sign and then the name of the variable. So you know how in shortcuts you have a... Yeah, it's more like regular coding in that sense.
It's more like regular coding, but it means that there's not... It's all plain text, right? So if you have a URL variable, you don't have a nice looking variable picker. You just got to type percent URL and that's your...
manually assigned variable. So it's very more, much more programary from that point of view. And I think one of the greatest, one of the greatest aspects of shortcuts is the fact that it combines, especially for advanced users, the, the programary nature of, of putting together a complex shortcut with a really well-designed, really intuitive UI that, you know, owes a lot to, uh,
automator for sure, you know, made by Apple. And I think that's something that I realized. I want to keep using shortcuts for that style of top to bottom execution of multiple actions. But on a Mac, I can sort of have a corollary of other automation tools that can either come before a shortcut or can come after a shortcut has run.
This episode of App Stories is brought to you by BetterTouchTool. BetterTouchTool is a powerful macOS application that enables users to completely customize their various input devices, such as keyboards, the Magic Mouse, and Magic Trackpad, the Touch Bar,
the Siri remote, and even things like a Stream Deck. We love BetterTouchTool here at MacStories. It's one of our favorite applications that everybody on the team uses because it's just so powerful when creating automations. It allows you to take complex multi-step tasks and compress them into simple gestures and other automations.
I have some very cool news to share with everybody today. VTT Mobile is coming soon. It's the iOS, iPadOS, and VisionOS version of BetterTouchTool. It's going to allow you to create completely customizable dashboards to control your Mac straight from those devices. Whether you're on your iPhone, iPad, or Vision Pro, you'll be able to kick off automations on your Mac. I can't wait to get my hands on this. It
It's not out yet, but starting today, you can get on the test flight beta. Just go to community.folivora.ai. That's community.folivora, which is F-O-L-I-V-O-R-A dot A-I.
There you'll find all sorts of information about BetterTouchTool. It's a great, vibrant community full of people who are making the most out of BetterTouchTool. And that's where you'll find a link to the test flight to jump on board and start providing feedback for ways in which you want BetterTouchTool to be usable from those devices like your iPhone, iPad, and Vision Pro.
From everything I've heard, this is going to be a big deal. So check it out today and let the developer know what you think. You can learn more about BetterTouchTool by going to folivora.ai today. Again, that's folivora.ai, F-O-L-I-V-O-R-A dot A-I, to learn more about BetterTouchTool and to drop into the community and grab that test flight beta for BTT Mobile. Our thanks to BetterTouchTool for their support of the show.
And I want to get into a little more detail about something that I hinted on Unconnected a few days ago, which is this new system that I have to transcribe audio notes.
So something that I got, something that I got really into is the idea of talking by myself and sort of recording my thoughts out loud. Usually when I'm driving the car and usually when, when like, for example, say I'm going to go pick up Sylvia, I have, you know, about 10 to 15 minutes in traffic.
to get from our place to Sylvia's workplace. And those times, that's when I usually listen to podcasts or some music and I got my dogs in the back and they're just sleeping, you know. But I realized that I really like recording myself, sort of either brainstorming ideas
Or talking out loud about a shortcut that I want to put together or just, you know, doing a brain dump of sorts for things that I got to do during the week or things that I got to do tomorrow. And I don't know what it is, but, you know, at this stage of my life, talking out loud about those problems is faster and more effective than sitting down and typing those thoughts.
Do you do this in English or in Italian? In English. I do it in English. I had a feeling you did, but I was kind of wondering. That's interesting. I feel weird talking about my work in Italian. I was going to say, because your work is in English, I would think that it would feel...
seem most natural to do that. And like even, like even the simple things, like the name of shortcuts, like I call it the app, I call it shortcuts, right? Yeah. It's called something completely different in Italian, which is, it's called Comandi Rapidi, which means quick commands,
rapid commands, you know, whatever you want to call it. And like, I just cannot bring myself to do it. It's just weird. Uh, so yeah, I talk in English and then I don't know what it is, but there's this part of my brain that sort of, and maybe it's my podcaster nature. But when I talk about a problem or an idea out loud, it's like, I come up with more ideas. I come up with more thoughts almost as if I was having a conversation, but I'm having a conversation by myself. So yeah,
At least you have the dogs in the back, right? You know, you have the dogs in the back. You can pretend you're talking to Zelda and Ginger. To Ginger. She's like, hey, Ginger, you want to talk about this shortcut with me? And she'd be like, yes, dad. But like, so on Connected, I talked about this journey that I've been on to find the ideal recording device.
And I really wanted it to be earbuds. And specifically, I just wanted to wear a single earbud for dictation because I realized that the microphone quality of the AirPods 4 and also the AirPods Pro 2, it's not great. Like it's not great recording quality. Either if you are recording yourself or you are like on a phone call or on a Zoom call, people can tell. I've noticed this. I've even used a dictation app where it says, we can tell you're on AirPods. You should switch to something else because it's not going to be as accurate with AirPods. Yeah.
So on Connected, I talked about this journey and I discovered this excellent pair of earbuds made by Huawei. It's the Freebuds 4 Pro. I'm sorry, in America, you probably cannot purchase them because Huawei is banned in the United States. But these are really, really good earbuds. They have bone conduction microphones, which sort of explains the quality of the recording that you get.
But I was running into this issue where when I was getting into my car, my car's system, either CarPlay or Android Auto, was defaulting to the car's microphone. It wouldn't let me use the microphone of the earbud because the car's microphone was taking over. And so that was sort of defeating the whole purpose until I found these Xiaomi earbuds. And Xiaomi, they make the line of Buds 5.
These are available. It's the regular Buds 5 and the Buds 5 Pro. On Connected, I talked about the Buds 5, which are basically the Xiaomi version of the AirPods 4. They don't have in-ear silicone tips or foam tips, but they do support noise cancellation, like
you know sort of low quality noise cancellation because obviously they don't have a good seal in your ear canal but what these earbuds do they support local audio recording in the earbud itself like the audio recording it's saving a file that is stored inside the earbud because they come with a little bit of storage how much storage 90 minutes per earbud so i told oh that's great
A total of three hours, which is totally a feature that I hope other companies can steal from Xiaomi. That's really plenty for like what you're doing. It's plenty for what I'm doing. On Connected, I mentioned the Buds 5, but just today I received the Buds 5 Pro. So these are the Xiaomi Buds 5 Pro, and these have the in-ear tips, which I am not a huge fan of, but I can tell you from a quick test just before doing AppStories,
the microphone quality is better. So it is. And they also support the local technology
recording in the earbud itself, which is kind of funny. You can actually even record if you talk into the case. So you can actually record by talking into the case of the earbuds. Talking about having voices inside your head. You literally have an earbud inside your ear and you're recording your own voice into your own head, basically. Yeah, yeah. So I'm going to test this out over the next few days. But all this to say, I end up with this file.
this M4A file that my goal, let's talk about the goal. The goal was to take this file, transcribe it, and summarize it. And from that file, in addition to the transcript, and in addition to the summary made by a large language model, I also wanted the large language model to understand the contents of what I was dictating. And I wanted it to be smart and understand everything
In the thing that I dictated, can you identify any tasks or like actionable items, like things that, you know, follow the conversation, transcribe the conversation, summarize the conversation, but also extract the actual things that I'm saying I'm supposed to do. And this is the automation that I put together.
I had to use on the Mac Mini FFmpeg or FFmpeg. It's a command line utility, really popular command line utility. And I'm using it because this is something that I discovered this week. FFmpeg can apply neural network-based
filters to clean up the audio of audio recordings. There are different models that you can run totally local on your computer to denoise a voice recording. You can choose different models based on your environment, based on what you're using. I chose one
from this GitHub repo. I saved it on my Mac Mini. So the first step was to set up a Hazel automation on the Mac Mini server that monitors my Google Drive folder for an audio file. So the only thing I got to do when I import the audio file from the earbud is to save it into a folder.
And that's the only manual step of the process. I'll say that right away. Then Hazel sees that file, runs the denoising automation. And once the file has been denoised and saved in a different folder,
A second, Hazel automation kicks in and passes that cleaned up audio to a shortcut. This is where shortcuts comes in and shortcuts uses one of the apps that we mentioned a couple of episodes ago, AI Actions by Sindra Soros. And that's you, actually, first, it's using another app by Sindra called AICO, A-I-C-O. It's a transcription app that has a shortcut action
And on a Mac, ICO supports background processing. So you can run a shortcut that takes a file and transcribes that file in the background while the shortcut is running. Is it using its own model to do the transcription or is it using something like Whisper? I think
I think it's using a local version of Whisper because when you download the ICO from the app store, it's like 1.5 gigabytes. - Yeah, it's probably one of the smaller Whisper models. - So yeah, I'm assuming it's one of the smaller Whisper models. So the shortcut creates a complete transcript of the entire file and saves it as a variable. Then AI actions uses Claude to summarize
and create a list of actionable items. So at the end of the shortcut, I end up with three variables, the entire transcript, the summary, the list of actionable items, and I still have the denoised audio file. At the end of the shortcut, these four things are all assembled in a template for a document that gets saved in Obsidian.
And in Obsidian, I end up with this beautiful document that has my AI-generated title of my recording, the summary, the actionable item, an embedded audio player inside Obsidian that lets me listen back to the denoised audio file, and also an internal link to open the full transcript if I want to read it.
All of this, I have tested this for the past week. All of this is complex. It's a complex automation, but it runs in a minute. Like it just takes a minute for a 10-minute audio recording to be transcribed, to be denoised, transcribed, summarized, analyzed, and saved in Obsidian.
Are you saving the audio files so that you can go back and listen to the details and the context? Yes, sir. Okay, that's what I was wondering. Just in case the summary is wrong or there's a hallucination or something like that. Exactly, exactly. And I'm saving the source files, so the audio and the full transcripts.
in a sub directory of my Obsidian folder where I'm keeping these audio summaries. I have a folder where I store the source assets so that if there's an hallucination or something like that, I can just double check the source. But yeah, this is my favorite automation that I put together lately.
Oh, yeah, that's a really cool one. I like that. I think it's a great idea. I've been toying with dictation myself. I haven't set up any big automations yet, but it's, you know, we type a lot. And sometimes you're right. Both one, saving the stress on your hands and your wrists and your arms, but also it is, you're right, I think a good way to get ideas out sometimes is speaking them out loud.
This episode of App Stories is brought to you by Notion. There's no shortage of helpful AI tools out there. We've talked about a lot of them here on this show.
But using them means switching back and forth between yet another digital tool. So instead of simplifying your workflow, sometimes it can make it more complicated. Unless, of course, you use Notion where it's built right in. I love Notion for that very fact. The fact that you can use Notion AI right there to deal with your notes. You know, you can summarize data that you've collected from websites. You can pull out documents.
tasks from meeting notes that you've taken. You can do all sorts of things. Plus use it to analyze the data that you've collected over weeks, months, and years all at once without you having to go searching for it. It can find relevant information based on your natural language prompts, which makes it incredibly powerful.
Notion combines your notes, docs, and projects into one space that's simple and beautifully designed. It's your one place to connect teams, tools, and knowledge so you're empowered to do your most meaningful work.
The fully integrated Notion AI helps you work faster, write better, think bigger, doing tasks that normally would take you hours in mere seconds. Notion is used by over half of the Fortune 500 companies, and those teams send less email, cancel more meetings, and save time when searching for their work, reducing their spending on tools, all of which helps Notion.
Keep everyone on the same page. Try Notion for free when you go to notion.com slash appstories. That's all lowercase letters, n-o-t-i-o-n dot com slash appstories to try the powerful, easy-to-use Notion AI today. And when you use our link, you'll be supporting our show. Notion.com slash appstories. Our thanks to Notion for their support of the show.
My first automation that I want to mention is one that I call, it is a shortcut, it's called Fetch Podcast Details. And this is something I set up over the holidays because one of the things that I do a lot is I deal with our social media promotions of the various shows that we do on Mac Stories with the podcasts. And not all of these are podcasts that I'm on and I don't
I have to go to a bunch of different places, or at least I used to have to go to a bunch of different places to get information about the URL for Apple Podcasts, the YouTube URL, the title, the description, all these little bits of information that you can put in either a social media post, or maybe I need to use them for the club newsletter or a post on Mac Stories. There's lots of places. I need this stuff every week, and I wanted to find a way to gather it all in one place
that I could go to throughout the week as I needed this stuff and have it centralized. So what I did was I started out with a couple of APIs. I worked with both the iTunes, um,
The iTunes API, which has been around forever and is a good way to get, you know, the Apple podcast URL for a podcast and the podcast actions. You can use that as well. But then I used Google has some APIs for YouTube, which are a little more complex, but they allow you to get all sorts of things. You can get the thumbnail. You can get the the URL. You can get the description. You can get all the metadata associated with a YouTube video.
And I took those things. And what I do is when...
A new episode is published. Zapier, and this is like outside of shortcuts, but Zapier sends me a notification that a new episode has hit one of the feeds of one of the podcasts. That way I know it's about time for me to do some kind of promotion. But the thing is, is that those episodes don't hit Apple Podcasts or YouTube immediately. A lot of times Apple Podcasts takes around 20 or 25 minutes. So instead of like,
Getting that notification and then forgetting what I'm doing, I have Zapier set up a Todoist item for 20 minutes later that'll give me an alert and remind me in 20 minutes to do the promotion. Because by then I can run my fetch podcast detail shortcut. I run it. I actually have it send me some alerts as it's running so that I can see that it's in fact fetching the last item and that the last item is the new episode and not the one before it.
And if it is, I just let it run through and it takes the information from the iTunes API and the YouTube API, combines it along with a short description of the episode using the chat GPT actions and shortcuts and combines it into two different videos.
clipboard items. First, it populates one to the clipboard, which has the URLs and some of the very basic information that gets dumped into PastePal because PastePal has shortcuts actions to dumps it into a PastePal folder specific to that show. Then it creates a second
clipboard item that has more detail. It has the thumbnail for YouTube, the URL for that. So I can easily grab that. It has the description, the title, the episode number. It even has an iframe embed, which
is something that I, you know, those are structured and always are the same except for the information about the ID numbers and things for that particular episode of the podcast. So I took the time to kind of reverse engineer those iframes over the holidays. And now those get generated automatically for me because those end up being the little mini players that you see in the posts on Mac Stories that recap the podcasts that we publish.
those get automatically generated and also loaded into PastePal. And so now, whether I need it then at that moment to create a social media post to say, hey, there's a new episode of this particular podcast, or I need it later in the week,
when I'm putting the iframe into an article for Mac Stories or for the club, it's all there and waiting for me in PastePal, which then syncs, of course, not just to the Mac where I tend to run this thing, but to all my devices. So I have it wherever I am. So it's easy for me to get to and be a little more timely with both the social media posts, but also being able to grab those details for other uses throughout the week without having to kind of recreate the wheel every time. Nice, nice.
I want to mention, this is not like a particular shortcut. It is more like a technique that I've used in a whole bunch of my shortcuts lately. And I want to talk about, again, the AI Actions app by Cinder Source. This allows you to either use ChaiGPT or Cloud. Okay. And I've been using specifically the... They have two separate actions in shortcuts. One is called Ask AI. The other is called Ask Cloud. I've been using Cloud just because I just prefer...
I just prefer Anthropic as a company, and I also prefer Cloud as an assistant. It's a whole bunch of tiny little things about the way that Cloud works, the way that talks, the way that Cloud also can be used by teams. I just prefer Cloud over ChatGPT. And I've been using summarization for, like, as a person who deals with text,
And a lot of text every single day, both my text or text from the app store or text from email. Like we deal with text, a whole bunch of text on a daily basis. And so the Ask Claude action, I've been using it a bunch to summarize my own stuff. Like, for example, those voice recordings that I mentioned before, I'm summarizing them with Claude. But I also, I've been using Claude summarization to, like, say, for example, that
I'm running into an app update that has really long release notes. And there are some of these apps that have really long and really technical release notes. I have a shortcut that gets the app, the change log for an app from the app store using the native app store actions. But then in the case of a really long change log with a whole bunch of details that I don't care about, instead of doing the manual cutting and pasting myself,
I'm using the cloud action to extract the key points for an app update for me. And I've done that thanks to a prompt that sort of instructs cloud in terms of like, what are the things that I care about? So that it sort of, it understands the types of new features and updates that I'm interested in and sort of cuts everything else from the change log so that I end up with a much
cleaner and more actionable change log that I can use in the newsletter, that I can use as a jumping off point to write what I'm supposed to write. So it's basically a different type of automation based on large language models. I keep saying this, but large language models are great at dealing with long text, and
And so, you know, you feed them a whole bunch of text and they're great at that. And they are more, from that point of view, they are more reliable than regular expressions because like text can be different, you know, and regular expressions are static. They work with patterns and they match specific patterns. But if you don't have the same patterns, well, that's where a large language model comes in, which can sort of
The reason over a long block of text, kind of like a person would do, be like, is this a thing that I'm interested in? Probably not.
And in the AI Actions app, even though it hasn't been updated for a while, if you want to use the latest models, this is kind of confusing, but in shortcuts, if you use the Ask Cloud action, you can expand the action and click on a parameter that says name of the model. And if you choose custom, you can enter
the exact name of the model that you want to use. So even though... Well, like SANA 3.7 or just a totally different... Could you use LAMA or would it just be another cloud one? It'd have to be another cloud one, right? Yeah, but right now the Ask Cloud action doesn't have 3.7. So you can just enter it yourself. Oh, that's good. Yeah.
That's great. Yeah, that's that I have found that to be true with summarizing text, too, because I have the same kind of problems where it's like either a gigantic email from a developer or long release notes. And I really want to get down to like the four or five major things that I want to really focus on. And doing that kind of summarization is great.
So the next thing that I want to mention is really a shout out to the unread RSS client. Oh, yeah. Oh, yeah.
The shortcuts integration in there is fantastic because what you can do, you know, it has, Unread has a very long list of integrations for various read later services. And it's really good that way, including it has, it has, you know, ReadWise Reader and Instapaper and all the other different things. But what you can do with its shortcuts integration is you can have it fetch the
your shortcuts that you created yourself by name, run them, and then come back to where you started in unread, which is a real time saver. And I've set up a couple of these that
The two that I want to mention are Send to Reader. So even though there is already Readwise Reader integration in Unread, what I do is I sometimes use the built-in to integration if I don't need anything more than I want to save it. But if I have something that I want to save, for instance, for NPC or a show with Brendan Bigley,
or for the club, for like the weekly interesting links, I run this shortcut because all it does is it does the same thing. It saves the article, but it saves it with a tag. It gives me a list of things that I can choose from to tag the item in Reader so that when I go at the end of the week,
I have a saved filter in Reader for NPC, another saved filter for Weekly, and I can just click on that, look at what I saved, and it gets me off to the races a lot faster. The second one is similar in that it takes links from Unread and it sends it to Todoist instead. And it takes those items and gives them a project and gives them a label. So it's very easy using the Todoist filter
integrations with shortcuts in their API to do that kind of thing. And so I'm doing that from Unread as well. Nice, nice. And the final one that I will mention is this new shortcuts that I put together to transcribe YouTube videos.
And I initially, I thought, oh, I'm just going to use the Google API myself. And I'm just going to, you know, but it turns out that working with the Google API is really difficult and really challenging because like, it's a very complex API that has to deal with like all auth authentication, which is not something that is super intuitive to do with shortcuts. That's why in my first shortcut part of that, I got a little help from Finn on the API side. So,
See, that's cheating. It is cheating, I know. It's very complex to deal with the Google API in shortcuts. Yeah, it's not easy. You don't ever redirect URL unless you spin up a private web page and that web page redirects to shortcuts. So, in doing some searches, I really wanted to have a way to get a transcript from YouTube videos. And I found this incredible API.
called youtube-transcript.io. There's going to be a link in the show notes. So this is like, it basically acts as a middleman in between you and YouTube. And it's a very easy to use API that allows you to pass the ID of any YouTube video and get back the entire transcript of the video.
I don't know how they're doing. Like, obviously, this must, you know, these folks must have some, you know, enterprise pricing with the YouTube API. I don't know. But like, it's very easy to using shortcuts. The only thing is that it returns this JSON object with like thousands,
of individual items for each line of text in the transcript. I assume because the transcript in YouTube is timestamped, right? So the longer the video you're trying to get a transcript for, the more lines, individual lines of text you're going to get. So I had to design the shortcuts that
talks to the YouTube transcript.io API, gets the JSON response, and then iterates over, like if you're feeding like a 30 minute video, there's going to be like thousands of lines of text. And so you need to get that text and combine it with a combined text action to get back the entire single block of text.
for the entire transcript of the video. And yeah, that's what I've been using to... I tested it with our own transcripts from YouTube. And also, it's a very nice solution to sort of put together your own summarized YouTube video
shortcut type of automation. But yeah, if you are a programmer, you could do this yourself by using the Google API. But once again, it's a really challenging API to use, especially in shortcuts because of the authentication that YouTube and Google use. This one is basically a bridge between shortcuts automation and YouTube and it works great.
Oh, that sounds really good. A couple other ways you can do that right off the top of my head are ReadWise Reader I know has transcript support for YouTube. And so does Notebook LM if you are using that from Google. I actually did that recently for a video I was watching for NPC. I dropped the URL in there and it gave me a transcript and I was able to kind of do a summary and a timeline and stuff like that.
The last one I am going to mention for today is a Claude project that I created called Podcast Clips. Now, one of the things when I'm finished editing the audio and the video of a podcast, I can sit as I do that and come up with what I think seem like good clips for sharing in YouTube Shorts or on Mastodon or whatever.
But I've found that for whatever reason, when I'm listening for errors and things I want to edit to make it sound better, I'm not as good at listening to the content. Probably because I lived a lot of that content because I'm only editing shows that I happen to be on. So what I've done with Claude is I gave it a long set of instructions about...
each of the shows, NPC and App Stories and Unwind, and asked it to, and I feed it the transcript of the show with timestamps. And I say, I want clips that are no longer than two minutes long.
And that are interesting quotes, basically. And it comes back. It's remarkably good at this. And I usually ask for five, five different clips. And what it does is it gives me the text of the clip. It gives me the reasons why it's decided to give me that back as an interesting clip.
Sometimes they're not that great, but usually out of five, two or three are good. And two or three is all that I really want because I'm not going to like flood the YouTube reels or shorts or anything like that with a bunch of clips, maybe two or three a week at most. And so it's been a really great way to kind of do that quickly because once I do that, because I have the timestamps too, because I have the transcript with timestamps, I can go back, search for that text, see exactly where it starts and
drop my video into something like Adobe Rush, Adobe Premiere Rush, and very quickly create a short snippet from the full video that we have without a lot of extra work. And I can do that. You know, I can do three or four of those for each show in about a half an hour. It's a lot faster than kind of doing it manually and scrubbing through the video, looking for something that sounds good. And so, yeah, that's my, that's my final one.
Nice. Well, we're going to talk about a whole lot more automations and things that we tried in the post-show for App Stories Plus listeners. But the main thing that I will just say at the end is that I'm finding the real power in being able to continue using shortcuts, but
have an ecosystem of other utilities on macOS sort of around shortcuts. In Mac Stories Weekly, I wrote about Lingen X, which is this other Mac utility that allows you to... Is it a lot? Yeah, it allows you to run shortcuts on a schedule. And by on a schedule, I mean like I have a shortcut that runs every 10 seconds, like all the time in the background, like that sort of thing. But you also have apps like Shortery, which is sort of like Lingen and, you know, a little...
more limited but also has a bunch of other triggers to run shortcuts uh automations on mac os there's another excellent one uh called stacker stacker right stacker allows you to run shortcuts when usb devices are connected or disconnected from your computer like there's a there's a whole ecosystem of shortcuts utilities on the mac that
don't mean that you'll have to stop using shortcuts. I mean, you can on a Mac stop using shortcuts if you want. You can use, you know, Keyboard Maestro, you can use, you know, Alfred Workflows, whatever. But I think there's real power in being able to have these other power user command line tools or these other third party apps that just cannot exist on an iPad and sort of build your custom automations around shortcuts. That's something that I really like.
Absolutely. Absolutely. All right, Federico. Well, that's a good place to stop for today. I want to thank our two sponsors again for this episode. That's Better Touch Tool, also a great automation tool, and Notion. You can find the two of us over at MacStories.net. And of course, we're on social media where Federico is at Fatici. That's V-I-T-I-C-C-I. And I'm at John Voorhees.
J-O-H-N-V-O-O-R-H-W-S. Talk to you next week, Federico. Ciao, John.