We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode 618: Type System Says No

618: Type System Says No

2024/12/19
logo of podcast Accidental Tech Podcast

Accidental Tech Podcast

AI Deep Dive AI Insights AI Chapters Transcript
People
C
Casey
一名专注于银行与金融实践的律师助理,擅长公私伙伴关系项目咨询。
C
Christian Kent
D
DeMarco
J
John
一位专注于跨境资本市场、并购和公司治理的资深律师。
M
Marco
技术播客主持人和苹果产品专家
Topics
Thomas Alvarez提出了一个关于洗衣方式的阴谋论,认为将印花T恤正面朝外洗涤是为了更快损坏印花,从而促使人们购买更多ATP T恤。John对此进行了回应,他表示这并非阴谋,只是印花T恤的印花最终都会脱落。Christian Kent对John之前关于应用名称的玩笑进行了更正,他指出"Space Doubler"应更正为"Disk Doubler"。John列举了众多收到的应用名称建议,并最终表示他已经选择了一个名称,但暂时不会透露。Marco也分享了他最喜欢的几个应用名称建议。 Casey对Thomas Alvarez的阴谋论表示认同,并对John的应用名称建议进行了评价。John解释了他选择应用名称的理由,并表示他个人喜欢这个名称。John更新了他的应用开发进度,表示应用的发布日期尚不明确,并解释了开发进度缓慢的原因。 Marco分享了他对应用名称建议的看法,并表达了对一些名称的喜爱。John解释了他选择应用名称的理由,并表示他个人喜欢这个名称。John更新了他的应用开发进度,表示应用的发布日期尚不明确,并解释了开发进度缓慢的原因。 Christian Kent对John之前关于应用名称的玩笑进行了更正,他指出"Space Doubler"应更正为"Disk Doubler"。他解释了"Space Doubler"和"Disk Doubler"的区别,并对John应用的名称"Risk Doubler"进行了评价。 Marco对John的应用名称建议进行了评价,并表达了他对一些名称的喜爱。他分享了他最喜欢的几个应用名称建议,并对John最终选择的名称表示好奇。

Deep Dive

Key Insights

Why is there a lot of feedback about laundry on ATP?

The podcast discussed laundry techniques, and listeners, particularly Casey, shared theories and jokes, such as the idea that ATP was trying to get people to ruin their shirts faster to sell more ATP shirts.

Why was the suggested name Disk Doubler corrected to Disk Doubler?

Disk Doubler was actually the correct name for a utility from the classic Mac OS that saved space by compressing files. RAM Doubler and Speed Doubler were other utilities, but they were not related to disk space saving.

Why is John's app development taking longer than expected?

John is a slow Mac developer and is new to many of the APIs he is using, which makes the process more challenging. Additionally, implementing UI and in-app purchase functionalities is taking a significant amount of time.

Why is Apple's documentation frustrating for John?

Apple's documentation often lacks comprehensive explanations and error handling details. It also assumes prior knowledge, which can be problematic for new developers. John finds this particularly frustrating when trying to implement in-app purchases.

Why might the new Blackmagic camera for immersive video be important for content creation?

The Blackmagic camera is designed specifically for recording Apple immersive video, offering high resolution and frame rates. This could significantly improve the quality and accessibility of immersive video content, making it more appealing for professional and smaller production houses.

Why is the Vision Pro not well-suited for gaming, even with controller support?

The Vision Pro is expensive, heavy, and not designed for motion. It has high latency and motion blur, making it unsuitable for popular VR games that require physical movement. Additionally, Apple has historically been reluctant to support dedicated gaming hardware, limiting the platform's potential for gaming.

Why is Marco excited about the Terminal with No Vowels?

Marco likes the e-ink display technology and the minimal, stylish design of the Terminal. It allows for ambient displays of information like weather and countdowns, and it's a well-designed product that doesn't look like a DIY project.

Why is John considering developing a disk health app?

John finds existing disk health apps unsatisfactory, either because they clutter directories with checksum files or use a central database that can diverge from the disk's state. He believes there's a need for a better approach to disk health monitoring.

Why are 360-degree cameras like the Blackmagic design different from traditional movie cameras?

360-degree cameras capture a wide field of view without the need for interchangeable lenses. Traditional movie cameras require a variety of lenses to capture different perspectives, but 360-degree cameras are designed to capture the entire environment from a fixed position.

Why is Siri expressing more emotion in iOS 18.2?

Siri now uses inferred emotion based on the text being dictated, such as capitalization and punctuation. This feature was likely introduced to make the voice assistant more engaging and realistic, though it can sometimes be surprising or frustrating for users.

Chapters
The podcasters discuss feedback received on their laundry technique, specifically the suggestion that it might be a conspiracy to sell more shirts by damaging them faster. They humorously dismiss it as not a conspiracy, but acknowledge the interesting theory behind it.
  • Feedback on laundry technique
  • Thomas Alvarez's conspiracy theory
  • Discussion on shirt printing and washing

Shownotes Transcript

Translations:
中文

We also didn't mention this yet, but we did a new member special. Oh, that should have been the notes. Oh, yeah. My bad. That's on me, too. Because it was released like an hour before we started recording. See? See, Casey, I need to be on top of this. Well, I know. Usually I add that one. No, that's not true. No, usually you do, too. All right. Here we are, you know, an hour and a half into the show. Go ahead, DeMarco. So it's about how we make the show, all the organizational techniques we use to prepare very well in advance for making our show. Right.

Except for when the new special episode drops an hour before we're recording. And I was in the shower.

All right, let's do some follow-up. We have some, we had a lot of feedback about laundry, mostly, oh my God, I can't believe that I listened to laundry stuff and had opinions, but nevertheless, I think our favorite piece of laundry follow-up was in feedback was from Thomas Alvarez, who writes, I told my wife, these guys are talking about washing shirts with the printing right side out, to which she said, they want to sell you more ATP shirts by telling you to ruin the printing faster.

I love this. I wish I could tell you it was some big conspiracy that we had plotted, but no, that had nothing to do with it. But this is hilarious to me. So this is very well done. Yeah, this is the theory that with the printing facing out, it will rub against other clothes more than it would with the printing facing in. I'm not even sure that theory holds out because, you know, although the clothes are all smushing around there, if you have it inside out, the printing is still touching another piece of fabric. It just happens to be the other side of its own shirt. It doesn't know that it's its own shirt versus another shirt.

But who knows? People have ideas about laundry. But yeah, it was not a conspiracy to get you to buy more shirts. The printing's going to come off those shirts eventually, no matter what, if you keep washing them.

of course john had feedback about a joke but here we are uh christian kent speaking of jokes has a correction and according to someone who put this in show notes who i'm sure his name is john syracuse christian kent apparently had the best joke about john's app so the correction the correction is this i was when i was talking talking about uh names for the app which we'll get to in a second uh i was mentioning uh

I think I said Space Doubler or something like that because there was a series of utilities for the classic Mac OS called RAM Doubler or Disk Doubler. Well, Disk Doubler is what Space Doubler was called. That's the correction. Sorry, but it was RAM Doubler and Speed Doubler. Believe it or not, they sold a software product called Speed Doubler. It did make your computer faster, perceptively faster in a few interesting ways. But anyway, the space saving one was not Space Doubler. It was Disk Doubler.

Uh, and Christian made an image of like the old, I presume this is the old disc doubler floppy disc that it came on with like the little styled label or whatever. Uh, but he changed the word disc and disc doubler to be a, uh, appropriate, potentially appropriate name for the app that I'm working on, uh, which is risk doubler. Yeah.

A little bit pessimistic, but you know, I laughed. I thought it was funny. Risk doubler, disc doubler there. It's right there in front of you. Good job, Christian. I like that one. And speaking of names. Oh, we have so many names were suggested for John's app. I made myself kind of a top list. Have you made one or do you want to hear mine? Well, so yeah, are yours ones that have been suggested by other people or are they new? By other people.

Oh, okay. So I have a list here in front of me. Now, I want to say that this list is not exhaustive. There were just so many suggestions, I could not collect them all. So I'm sorry if you don't hear your suggestion on this list. This list is not ordered in any way. Oh, come on. You got to order it, man. You got to pick your favorites. I'll get to that in a little bit. I mean, there are favorites mixed in, but it's mostly kind of like...

the order that I got to putting them into this document together. There's no theatrics in that. You got to have like a countdown from worst to best. So as I, as I go down this list though, what it will actually be is an exercise in me trying to pronounce people's names. Like it's not about, I just realized this has nothing to do with the name of my app. It has entirely to do with the fact that I decided to put the people's name who suggested it in here. And now you're going to hear me. This is like payback for Casey. Now you're going to hear me try to read 50 people's names.

Anyway, I'll try to do this quickly. Here we go. Ben McCarthy suggested deduplicity.

Todd Hoff suggested free disk space because it has a double meaning. It's like you're freeing it as a verb and you're also talking about it as a noun. Alexander Morris Terry said spacey list. I like that one. Ben O'Maddock said Final Frontier. Mark Edwards said Double Down and Clone Cut. Gabrielle or Gabriel said D-Duplo. F10 said Dupty Dupe, which I thought was fun. That's very good. Yvonne Cavero Ballonde said Dupe Nukem.

which is good. I mean, a little bit of a copyright trade dress, whatever. Graham K. said Storacusa.

L. Neal said ding. John Muir said safe space and bigger Mac. I'm sure the McDonald's would love bigger Mac. Thomas Hall said saving spaces, bit biter and bite impotent. Bite impotent. It's like bite in front of item potent. That's a little too obscure. Jack Wellborn said fire marshal. Like, yes, like fire marshal. File marshal.

Yeah, File Marshall. It said, let me show you something. File Minion, Copy Optimizer, and Copy Miser. Michael Horns... Michael Onore, I'm going to say, said Space Cowboy, which I like because it's a Cowboy Bebop reference. Ted Duffield said Obiterate. Timo Gruen said Duper Blooper, Duper Trooper, and Mirror Space.

Dan Engler said, Archivides, uh, in honor of his, one of history's great minds, famous for messing with volumes because he was figuring out like how to tell the volume of like spheres or something. I remember. And he lived in the city of Syracuse. Uh,

Cliff Connell said Duplarace or Duperace. Matt Johnson said Clone War. Obviously, there's lots of clone things that are going to run afoul of Star Wars here. Kelly said Clone Zone. Mustafa Hamoumi said Conjure, as in creating something out of nothing, like free storage. Steven Bernard said Disget, like Disget with a G. Uh...

Jack Uyane said, Duplikiller. These are getting kind of violent here. Squozin said, Optimus. Marina Eppelman said, Space Forager. Claude Zien said, Repeat Defender or Repeat Offender. Duplicut, Double Down, Duper Scooper. Rene Banas said, Storage Sweep. Jonathan Augusto said, Storage Consolidator, Disk Consolidator, Consolidator.

Consolidate Storus. Sounds like a dinosaur. Storage Liberator, Disk Liberator, Byte Liberator, Freedom Biter. Schumstra said Space Rescue and Space Machine. Nathan Galt said Cow Candidate because cow stands for copy on write, which is how the cloning stuff works under the covers.

Mike Corilico said Doppelganger and Jose Vasquez said Space Scout, One File, Stack File, Bed Bunk and Mac Half Empty. Those are all the ones that I have in my document. That is not all the ones that suggested, but that gives you a feel of what the names are going. Now, Marco, you have a favorite list. I do from least favorite to most, but I think these are all excellent. That's why they made the list. Jonathan LaCour Copyright.

like two words copy and then like right as in the opposite of left copy right Jonathan Augusto's storage liberator Ben Curtin that's a reference which is amazing

That's like an extremely inside joke for an app that you're trying to sell to people. Yeah, but that's a reference. It's a great name. Tom English suggested Space Finder, which I think is pretty straight. I did like Graham Kay's Storicusa. That's excellent. Jack Wellborn's File Marshall is also excellent. I also wrote down Ivanka Arabella Gunde's Doop Newcomb. But my favorite, which I don't think you said, was via email by Sean Flynn, Forage Space.

Yeah, there was lots of foragers. I didn't have all the forage ones on because so many will be like space forager. Lots of foraging. Yeah, space forager I thought was decent, but I think forage space is even better. I think that... What is that? What's the pun there? Like storage space? Like storage space. Forage space. Come on, that's a great name. That is good. If you're not going to go with Storicusa, I think...

or dupe nukem dupe nukem bigger mac is one of the i like i do like a dupe nukem and bigger all the ones that you can't possibly use because like mcdonald's and apple and whatever those are funny jokes yeah of course you can't use most of those but you can use storic usa or forkspace i'm not gonna use stork i know i've had there was no way you were ever gonna use that the bites of syracuse county you know casey do you have any favorites

I think Duke Nukem was my favorite, which obviously has problems. I think you could get away with it, but it definitely has problems. I don't know. Other than that, there were a bunch of really good ones. Nothing specific is leaping out at me as the rightest answer. Come on, Forge Space. And again, I apologize if I didn't read yours. There were just so many of these, and I'm sure lots of them fell through the cracks. Actually,

That's true. Forge Space is excellent. That's probably, of the ones that you can actually use, that's probably my favorite. So the reason I'm not giving you top lists here is because I have indeed chosen a name for my app since last we recorded. Oh, no. And I'm not going to reveal what it is, and I'm not even going to tell you whether it was one of the ones on this list or not. Okay. Will you tell us privately at another time? No. No.

no you know that has to be an on-show reveal here's the thing about the name and maybe casey you can relate uh it's it's a name i chose this name and i just know that everyone else is not gonna like it oh been there been there casey lives there only i have to like it and that's all that matters uh so yeah eventually i will reveal the name i'll probably reveal the name once i have like an icon which i'm nowhere near having or whatever but you know anyway

And on that front, I'll just give a brief update on my progress on the app. Like when we recorded last week's episode, I had created the Xcode project like a couple of days before that. So if you're expecting that app to be released, yeah, last one we recorded, I had a working app and it would do what it's supposed to do when you click the button. But it takes longer than a week to make an app just FYI. And it takes me personally way longer because I am so slow as a Mac developer because there's just so much I don't know.

It's frustrating when you're a fast developer in other contexts and other languages. At this point, I'm faster in PHP than I am in doing what I do. There's just so many things to do. And there's only one way to do a lot of things because you've got to use the APIs to do them. Anyway, it's going to be a little while. This app will not be out in 2024 for sure. Who knows when in 2025 it will be out. What am I doing with all this time? If you already have an app that quote-unquote works, well, I've got to put a UI on it that takes a surprising amount of time.

I got to do all the store in that purchase stuff. It takes a surprising amount of time for someone who has never done it before. Uh, I'll complain about Apple's documentation, maybe next episode. Oh yes, please. And, uh,

And I'm like, basically, I'm trying to figure out and implement all of the appropriate guardrails to make this not be a risk doubler. Right. That's the hard part when it's easy to just make something for yourself. You're like, well, you know, I know what I'm doing and it's fine or whatever. But to make something you're going to give to other people to use, you really have to nail the sucker down. And that is taking a lot of time because I'm trying to think of what are the best mitigations? Which one should I allow to be disabled? Which one should I not allow to be disabled?

How locked down should it be? Because you're locking down too much. It's not going to find any duplicates because it's not allowed to look anywhere where the duplicates are, you know. So I'll give updates as I go along here. But like, it's going to be a while. So don't hold your breath.

We also got some feedback. Some experts have weighed in. A friend of the show, Dave Nanian, has some thoughts. Dave writes SuperDuper, which is what I use every other day in order to do a full backup of my computer. Dave writes, the big problem with SpaceSaver is that, which I guess is the name that he has chosen. We'll see if that's what John has chosen.

is that it's going to be run by people low on space, not just the curious. And the side effects, more data on the drive than capacity, will cause post-run pain for the user, it will cause confusion, and more. Think migration, moving to new Macs, etc. Even though we know how to do this, obviously it's a brute force thing, we don't on backups for performance or sense reasons on Smart Update, which is a feature within SuperDuper.

Erase replicates to preserve these relationships, even for data only on version 3.10. So this is more about, again, super duper. It's not going to be so much your support, which may occur or may not, but downstream support effects that you're not fully or at all anticipating. At some level, of course, it's not your problem, but it's a significant issue that's certainly prevented me from doing something similar. I don't want to lead users down a primrose path.

Yeah, so this, I mean, obviously, maybe I'm making Dave's life miserable. So even though I'm not calling it SuperDDuper, he's still going to have problems. What he's talking about is if you have lots of clones on your drive, clone files on your drive, and then you try to use SuperDuper to clone that drive periodically to like another drive that is the same size, you may be surprised to find one that says, oh, not enough disk space. And be like, what do you mean not enough disk space? I have one disk.

four terabyte drive that I'm trying to clone to another four terabyte drive. How can it not fit? Clearly it fits. I'm looking at it right here. Well, the drive you're looking at has a whole bunch of clones on it that are only taking up, you know, one amount of the space, even if there's five copies of the file. But when you copy it to another disk, if you don't faithfully reproduce those clones, you will take five times the space for those five files instead of just the one times the space that it's taking on your drive. That's what he's talking about, about drives

growing or whatever, like migrating to a new Mac, for instance. If you have tons of clones in your drive and you migrate to a new Mac and it doesn't reproduce the clones, your stuff might not all fit on the thing. Now, my response to this, I had a couple of responses. First of all, obviously, Dave's perspective is he is doing support and writing an app that does disk cloning. So yeah, it's going to be probably more an issue for him than it is for me. But second of all, ever since Apple introduced this feature...

The finder does it every time you copy a file onto the same volume anywhere, which is why people have clones all over the place. People have no idea how many clones they have. They're just blindly and instantaneously copying giant files around and putting them into folders and organizing things and not realizing they already had a copy of that somewhere over here or over there.

So I'm not the only thing creating clones. People are doing it themselves every time they duplicate a file in the Finder or copy it somewhere, right? Again, on the same volume only, right? Still, I'm obviously making the problem worse, which is kind of a bummer. But this feature exists for a reason, not just to make copies instantaneous, but to save space. Like I said, I leverage it myself because one of the big things in my home directory is the giant...

full of all my audio recordings. Every time I record a podcast, it's a couple hundred megabytes and I copy it to a bunch of different places to organize it. And I don't worry about those copies because they're all clones. They don't take up any more space because I'm copying them in the finder, right? And that's one of the reasons when I duplicate my drive with SuperDuper, sometimes I run out of space. Now, what he was saying is when you do the initial copy with SuperDuper, it faithfully replicates the clones. No problem. When you do subsequent incremental copies, it does not faithfully reproduce the clones because it would take too long or whatever.

I told him if you added the option for smart update to reproduce the clones, I would take that option because I want I want that to be leveraged. It's leveraged on the initial copy, but not on follow copies. But anyway, I'm still making it. We'll see how it turns out. It may be disastrous for me. Maybe disastrous for Dave. Maybe disastrous for users. But right now I'm still a go.

This week, we are sponsored by Aura Frames. Imagine you wanted to give the gift of happiness. Unfortunately, you can't literally do that. But thanks to Aura, you can effectively do that. How? Because you give people the ability to get their photos out of their devices, out of their memory banks, if you will, and put them on display in the middle of your home. Aura Frames is a great way to do that.

Aura was kind enough to send a couple of these photo frames to me and I kept one and put it on my living room wall and I sent one to my mom and dad. I could pre-set it up before it even left the box by scanning a little bespoke QR code, give it their house's Wi-Fi password so that when they plug it in, magic happens. It's incredible. That's a great example of how well Aura Frames thinks about everything. They're just so well designed from top to bottom. There's no chintzy looking branding on the front or anything like that. In fact, I'm not even sure there's branding anywhere on these things.

But if there is, it's got to be on the back because I never see it. These are so well done. And the thing of it is, is that every time I walk in my living room, I see a picture of the kids or Aaron or Penny or what have you. And I get to be happy. I get that little dopamine hit of a happy memory that I shared with my family and my parents. Oh, Nellie, they love putting pictures of their grandkids and their kids and their friends and

whatever. And they've set this frame in the focal point of their living room. It is like the side table that you can see from the front door. That's how important it is to them. These things really genuinely are great. They're super easy to upload photos to. I can upload photos from my phone directly to my parents' frame or my own or both at the same time. Again, just really well designed top to bottom. So what can you do? You can save on the perfect gift and

be it for now or for later or for yourself by visiting aura frames.com to get $35 off or as best-selling carver bat frames by using the promo code ATP at checkout. That's aura frames, a U R a F R a M E S.com promo code ATP. This deal is exclusive to listeners and the promo code only works for Americans, but still even at full price, these things are bargains. So go get yours. Now terms and conditions do apply. Visit aura frames.com and use the promo code ATP. Thank you to aura for sponsoring the show. Oh,

Tim Shoof writes, do you think that an APFS style switch from JPEG to JPEG XL in the Apple photos library is planned imminent or for some reason unlikely? I'd certainly like a 50% smaller photo library. Are there third-party tools to achieve this? And would you trust them?

I don't think this is likely anytime soon, although to Tim's point, they've shown that they can change the wings while the plane is flying, so anything's possible. I don't know. I would want to keep my photos...

kind of as they are, even though there's really no academic reason why this wouldn't be a good idea. I don't think I would do it, though. If I remember correctly, isn't one of the benefits of JPEG XL that it can actually store existing JPEGs at smaller size and then losslessly go back to them if it needs to? I believe that's right. Yeah, so that's what he's getting at. The idea that the specific feature of JPEG XL he's talking about is take an existing JPEG that's not an XL JPEG

and make it smaller with no loss in quality. It's not recompressing it at all. It's just found a cleverer way to store the exact same JPEG. So it doesn't improve the quality of the photo, but it also doesn't make it any worse. It is not recompressing it. JPEG XL has a way to just take your existing JPEGs and make them smaller without changing the quality of the display, nothing about them. It's essentially a lossless conversion, just saving space. Apple could...

do that to all of our jpegs uh if they wanted to i think probably the reason it is not imminent is because uh development in the photos world uh is slow uh sometimes for good reasons because it's a very important thing to get right how long did it take for us to get icloud shared photo libraries like a decade right uh but i'd rather have them get it right than you know ship it and destroy all my photos right so i think this is not imminent but it definitely would be cool as for

Are there third-party tools and would you trust them? I would absolutely not trust them because when you do this thing, you are converting it from a JPEG file to a JPEG Excel file. It would have probably a different file name extension. It would be a different format. Apps have to read it. You can't do that behind the back of the Photos app.

Like, Photoshop expects JPEGs to be JPEGs, not JPEG XLs. I would absolutely not trust a third-party tool to do this because that's not what the photo library is expecting. Even if you just think of it as simple as, like, the metadata in the SQLite database in your photo library will no longer match. Like, the file and disk will be a different size, right? It'll just...

No, if you see a third party tool that says it will do this, don't do it. Like I'm talking about changing your photo library in place. Now, if you want to like export originals from your photo library, put them in a third party tool, change in the JPEG XL and re-import them by all means. But you're going to lose all your metadata for the photos if you do that. So anyway, it would be cool if they did it. But as we saw with iOS, they didn't even like go whole hog on JPEG XL on the phones. It's just a thing they use for RAWs, but they didn't even switch. The regular borders are still HEK. So baby steps in the JPEG XL.

I mean, when you think about it, though, like nobody has a higher incentive to shrink photo JPEGs than Apple because Apple probably stores more photo JPEGs on their own servers than anybody else in the world. If I had to charge you for it.

I don't think you have that much incentive. Well, but still, I think they would benefit substantially from the resource savings on their end alone. And I mean, who knows how many people really pay that much for iCloud who wouldn't still pay. That's the question of like, are people simply not paying and by making it smaller, making it... Well, they're never going to fit in between the amount. They give you five gigs for free, which doesn't matter how much compression they do. That's not going to help anybody, right? So...

I don't, yeah, I see what you're saying, but right now they, anybody who does, anybody who pays for any iCloud storage gives Apple the, you know, market rates for it. I think last time I reviewed it, they were about charging about the same amount as Google. So they're not ridiculous, but it is ridiculous that the amount you get for free is basically useless.

But I could totally see, because keep in mind, occasionally what Apple will do is, in order to justify still selling iPhones and Macs that have much too small a base storage for the era in which they are sold, they will sometimes, you know,

do fancy compression tricks to save space in order to make them not suck for a little while longer as much so they can keep selling their 256 or their 8 gig or whatever. We see all the work they did to try to fit modern LLMs

to try to shrink those models to fit onto these small amounts of RAM that are in iPhones. A lot of that's because it's just more efficient. A lot of it is because the iPhone RAM has been too small for a little while, and so people... They kind of had to make it work. In this case, maybe...

Maybe they'd be motivated to do one of these in-place migrations over time for everyone's photo libraries as JPEG XL support improves over time. Maybe they'd be motivated to do it simply to keep selling 256 gig Macs for a little bit longer. There are other cynical reasons than to do it. Remember last episode they said we're going to get the 1 terabit NANDs?

We're going to be off 256 next year. That person promised. I hope. Yeah. But by the way, just to be clear, JPEG XL, like they should change. To get the benefit of JPEG XL, you should be saving your photos in JPEG XL. This thing we're talking about where it can take your existing JPEGs and save them without loss of quality in a smaller size. That's like an off to the side. Nice to have the advantage of JPEG XL. We talked about before is like, it will make you better quality pictures in the same size or, you know, like it's,

you should just be, the camera should be producing those if you want to get the real benefit from it. So I would imagine they wouldn't do the recompression until or unless they decided, Oh, we're switching from heat to JPEG XL. Uh, and like the, the, you know, the space savings is like, well, since we're in there and since every, because if they did this, like every part of the Apple ecosystem would have to fully support it, which I think it mostly does as already. I think we had a followup item about that where most Apple things can currently read JPEG XL, but Apple's cameras don't shoot them. Uh,

And I'm not sure it's entirely universal. So we'll see. I'm rooting for it. I love saving space. I don't know if you heard. You know, I was aware of that, as it turns out.

Unrelated to this, we got in a little bit of anonymous Apple Genius feedback with regard to self-service. So this anonymous genius writes, regarding buying upgrade parts from Apple's self-service store, you can't simply insert the main logic board of one configuration of MacBook Pro into a different configuration of MacBook Pro, nor can you do this for the storage modules in Mac Studio, because Apple requires their configuration tests to be run for self-service repair and for Apple Store and mail-in repairs.

This configuration diagnostics will fail unless the specs match what's already assigned and configured to your serial number. Apple locks storage memory configuration to your serial number, preventing you from using a different logic board configuration to increase storage memory, which is why you enter your serial number on the self-repair website. Womp womp.

Bummer. All right, John, apparently you're still trying to make pinstripes happen. What's going on here? Not me. I just was using my computer one day and I looked down at the finder and it was looking a little under the weather. Maybe like I had a rough weekend and just woke up too early. I guess we'll put a picture in the show notes or maybe it'll be chapter art or something because it's hard to describe what it is. But it's clearly a...

a mistake like a rendering problem it doesn't look like it was done stylistically it's like the finder icon in the dock on the left side of the dock uh and it's got kind of vertical stripes where there's like a dim area in the lighter dim area and lighter like maybe i don't know 10 15 stripes vertical stripes across the thing but as it gets towards the edge like the stripe breaks up and it's like melting the colors behind it and it just looks awful i posted about it on mass and i said you feeling okay this morning little guy um

And I thought it was just me, who knows, whatever. I have exotic hardware, sometimes weird stuff happens. But then Jeff Johnson posted that he saw this on Reddit as well. Someone who had the exact same problem, only on their computer, it was Safari and the Finder. You can see in my screenshot, the Finder's messed up, but Safari right next to it is fine. And then in this Reddit screenshot, Safari and the Finder are messed up, but the icon between them, which is like Launch Center or whatever, is fine.

So anyway, this is a 15.2 thing as far as I can tell because it started after we all updated to 15.2. I think this is an Apple bug. Just FYI, if the icons in your docs start to look wonky, it's probably 15.2 and hopefully Apple will fix it. All right. That was quick and easy. Speaking of quick and easy, I have a question for you. So I don't drive around town very much. And when I do, it's usually 10 or 15 minutes. But it seems like some way, somehow, every time I get in my car,

That is the moment that any of my friends decide they would like to talk to me via text message. And so because of that, I am often reading, if you will, and responding to text messages via Siri on CarPlay.

In 18.2, I swear that Siri has gotten more emotional. What does that mean? So, like, as an example, when I said something, gosh, I wish I remember what specifically it was, but I said something like, I said to, I kept saying the S word, I'm sorry, I said to the dingus, you know, I'm dictating a text message, right? And I said to the dingus,

all caps, no, all caps, freaking, all caps way, which the dingus and Kurt interprets as, you know, no freaking way written all in capital letters, right? You hope. And normally, normally what would happen is it would read it back to me because it's car play and it would say, okay, sending to Aaron, no freaking way. And that's all I would get.

But ever since 18.2, I swear to you, what I get instead is, sending to Aaron, no freaking way would you like to send it. And so I feel like Siri is actually expressing some amount of inferred, I guess, I think that's what I'm looking for, inferred emotion based on what you're sending. And that's the most obvious example that I can think of is when I said something like that that was...

uh, surprising. And I feel like I keep saying the S word that dingus kept it like, uh, read it to me and read it back to me in a surprising way. Um, I said something angry, um, I think similar to what I just described and, and, and it read it back in like an angry way. You know, maybe it was the same words, maybe it was different, but it was like, no freaking way. You know, I'm exaggerating some, but you get the idea. It is surprising to me that it's showing a bit of emotion and, and,

I don't remember this being a thing before. I don't remember reading anything about this anywhere, but I swear to you it's happening. So if you guys don't have anything to say about it, that's fine. But listeners, reach out on like Mastodon or something. You don't need to email us or anything like that, but reach out to me on Mastodon. If you've seen a similar experience, had a similar experience online,

Or if there's been coverage about this that I missed, I would love to get an email about that because I totally miss this if this was stated and I just didn't realize it. WWDC was so long ago that we've all forgotten about what was advertised for iOS 18 because it's been rolling out over the course of this whole year, but I'm pretty sure this was one of the features that was advertised way back at WWDC that was coming in iOS 18 and we all forgot about it because at this point, I'm not even going to blame Casey because it's just been so long. This year, they have just been...

It's been a slow roll. These features are not coming fast and furious. It's like when they get around to it. Someone in the chat room is saying it's all of 18, not just 18.2. But yeah, I'm pretty sure this was in fact an advertised feature of the new Siri. I think when I first upgraded iOS 18, I noted that it did that as well.

And I did not like it. I wanted it to go back to being on an even keel. I don't need it to add to play act the things that I'm saying. But yeah, I think it's not just you, Casey. As long as it gets it right, which so far it has, I like it. But I don't feel like I noticed it before 18.2, but it very, very well could be that it was 18.0 and I just didn't notice.

Yeah, that's another thing with the 18 release. I'm just going by what some people said in the chat room, but it's like which 18 release had which thing in it and which things are still to come on which platforms? This year, man, like there's, we didn't put it in the notes here, but there's been some stories about like the people who at Apple who are supposed to be working on iOS 19 are delayed because so many people are still working on 18 because 18 is not done. Like we're getting into this type of thing where it's like they will announce a WWDC and

basically all the features that they're going to release over the course of the next year and only by next wwc will they all actually be out which is a strange way to do things because then the next wc comes along do you have another year's worth of features to announce at that point are you just going to start announcing things before anyone has begun working on them and just assume you'll get them done within a year obviously the ai thing sort of catching apple uh

not by surprise, but sort of like they, they had other priorities and they shifted priorities quickly. Like, I think that has been an issue. Hopefully they'll get back on track, but this definitely has been like the longest release I can remember. And for, for on all their platforms of just like, it's not going to be done until certainly not done in within the calendar year that the WWC was. So they'll still be releasing features into 2025. Um,

Whereas in past releases, by the time the year turned around, it was just bug fixes. Well, if it was 18-0 and I didn't realize it, that's my bad. But it is weird. I think I like it, like I said, but it's weird. Honestly, I think it's probably for the best that Siri tries to accurately reflect the emotion that we are choosing or people who send stuff to us are choosing to use capitalization, punctuation,

Like, we're choosing to use this stuff. You know, maybe, you know, now I believe we now have support for some basic, you know, bold italics kind of formatting. You know, hopefully Siri could take advantage of that when reading things aloud. I think that's all good. It just has to be done well. And so, you know, once we get used to it, you know, the shock of it being different will wear away. That being said...

There are still certain things that Siri reads. Like if I get a message while I'm walking my dog and it comes in through my AirPods and Siri dictates the message to me over my AirPods, I'm not sure what part of the computational chain there is choosing how to read those out. But wow, it's really dumb in certain ways. I've had it do things like read out tracking numbers.

individually like yeah yeah yeah one z two five six like it and boy that like when you're like walking your dog and it's cold and it's winter and your hands are all bundled up in your pockets and you have a hat on and all this stuff like there is no good way to tell it stop for the love of god like please stop well no there is there is if you enable the head shaky stuff you can shake your head left to right which is you know the standard no head shake and and

English-speaking languages anyway, or English-speaking cultures, you shake your head laterally left to right and you hear do-do-do-do-do-do and it'll stop talking. Oh, really? Once it has already started? Yeah, yeah, yeah. I'll have to try that. You can also say, hey, dingus, stop when it's talking as well. That's true. Yeah, I know, but usually by the time you get that out and by the time it recognizes it, it's a slow process. Sometimes it also decides to do, you know, so-and-so wrote a long message. Do you want me to read it? Yeah, but...

So far, I assume it's waiting for me to do the head tracking. I turned off the head tracking, the yes-no tracking, because I thought it would be annoying. But I will actually say out loud, yes or no. Turn on the head tracking. Like I said, I hear the little rattles a lot, but I've never accidentally triggered it yet. All right, yeah, because so far, just by vocally saying yes or no, I've never had it once do the right thing. I've never had it recognize it. Fun.

Marco, you have a new toy you wanted to tell us about. Can you tell us about it? I do, yeah. So this is something that breezed by our friend Charlie Chapman. I saw him buy one, and I'm like, oh, that looks kind of interesting. He posted on Amazon about it, and I immediately ordered my own. This is the Terminal with No Vowels.

uh t t r m n l um it's basically a little kind of ipad mini sized e-ink screen that you use to just display data it's like a little e-ink dashboard it's about it's a little over a hundred bucks and i got one and i you know when i was buying this i'm like all right

My family could use some basic ambient display of information in the kitchen, like on the kitchen counter next to the HomePod. I want to be able to show the weather and some countdowns to important dates and stuff like that. And this does that excellently. But I thought for sure, like, all right, I'm going to put this up sometime when Tiff is not home.

and just hope that it can last as long as possible before she sees it, 'cause I think it's a huge risk to put this up and have it be approved. I played a family favorite. I had it do like a three-pane layout. One big pane was weather, one small pane was a date countdown to the next important date, and the other small pane was a quote from The Office, the TV show The Office, 'cause I'm like, my family loves The Office. So I'm like, all right, I'm like playing with the crowd here.

maybe that can win people over. I put it up there and a day went by and I'm like, there's no way I got away with this. Because when I change anything in the house, like,

You could move an entire room in my house to a different spot and it would take me a week to notice. But if I change anything in the house, Tiff and Adam both notice immediately. Like they can sense any disturbance, anything being rearranged, anything missing, anything being new or removed. They know immediately. They're like dogs. You ever take your dog for a walk and the neighbor has like a new, like a tiny flag on their lawn and the dog freaks out about it because something has changed in their environment. Right. Exactly. Yeah.

So anyway, so about a day into this, I'm like, there is no way I'm getting away with this. And then eventually, you know, somehow a conversation, somebody's asking something and Tiff is like, oh, is that what that new thing on the wall is doing? I'm like, oh, you saw that? Yeah, yeah, I like it. It's fine.

Whoa. Look at that. It's the magic of E-Ink because it's not emissive. It's not a quote unquote stream. I'm assuming it doesn't have like any kind of like front light like the Kindles. Not at all. Yeah. So that's how it sails through because now it is just like a piece of paper. Yeah. Like so what I like about this thing. So I just love E-Ink. I've always loved E-Ink.

I hardly ever read books. And so I don't really have a chance to use E-ink. I also like, you know, when you look at stuff like the Remarkable tablets, those are great. I also never take notes like with pencils or pens. So I don't have much of a reason to use E-ink in most of my day-to-day life, even though I just love it as a display technology.

And it's, I mean, it's very limited. It's very weird, but it's really cool. And so I'm like, all right, for this kind of purpose of like an ambient display, it's actually perfect. I think let's try it. And what I like about the terminal product is, you know, you've been able for years, you've been able to go buy like a basic e-ink screen. That's like, you know,

two to eight inches wide for almost no money if if you get it in like a raw state that you can plug into a raspberry pie or something those are great a lot of people have done some fun stuff with those but that would never pass muster as a thing that looks nice enough to put in my house like that would like that would be the kind of thing that would be you know

constrained to my office. And even then, I wouldn't even want to look at that in my own office. But what I like about the terminal is that they've basically taken that type of hardware, like inexpensive mid-sized e-ink screen,

It's not super high resolution. I think it's like 160 pixels. So it's not super high res, but for viewing it at a distance, which is what you're mostly doing, it's totally fine for that. And they've put it in a nice looking, small, minimal white enclosure.

They have a battery in it that lasts like a couple of months. So it has a USB, it charges by USB-C, but I just, I took some Velcro command strips and just stuck it on the wall and I charged it up and stuck it on the wall and, you know, throttle back some other refresh settings. So it lasts even longer. And so, yeah, every three months I'll take it down with those Velcro things and charge it for a few minutes and put it back up.

So far, this is great. I am and it's not an ad, but I strongly suggest I think people in our audience would enjoy playing with this thing. I do have an incentive for everyone in the audience to buy these.

The current developer ecosystem of like plugins to display things that it's very friendly, like you can very easily make your own plugin, but there just aren't that many yet for different services. So I would love for more people to buy these. So that way, maybe more people would develop plugins for more stuff I might be able to do with it. But yeah,

I'm very happy with this because it's finally like it's taking all of this amazing like this amazing cheap tech that we had like hardware is so cheap now, especially if you don't need like top of the line cutting edge stuff. And so I'm excited to see products like this, you know, even if this one particular thing isn't exactly what what you need out there.

There's so much amazing cheap hardware now that just makes it so much more accessible. You can use it in so many more places. More people can buy it. More people can afford it. More people can afford to replace it if it breaks. If you have a need for some kind of ambient display in your house, check out Terminal. I think it's a really cool idea. Their heads are in the right place. It isn't some kind of weird, creepy startup trying to VC you up the rear end or anything. It's just a nice-seeming effort to a nice product. Yeah.

so far it passed muster as like something they can live in our house and that's that's pretty impressive so the three things you're showing on it is that just like built-in stuff like does it uh i forget what you said weather is it like a built-in weather thing and then countdown you just put a date in and it shows you countdown yeah i haven't yet made any custom things um frankly i just haven't had time i do intend to um because this this it's so it's so nice and good looking and

useful that i think if i were ever to want to make like some dashboards for my business metrics say this would be great to have like you know two or three of these in my office uh you know just you know next to my desk or something like that um but right now i'm just using the like there is just in their existing like library of plugins that they they already have um i'm using like yeah the weather the date countdown and the office quotes those are all built in

I think I mentioned a while back that my brother had the need for the same type of thing to show, like, the family calendar. It wasn't e-ink, but it was, like... I don't know. Maybe it was color e-ink, or maybe it was just a really bad-looking LCD. But it looked very similar to this. Similar size, similar proportions, even, you know, similar, like, the little bezel was similar as well. Although I do kind of wish this thing didn't have the Terminal logo on the front and the bezel was equal on all sides. But whatever. It's cheap. But, yeah, like, having...

E Ink, one of the good roles of E Ink is to basically be configurable paper with all the advantages and disadvantages of paper. You can't see paper if it's dark and you don't have light shining on it, but if there is light shining on it, it looks real good and doesn't take a lot of power and yada yada. Um,

So, yeah, I'm not sure if I would use this, but another question, does it come with bundled support for like Google Calendar or other calendar things if you want to show your calendar and not just a calendar? It doesn't have any kind of iCloud support yet. So that's kind of what I want people to make. Otherwise, I'll be forced to make it for myself and I don't want to. But it does have, I believe it does have the Google Calendar support already. It does. If you go to slash integrations, useterminal.com slash integrations, you can see integrations.

What they're currently advertising, for lack of a better word, and yes, Google Calendar's on there, but I also am an Apple Calendar person. So yeah, I like the idea of this quite a bit. And on my personal to-do list of projects that I'll never get to and probably talk about a lot, the top of that list is still Fiverr, but somewhere not too far below that is writing my own dashboard-y things.

thing, like Marco said, probably with Raspberry Pi, because hey, what else can I solve with Raspberry Pi? But I feel like it's a lot of work, especially if I want to make it look good, which is not my strongest suit. And I would have to get a bunch of integrations or find third-party code that does these integrations. It just seems like

way too much effort. I love the idea of this, but the particular integrations I want doesn't seem like they exist. I still use AnyList, which is, I think it might have been a sponsor years and years and years ago, but I use that for our shopping lists and I use Apple Calendar. So this does not fit my needs today, but I concur with Marco's implied point that it could fit my needs tomorrow.

Yeah, and also if you are so inclined to write your own plugins, you pay an extra, I think, $20 for access to the developer keys and everything, and you can do basically whatever you want with it then. And it seems like it's pretty easy because it's all just web page rendering, basically. It's like rendering HTML and CSS to an image. So it seems like, I haven't delved into that yet, but it seems like it's pretty easy to make your own plugins.

I'm still using panic status board. I had the app that was discontinued many years ago, but somehow still is running on my iPad. Um,

Um, yeah, that's, that is very limited, but Casey, if you're thinking of doing this, you might want to try doing it on iOS or iPadOS because there's that cool Swift charting library that exists. It depends on what you want to do with it. Like what, what, what this thing sounds like is doing like rendering HTML to an image is way more flexible. I kind of wish status board did that, but it doesn't. But anyway, uh, yeah, this is cool. I'll definitely keep in mind if I have a need for something that's already said last time that I didn't think I had any need for a, uh, um,

a digital picture frame. And apparently I do. So maybe this will find a place as well. Like I'm thinking, because you know, we do use Google calendar, maybe having this. Well, so the thing is we have a paper calendar on our fridge, but the reason we do that is because it just to see the pictures, essentially, I don't think anyone, we should just take the paper calendar and just like fold it over. So all you see is the picture on the top page and not the calendar. No, actually I do look at it. I looked at, I look, here's when I look at the calendar that's hanging on the fridge. I see the pictures and I like seeing, I make the calendar out of my own pictures, you know, one picture for each month. Um,

The main time I look at it, and I did it today, was when I'm taking the milk out of the fridge and I see the expiration date. I glance at the calendar that's on the fridge door to see what day it's today. Is this expired? How close is this to being expired?

So yeah, maybe, I don't know, maybe there's a role for this somewhere. I'll think about it. Interesting. I'm looking at the API documentation, which we'll also link in the show notes. And they have a bring your own device section. And they say, look, the components are probably going to cost more than what we have put together. Yeah, we share this not to dissuade or pitch you, but rather as a friendly FYI. Making your own terminal from scratch is not an economically rational decision, but a labor of love.

love i mean john you could make a custom plug-in that would you know just you enter every time you buy new milk you enter expiration dates in in your api somewhere and then it just counts down the days on the terminal on the fridge to you know the milk has three days left there's more than one thing of milk on the fridge but yeah it's just easier to pull it out and glance at the calendar as you close the door it's the the system works fine but that's what that's when i use it that's when i actually look at the calendar portion the rest of the time i'm just looking at the image part

I like the idea of this quite a lot. I really, really do. And it's honestly, it's the first one that doesn't look like somebody's DIY project. Like it actually, it's actually like a legitimate product. Like it isn't like, you know, if somebody is, you know, flimsy 3d printed 3d printed. Yeah. It's not like, and it isn't like, you know, held together by like tapes and screws. Like it's, it's just, it's actually just like a nice feeling and looking product. Ooh,

We are sponsored this episode by Masterclass. For a gift that's always on time and lasts a lifetime, you can't do better than Masterclass. Your loved ones can learn from the best to become their best. This can be a gift for unlimited learning. You can learn from any Masterclass instructor anywhere, on a smartphone, computer, smart TV, even audio only in audio mode. So this can do all sorts of wonderful things for you or as a gift for your loved ones.

You can prioritize like a boss with multimedia icon Elaine Welteroth. You can learn how to invest in the stock market with Ray Dalio and some of Wall Street's best. You can build a billion-dollar company with self-made billionaire Mark Cuban. And these classes really make a difference. 88% of members feel that Masterclass made a positive impact on their lives. So this is a great way to just...

Get yourself into new things this year. Develop skills. Try new things. This is the time you have both the holidays with gifting season and you have the new year. And this is a great time to kind of take some, evaluate what you want to be doing with your life and say, maybe I do want to get better at this hobby I've always wanted to do but never quite felt right. Or maybe I do want to get better at work or develop my skill in these areas. Masterclass is great for that.

Masterclass always has great offers during the holidays, sometimes up to as much as 50% off. Head over to masterclass.com slash ATP to see their current offer. That's up to 50% off at masterclass.com slash ATP. Masterclass.com slash ATP. Thank you so much to Masterclass for sponsoring our show.

Last week, there was, I think it was last week, there is a new episode of Apple's Immersive Video Wildlife show. The previous episodes were, I think, Rhinos first and then Elephants second. And now there's a new episode about sharks. And...

It's pretty good. It's like, I don't know, less than 10 minutes. And you should check it out if you're one of the dummies like me that bought a Vision Pro. So check it out if you haven't. I just wanted to call it to people's attention because I feel like because not that many people seem to have Vision Pros...

When new things like this drop, I often am unaware until I put my face computer on my face again and just go, you know, spelunking through the Apple TV app. So this is the service I provide to you listeners for all six of you that have one of these. You're welcome. And you also provide a personalized recommendation service to me because you say, hey, there's a new thing out in the chat, like before we record.

So I also, I watched the Sharks. Oh, what did you think? It was delightful. Again, this is what I want to see more of. So I'm happy to see that it's happening. It's very slowly, but it's happening. We are getting more immersive video. It still feels like...

demos of content not content like it still feels like these are little snacks and little previews of what could be done if somebody gave us like full length content like you know because i think i think the sharks thing was maybe like what seven or eight minutes long like something like that yeah like these aren't long these are these aren't even as long as sitcom episodes like they're they're just they're like you know seven to fifteen minute chunks of

example content and it's good example content but it still feels like example content and so i that that being said um there was also news i think yesterday um there was that black magic camera announced you see that oh yes yes so because i think this is i'd actually like to talk about this for a second um so basically black magic released a or announced rather a camera that's shipping in i think a

It's a dedicated hardware camera specifically for recording Apple immersive video video. It looks basically like, you know, two giant lenses, like two eyes on the front of what otherwise looks like a fairly common black magic camera back. It's two, you know, 180 degree kind of fisheye style lenses, each one having an 8K image sensor behind it.

So it records 8K per eye at, I think, 90 frames a second up to. And then, of course, you know, all the different, you know, video specs of color depth and everything else. Before this, I'm not sure what people have been using to record the immersive video to date. I know Kinect

Canon has that, like, we talked about it when they announced it. Canon released, like, a dual fisheye lens that you could attach to some of their more recent cameras. The Canon setup is about $5,000 all in. If you look at the reviews of people who actually try to use it, first of all, it is not as good as the Blackmagic thing in specs in that the Canon lens is using two eyepiece lenses.

you know lenses to project an image onto one single 8k sensor so it kind of just divides the sensor in half it projects the left eye onto half the sensor right on the other half of the sensor so the canon one is lower resolution and i believe only up to 60 frames a second so the black magic camera is higher specced it is you know dedicated for this purpose designed right from the start for this purpose and instead of costing five thousand dollars it costs thirty thousand dollars

Oh, how cheap? A bit of a jump. Now, this sounds ridiculous, right? I know how this sounds, a $30,000 camera to shoot Apple immersive video. But again, if you look at the Canon reviews, people trying to use the Canon double lens to do this,

The main challenges seem to be on the software end, like whatever data format the camera is generating, it's like a special format that you need their special software to transform into anything else. And apparently, not only does that require a subscription that nobody wants to pay, but apparently it also is not easily compatible with Apple Vision Pro or Apple Immersive Video and is also not as good of quality because it's not having 8K per eye or more than 60 frames per second. So,

I am actually excited to see this Blackmagic camera release. And I wonder, I don't think we know what Apple has used to shoot the existing Apple immersive video. Maybe they were just using like a preview beta version of this camera. It's possible. That wouldn't surprise me at all. Like maybe they worked with Blackmagic to develop it. And part of that process was they were able to use pre-production versions to shoot all this other content. That was all shot on iPhone.

Yeah, okay. Because you can buy a $30,000 camera or you can just use your iPhone. Yeah, right. What's the difference? But the reason I'm excited about a $30,000 camera is not because I'm going to buy it. Not only is that obviously $30,000, but also it's like a black magic professional camera. I have no idea what the process is to shoot with that camera and then whatever data format it makes to take that onto...

whatever computer however it's edited i have no clue i was gonna say you probably need a pc and adobe premiere or something no well the canon one you do actually but but uh no i the black magic i think i think it goes into da vinci resolve uh but i've you know i've heard of that as a name i don't know what that means i've never seen the app but you know you're you're practically an expert

So the point is, right now, whatever this camera's going to be when it comes out in a couple of months, it's going to be $30,000. It's going to require who knows what else in the ecosystem around it to get that video from shooting it to actually being able to play on a Vision Pro. Who knows what app...

plays those how do you get the data into the vision pro or do you have to put it in the apple tv marketplace however that works does apple photos support it can you put it in your photo library like who knows like those are all going to be worked out over time but what's exciting about this is that the existence of this camera being publicly released and probably being significantly better than the canon solution because the canon solution was just kind of you know bolting on an existing lens onto cameras that weren't designed for this this is designed from the start for this

I'm not going to buy a $30,000 camera to do all my video projects that I do. I'll zero of them.

But other people will be able to who know what they're doing. Video production companies will be able to buy or rent these and shoot a concert, shoot a play, a nature documentary, whatever it is. Getting these cameras out there will be a huge boost to getting more immersive video content made. Whether it's the high-profile content snacks that Apple keeps giving us

Or, I think more interestingly, get these into the hands of YouTubers and smaller production houses. Because again, they'll be able to rent these. So if you're doing a shoot, I'm sure you can rent one for a few thousand bucks or something like that. And that makes it more accessible to smaller productions. So this, I think, is an important milestone. I think this is more important than just breezing through the news for an hour. I think this is actually going to be a much more important deal than that. And

yes, this is $30,000 today. The Canon solution is, you know, quote, only $5,000. And that's giving much of the functionality of this. Granted, you know, the reviews say it's not very good in different ways, but those are mostly workflow problems and software and support problems. It's only a matter of time before

Maybe it'll be two to four years when there will be maybe one or two other options that are substantially in the ballpark of that Blackmagic camera and capability, but maybe a little bit more consumer and price friendly. Maybe in four or five years, maybe we'll be able to get something like that for $2,000 or $3,000 in the prosumer market.

And that becomes something that like maybe nerdy people like us start to buy and experiment with. Or we start to rent it from camera rental companies for like important family events. I would love the ability to buy or rent something like that and capture like

basic you know when my kid has a band concert i mean obviously that would be a little disruptive to the audience members behind me so maybe maybe i work with the school in that one but like you know or you know a family event christmas day like when apple immersive video capture becomes accessible to prosumers and consumers i think that's going to be a big deal if the vision pro still exists by that point because i hope it does because

Yeah, it's great to be able to have a shark video here and there. But what I want is to capture the people I love and the moments of my life in that format. Because it really does feel like you're transported right there again. And that's so much better than little postage stamp dream blobs you get from the iPhone immersive video. And I'm glad to have those dream blobs. It's better than not having anything at all that's spatial. I'm glad we have that option.

But the immersive video with the 8K per eye is so much better. It's like it's an order of magnitude different in, you know, in so many ways, in quality, obviously, in frame rate, in just how much of the scene you're capturing and how it feels to watch it. And you really feel like you are in it. That's a very different thing. And so when that becomes accessible to regular people in some way, even if it's, you know, in a few thousand dollar camera, like,

Photo enthusiasts have been buying few thousand dollar cameras for many years. That is something that everyone will be able to afford, but it is something that gets it into the prosumer market in a much better way than a $30,000 custom pro everything camera. So this, I think, is a very important step. This camera being released that shoots directly to immersive video, apparently, or at least can be easily processed into it, that's a really big deal because that signifies that we are on that path.

And that maybe in a few more years, and that now, production companies now will be able to rent that and shoot a concert or whatever, and that's going to be huge. But then it's going to be even better when we can shoot our own memories with that. Yeah, and I just want to reiterate, I know you just went over this, but I harp on this regularly because it's very hard. It seems like a distinction without a difference, the different kind of modes that you can get in in the Vision Pro for video. And I just want to reiterate, there's things...

3D movies where you have a rectangle, but a rectangle that has depth to it, right? So it's just a rectangle, but you can see into the rectangle. It's just like a 3D movie. Then there's these dream blobs, I think you called it a minute ago, which is a very accurate description, which is where you have a very small, often square, if not rectangle, that the way they represent it and render it, the edges kind of like

are a little like wispy, if you will, and kind of fade out, sort of, kind of. And that is when you've recorded something on one of your devices, like an iPhone or whatever, and it also has depth. But unlike a 3D movie where the canvas, if you will, is multiple feet wide and multiple feet tall by default, you know, the canvas or the viewport, for lack of a better way of describing it, is very, very small by comparison.

And with both of these, with 3D video or the, excuse me, the 3D movies and with spatial video,

You don't get to control where the camera's pointing. You are along for the ride. There's depth, but you're along for the ride. And what we're talking about, the key that makes this so impressive is that immersive video, and I'll continue to harp on this for a long time now, immersive video, as you move your head and as you look around, you're changing the perspective of the camera. You're effectively moving the camera. You're looking at different things. And the jump from that,

a 3D movie or a spatial video to immersive is just night and day. It is wildly different. Wildly, wildly different. And even if you can reason through what that difference would feel like, I assure you actually living it is very different than what you think. And in the end of the day, I couldn't agree with you more, Marco, that...

Being able to capture full-on immersive video would be such a game changer. And I'm not going to spend $30,000 to do it. I don't think I'm even going to spend $1,000 to do it. Like Underscore had tooted, I'll probably forget to put the link in the show notes, but he had tooted, you know, hey, I can imagine being able to rent this for $1,000 and just goof off and try some things. And I agree with Underscore.

for me, $1,000 is still a bit rich for my blood, but the point is still fair. And I absolutely concur with you, Marco, that how cool would it be if you could, like, set this up, and I don't know what the mechanics of this would look like, but set this up when everyone's opening Christmas presents or a Christmas dinner or something like that. Yeah, or like,

And you're wedding around or at your wedding. God, yes. And you can look around and you can see, let's take your wedding as an example. You can watch, you know, the, the, your, yourself exchange vows with, with your bride, with your partner, but you can also turn your head and you can look and see how your parents are reacting. And,

you're not losing anything because it's not the camera. Like it's, it's not the recording that has moved. It's your, your viewport into the recording. So you could turn your head right back and look at your, you know, partner. And then you can look at their, their parents, you know what I mean? And so it is,

so incredibly powerful and I cannot wait until that is in the hands of someone like a prosumer like you've been saying but it's important to understand if you've seen a 3D movie and you think oh that's it this is not that great no no no no no no no this is more like a 3D IMAX than it is anything else

Yeah, like it this more than any other format. This makes you feel like you are there. You are in that room with the people. So if you know, like I could just imagine like I think back to, you know, videos I've taken of relatives who have since passed away. Like I'm so happy I have those videos because

This you feel like you are in that room. So like imagine the emotional weight of, you know, if I had like the video of my grandma with my kid, like if I had that video in this format, I mean, obviously I probably would not have hauled.

a $30,000 giant camera into the assisted living facility for that video. But like, you know, so it's obviously very important for the phones to keep getting better at capturing that stuff because they're always with you. But we don't know what media will be important to either us in the future or our future generations.

You know, like when we look back at our own childhood, you know, there's almost no videos of us generally. There's, you know, we're lucky to have like, you know, a few photos here and there. And then when you go to like our parents or our grandparents, there are usually like there's maybe one picture of them when they're young. Maybe it's their wedding picture and that's it. Like, you know, like as we move forward, there's,

you know, it becomes more commonplace to have more media captured of people and in more rich formats, whether, you know, moving first to first from photos into like color photos and then better photos and then more photos and videos and more videos. I think it will be, I think we will, if you, if you start capturing this kind of video somehow, when it becomes available to you, capture important moments here and there, you know, while it's still expensive and hard to do, maybe it's only things like weddings or whatever, but like

Over time, that will become more common and you'll be so happy you have those videos. And so I'm looking forward to that because I thought back before the Vision Pro was out, I thought that the version of immersive video that the phones were shooting now and the Vision Pro itself as a capture – because the Vision Pro itself can capture video, like spatial video –

But the quality is not great, and the immersion, therefore, is somewhat limited. As Katie said, you're only capturing a rectangle in front of you, and the resolution and frame rates are pretty rough. They're pretty bad, honestly. So this is like a leap above that. This is a huge leap above that. And so, yes, it's nice to have some spatial video of people here and there. It would be a heck of a lot nicer to have this.

So I look forward to this being more available over time. I think the realities of dealing with dual 8K video streams with 180 degree fields of view, there's always going to be some difficulty in shooting that. That's always going to be a little bit of a skilled operation to do it well. And I think for a foreseeable future, that's going to be somewhat expensive to do also.

But $30,000, it sounds ridiculous now, but trust me, to a production house, that's not that ridiculous. And again, there's the rental option. And it's only going to come down from here. So I look forward to this. I think this is a big deal. I am tempted to buy one. I'm not going to. So you say? I'm not going to. But I might. But like what Underscore was saying, I might rent one someday. I mean, obviously, I think it'll be a while before they're even available for rental, like from the rental sites. And they will probably cost, you know.

three to five thousand dollars for a few days with them so even that's probably going to be more than i want to spend on on my curiosities here but i can see the the future coming like i can see the first you know like the sun is rising in this area i can see the little glimmer of light like i think this is going to be really cool in not that much time from now

On the price of these things, like you said, I know $30,000 sounds like a lot, but just the regular digital cameras they shoot movies with get into those prices, and they have multiple ones of those cameras. So this is in line with prices of fancy cameras that professionals use to shoot movies, although professionals use iPhones to shoot movies too. But anyway, I know it sounds ridiculous. It's just not a consumer device. It's a pro device. Just go look at how much the equipment costs to shoot your average Marvel movie or whatever. They're expensive. Another thing is,

You know, so as you said, like there's the professional cameras and then we've got our iPhones with what they're doing. And obviously it's not great because the cameras are dinky and small, low resolution, and they're like five millimeters from each other. But I am looking at the picture of this camera and I'm looking at my phone and I am wondering why.

that if the iPhone 20 had cameras on the far ends of the phone and you were to line those cameras up with like basically the interpupillary distance of this camera, would it match? Because I think those camera things, they look kind of like they're about the same distance as your eyeballs. Like that's what they're going for, right? Big lenses, but like the centers of them are like human eye distance from each other. Well, an iPhone held sideways might

The corners of that are also within human-eye interpupillary distance from each other. And if you put...

a 180 degree fisheye camera on either end of your iphone and if they were 8k on the iphone 25 or something obviously it's not going to be as good as a thirty thousand dollar camera but you're getting closer to what we're looking for which is instead of the little tiny rectangle field of view that is you know the blob or the postage stamp that you you turn your head and what you see is not the thing you recorded because it's not there because the field of view is so narrow right yep

And the field of view being narrow is mostly due to the lenses. They simply can't capture image that are on the side of you, right? I mean, if you look at these lenses, you'll see they look like spheres poking out of it. Like your iPhone can't capture that. The light coming from the side just hits the side of the camera lens. Like they're flat. That's not going to work, right? And the depth perception being slightly off on your phone is because the cameras are just so darn close. The left and right image aren't that different from each other, right?

We want the, when we're looking through our eyeballs, the left and right image to be different from each other based on how far our eyeballs are apart. And everyone's eyeballs are a little bit different distance, but that's basically what we want out of it. And this camera achieves that. And another thing that I have a question about, I don't, maybe someone who knows about these cameras can write in. So I'm thinking of looking at this camera and I'm thinking about like, you know, professional movie cameras that you see of the,

Arri Alexa or the red cameras all of all the modern all the companies that like did well in the modern digital camera space and

and they sell you know they sell the camera bodies which cost as much as a car and they're just like a cube with and then you buy their incredibly expensive twenty thousand dollar lenses that you stick on to that cube and then you buy their fifty thousand dollar sd card that is really just a ten cent thing from a drugstore but they charge you a thousand dollars for it although did you did you see what they charge for an eight terabyte ssd no is it less than apple or more it's like half what apple charges

Oh, lovely. I think the RED cameras had... Back in the early days, the RED cameras had flash storage that was more expensive than Apple's. Like, again, it's the same storage, but it's like, well, this is the special RED stuff, and we guarantee it. It's quite a racket. I mean, they're selling these to Hollywood studios with multi-million dollar budgets, so it is what it is, right? But my question about these cameras...

With like the regular ones, like I said, you buy the body, but then you buy the lenses and the lenses tend to cost as much or more than the body because you can buy 17 different lenses depending on what you need for your shot. Are lenses a thing on this camera or can you not use lenses at all because...

Like, what's the point? Like, you need to get... You need to get 180-degree field of view. That's the... Like, so if you put two lenses on there and they also got 180-degree field of view, like, is being... Like, I don't understand how lenses work with that wide of a field. Maybe it just works the same. Maybe it's like you're looking through binoculars and it's like you're sitting...

you know if you put this at a concert and you had like a lens that was magnified it would be like you were sitting five rows up from where the camera is placed because it's magnified but you'd still get a 180 that can't possibly be true right because the things from the side like you can't ever with a regular lens with a narrow field of view you can essentially zoom in and it seems like you're closer to the concert but with a 180 degree field of view the thing that would be directly to your left if you were in the front row is never going to be directly to your left

If you're in the back row, like you'll never have, you'll never be able to see that person's ear dead on because they're in the front row and you're in the back row. You know what I mean? So I think like I'm looking at this camera. I'm like, it doesn't look like there's any place where you could put a lens on this thing.

I mean, maybe. I don't know.

I think it just has to be that because what you're trying to do ultimately is you're trying to match the visual perspective of what our eyes see. Well, our eyes don't see 180. Our peripheral vision is garbage, but you're trying to make it so that when Casey turns his head, there's something there that he can see.

Yeah, and I mean, by the way, the resolution on the edges of immersive video is also garbage. Like, we say we can turn our heads around to look at the sides, and we can, but it's very blurry and low res when you get to the edges. It's better than our peripheral vision, I'll tell you that. Yes, but for the most part, you want to focus straight ahead. But what you get in the field of view is useful for, like, just additional immersion and context.

So, yes, you can look over to the sides and the top and the bottom and everything, but you generally don't spend much time doing that. Also, they're out of focus. I mean, you don't have to look 90 degrees. You can just look off center a little bit. But, like, I mean, they obviously have the 360 cameras as well, 360 cameras that, like, erase the tripod that they're on, which is just, you know, sort of the end game of this. Like, I capture video in every direction and, like, you know, cleverly remove the monopod that my camera is sitting on.

you know, the resolution is lower on those. They're even more limited. But I think with things like this, like I don't think it seems to me that lenses, different lenses for this camera are,

are not useful or a thing, both based on my reasoning about it and also by looking at the physical item, I don't see how lenses would attach to it. Like the two lenses that are there are too close together. So that's an interesting... If that's true, someone write in and tell me, do they have lenses for them? Do they not? Would lenses just massively narrow the field of view and maybe that's a useful thing to do or not? But I don't even think it can physically attach. But anyway, someone who knows, write in and tell me. But either way, that's like, if it's the case that lenses are not as important on this, that's quite a change from...

decades and decades of, of, of, you know, movie cameras where it was all about which lens do you choose for this scene? And how do you do this? And that was just like, we reached the end game of like 360 degree camera in all directions, uh,

there's no more like lenses are not a thing. It's just like light, we put the camera in a position and light comes from everywhere towards it, right? I mean, yeah. Maybe it's not really the end game because Casey's saying like, you know, you can turn your head and stuff like that. But one thing you can't do, unfortunately, with these cameras and any of the immersive video we've seen here, with the exception of, I think, the immersive environments that Apple puts you in, is if you stand up in your concert seat, your perspective on the people on stage does not change.

Because guess what the camera didn't stand up the camera was the same height the whole time So if you wanted to see the top of that person's head to see if there was a piece of confetti that landed on their head And you can't see it when you're sitting and you stand up You still can't see it because you're not you can't change your perspective in that way All you can do is look at different parts of the frame that was captured right with the apple immersive environments where the I don't know what they do with like 3d modeling part or whatever you basically need to be like essentially in a game engine and

where, oh, if you're in a game engine, when you stand up, you can look on top of the dresser that previously you couldn't see the top of because it's all 3D rendered in real time and it's literally changing essentially the quote-unquote camera's position in space. So for immersive video to work well, and it does work well,

You stay in the same position. You stay seated. You can look up, look down, look left, look right. But what you can't do is stand up or step 10 feet to the left because that will not change your perspective on the video. Yeah. But I think the question about whether we need lenses on this or whether you can, I think that's immersive video, like immersive 180-degree stereo video.

I would almost treat it as a different medium than cinema or cinematic video. What's important is...

your perspective but what's more more important like what what your environment has in it like you're not doing like there is there's less camera work that can be done because if first of all if you if you move the camera too much it makes people motion sick and that's still a problem that i had a little bit with some of the shark video but it's less so i think they're getting better at that the more they make it's almost like you know asking like what kind of lens do you use for concert goers to see a concert it's like that's that's

That's kind of an invalid question. That's missing what this medium is. People who stage a theater play don't need to care what kind of lens people might be using in the audience. It's not about that. This medium, there's actually less camera trickery potential to do because what you're mostly doing is just capturing a scene in a way that the viewer is in more control over it than you are.

So what you want to do is capture as much as you can because you're going to get the 180-degree field of view no matter what. So capture whatever you can. Capture it and give people the freedom to look at whatever they want in the frame and guide them by putting the important stuff in the middle. But for the most part, they're going to be looking around a little bit more than you might expect. And I think it's just going to take a little while before people really get that who are producing it. But this is getting better. I just –

Like, you know, all these Apple videos, as I kind of said earlier, they're like snacks of content or previews of content. I kept wanting to just like sit on a shot for a while. Like the shark video is in the Bahamas. And there's one shot where they're doing like a quick helicopter flyover of like one of the islands in the Bahamas.

I've never been to the Bahamas. I was looking around. I'm like, oh, is this what the Bahamas look like? And I'm looking around, and before I know it, it's gone to another shot. Because it's like, oh, well, there's three seconds of what it's like to be in the Bahamas. I want more. They would have like, they're feeding some sharks under some water, and you can look up and you can see the boat in the water.

above you, like on the surface of the water. Then it explodes. Yeah, and I'm like, I'm looking up at the boat and then it's gone because the shot changes. Like, I want longer shots. I want more, I basically want more immersive content in the sense that

I want to just know what certain environments are like just to sit in. Let me just sit in one of these. Give me a 10-minute long fixed shot of something that's not that interesting, but just an environment to be in. Give me 10 minutes of sitting on a beach in the Bahamas. Give me 10 minutes of a boat ride or something like that kind of thing. For a nature documentary, how about just stick a camera in a nature preserve where there's some animals off in the distance and just give

give me a 10 minute shot of that so I can just sit there and like look at the animals and just enjoy it. That's why they got to do a full 3D engine of it so you can actually get up and walk around. And that's what the environments are. If they can feel you, if they can fool you into thinking it is, although of course you could also do the, I don't know if you remember this back from the, uh,

The weird old days, this is before your time in the Apple world, but one of the things they would do, do you remember QuickTime VR? Of course you don't. That was way before your time. I remember it being a thing. And I remember using it. I remember that I have used it, I should say. I don't remember what the actual use of it felt like, for lack of a better way of describing it. Was that where you could point and click and hold and drag the viewport around, basically, with your mouse? Is that right? Yeah, it was a 360-degree field-of-view, extremely low-resolution photograph.

Right. And so once you have a 360 degree photograph, minus the tripod, which they didn't really know how to erase, so it was kind of down there, right? Then you could look all around the photograph at your different places. And one of the things people did with QuickTime VR is, you know, they did it for real estate, but they did it for other things too. They would...

take the QuickTime VR camera, the 360 camera, and put it in 17 different spots within an area and then allow you to move between individual, essentially individual spheres of 2D imagery. We like cross fading between, kind of like Google Maps when you click forward, like on Google Maps and you go to the next part in the street, right? It was exactly like that. Do you never look at,

houses on Zillow just for grins and giggles. What is it? No, but this is like quick time. VR is like, it was like the nineties. Like it was so long ago. Yeah. Um, but I was thinking it from the, from, uh, Marcos Bahamas thing, uh,

If you had 360 degree 8K 2i video, you could shoot it forward, backward, up, down, left, right, and then move two feet forward, backward, up, down. Like the data would be massive. It would be better just to do it in a 3D engine. But like you could brute force this into providing...

some semblance of essential like 3d google maps street view where you could move around in the space and within each space you could look anywhere you could even do the thing where when you stand up on the couch your perspective does change because they also shot from a foot higher and they could fade between them it's probably just easier to do it in 3d but i was just thinking about the quicktime vr thing or like you said the current real current real estate things are essentially that quicktime vr thing but like it's so cheap to do now you buy a 10 cent 360 camera

on Amazon and you just stick it in a bunch of rooms and you, you jump between the spots by quick time. I don't think that's the case. Like Matterport, I think is the company that does this. And I, I,

I think it's more involved than you're giving it credit for. You could do the fancy version too, but the tech is cheap enough now that you can do the janky version real easy. Well, fair. But yeah. I mean, it is good for real estate. It does give you a perspective on things, but that's as close as we can get to having like, okay, but what if I wanted to have a different perspective? Well, we had six spots in this room. So you can jump between those six spots and look from those six spots. Yeah, but I mean, it is very important though. That's getting...

almost ubiquitous now in new real estate listings i've even seen it like on certain like hotel room bookings like here's what this type of room looks like like that that kind of stuff is actually very useful and i'm looking forward to seeing more of that yeah it is but it's not it's a video and it only has six spots in the room so for your bahamas thing if you wanted to like be in the bahamas with enough room to maybe walk around in a 10-foot circle or something as opposed to being confined and not being able to change it like the apple's immersive environments which

I'm assuming like they use 3D plus photo stuff or whatever, and you can actually move around a little tiny bit. And moving around, I believe, even if you move a little tiny bit, does actually change your perspective on the things that are in there. Yeah, because those are those are just rendered 3D environments like those are not videos. Yeah, I think there's probably some photographic things used for textures and backgrounds in those as well. But like, yeah, that's kind of like what you're going for is like the stuff in the...

What is it called? The ILM thing where they did the Mandalorian, the big LCD screens. Oh, yeah, I know what you mean, like the big 360-degree screen. Yeah, if you can think about that all happening inside, because the way that works is they have cameras shooting actors on a stage in front of a screen, and the screen is just a wraparound screen, but the things that are projected on that screen are...

only looks sane from the perspective of the camera if you're just on the side of the camera looking at what's on the screens it just looks like garbage because they're what they're doing is as the camera moves all the imagery moves behind the camera to you know a 3d engine like unreal i don't think they use unreal anymore but whatever 3d engine they're using it's basically a game engine with the camera being a real physical camera in the real world and real like actors and sets in front of it it's very clever the way they do it

it you can imagine doing that in vr only in vr you are the physical your head is the physical camera that's in that thing what is it called the uh someone in the chat room tell me it's really annoying me now it's not the sphere because that's the las vegas thing that's something else yeah the volume thank you dr calhoun uh if they can do that in a way that is good enough to fool marco into thinking he's seeing the real bahamas you know you've you've accomplished the goal

I mean, that's what I want. Like, you know, part of what I have been kind of begging Apple for since this came out is like more environments, please. Because the environments that are 3D, you know, game engine rendered kind of things. But like with this Blackmagic camera and like with, you know, more immersive video cameras, hopefully down the road that are more available to people.

Like when you look at what kind of videos people watch on YouTube, say it's everything. It's every it's stuff you would never even think people watch that. They want a 24 hour you log video and immersive three. That's the thing. Like you can think like, oh, yeah, you can make like nature documentaries on YouTube. Yeah, you can. But there's also people who just like stick a camera on something as here's here is eight hours straight of this thing that people find relaxing.

That's what I want more of. I want to get these cameras into the hands of everybody possible who might make video. Because the reason they were able to make videos like that on YouTube is because video cameras got cheap and widely available so that everybody...

could you know take out the phone they already had and stick it in front of a fire for eight hours and record that video and put it on youtube for free and somebody would find that and be like oh this actually is what i'm looking for right now thank you the more we can get immersive video cameras out there into the world the

the closer we will get to the world in which it is worth somebody's time to stick one in a vacation destination or a relaxing mountaintop or whatever and give us those eight-hour YouTube videos of just that. You want dual perspective, 180-degree, immersive, 8K, ASMR videos. Basically? I mean, I'm not an ASMR person, so I can't say for sure, but I think that's what I want. Yeah, the 3D audio is real important for that, too.

Yeah, but like, and again, it's all like, this is all super easy stuff to capture all at once when you have the right equipment, which they do. Because like, what's interesting is like, this does not require them to set up a whole studio or a whole set or have a bunch of actors or staff.

Somebody could take one of these cameras and literally just like bring it to an attractive environment in nature and just capture a day there and, you know, figure out what to do with it afterwards. Like, I mean, well, they'd probably fill up all those eight terabytes storage modules pretty fast because this video is huge. But let us stream it live. Your grandchildren, they're the cheap bird feeder camera that they buy online. We'll have this in it. Yeah, right.

And there are challenges too, again, with just the size of this video right now. That's very challenging. But when 4K came out, hell, when HD came out, those were very large for the computers and networks and disks of the time. Now you can stream 4K live and it's fine. It just works in so many places now and it's not that big of a deal anymore. So it's only a matter of time before this dual 8K format comes

is a little more wieldy. I know unwieldy. Is wieldy a word? Make it up words. Yeah, right. Like I always say, we are...

We only need probably a few more doublings to do what we've already done with audio, which is max out human perception. For this particular format of can't change your perspective, immersive video shot from a single point, we're not there yet. But we're one or two or three more doublings away from there's no point in making this higher resolution because human eyes can't distinguish it, just like we were with audio, where there's no point in making this higher resolution because humans can't hear a difference. And we got there with audio because it's easier because there's less data.

And even when we get there with still video, then it'll be like, okay, but what about the camera that takes 8,000 different perspectives from all over the place? And what about the 3D engines and yada, yada, yada? So there's still a ways to go, but you can see it now. You can see that like, all right, if you could give me four times, eight times the resolution,

there's no need for more at that point. And then it's just about dynamic range and other issues, but not resolution, which in resolution is mostly what's giving you the, uh, the data size issues. So, and even within resolution, like poor Casey sports are still in 10 80 most of the time, which is sad sports. Uh, and yeah, in terms of streaming, we can mostly do it except for the internet is not made for broadcast the way radio waves and cable television was. So when everyone tries to watch Mike Tyson, there are problems. Yeah.

Yeah, I mean, I know I've said it 100 times. I'll say it again, that I think the holy grail for this would be, you know, live streaming sports. It would be just unreal, just truly incredible. But getting that much data, you know, quickly in real time down to a Vision Pro, I imagine is a large engineering hurdle, to say the least.

It would be upsetting to have a 180-degree immersive... I know they have those wire cameras flying over the football fields and everything, but a lot of sports, especially football, you do want to have that...

narrow field of view perspective so you can see the whole field sometimes. In fact, maybe most of the time because the view from the quarterback's perspective when he's about to get sacked, that's exciting but also kind of upsetting and doesn't really give you a view of the entire game. It's kind of like your F1 thing that you were talking about, Casey, where it's cool to see the driver's perspective, but what's really cool is to see 17 other perspectives, not to say...

oh, the whole race is going to be from the perspective of one driver because that would be incredibly fatiguing and you wouldn't have any sense of what's going on. And so sports really needs a hybrid approach. But again, Vision Pro is there for that because it's like, oh, you want five screens? I can put them wherever you want. I can put them any size you want. I can have whatever I want on them. I can have an immersive screen that when you stare at it and pinch your fingers, you jump into that screen and now it's immersive and you're seeing from that driver's perspective, like so many things are possible.

Yeah. I mean, again, I've said it before, but I'll quickly recap. When you're watching F1 race and you have like the main feed directly in front of your face in the equivalent of like a 70 inch TV, but you have two or three accessory feeds on either side of that. So you have like the in-car feeds from a couple of drivers. And then on the bottom, you have a 3D representation of what the racetrack looks like and where every driver is on the track. Yeah.

It is mind-blowingly cool. And that is another great way, you make a great point that I hadn't considered, that's another great way to get quote-unquote immersive sports. All of these are 2D rectangles, like the most basic version of video that you can get on a Vision Pro. But because...

the whole of them end up being immersive and they're in an immersive space. It's a different way of kind of sort of reaching the same goal. Now you can't turn your head and, and, you know, change the way the camera's looking, but you can choose which one of your screens you're looking at. It's like, you know, the prototypical, uh,

man cave sports dungeon-y thing where there's 14 TVs in the wall. Well, you can do that with 14 TVs in front of your face and strapped to your face in a way. And that's really, really cool. And I would love it. I would absolutely love it. I love it when F1 is live and I have the opportunity to use the Vision Pro to watch it. But that is another way of accomplishing the same thing.

I don't know. I'm happy that, as Marco said, that we're moving forward. You know, the sun is rising. Progress is being made. And that's nothing but a good thing.

We are sponsored this episode by Squarespace, the all-in-one website platform for entrepreneurs to stand out and succeed online. Whether you're just starting out or managing a growing brand, Squarespace makes it easy to create a beautiful website, engage with your audience, and sell anything from your products to your content to your time, all in one place and all on your terms. Squarespace makes it just so great to run your business.

They have an amazing design system. They've evolved this over the years and they're constantly making it better. This is design intelligence now from Squarespace, combining their two decades of industry-leading design expertise with cutting-edge AI to unlock your strongest creative potential. This empowers anyone to build a beautiful, more personalized website tailored to your unique needs and craft a bespoke digital identity that you can use across your entire online presence, all built into Squarespace.

And of course, when you're selling your products on Squarespace, this can be anything. This can be physical goods with inventory management and all these different integrations. My wife uses that to run her own site. She has a store site on Squarespace. I've seen this for myself. I've seen how it runs. And it's so easy. She never has to come to me. She's not a programmer, but she never has to come to me for help. She runs it all herself, all on Squarespace.

And all the payment support is easy. They support every kind of way people might want to pay. Popular methods now being added like Klarna, ACX Direct Debit, Apple Pay, Afterpay, Clearpay. And as these get added in the industry, Squarespace is always right there adding them also. So this is just a great way to run your business, whether it's physical goods, digital goods, memberships. Maybe you're a consultant and you can book downloads.

time slots or maybe you're a trainer there's all sorts of businesses you can run on Squarespace and they make it all super easy and it looks fantastic in the process go to squarespace.com to see for yourself with a free trial and when you're ready to launch you go to squarespace.com slash ATP and you'll get 10% off your first purchase of a website or domain once again squarespace.com slash ATP for your 10% off your first purchase of website or domain thank you so much to Squarespace for really just being awesome and for sponsoring our show

Speaking of VR, Apple is allegedly working with Sony to bring PlayStation VR 2 controller support to the Vision Pro. Reading from, I believe it was Mark Ehrman at Bloomberg, Apple is now working on a major effort to support third-party hand controllers in the device's VisionOS software and has teamed up with Sony Group Corp to make it happen.

Apple approached Sony earlier this year, and the duo agreed to work together on launching support for the PlayStation VR 2's hand controllers on the Vision Pro. Inside Sony, the work has been a months-long undertaking, I'm told, and Apple has discussed the plan with third-party developers, asking them if they'd integrate support into their games.

Apple doesn't have any imminent plans to launch its own controller, but the company's design team spent a few years prototyping what is essentially a wand for the Vision Pro. This would be more of an Apple pencil-like tool for precise control rather than gaming. As for supporting the PlayStation VR 2 controllers, Apple and Sony originally aimed to announce this capability weeks ago, but the rollout has been postponed.

One hiccup is that Sony doesn't currently sell VR hand controllers as a standalone accessory. The company would need to decouple the equipment from its own headset and kick off operations to produce and ship the accessory on its own. As part of the arrangement, Sony would sell the controllers at Apple's online retail stores, which already offer PS5 controllers. The move is meant primarily for games on the Vision Pro, but the company's also created support for navigating the device's operating system. The controller's thumbstick and directional pad could be used for scrolling, while the trigger button could replace a finger pinch when clicking on an item.

So on the one hand, you can see the arguments surely people are making inside Apple, which is why should we bother trying to make a controller? These gaming companies have been making controllers for decades. They're really good at it. They're making them anyway. They're going to make them with or without us. Why don't we just make sure our Macs, our iPads, our phones are compatible with Xbox controllers, PlayStation controllers, so on?

Hell, we'll sell the PlayStation controllers in the Apple store. Done and done. What a clever, good business thing we did. We didn't try to make a controller because we're probably bad at it. They're already making them. We support it. People already have these controllers because they already own a PlayStation or an Xbox. Problem solved. And this is just one more example of how Apple's approach to gaming is...

Let's say wrongheaded, misguided because it's,

The reason all those people make controllers is if you want to be remotely serious about gaming, you have to make and ship your own controller. You can't just say, oh, it's a third-party opportunity. Apple just does this forever. Third-party controllers for your phone, for your iPad, for your Mac. Like, just other people make them. We support them. Isn't that good enough? And the answer is no. If you bought a console and it didn't come with controllers and they said, oh, just buy them from a third party. Other people make them. They'd be like, what?

the hell is this a gaming console or is this not a gaming console it was bad enough when they used to come with one controller which is criminal but like they still do uh like apple you need to make control so anyway vision pro no hand controllers people like apple doesn't believe in hand controllers it's all gonna be with your hands you pinch your fingers together you do gestures you don't need controllers they're they're cumbersome no one's gonna it's bad enough that you're putting a thing on your head you have to be able to use it without them and i agree with that but also from day zero of the vision pro we were like okay but

what kind of games do people like to play in VR already before Apple introduced this product? And how would those games work on the Vision Pro with no hand controllers? And the answer is poorly. It doesn't mean there can't be good games without hand controllers, but that we know that there's a whole bunch of games that people already like that require hand controllers. And Apple's like, oh, you can find them somewhere. I mean, and they're doing the same thing here. Like, oh, well, you know, we'll team up with Sony. We'll make support for their controllers. They already did all this work. It's fine. It'll be...

it's so frustrating it's like the whatever the alan k quote or whatever people who are serious about software have to make their own hardware i forget i may be reversing that to someone please google that and get but anyway it's like if you're serious about gaming you have to make your own controllers and i agree apple will be terrible at it there's a story probably get to the next episode about apple working on human input referrals and how that has not been their strength for a long time but

But, like, you gotta try, Apple. You can't sell the Apple TV with the stupid diving board remote and say, here's your game controller. No, that's not a game controller. Like, it's never gonna work. It's just killing me. I can't take it anymore. They're doing all this. They're putting amazing GPUs in. They have the game porting toolkits. They're doing all these things, and they're just like, but we're not gonna make a controller because...

I don't know. How do you guys feel about this? Well, I think if you look back at the history of how this has gone, the Apple TV, great example of this. The Apple TV has all of the hardware needed to be a fun game console, but it really hasn't stuck.

why didn't the Apple TV ever become a fun game console? All the computing hardware, you mean. Yes. And the problem, the reason it never became a game console, there's multiple factors for sure. But one of the biggest reasons, and there were different changes over time that affected this, but you couldn't require a game controller for a long time by policy. But how many people who have Apple TVs

...bought extra game controllers for them. I did because I'm a fool, but no one else did. So if you're making a game for the Apple TV... ...you cannot assume that almost any of your players... ...will have a controller. So you have to design for the crap little... ...diving board thing that it came with... ...which is very limited for what games could do with it. And that's it. It wasn't enough that it was possible for people to buy controllers. Since every Apple TV didn't come with a controller...

Effectively, zero of Apple TV's market would have one. That's what's going to happen here. The Vision Pro, I mean, look, there's so many ways that it's not a game platform. The one thing they have going for them is they haven't sold that many yet. Yeah, but like, suppose this goes through and all of a sudden you can buy Sony PSVR controllers separately that will work with the Vision Pro. How many people are going to do that? Like 10? Yeah.

And I'm not exaggerating. Like, actually, 10? We're talking about a fraction of a fraction of an already small market.

No developer is going to port a game to the Vision Pro that requires controllers. The Vision Pro is already a tiny market, and then you're saying, we're going to only target people or have a game that mostly pretty much only works or only works well with these add-ons. After you've spent four grand, you're going to also now spend another, whatever it would be, a hundred bucks to get some VR controllers. And then...

be able to play our game, like that's going to be almost no addressable market. It's going to actually do that for, for a platform to have games that run on game controllers, the platforms hardware has to come with the controllers. If Apple TV started shipping with a game controller in the box, which they will never do. But if for some reason they would do that,

Then you would start having a lot more games on Apple TV that were designed for the controller and could therefore accommodate more game types pretty well. You look at iOS. iOS is as big of an addressable market as you can find. There are lots of third-party game controllers that work with iOS. And yet, no major iOS game requires a controller. Why?

Because effectively almost no one has them, even though it's a huge market. And by raw numbers, that's going to be way more people who have iOS devices plus game controllers than Apple TVs or Vision Pros. But because the iOS devices don't come with an official controller with everyone sold...

Game developers cannot count on there being a very large market that has that. So they have to design for touch mainly, and maybe you can also work with an external controller. So that's what we're going to see with Vision Pro. If anyone is making games for it, which they're not and they shouldn't, but if anyone's making games for it really, what you're going to keep seeing is, well, maybe you can use a controller once the support is there.

But you're not going to get anything shipping on the Vision Pro that is only good with a controller because it makes no sense for anyone to develop that. I mostly agree with that. I do find this idea to team up with Sony to be less...

frustrating than I think the two of you do. If you look at the situation, you know, based on rumors, uh, Apple hasn't sold a lot of vision pros. It doesn't seem like by and large, it's catching on as a productivity device with the possible exception of Mac virtual display, which we talked about, uh, I don't know, last week, the week before, um,

What is the purpose of the Vision Pro other than to sit there and let movies wash over you? And I think the obvious answer could be to play games. But it's obvious as well that doing that with hand tracking just isn't cutting it. And if you're Apple and you want to solve this problem yesterday, then you team up with Sony and make a plan.

pairing out of it and make it work, make the Vision Pro work with this PSVR2 controller. I think that makes perfect sense. Now, that doesn't mean that John is wrong by any means. I think in a perfect world, Apple would have already come out with their own controllers. And even if it was optional, it would have been a launch device with the Vision Pro. Similarly to the head strap that Belkin just came out with that probably should have launched with the Vision Pro, probably should have even been in the box with the Vision Pro. But that's neither here nor there.

So in the end of the day, I'm not put off by this partnership, and I think it does make a lot of sense, probably for both companies. But I also concurrently agree with John that really Apple should be solving this problem themselves. Yeah, they should have first-party controllers and also support third-party ones. Like in that scenario where they ship a first-party one, it's getting to Marco's point, at least –

the software developers know everybody has a controller and those people can choose not to use it because they don't like it they want to use it I mean there's third-party controllers available for every system you know consoles you can use third-party controllers PC you can use third-party controllers and these days most PCs support all the controllers that work with consoles as well like that's great that's a great ecosystem but you got to come with one it's like

I was going to say, it's like a computer not coming with a keyboard and mouse, but the Mac Mini does that. But anyway, game consoles come with a controller. The PSVR comes with controllers. Sometimes it's optional. I think some of the motion things have said, well, you get it with this kind of controller, but not with that kind. But in the VR headsets, the MetaQuest, I believe, Marco can correct me, that comes with controllers, right? Yes, of course it does. Because here's the thing, like,

The way you design a game console or hardware meant to play games as its primary or one of its main functions...

It has to come with the controllers that it needs. That's what, of course, of course the Quest comes with controllers. They all, they all have for like, you know, the whole Quest is only like, I think the entry level one is like 300 bucks now, including the controllers. Like, of course it does. I think the single Xbox Elite 2 controller I have costs close to that much. Yeah, probably. It's not a headset. It's just a controller. Trying to make the Vision Pro have gaming as one of its, you know, significant uses is,

I think is never going to work because Division Pro is not a good game console, even if it came with actual game controllers in the box, which if they want to be serious about it, it needs to come in the box. But even if it did, Division Pro is, first of all, way too expensive to be a game console.

It's also way too heavy. You've got to get rid of the dangling battery because what games are popular in VR? Most of them are motion games where you're moving. You have things like the rhythm games like Beat Saber and whatever the version of it is for version OS. You have rhythm games. You have kind of virtual shooting games where you're turning around constantly. You have stuff like Gorilla Tag, which is massively popular on Quest where you're being physical. You're physically moving around in the headset.

You have the wonderful ping pong game, Eleven Table Tennis, where you're physically playing ping pong. You have games where you're moving a lot. The Vision Pro is so not made for motion on so many levels. It's terribly physically designed for it. The screens have too much motion blur and have too much latency for that. I mean pass-through latency. There's so much about the Vision Pro that is...

clearly not designed for the kind of games that people enjoy on the quest series right now so this whole idea of trying to make game controllers compatible with it is is not gonna go anywhere i i'm very optimistic if they get good content out there like video and sports and experiential content division pro is designed for that kind of thing it is not designed for motion and games

And you know, Apple will, you know, hinder itself even further because even like a kind of like a grassroots kind of behind the scenes thing where some developer make a really compelling game, they're like, oh, this is really good game for vision pro. And now the $1,500 one is out. And I know Apple doesn't ship controllers, but you can use this third party controller and like,

it's just so popular that it gains momentum all on its own and despite apple it makes it starts to make the vision pro into a gaming platform but that will never happen you know why because apple which controls all software that ships on the vision pro will say oh you can't ship this game that requires a controller sorry rejected from the app store it will never they will never even allow someone to like to help them like help me help you and apple's like no you cannot help us your game has to support the apple tv remote right i know they back they went back on that one but it was too late but like you just know they wouldn't

never allow a game on the vision pro that requires a controller because they'd be like oh that's not that's not the way we think about our platform it's like well then we can't help you you're never gonna be you won't ship your own controllers you won't let people shoot games that require them it's just

they cannot get out of their own way. Because if it was like if there was sideloading or if there was third-party stuff or if the EU decided that the Vision Pro was too dominant in the market and you had to allow third-party... Because if people can do whatever they want on it, you can get that kind of like...

grassroots phenomenon surprise viral hit that makes people go out and buy the controller but with apple being the gatekeeper for these platforms that's it'll never happen on the apple tv it'll probably never happen on the vision pro because there's no room for a third party to do a thing that definitely will be janky and low interest at first that might catch on because apple will just be like no that's not that's not how we hate they would like our platform to behave

Thanks to our sponsors this week, Squarespace, Aura Frames, and Masterclass. And thanks to our members who support us directly. You can join us at atp.fm slash join. One of the perks of membership is ATP Overtime, a bonus topic every week.

This time in Overtime, we're talking about the Mac monitor situation. There's been a number of recent hardware releases from other companies for Mac-appropriate monitors, and we're going to talk about that in Overtime. You can join to listen, atb.fm slash join. Thanks for listening, everybody, and we'll talk to you next week.

Now the show is over. They didn't even mean to begin. Because it was accidental. Oh, it was accidental. John didn't do any research. Marco and Casey wouldn't let him. Because it was accidental. It was accidental. And you can find the show notes at ATP.FM.

And if you're into Mastodon, you can follow them at C-A-S-E-Y-L-I-S-S. So that's K-C-L-I-S-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-C-O-A-R-M-E-N-T-M-A-R-

All right, John, you want to update us a little more specifically about your app? Yeah. So, you know, I'm here plugging away. I guess the main thing I probably want to talk about today is, I mean, something near and dear to Casey's heart. We were always complaining about poor documentation with Apple stuff. Oh, yes. Yes.

Yes, baby, I'm here. So here's the thing. Like, poor documentation, you know, the lack of documentation, there doesn't seem to be a lot of it. Some things have like a one or two sentence description.

That is frustrating, but like you really don't feel the full force of the pain until you are in an area that you have no knowledge or experience of. So if you're in some area that you know about, like say I'm doing app kit stuff, which I know a little bit about from my other two apps, right? I know a whole bunch of app kit stuff and there's some stuff I don't know, but like

I needed to like fill in the blanks or fill in, you know, it's like the fog of war on the, you know, RTS maps fill in the areas that I have, you know, so that I know a bunch of surrounding stuff, but I don't know. And then you go and you find some credit documentation, but because you know, all a bunch, a whole bunch of surrounding stuff, you're going to like, Oh, I see what they're probably getting at here. And I see how this is related to that thing. And you can put the pieces together.

But for what I'm doing in this app, what I've been spending some time doing, you know, in-app purchase type stuff, I've never done that on any Apple platform ever, ever before. So I have zero knowledge other than watching WWDC videos about it. And I've watched a lot of WWDC videos, but there's nothing like actually programming it, which is actually part of the problem here. So...

In that situation, when you're just, you just got nothing, you're starting from zero and you land on one of those documentation pages that doesn't have anything except for like a single sentence that uses like five proper nouns that you don't know the definitions of and there's nothing else. You're just like,

What? I don't, you know, and so here I'm in this situation and I, you know, I have watched just so many hours of WWDC videos about APIs that I have never used, right? I have a lot of this knowledge in my head, but they're WWDC videos for all, for as wonderful as they are, they really are kind of like math class in school where if you have kids or if you've been a kid and remember math class, if you like don't understand a concept,

And you never learn it. That's going to be a big problem because next year they're going to talk to you as if you already know elementary algebra. They're not going to reteach you algebra. They're going to assume everyone here is known as algebra or like multiplication tables. They're going to assume, you know, your multiplication tables, you know how to do addition, subtraction, multiplication, division. Like they're not going to reteach that.

If you miss one or two things, when you watch or when you attend, like, you know, the junior year or high school class and they start teaching you calculus, if you don't know, like, pretty much everything that came before it, you have a serious problem. They like to say that math is cumulative. You really need, you can't, like, forget the old stuff. They build on what you knew before.

Many WWDC videos are like that. Not all of them, but many of them. Well, they'll say, what's new in Star Kit 2? They assume you already know what was old in Star Kit 2, and they're just going to tell you about the new stuff. It's an incremental update. And so you're like, OK, but that's not all the videos. What you should do is go and find the WWDC 2021 or whatever video that's introducing Star Kit 2.

Here's the problem with this approach. You know, this is, you know, Apple spends a lot of money in WDC. The production values are great. The people present them as great. I think the presentations are great, but you go introducing store kit too. And you watch that video. If there's a conceptual part about like big picture, what a store kit about, hopefully that's still relevant.

But with the current rate of development and transition to SwiftUI, almost all the code in that presentation is not what you want to be doing today. Because it's not, you know, in future years, there'll be a session called what's new in StarKit 2 and SwiftUI or something.

And those are the APIs you want to be using because those are the new ones that work really well with SwiftUI. The old ones, you could get it done. So the intro course is telling you a way to do it that you don't want to do it. And the new course...

expects you to already have experience with the old API, but you have neither. And like, I'm going through this and I'm thinking as I stumble my way in the dark through this, just making mistake after mistake, as I stumble my way through this, I start fantasizing about writing documentation. Like I'm already, I've already like written it in my head based on my like current, what is surely my current misunderstanding. It's like,

I could explain this to somebody who has zero knowledge. Granted, my information is probably wrong at this point, but I could explain my wrong information and say, look, here's the problem you're facing. Here's the things you're going to run into. Like, here's how you have to think about it. Here's a way to arrange it, stuff like that. Right. And as far as I've been able to determine that kind of documentation almost doesn't exist at all anywhere in the Apple ecosystem, like first party.

It provides a third party opportunity for people to do that. But even the third party ones, because things are changing so rapidly in particular, because so many APIs like predate SwiftUI and then work with SwiftUI, but just barely. And then there's the new ones that are like made in an age where SwiftUI is the expected default and they don't work anywhere else. Like if you make that tutorial two years ago, maybe it's out of date now. It's extremely frustrating. And so one of the things that Apple usually is okay about is providing the sample code. Although they do this weird thing where,

the sample code from year to year they'll keep enhancing the same app like the food truck app or the backyard birds app or whatever and the download links to the sample code will be like download the food truck app but when you download it i think they only have like one copy of the food truck app and it's the current version so if you download it from like a video four years ago you don't get the four years ago food truck you get the current one which in some ways is good because it's more updated but in some ways it's bad because the the app doesn't match the video that you're watching so

But anyway, that my final complaint on this is, you know, again, about WWC videos, which I feel like WWC videos are, they're not like movie trailers, but they're like, do you want to, they're great for me before I was programming with these APIs. Do you just want to get an overview of what Apple's doing with the APIs and what they're capable of? These are the videos for you. But if you actually have to implement an app,

man these videos like they're not going to give you what you need to give just one example that drives me batty anytime in a wwc video and again i understand why they do this anytime in a wwc video where they say you know we're going to do x y and z and they say they've omitted error checking for brevity that is useless to me error checking is the most important part of the programming

I've omitted error checking? Are you freaking kidding me? All I want to know is where can this go wrong? What do I have to check for and where so I tell that the transaction is valid? Do I have to do it? Do I not have to do it? What are the possible error scenarios? Where do I... Like, I've omitted error checking for brevity? I know you have to fit it on a slide, but that is... Just throw it out the window. Like, it's useless. It's only useful to me as a casual dilettante viewer of like, I'm not writing an app, but I just want to learn about APIs. Isn't this great? But when it's time to write an app...

the code is all error checking there's one line that does something in 75 lines of error checking like especially with something like in-app purchase like i just wish like this and this made me think like this is what labs are for like if i'm still doing this wc i should go to a lab and say i should just come with wc session and say explain this stupid app to me explain backyard birds why are you doing this what does this api do what in the world i know

It took me like a full day to figure out they were calling map dot map on something. And it wasn't, you know, the array dot map. It wasn't that map. It was a different map. It does basically the same thing, but it was literally like a different signature, a different

freaking function because it wasn't an array. You can't tell that by looking at it because it's not Perl with an at sign in front of it. Oh, relax. I'm like, and then so the documentation for that map function was a single sentence that I can't make heads or tails of. Single sentence. And it's in their sample app and they gloss over it.

I just, it's, it's loosely incorrect. And, and surprisingly, the thing that I thought would be tripping me up so far, I mean, I'll get to it, but so far hasn't been, which is, Oh, how do you test transactions and like pretending to purchase things and blah, blah, blah. That actually, because I spent, you know, because I waited, you know, eight years or nine years or whatever to do my first in-app purchase. That's actually pretty good. And compare, I'll say this, that's better than I thought it would be. And I haven't had problems yet.

partially because I'm still just doing it locally, which is a feature they introduced two or three years ago, the local store kit configuration thing. Love it. Pretty easy. Mostly works. It doesn't really work. You can't really... I haven't money you can do tech support for me now. Is there a way to cancel subscription in the local Xcode thing? Yeah, there's a special bespoke window for it in Xcode. Hold on. I got to open it up. I found that window. How do you cancel? Right-clicking doesn't do nothing. They do have a thing where you can click and it says show options, and then it has...

by like the transaction list. Yeah. In the transaction list, I thought there was a way to do it, but now I'm having second thoughts about cancellation. But I go to the options list and in the, and when you click options on it, it brings up a thing. This is like, there is a cancel option, but those are like transactions that already happened. And when I cancel, um,

uh my app doesn't see that like it doesn't you know hit one of the the update handlers or whatever so anyway when i try it with sandbox i'm sure it'll be easier because i'm assuming when i cancel at an app store connect because basically what i'm doing trying to do is to simulate a user canceling it they go to their subscriptions in their icloud whatever blah blah and they hit cancel that's what i want to simulate from xcode and haven't figured out how to do it yet but that's a minor complaint so i thought that would be the big problem

But they actually have pretty good options for like, you know, renew every 30 seconds. Each 30 seconds is one month or you can easily delete the transactions to reset the world. Having no problem with that. Having massive problems being like, where do I put the stuff in my app to make sure I'm doing all the things that I need to do for all these freaking transactions? And all the types that just when the type signature makes the line go over 100 characters just by this is like, you know, what is it? Yeah.

result verification result with angle brackets around it all plus the type that you put but that one inside has a question mark and it's just and you have to like massage that value through 17 different things to figure out if it's verified and extract the real value with an if case led and then it's just like oh my god who made this api who made it like it's so good in so many ways and then just when you get down to where the rubber meets the road where you have to figure out do they actually own this

It's like, guess what? Angle brackets for days. No documentation. Good luck. Oh, and by the way, there's the subscription update task thing to monitor for updates to subscriptions. Is there an equivalent of things that aren't subscriptions? No, that one takes a single product ID because screw you.

I wasted at least 15 minutes trying to figure out how to programmatically apply multiple of the same view modifier with different arguments. And the type system did not like what I was doing. And I gave up. No, no, no, no, no, no. I gave up. I felt like I was so close. Like I can smell it. I can use reduce. I can make it happen. It's like, nope. Type system says no. Type system says no. And I was like, nope, this is beyond my Swift skills. And I just bailed out.

Anyway, store kit. Yeah. And so the funny thing is you're doing store kit to store kit to is light years better than store kit. I know. I saw the old with the SK, the SK APIs. I'm like, boy, I'm glad I'm not doing that because that was working that with FUI. First of all, that will never work with six. Never. Like it would be so angry at you about everything you're doing.

I don't think there is a way to cancel, by the way, within this little transactions dialogue. I apologize. The thing you can do to cancel that makes a transaction disappear, but the point is it doesn't have the effect that a user canceling their subscription would in iCloud.

Yeah. But with regard to documentation, I mean, I could go on and on and on about this. Let me start by saying it actually has gotten a lot better over the last few years. I doubt I had anything to do with that, but I certainly, it was late 2020 that I wrote my blog post about this, which I stand by pretty much. I mean, I think it has gotten better, but it is still not great. And, you know, the name of this blog post was on Apple's piss poor documentation. Yeah.

Which, yeah, pretty much. And I think the thing, there's two things that really chat my bottom about Apple's documentation these days. First of all, when you have either, what is it, no overview, something, something, no overview found or something like that. I forget what it is. No overview available. That's what it is. You see that less these days, which is great.

But a lot of times what you'll see is just like a regurgitation of what the function signature is and very, very few other words. And this is like what you were describing, John, with like, here's a single sentence with a bunch of proper nouns that I'm not familiar with. Also not helpful. Yeah.

And that really is frustrating. Really, really frustrating. Or I love seeing like enumerations where each of the cases is just a restatement of the case. Like, yes, no, maybe. Like the comments that add one to I. Right. It's just bananas. But the other thing that really I find deeply frustrating, and Apple has an affordance for this in their documentation, but a lot of times there's not a really great response.

overview of here's this system here's how it works or secondarily here's the overview which sometimes does exist but here's how you actually execute right so take store kit for example they have a pretty great overview of here's all the things you need to consider and here's what this does and that does and subscriptions and non-subscriptions and this that and the other thing but

They don't do a particularly good job at all of really explaining why the API is as kooky as it is. Because StoreKit 2's API is actually really good. It's actually not that kooky. But when you want to figure out, is this app purchased? Yes or no?

In order to figure that out, Apple's basically like, you got to look at like 300 different arrays of information and I don't know, piece it together yourself, have fun. And I understand why that is. But first of all, it would be nice if Apple kind of solved this for us. But second of all,

There's like maybe a couple of sentences that explain why this is the way this is, and that's it. And what I really want is a deep dive about, okay, here's the reason this is the way it is. You might have family sharing or you might not. You might be entitled via family sharing or you might not. You might be entitled via family sharing and your own entitlement or you might not. You might be entitled via family sharing something else and your own entitlement or you might not. Or the family sharing has...

has expired, but your own entitlement hasn't or vice versa. And so suddenly you realize, oh no, I really do need to do the work of traversing all of these arrays of subscriptions and information and whatnot. But you, you come upon that oftentimes by users reporting into you, your, your app doesn't say it's purchased, but here's a receipt showing I purchased it.

And that's not frigging helpful, Apple. That's too late. It's too late at that point. And that just drives me bananas. Another great example of this is I started adding widget and intent support to Call Sheet. It hasn't shipped yet. But one of the things you can do is with widgets, I'm pretty sure it's widgets, you can communicate between the widget and the main app. And

And there's not really a good overview page of your different techniques of doing this. And I pieced together that, oh, you can do like URL schemes, but it has to be a secure URL. It can't be like an X callback style URL. And I forget there was one other, oh, you can put things in user defaults, but even Apple seems to think like, I wouldn't recommend that if you're going to avoid it. And that's, I thought that's all there was. And

And come to find out, there's actually, you can flip a switch. There's a Boolean somewhere in your widget where you can say, no, I need you to open the app when the widget is interacted with. And at that point, you're effectively in the app's, you know,

and you can do kind of whatever you want. But there's no good overview of, okay, here are all of the available options, and here's why you would choose each one. It's not useless, but it's almost useless. And in some ways, it's worse than useless because they give you just a little nugget of information and then basically say...

Have fun. You'll figure it out. You're smart, right? And then when you search like for third party stuff, you'll find lots of like really good posts that do the kind of documentation I'm fantasizing about. Like, let's start from the beginning. Let me tell you the problem. Let me tell you the solutions. Let me tell you the different approaches. Like you said, when you would use them or whatever. But the problem is that blog post was written four years ago and now it's all out of date and it doesn't talk about the APIs you want. And maybe it's wrong. Right. And I just I want to give credit here. Like the Storke people are listening. Storke too is so much better.

like i can see that i've seen all the wwc sessions especially for using the swift ui it like it does really does a huge amount of work current entitlements does a lot of the work for you like a huge amount the fact that entitlements are you know products are separate from subscriptions is still a little janky even though they're both represented by products like there's weirdnesses there i get it but it has come such a long way and that's why it's all the more frustrating that like

I'm sure there's a correct way to use these tools. They seem powerful. They seem way more powerful than the things they're replacing. They do a lot of stuff in a small amount of space, but there's one or two things that I'm missing that I don't understand. And especially like Swift 6 really hasn't been much of a problem, but it's a little bit of a problem in that smuggling data between the various islands is

especially when you're stuck in some kind of non-async SwiftUI thing where you can't even await something from an actor. You end up smuggling a lot of stuff through environment. Apple's example smuggles stuff through the environment. So I'm like, is this just the way? Because like, what's the best practice? And they're like, yeah, we're totally going to smuggle. Like, do we have an actor doing a bunch of store stuff? And that actor is like,

invoked asynchronously from a view modifier and that view modifier shoves things into the environment and then other views see it because you shoved it into the environment and they have the add environment. It's like, all right, I mean, is that it? Like, am I... Because I have a lot of data that I want to move around and now I have like six environment variables and I'm like, maybe I'm doing it wrong and it's just...

Anyway, I'll put a link in the show notes. I just put it in the document case at the link section for the map method. The documentation for this map method says the method is map open friends underscore colon close friends. And it says returns a new state mapping the entitlement value if successful. Thanks. That's helpful.

It maps a new value. Oh, mapping the entitlement value. That's great. It returns an entitlement task state angle brackets new value. So the value is like I kind of see how they're using it in their sample app. I'm using it in my app. I kind of understand what it's doing. But when it came time for me to do something similar, not in subscriptions, but with the like for the entitlement part for trying out like non-subscription purchase type things, I

I just was baffled and I was like, no, I'm just not going to like, is this just a convenience method? I'm just going to like, anyway, I, I would, I long for a first party thing that like just explains from start to finish because you can't do it in a WDC session. It's not long enough. That's not what it's for. Like, I'm not saying WDC sessions should be different. I'm saying there needs to be something to augment it. And I do appreciate all the third party stuff out there. Hacking the Swift for like basic Swift stuff.

And some Swift UI stuff. Those things are great. There's tons of people writing blog posts about it. The Swift forums, like you can find stuff with the problem with the internet is just, you're just going to find so much out of date stuff. I have been using like chat GPT to try that, but like chat GPT doesn't know about Swift. It thinks it knows about Swift six, but it does not.

It absolutely does not. So I'll just give you this code that's just not going to run. I've been trying Claude to see what it can make. They will give you code that doesn't work for just days, right? Just that will never work. APIs that do not exist, they can be helpful sometimes, but A, they're just chronically out of date because they have to be because it takes so long to make these models that by the time you get them, they don't have WWC 2024 info in there. It's just...

And then B, yeah, they'll just make up something. And it's a shame. Like I use it because it's like, look, you're going to try to run the code anyway. As I said before, it's one of the ideal cases for this. I do. If I had to pick the best program, best new programming tool 2024, it's these AI things.

but they are in their own way extremely frustrating and they absolutely do not help for things like I need a human being to conceptually explain to me all the moving parts. Because there's not 8,000 different ways to do this, I would imagine. The people who made StoreKit have an idea. Say you have an app that wants to use every single feature that StoreKit have. You have...

you know, consumables, subscriptions, not really like that's what their bird, you know, backyard birds app is trying to say everything that store kit can do. This does all of them. And if you're going to do all of them, this is how you should arrange your junk. Uh, but like, I don't blame the backyard bird sample app. It does do all those things, but it is so kind of like strange and idiosyncratic and it doesn't do,

every single possible thing you can do in some ways it's limited and there's some weird choices made by like the programmer who just really likes using enums to smuggle information around where there's like an enum where there's like seven of the cases are product ids and one of the cases is the group id like i guess i mean you just you just avoid the group id and because you know that one's called group and it's not a product id and it's like well and it's convenient when we throw and also one of the states is not subscribed and we're going to do it's just

what are you doing? Like, you know, every program has got their own quirks, right? And it's like, it's fine to be quirky, but when you're making like the, the canonical sample code for how to use store kit, maybe like rein in those quirks and just be like, I'm not going to do anything or make any weird decisions. It's just going to be like, here's how you use the APIs. Um,

it should be incredibly well commented which it is not uh and it should have all the error checking like all of it because there's so many theories you go on you google for it and it's like i think you should do this and i think you should do that and you have to check for this but you don't have to check for that but now it's just you have to check this and you check the revocation date but if it gets through here it's already verified but you have to verify yourself oh store kit 2 does the verification and the old store kit didn't but now you still need to verify i just

it's just exhausting. So anyway, and this, the, the, the frustrating thing obviously is that this has nothing to do with the functionality of my app. It's all just about the monetization. And it's not the only thing I'm doing really. I'm spending more time on UI than I am on this, but UI is, you know, this is UI just is what it is. It's a slog because you don't quite know what you're, you know, the best way to make it and stuff like that. And yes, I'm also working on the engine. Like I'm doing all the parts, but I just, I just wanted to rant out store kit in particular because it was what I was working on today. It's, it's brutal. And I can't stress enough that,

It is light years better than store kit one. I know I've said it two or three times. It totally is. I want to say it two or three more times. It is so much better than store kit one, but it is still a lot.

And some of that, maybe even most of that, is the domain that they're trying to cover because there's so many gotchas and what-ifs and this and that and the other thing. But again, it would be so much better if there was better documentation that walked you through, okay, here's the plain vanilla version where you have a single...

in-app purchase that's unlocking your app. Okay, now what if you had a single subscription that unlocks your app? Okay, now what if you had multiple different in-app purchases that could unlock your app in multiple different subscriptions? Well, okay, now let's introduce consumables, you know, and build...

step-by-step over all of these different problem domains, if you will. But there's just not that. And from what I can tell, there does not seem to be an institutional value in documentation from what I've gathered internally or externally that documentation just does not matter the way I think it should. And it's really unfortunate because Apple, at its finest, really wants to and empowers

us, the developers, to make world-class apps. And this is flying directly in the face of that, right? Like, if they want Marco and John and me to make the best apps we possibly can, then they need to document their APIs in the best way they know how. And I don't think that's the way it is today. Yeah, I feel like that's one of the goals of StorKit 2, is they basically said, people are using StorKit 1 wrong, probably because we didn't explain it well enough and there's not good docs. So in StorKit 2, we're...

we're going to do a lot of the work for them because we know how the API is supposed to work. So we're going to take away a lot of that. That's one of the things that's great about that. Like inside this API, it's doing a bunch of stuff that basically they're from Apple's perspective, you third party developers proved you can't do this right. Yeah.

Like on average you can't do it, right? So we're gonna do it for you and then we'll just give you a response But still the response they give you it's like oh and by the way the things we give you You have to implement a bunch of business logic Here it's and it's super important you do it exactly right Otherwise you're gonna screw things up but we'll only kind of vaguely explain it again the like the backyard birds app They go through all the things you said Casey, but it is still in the end just one app and if you watch that session of

you cannot come away from that session and say, now I'm ready to implement my app. You're not, you're absolutely not. Like even if you memorized every line of code, either you copy and pasted the code samples, they have them in the transcripts, the code, like that's not sufficient for you to, I know I just did it. Like that information, I have the Backyard Birds app open in Xcode next to me.

That is not sufficient for you to do even the most basic thing. Like what, like just a single thing that unlocks your app. There are still so many more things that you have to know and think about to actually make that work because the code snippets they give you and like the backyard birds app itself is so weird and,

differently structured than your app would be that you can't just use it as a one-to-one thing and the wwc session just glosses over so much stuff like it gives you an outline of what's possible but when it comes time for you to type stuff in you're just like uh like it took me so long to figure out just banging my head against it and just randomly googling to to find all the other people who have hit the same roadblocks and you know discovered things like you know

non-renewing subscriptions apple doesn't track anything having to do with those so that's all you all right i know they do it for all the other ones but those ones just yeah like okay it would have been nice if that was in the documentation in big red letters which is like hey if you do this everything's on you unlike subscriptions where we keep track of it all

But, you know, that's development for you, right? You know, it sounds like I'm being fresh, and I'm actually enjoying the fun parts of it, like the parts that actually make my app work. And I do enjoy the UI stuff a little bit, although I have a separate rant about that for another time. But, yeah, working on the app. So, this is probably not a good time to suggest this, but I was thinking...

If you are making an app that's going to, you know, crawl people's files on their disks and everything and take advantage of the knowledge that you have and the, you know, things you care about, why not make an app that scans your files and keeps hashes in a database and periodically can rescan to detect bit rot?

People have already made that program. What is it? There's a whole bunch of ones that will do that. Yeah, they'll put little checksum files in your directories. Like, what I've gathered from the discussions I've had about that thing is...

Is that people who have like carbon copy clunder does it actually if you instead of using super duper if you use carbon copy clunder they have an option to put checksum files in there and every time you do a copy or rechecks on everything like you can do it manually. But what what people have told me who do this type of thing is that they're super diligent about it they do it they have a program that does it they have a third party program does whatever and they say just never finds any errors.

And I don't know if that means those programs aren't working right, or if it means the modern storage stack with SSDs is so good that you're very unlikely to find those type of things. And if you just keep the data moving from one fresh SSD to another, it won't be a problem. But yeah, that's, you know, it's something I have considered. But one of the things that bothers me is

You know, the map is not the territory. You can put the checksum files in every directory, but then you're putting turds all over the disk and there's all sorts of issues with that. You can keep a central database, but then your central database diverges instantaneously from the disk that it is databasing. And you're constantly like chasing it around to try to keep the database up to date with the state of the disk. It's not, it's a type of thing that, this is where like my, you know, my principles and tastes say this should be implemented in the, it has to be implemented in the file system. Many file systems do implement it,

apfs does not and so i just sit around here with my arms folded say well you should eventually do this in the file system right because it's it's like wrong it's like it's wrong to do it not on that you can but it's just wrong put it in the file system zfs showed you how this is the thing that exists just do it well but it wouldn't even if it was in the file system wouldn't the file system not know until it tries to read the file

Yeah, but like you don't know until you try to scan the file. It's the same thing. Like you have to, you can't get around the fact that to tell whether the bits are right, you have to read them.

Sure, but... There's no getting around that. But wouldn't that still then leave an opportunity for something that would scan periodically and alert you? Yeah, but the thing that scans has to read all the bits to find out if they're still right. Yes. That's a tremendously heavyweight opportunity. The thing that ZFS did was you had optionally... Optionally, you could store data redundantly. Right?

Right. And so when the thing was scanning in the background, just crawling your entire disk from painting the Golden Gate Bridge from start to finish. And as soon as it got done, it just starts over again with low priority thread is it would fix the things that it found.

because you have redundant copies of the data and it would repair them and it would alert you if there were problems. I just feel like it's a file system thing. Maybe I'll add it to my list. If this goes disastrously bad, maybe I won't. But if not, again, there's already apps that do this. But I don't like either of the approaches. I don't like the turds in the directory approach and I don't like the central database approach because both of them are just like, bleh.

I mean, honestly, I think the central database approach is the right one for that. But like, I don't know. I'm kind of thinking this could be like your thing. You could become like Storacusa suite of apps. Like imagine, honestly, imagine if like, what if you had one app that

that would kind of just care for your disk and it would have a few of these functions built in. Yeah, TechTool Pro, Norton Disk Doctor, yeah. No, it's Storacusa or ForgeSpace. From a sandbox app, none of that stuff is possible, let me tell you. Who has more credibility or passion about the file system than you? Just because I'm interested in the file system doesn't mean I'm good at it. I've already explained what a terrible Mac developer I am. You don't want me doing too much stuff. Leave it to the experts.

Honestly, I think people would love to see the store accuses suite of functionality. A few annoyances or shortcomings of APFS or of data storage on the Mac that you could help out a little bit with. And we'll say one app at a time.