Jiter is a fast, iterable JSON parser that serves as the backend for Pydantic and is used by OpenAI, including in their Python library for ChatGPT. It enhances the speed and efficiency of JSON data parsing, which is crucial for applications like Pydantic that validate and transform JSON data into Python classes.
Python Build Standalone, a project providing prebuilt binaries for different architectures, has moved to Astral, ensuring better stability and stewardship. This transition is crucial for tools like UV, pipx, and hatch, which rely on it for Python installations. Astral aims to keep the project up-to-date with Python releases and improve its build and release processes.
Mocha-Py is a Python binding for the high-performance Mocha caching library written in Rust. It offers features like thread-safe, in-memory caching, TTL and TTI support, size-based eviction, and concurrency optimizations. Mocha-Py allows Python developers to leverage Mocha's efficient caching mechanisms, including a decorator for function-based caching, enhancing performance in multi-threaded environments.
UV is a fast and ambitious package manager for Python that simplifies various tasks including Python installation, virtual environment creation, package management, and dependency pinning. Key commands include 'uv python install' for installing Python, 'uv venv' for creating virtual environments, 'uv pip install' for package installation, and 'uv sync' for managing project dependencies and lock files.
Supporting open-source projects financially can significantly contribute to their sustainability and improvement. For instance, if every user of a widely-used project like Flask or Gunicorn contributed a small amount, it could transform the project's development and maintenance. This support can ensure that critical tools remain healthy and well-maintained, benefiting the entire developer community.
Hello and welcome to Python Bytes where we deliver Python news and headlines directly to your earbuds. This is episode 413 recorded December 9 2024. And I'm Brian Akin. And I'm Michael Kennedy. This episode is sponsored by us. So check out the links in our show notes, but also check out talk Python training and python test.com. There's courses over there. And of course, thank you to Patreon supporters and
And we have links. If you want to get ahold of us, you can reach us on blue sky or mastodon. The links are in the show notes. And also if you'd like to, to, um, uh, if you're listening to the show, thank you. And, please share it with a friend. Also, um, if you'd like to participate in the discussion while we're recording, you can,
you can head on over to pythonbytes.fm slash live and see when we're recording next. But usually it's Monday at 10 a.m. Pacific time. Sometimes it shifts though. And during the holiday season, who knows what we might do, but so far we're sticking with that. And if you'd like to get the links in your email inbox, go ahead and go and sign up for the newsletter at pythonbytes.fm. And we will send you all of the links in the show notes right in your inbox. So.
Michael, let's kick it off. Let's kick it off. I want to talk about a little bit of jitter. Maybe I've had too much coffee this morning or something. I don't know. What do you think? Jitter is a thing from the folks at Pydantic. And the idea here is they need really fast JSON parsing as the foundation of Pydantic, right? Basically, Pydantic is a
about how do I exchange, validate, and transform JSON data with Python classes, right? Into Python classes and types. Yeah. So you want that to be fast. The folks over at Pydantic created this thing called Jitter, G-I-T-E-R, and it is a fast, iterable JSON parser. Now, if the Pydantic usage does not catch your attention, OpenAI is also using Jitter, which is pretty interesting. Ask Chat Ship PT about it. So...
The reason that they were interested in it is they want to be able to work with, I believe Pydantic as well, but they want to work with
responses coming out of LLMs and anyone who's used LLMs until maybe very recently knows that they kind of like spit out the answers in a little progressive way. Right. And so with this, you can parse parts of data as it comes down, which is pretty cool. So there's some examples of partial in here. You can go look for somewhere, I think maybe on the docs website or something like that, but you know, you can give it like a,
a partially formed string and it'll come up with like perfectly good answers for it so that's
That's pretty neat. And that's one of its features. The other is that it's faster than what I think the default rust JSON parser is. Even for non-iterable, just straight parse it, which is, that's pretty impressive. Okay. And then there's also, this is why we are talking about it, there's Python parse, which parses JSON strings into a Python object. So you can go and run that as well, which is pretty cool. Jitter examples.
Yeah. Anyway. Yeah. So you can go and parse it into different pieces using basically if you need a really fast JSON parser with Python, you can use Python parse and it'll parse into a structure. Right? Yeah. So awesome. I thought people might be interested in both an iterable iterating JSON parser and back to this one.
iterating JSON parser and also a really fast one. Plus it's being built by the folks at Pydantic, Sam Colvin and team. And yeah, excellent. Nice work. Oh yeah. I think I've got several uses for this. This is cool. Yeah. Cool. Yeah. I recently had Samuel Colvin on, uh,
with David Seddon to talk about building Rust extensions or integrating Rust with Python and things like that and talk Python. And he talked about this as one of the things they're building, which is like, oh, okay, this is pretty interesting. Yeah, definitely. Well, I'm going to talk about Python pre-builds a little bit.
This is big news, Brian. I'm glad you're covering it. So Python Build Standalone is a project that we've talked about on the show, but mostly we talked about it in association with UV. Because if you use UV, UV sync or UV install Python or UV virtual environment or UV VNV and then install and use Python there, if it can't find it on your system, the Python on your system, it's going to pull it from PyTorch
Python build standalone, which is a separate project, not part of UB. So we've discussed that, but the big news right now is the, that Python build standalone is now part of astral or under the astral umbrella, which is huge. So yeah, there's a, we're going to link to a, an article from Charlie Marsh head of astral saying a new home for Python build standalone. There's also a,
It just says we'll be taking stewardship of this project from Gregory Zork, I don't know, cool last name. Anyway, the foundational project for building and installing portable Python distributions. And there's a link to Gregory's announcement also. And the discussion around that, like the Python build standalone powers UV, powers RIE, also pipx and hatch and more.
And it's got like 70 million downloads so far. Wow. Pretty big project and definitely instrumental to going forward with Python or with Python packaging and using Python. So Astral is really like trying to make you
along with this Python build standalone project, the new way to install Python. And for me, it is. I'm using it every day now. 100%. Same for me. So a pretty short article talking about this, but it is kind of interesting. It talks about what the project is at first and talks about the future of Python standalone, Python distributions. Also, what
what they have in mind for the project that looks like they want to keep the project up to date with Python releases, of course. And then upstream changes to the CPython build system, possibly. And then remove some of the... Third is remove some of the project's existing limitations. Well, the existing ones. It ships some MUSL-based Python builds. They're incompatible with Python extension modules. I don't know what that means, but...
But-- I don't know what MUSL is, so I'm going to move on from that. OK. And then improve the project's Python build and release process. Just good stewardship for this project, and I'm really happy about that. Along with this, I was interested to read a thread from Charlie Marsh about what-- it said, "Python builds downloads exploded in popularity with over 70 million downloads all time."
this link to this thread from on blue sky into the show notes also, because it's interesting to, it's an interesting discussion. And I, I learned something through here that I didn't understand, didn't know before. It said that the Python, I didn't know this, that the python.org download, the download from python.org, it actually downloads an installer that builds Python from source on your machine. For Linux. For Linux. Okay. It says for Linux,
Okay. So for Linux. Yeah, because the Mac OS and the Windows ones install way too fast. The building Python from source is like a 10 minute deal if it runs the tests and stuff. Okay. Yeah. So I, cause I didn't think I was doing that on, yeah. Anyway. You didn't get the error VC VAR's bat couldn't be found. Oh, okay.
I can say that for a while. So, yeah, I guess a bigger deal for people that are not running Windows or Mac, but that's really like all the servers and stuff. So, yeah. Well, I think the other thing that's really non-obvious here is like, what is this build standalone anyway? Why don't we just download the installer and just run it or just...
take the result of the installer and clunk it out into your machine or something. So my understanding is the non-standalone one depends on other foundational things in the system, especially in Linux, but also in other places. If you want to be able to just copy it over, you can't do that. And so one of the things that they're talking about, one of the four points of...
of the direction that they're trying to go that Charlie laid out was trying to upstream some of these changes back into CPython itself. I think it might be number one of the feature. Yeah, upstream, no, number two. Upstream the changes to the CPython build system because they have to patch Python in order to make this actually build
which is why it's a pain in the butt to maintain. And then how many combinatorial variations of that do you get for different platforms and stuff, right? - Yeah. - And so trying to say like, look, we've done these things to make it build more simply with fewer dependencies. Let's maybe make that part of Python. I don't know about you, but I have not seen a single problem with UV Python, Python build standalone Python compared to system Python. It's not like, oh, well, the certificates don't validate or this thing doesn't work or it doesn't have
or some weird thing like a dependency might be missing. It seems fine to me. And actually, I'd be more, if I'm running a server or something, I'd be more worried about installing it separately
separately and building it on each of the machines I'm installing it on than I would having like, you know, one install that goes everywhere because, you know. Yeah. Yeah. Anyway. Yeah. And I can tell you that Pythonbytes.fm is powered by Python 3.13.1 based derived from or gotten from this method here. Yeah. Yeah. Anyway.
um big news that actually probably doesn't mean much to individual users other than i think that we can try we we had a little bit of a concern about whether or not you know this this one project it's it was sitting heavily on one person one developers to maintain and i'm glad that it's uh astral helping out with this now too yeah i agree and if you read greg's announcement um that
there transferring python build standalone stewardship to astral that he talks about how the astral folks actually for a while have been core contributors to the project and they've been working from the outside to help keep this thing going because they realize how important it is to this feature right yeah and uh i know also i read i don't know if it was in this or somewhere else but i essentially read that the project was i mean astral was really working on it for several months anyway um yeah this is this is mostly a um a uh
an official announcement is all. Yeah. One final parting thought, Brian, is right there where you are. It says, this isn't Greg's announcement, as I wrote in my shifting open source priorities in March, this is an interesting challenge that people can run into with projects that are run by one person, right? Yeah. The guy had a kid, wanted to spend more time with the kid, was feeling worn out by the projects and decided,
Well, and also talks about how he really just cares way more about Rust than he does about Python these days, which is fine. You're not married for life to a technology. Go where your heart takes you. But that's a challenge for projects that are run by one person. I think it's worth reading this thing as well, just for people to get a sense of when open source projects take off, but it's not necessarily a good fit.
Yeah. Yeah, but thanks to Gregory for creating this and keeping it going. He's also known for the PyOxidizer project, which came close but didn't quite get us a single binary of our Python apps. Yeah, interesting. Okay. Yeah, I really am. It's really cool that he made sure that this was in good hands before shifting it over. Yeah, absolutely. Absolutely.
All right. All right. On to the next. On to the next thing. So I talked about, there's a theme here. I talked about the jitters from having too much coffee. Well, let's talk about mocha. Maybe if we can put some hot chocolate and some sugar in with it, it'll be better. No, probably not. So this project, this project is by Delirio and...
And it's called Mocha Pie. So Mocha, let's like work our way inside out. So Mocha is a high-performance concurrent caching library for Rust. Not a concurrent caching server like Redis. Think SQLite, but for caching, right? SQLite's written in C, not Rust, but it's an in-process sort of deal, which is pretty neat. And this itself is inspired by Caffeine for Java, right? This is kind of like turtles all the way down, like...
like ports all the way down. So it provides a caching implementation on top of dictionaries. They support full concurrency of retrievals in the high expected concurrencies for updates. All right, so thread safe, highly concurrent in-memory cache implementation, sync and async can be bounded by the maximum number of entries, the total
weighted size, size-aware eviction, like kicking large things out versus small things. You can have cache controlled by least frequently used, by last recently used. Like I want to kick out things that are over two minutes, but if you've got room based on something, that's fine. You can give them a time to live, a time to idle. Right, idle is a really cool interesting one. Like when was this last accessed? So if you've got something that's old, but is used all the time in your app,
and then something that's somewhat new, but it kind of hasn't got used that much, it'd be better to kick out that new one rather than the old one, right? - Yeah. - Okay, so that's all just straight Mocha. MochaPy is a Python binding for this, here we go again, Rust library for Python.
They're probably getting VC money from this, I'm telling you. Okay. No, just joking. Sort of. So for the MochaPi thing, it has a synchronous cache, which supports basically thread-safe memory. It just like wraps the thing, so...
So time to live, time to idle, size of concurrency, all these things that you can imagine. And so there's a couple interesting ways. You can just say cache.set some value, or you can say cache.get some value. That's one way to use it. Another one is you can use it as, this is actually pretty amazing. You can use it as an LRU cache function decorator alternative. Oh, wow. Right? So one of the things you can do that's really easy to speed up,
Python code with not writing much code, you have to maintain much, is you just put a decorator, functools.lrucache onto it, and it'll look at the hash value of all the inbound parameters and say, if you pass me the same parameters, you're getting the same output, right? And it just does that, just straight in Python memory. But this would be backed by this high-performance concurrent Rust internal library. It's still in process, right? So you can say... Yeah, go ahead, sorry. With the time to live and time to, you know...
Yeah, especially. That's cool. Yeah, this is pretty cool. And there's so much talk about the thing supporting the Mocha itself, the Rust version supporting asynchronous behavior, right? I'm like, okay. If it has all these asynchronous capabilities, what's the story with Python and its async and await, right? So I filed an issue, which I don't really like to do, but that's how you ask questions apparently, and then you close it.
So I said, Hey, cool project. Since it says thread safe, highly concurrent in memory implementation, uh, what's the Python async story. And so they responded, um, this will work if you put the decorator on there. So remember how I was complaining that it's sort of weird that the funk tools and inner tools don't support async. This, this funk tool like thing, uh,
supports async and sync functions as well, right? So they just have an implementation in the center that says, is it a coroutine? Do this, else do that. So you can use the caching decorator, like we talked about, like the LRU cache thing already, on async functions and sync functions. So that's fine. And then I said, well, what about cache get and set? And DeLiro says, probably doesn't make sense to do it. It takes 230 nanoseconds. So you can do 4.4 million calls a second.
And set is a 1.3 million sets per second for a cache size of 10,000 that's fully occupied on a simply M1 Mac. So you know what? Probably not. But there might be some ways to expand this in the future. I don't know. But yeah, I would say probably not. Probably not needed because you're going to probably add more overhead just to juggle the async stuff, right? Yeah. And also just if the supported method is through the decorator and we'll
or whatever you need, you could just put your code in a function to do. Yeah. I mean, if that were Redis, you would absolutely want an async version because you're talking to another server and there's that latency in the network. But yeah, if you can do 4 million a second, then probably... I doubt you can do 4 million awaits a second, but it's much lower. So the cache get and set really are just... The benefit of those is probably just for... Because we want a really fast caching system or something. Yeah. Yeah, exactly. And you...
There's plenty of times where you say, in this situation, I want to get this out of the cache and then keep it for a while. Like if I had a user who logged in and I want to just hold their user account with all their details and I've used their ID as the key and their actual user object as the object that goes in, that's fine. But you wouldn't use that as a cache decorator because typically you might see that coming out of a database, something like that. And then if you pass the same user in, it's like it's similar, but it's a different database object.
object right you can run into real weird situations where they're equivalent but they're not equivalent you know and then you end up not using cache so anyway i think that might be where you would do it but anyway i think this is pretty cool um people can check it out yeah and it's it is not i don't believe it is like super popular here and you know 100 stars it kind of shine a light on it but if you go over to the mocha thing you know it's got a thousand seven hundred stars and this is kind of just a python ui on top or um
API on top of it. Yeah, but it's pretty recent. I mean, it's a few weeks old, looks like. It's just a baby. It's just a baby. It's okay to have 100 stars. It's pretty good. Yeah, it's pretty good, actually. Yeah, it's pretty good. It looks cool. So now you know. All right. I want to shift back to UV. I'm kind of in a UV mood. I'm missing the sun, apparently. But there's an article from the SAS Pegasus blog
about UV, an in-depth guide to Python's fast and ambitious new package manager. And a lot of people have written about UV already, which is great. But I really, I have been really excited about when I learned about UV Sync and started using that and all the different ways to use UV. It's a pretty powerful tool. So it's not really one thing. It's designed to be a lot. So I appreciate articles like this, but also I really
I really like this one. So it starts out with pretty much who is-- with a funny meme of a whole bunch of different commands to install Python and update it and install-- create a virtual environment and sync your requirements. And all of that is just done with UV Sync now. You can do it all in one, which is pretty sweet.
I don't use uvsync. I use uvvenv --python313 or something. But, you know, same. Yeah, I'm using both, depending on whether or not I have a project set up already. So it talks about what is uv, why use it, and we're just going to
assume that you already know if you listen to this podcast because it's really fast. But a lot of discussion of different workflows talks about installing, adopting UV into your existing workflows, doing install. But I'm going to pop down to the end.
adopting UV into your workflow. There's this cool cheat sheet. This is pretty much what the entire article talks about, the different parts, is you can use UV Python install to install Python. You can use virtual environment or VENV to create virtual environments. It's really fast. And then install packages with UV pip install, but then also install
you can build your dependencies. Like we would have used pip compile. You can use UV pip compile, but it's all in one place to all these different commands. And these really are the commands that commands listed in this article are really the, the way I use UV as well. So that's why I appreciated it. And then a discussion about how to adopt this into your workflow and what that means to get,
you know, talking about, I mean, some of this, a lot of people might not have used a sort of lock files before, but using lock files with UV or it's so easy that, you know, why not? And pinning your, pinning your dependencies, just some good workflow. It's good. Python project practices anyway. So why not?
Yeah. Yeah. That's great. And there's even a few more that you could throw in for the tool, like the equivalent C table there. Yeah. You know, there's you, there's you for installing CLI tools. You could say PIP X. Yeah. And just create a virtual environment and solve things and make that in the path and all those sorts of things versus UV tool installer. Right.
UV run, right? Those kinds of things as well. So, yeah. Yeah. It's missing that, which is, you know, which I'll feed it back to Corey. So one of the things, reasons why this, this came up on my radar is I'm working on a project that uses SAS Pegasus. So I'm in touch with Corey a lot. Oh, nice.
Yeah, but like the UV, the tool thing, instead, I'm not using PipEx anymore. The UV tool install is like super cool. Yeah, it's super cool. It is. I've also started using Docker for certain things as well. Yeah. I don't know.
It's kind of similar. But like, for example, Glances, which is a Python-based server UI visualization tool, you can just say Docker run Glances versus installing Glances, and you just leave the machine a little more untouched. Yeah, one of the interesting things about this article was the point of view. Because at the start, Corey talks about how...
He's not usually somebody to jump on like multi-tool fads like pipenv or pyenv for installing, for doing virtual environments better and big project-wise. And I like Hatch, but I'm not really a using Hatch for my entire workflow sort of person. I was using it just for a packager.
So, um, I'm, I'm in the same boat of like, I didn't really need an all in one tool, but this one changed my mind and I really liked this all in one tool. So, yeah, I'm still not bought into the project management side, but I love using UV for the stuff. Yeah. Yeah. Anyway. Um, well,
What do we got next? We have a quick bit of follow-up here that I just did. I did some searching. So over on PipX, so one of the things that, you know, you say like you could use PipX or there's an open issue on PipX that says integrate UV in some way, right? Because PipX is really just a wrapper around create virtual environment, pip install package, pip install dash U package, right? And so if they just change the internals to say UV pip install, then pip
pipx would all of a sudden become super awesome this recommendation is unfortunately over half a year old but it does have 21 upvotes so you know what yeah who knows that's there yeah okay yeah okay uh but that's not what i want to cover next
Come on, computer, respawn. There we go. I think that's it for our items, right? We're on to extras. Yeah, let's have extras now. Yeah, let's extra it up. Extra. So, registered for PyCon, I did. Oh, cool. Yeah, registration came out two days ago. I don't know. Whenever I posted some message on Blue Sky and Mastodon saying, I registered, how about you? Whenever that was, that's when the announcement came out. So, I think a day and a half ago or something like that. So, there's early bird pricing and all details on there. Uh,
If you wanna go and check it out, it's normally 450 bucks for individuals, but you could save $50 if you register before January, which is pretty cool. There's a bunch of stuff. It has all the detailed timeline, which is always interesting. Like if I wanna make sure I attend the Pied Ladies auction, when do I need to do that?
when is the main thing, when is the job fair, et cetera. So most importantly, main conferences, May 16th to May 18th, 2025. So there it is. And congruent with current times mask policy. Hooray. Optional and encouraged, but not required. Yeah. How about that? Okay. I got a few more real quick ones here. I recommend, you know what? Something I came across just thinking like, why don't I support more open source projects? Looking at my...
my dependencies and stuff that I'm working on. Like how much, you know, if everybody who used Flask put $1 towards it per month, everybody who used it in an important way where it's not just like, oh, I did a tutorial with Flask, but like, no, I have a project that is important to me and I use Flask. If everyone put $1 towards it, it would transform-
That project. If everyone who used G Unicorn put $1 towards it, that would transform it, right? So I decided, you know what, I'm going to just go to some projects and find the one that I use most. And yeah, I found four that had sponsorships that were available. I was going to support UV and Pydantic as well, but they, for some reason, they do like corporate sponsorships or I tried to do individuals and it didn't work. And then some other ones like Beanie don't currently have sponsorships, but, you know, are really important for the...
database layer stuff. But just think about, you know, put a couple of dollars towards some of these projects. It'll make zero difference to you if you have a job as a software developer. And in the aggregate, it'll make a big difference to the health of the ecosystem. Yeah. It's interesting to think about it like that. Like just, you know, a couple less coffees a month and you can help out. You probably cover like three or four projects. Yeah. Yeah. Yeah.
Anyway, I want to encourage people to do that, but you know, can't obviously don't, but I don't think it's a big deal. Uh, come here. Computer. Very slow for some reason. Don't know why. There we go. All right. Uh, this is the joke. So that I'm skipping the joke for a second. We'll come back to it. There's two things that I wasn't planning on covering, but I'll throw out here really quick. Uh, yeah, here's my register for PyCon. Also wrote a quick, uh, people said, Oh my God, Hetzner. We moved to Hetzner and they changed this huge thing where they changed their bandwidth and their price. And it's like a no,
no nothing sort of deal, like $5 a month more. Anyway, I wrote that up so people can check that out on Mastodon. And then, yeah, that's it for all my items. And then I just got the joke when you're ready for that. So let's do yours. I don't have much commentary on these. I just have a few extra things I wanted to point out. Pydantic AI was announced, which Pydantic AI is a Python agent framework designed to make
"Is it less painful to build production-grade applications with generative AI?" I don't have really any commentary about this other than I didn't see this coming, but interesting. - Yeah, very interesting. I've seen messages or tweets or whatever from people who do machine learning stuff saying, "You just need Pydantic." I mean, a lot of this is like, "I got a JSON thing here, and then I'm gonna call some other thing with other JSON," and just suggesting, "Hey, you could probably use Pydantic to make these connections." I bet the Pydantic team noticed that as well. - Okay.
A couple commentaries on maybe society? Anyway, I'll leave the couple other articles I thought was interesting. Blue Sky announced, I guess this is old, this is from August, but anti-toxicity features on Blue Sky. And I just actually appreciate some of these. I already have hit, I had a troll come by. And so there's some things where you can, if people...
you can detach a quoted post if if somebody quotes you and you don't want them to you can detach yourself from that um oh interesting um i had uh hiding replies i had some a troll you can't like delete replies but i had uh somebody just just idiotic reply to something i said and it was obviously just a bot or a troll so you can you can hide that um and as you
you know, as blue sky grows, we'll, we'll get trolls also. Um, if they're, if they're not affecting you yet, they, they may in the future. So, so I do appreciate that there's, um, there are features around to protect yourself. So there's, there's that. And then, um, this, I don't know what to make of this really, but wired fairly mainstream magazine, I think, uh, released the wired guide to protecting yourself from government surveillance. Wow. I'm,
I just, this is a head shaker of like, um, I guess we need this. I wish we didn't, but wow. Um, yeah, there's that.
Yeah, you can probably say that about some state governments as well. Every state's different. Yeah. Depending on your gender and things, you know, it's touch and go some places. Yeah. Anyway, so that's a little bit of a downer, so maybe we need something funny. We do. I don't want to spend all the time going down that deep rabbit hole. Instead, let's go infinitely down the rabbit hole. Yes. So check this out, Brian. Somebody who goes by Bits very...
very personal, on Blue Sky posted what the comments seem to indicate is probably a textbook, this is printed by the way, a printed textbook
LaTeX, okay? Okay. In the index at the back, on page 252, there's an entry for infinite loop. And it says, see page 252. I love it so much. It's so simple. I love it. Yeah. It's a really good, just like a little Easter egg in there, isn't it? Yeah. Yeah.
I haven't seen it for Infinite Loop. I saw that somebody did that for Recursion. Yeah, if you look in the comments, it says that Carrington and Ritchie has the same, I guess that's probably C or something, the same under the index for Recursion. And it's pretty good. People love it. Yeah, that's funny. And there's somebody that says, for those who can't be bothered, Google search for Recursion. Did you mean Recursion? No.
Yeah. I kind of feel bad for people that actually really need to know what that means. Good luck. Yeah. Good luck with that, huh? So, well, yeah, all good. All good here. We know recursion and infinite loops are, but we're going to break the loop and get out of here, right? Yeah. Yeah. Let's break the loop and say goodbye until next time. So thanks a lot. Bye. Bye all.