We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
People
主播
以丰富的内容和互动方式帮助学习者提高中文能力的播客主播。
Topics
主播认为AI和加密货币领域虽然相关,但应分别考虑其在政府中的角色,避免两者混淆。 参议员Peter Welch提出的新法案旨在简化AI模型训练中版权执行的流程,但可能存在过度倾斜的风险,需要在保护创作者权益和发展AI技术之间取得平衡。 艺术家团体P, R, puppet认为OpenAI的Sora早期测试计划并非真正重视艺术家的创意表达和批评,而是更注重公关和宣传,他们反对OpenAI在Sora早期测试计划中的运作方式,认为OpenAI向早期测试者施压,要求他们对Sora发表正面评价,并且未能公平补偿他们的工作。

Deep Dive

Chapters
The potential appointment of an AI advisor in the Trump White House could significantly shape AI policy in the coming years. This chapter discusses the role's potential responsibilities and the implications of having a safety-minded versus an acceleration-minded advisor.
  • Donald Trump is considering naming an AI advisor (AIS-R) to focus on public and private resources to keep America in the AI forefront.
  • The role could involve coordination across government agencies and support for new AI-enabled processes.
  • Speculation surrounds potential candidates like Max Tegmark, an MIT professor and AI safety advocate.

Shownotes Transcript

Translations:
中文

Today, on the a daily brief IT appears that OpenAI video generation model sora has been leaked before that in the headlines president and elect trump t is apparently thinking about a ede White house, a eye position the a daily proof is a daily podcast and video about the most important news and discussions in A I to join the conversation, follow the discord link in our shown notes.

Welcome back to the AI daily brief headlines edition, all the daily AI news you need in around five minutes. One of the big things that people are watching right now is how presidential appointments might impact AI policy in the years to come. Now we have rumors of what might be the most direct position to influences the space with axiom reporting.

The president elect, Donald trump, is considering naming and A, R. This is coming from sources inside the trump transition team and the way that that they framed IT to aioc that the role is likely, but not certain in terms of the details. They are, of course, sparse.

Sources suggest that this won't be elon must himself, but that he, along with the vacuum, a swampy, the two hour, of course, leading the new department of government efficiency, or doge, will have a big role in determining who is the ais r. That is somewhat concerning for other tech leaders with whom elan has a touchy relationship. Increasingly, this is not the only zai being considered for the trump White house.

Bloomberg reported last week that the trump transition team had also been vetting cyp to currency executives for a similar role for the cypher industry. There is also a possibility, say the sources, that the A I encrypt roles could be combined under a single emerging technologies. Are everyone am hugely hoping that, that doesn't happen.

I think that these spaces, while having a relationship with one another, and while i'd love to see those two SARS working close concert with one another, are fundamentally different and require different things. And i'd like to see them get their own consideration in terms of what the A S. R. Will do.

Axial says the A, S R will be charged with focusing both public and private resources to keep amErica in the A I forefront, something that was established under president bite and A I executive order, and which might be kept even though trump ants to repeal that order is that government agencies have all named to chief eye officers theodicy the White has a isar could play a coordination ation role across all of those individuals. Another potential area of activity we've discussed how dose the department of government efficiency might not only be focused on trying to find obvious ways, but also think about how new enable processes could make things more efficient. Reports speculate that both of those functions could be supported by the A, S R.

In other words, using A, I to root out, quote, wait, fraud and abuse, but also thinking about how A, I could reshape processes going forward. IT also seems likely that if this role is established, they will have to be closely connected to energy policy, given that one of the big constraints for future leadership is going to be the availability of energy and ax OS notes that an A S, R would not require senate consent, allowing the person to get to work much more quickly. Speculation has, of course, started ramping up.

The names turned around, for example, is max tegmark, who's an M I T professor and A I safety advocates, whose some of reported has been influential in shaping trump ws on control, the I development. Then again, all of that is just speculation. The more interesting thing here is that tegmark is a reminder of how much this role could shape with the U.

S. Approaches things. The difference between someone who is acceleration is minded versus safety minded could be enormous when IT comes to how different policy is pursued. And so if this is a real thing that is going to be worth following very closely. In the meantime, politicians continue joking to make A I policy.

The latest comes from senator Peter welch, who has introduced a new bill aimed at making copyright enforcement easier when IT comes to A I model training called the transparency and responsibility for artificial intelligence networks or train act. The bill theoretically increases transparency into data sets. Copyright holders would be able to supine of the training record of A I models.

If they have a good faith, their belief is used to work to train the model, developers would only need to reveal training material to the extent that IT is, quote, sufficient to identify with certainty whether copyright ted works were used. Failure to produce would create a legal presumption that the ad developer did, in fact, use the copyright of material in question. Welsh said that the country needs to quote, set a higher standard for transparency around A I training.

Adding, this is simple, if your work is used, the training a, there should be a way for you, the copy holder, to determine that it's been used by a training mode, and you should get compensated if IT was. We need to give americans, musicians, artists and creators are told to find out when A I companies are using their work to train models without artist permission. So far, attempts to sue A I labs around copyright infringement have been progressing sing at a fairly slow pace.

The new ork times lawsuit against OpenAI is probably the most advanced. In that case, the judge ordered open eye to produce the search able version of their training data for new york times. Return is to scoured through.

We don't know whether they found anything at this stage, but the process was marked by controversy when OpenAI accidently deleted search log, setting the process back. This law is aimed at streamlining some similar process, but the concern, of course, is that IT swings too far in the other direction. We don't have right now solid legal precedent on whether using data to train air models constitutes copyright infringement.

Many labs have been signing licensing agreements in order to avoid lawsuits and the associated P R. Damage, but no court has had an opportunity to make a ruling on this point of law. This bill also doesn't settle the question of law. IT simply introduces a clear so peanut power when copyright infringement is alleged.

Meanwhile, jurisdictions like giza, el japan and singapore have created laws that classify training data as fair use a sixteen z and others in the air industry have liked use in training data as closer to reading a book, then copying a book. Ultimately, this is going to be one of the most chAllenging balancing acts that we face, protecting authors, musicians and creatives on the one hand, while advancing strategy K I. On the other.

Moving back to the technical side of things. And tropic has launched a new tool for connecting A I assistance to external data sources called the model context protocol. M, C, P and tropical are proposing IT as an open source standard for data connectivity.

M, C, P allows any model, not just once produce by anthropic data from business tools and software or content repositories. Ies, they rode in a blog post. As A I assistance gained mainstream adoption, the industry has invested heavily in model capabilities, achieving rapid advances in reasoning in quality, yet even the most open isolated models are constrained by their isolation from data trapped behind information silos and legacy systems.

Every new data uce requires its own custom implementation, making truly connected systems difficile to scale. Alex Albert t, ahead of cloud relations, provided a series of examples of M, C, P being used to connect to get up in a generic search engine to demonstrate its flexibility. He wrote, we're building a world where A I connects to any data source through a single, elegant protocol.

M, C P is the universal translator in, integrate M, C, P once into your client and connect the data sources anywhere. Get started with M C P. And less than five minutes we build servers for get hub slack as Q, L, database, local file search engine, and more like L, S, P did for I, D S, were building M, C, P as an open standard for LLM integrations.

Build our own servers, contribute to the protocol and help shape the future of A I integrations. Even if that sounds like Green to you, what's important knows that open connectivity standards have a long history of being a powerful unlock once they reach mass adoption, even something we take for granted like the standard U S, B. Port used to be dozens of different proprietary variants that we're all incompatible.

It's unclear whether OpenAI and other frontier labs will adopt an open standard, but block Apollo replay, odium source crafts or all building mcp support into their platforms. Anthropic also shared prebuilt mcp servers for google drive, slack and get hub, the company wrote. Instead of maintaining separate connectors for each data source, developers can now build against a stand protocol as the ecosystem matures.

A I systems will maintain context as they move between different tools and data sets, replacing today's fragmented integrations with a more sustainable architecture. What I think is relevant here is that is so telling about where we are as an industry. So much of what's actually exciting right now is not big, huge advances in model capabilities is these fundamental infrastructure building blocks that are coming online that in a few years or even a few months, IT will be very hard to imagine a time before them.

They're going to unlock a huge number of use cases and new opportunities. And so as small as they might seem relative to getting ori in or GPT five, this is, I think, very big news indeed for now that that's gonna IT for today's add brief headlines edition. Next up, the main episode, today's episode brought you by plum, want to use A I to automate your work, but don't know where to start.

Plum lets you create A I workload by simply describing what you want. No coding or A P I keys required. Imagine typing out, analyze zoom meetings.

It's a notion, and watching IT come to life before your eyes, whether you an Operations leader, market or or even a non technical founder. Plum gives you the power of A I without the technical sole and access to top models. GPT o on three point five assembly.

Ai don't the technology hold you back? Check out, use plum that's pum with a bee for early access to the future of workload automation. Today's episode is brought to you by vantage a, whether you're starting or scaling your company security program, demonstrating top notch security practices and establishing trust is more important than ever.

Panta automates compliance for I S O twenty seven, O O one, soc two gdpr and leading AI framework like I S O forty two thousand one and N I S T A I was management framework, saving you time and money while helping you build customer trust. Plus you can streamline security reviews by automating questionnaire and demonstrating your security posture with a customer facing trust center. All power by vent to A I.

Over eight thousand global companies like lung chain lead A I in factory A I use vane to demonstrate A I trust, improve security in real time. Learn more eventide c com slash N L W that's ventadour com slash N L W. Today's episode is brought to you, as always, by super R N tEllier.

Have you ever wanted an A I daily brief, but totally focused on how A I relates to your company? Is your company is struggling with A I adoption, either because you're getting installed, figuring out what use cases will drive value or because the A I transformation that. Is happening isolated individual teams, departments and employees and not able to change the company as a whole.

Super intelligence has developed a new central podcast product that inspires our teams by sharing the best AI use cases from inside and outside your company. Think of IT is an A I daily brief, but just for your company's A I use cases, if you'd like to learn more, go to be super di slash partner and fill out the information request form. I am really excited about this product, so I will personally get right back to you again.

That's be super da I slash partner. Welcome back to the daily brief. We have a spicy one today as the internet is exploding with the report that open A S saw, a video generation model has just been leaked.

Now, sorry, is probably the most anticipated AI product that we've heard about this year that we haven't gotten yet. All the way back at the very beginning of the year open day, I blew people away with what was possible when they demo, saw up for many IT, transformed their sense of what AI video could do. And I really did feel like video was going to have its mid journey stable 的 fusion moment and become a big part of the texture of twenty four。 And that is sort of what happened.

But IT wasn't LED by OpenAI, and IT wasn't LED by sora. Peak labs came out with a new version of their model, which was much more advanced. Luma labs dream machine became a popular option for artisan creators and runway, in addition to releasing a new version of their model, also started forming partnerships with big hollywood studios like lions gate.

And what made that extra interesting is that even as open a sora got delayed, there was a sense that maybe I was because they wanted to roll IT out first. Two, hollywood have to be a product that came through traditional entertainment industry, rather than as a botton up sort of service. Now there are a ton of reasons why an advanced deo model might not get released.

It's an extremely expensive proposition for one. And there are a lot of safety concerns when IT comes to deep fakes and the use of A I generated video for a furious purposes. But still, most people who spend the backhand f of this year wondering, where is sara? Well, now we have apparently access to a via a model that was uploaded to hugging face.

You can generate a video in the P. R, puppet sora space, which is at the moment of recording having a seriously hard time, presumably being crushed under the way to people hitting IT. But the group has also publishing open letter.

The letter reads, dear corporate AI overlords, we received access to sorrow with the promise to be early testers, red teamers and creative partners. However, we believe instead we are being lured into art washing to tell the world that saw a is a useful tool for artists. Artists are not your unpaid R, N.

D. We are not your free bug testers. P, R, pup. Its training data validation tokens at up hundreds of artists provide unpaid labor through bug testing. Feedback and experiments will work for the program for a one hundred and fifty billion dollar valued company. While hundreds contributed for free.

A select few will be chosen through a competition to have their store a created film screen offering minimal compensation, which pales in comparison to the substantial P, R and marketing value OpenAI receives. D Normal ized billion dollar brands exploding artists for unpaid R N D P R. Furthermore, every output needs to be approved by the OpenAI team before sharing.

This early access program appears to be less about creative expression and critique, and more about P, R. And advertisement. Corporate are washing detected.

We are releasing this tool to give everyone an opportunity to experiment with what around three hundred dollars were offered a free, unlimited access to this tool. We are not against the use of A I technology is a tool for the arts. If we were, we probably ouldn't have been invited to this program.

What we don't agree with is how this artist program has been rolled out and how the tool is shaping up ahead of a possible public release. We are sharing this to the world and the hopes that OpenAI becomes more open, more artist friendly and supports the arts beyond P R. studs.

As you might imagine, immediately first, everyone tried to figure out whether IT was real or not to more blah o rights. why? I think it's real.

This is using the OpenAIS A A P I p oint t o g enerate a nd d ownload O S w ith h ard c oded r equest h eaders a nd c ookies f rom t he h uggy f ace s pace e nvironment j ob o n t witter r ights c onfirmed. Open an A I sora really has been leaked. However, on the other side, A, I worry rights.

I mean, if it's just an, A, P, I request directly to OpenAI, IT gets shut down in two minus ten seconds. If IT doesn't, it's B S. The other big discussion, of course, is what the artist rode.

There is a little bit of a sense of yet do, of course, open the eyes behaving like this, my budgeter quote tech round writing. They claim that OpenAI is pressuring source early testers, including red teamers and creative partners, to spend a positive narrative around sora and fAiling to fairly compensate them for their work. But SHE added, who would have thought OpenAI would do that does A T course.

The other discussion is people sharing what they've generated and the video quality does seem high, although at first chance, it's a little harder to tell how much Better this is than other options that are out there, given how much the rest of the video generation space seems to have caught up. I tried to create a video of a turkey getting up off the table and running away, but alas, the site was too crushed. Anyway, this is certainly something that i'm going to keep watching, and I will report back tomorrow on whether this is confirmed as an actual leak or if it's just a big P.

R stunt. Ironically, I came on the day where luma, I released a massive update to their dream shine video model platform. The end end upgrade includes a new interface, a new mobile APP in a new image generation foundation model cotton M A photon.

To get a sense of how many people have been excited about this space, luma says they have over twenty five million register users since they launched in june twenty twenty four. C E, O, omit jane said, we build dream machine as a visual thought partner powered by a whole new image model called luma foton, creative, intelligent and design for the people who build our world designers, creators and fashion media and entertainment. The model also has improved functionality with natural language and can also draw from reference images, jane said.

Unlike prompt engineering, where you have to carefully craft specific commands, dream machine lets you talk to you like you're talking to a person. This conversational interface makes editing and creating intuitive. With dream machine, you can give IT reference images, color structures or textures, and IT will intelligently combine an iterate until you get exactly what you want.

Lua also says that they're cracked consistent characters, which is going to be essential for creating longer content pieces that have a coherent through line. So maybe taken together, the story of this plus OpenAI is that the video generation space is very much in full swing, like I said, a spicy one a day. But for now, that is going to do IT for the air daily brief till next time peace.