We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Police use new AI tool that can identify someone without facial features

Police use new AI tool that can identify someone without facial features

2025/6/2
logo of podcast Marketplace All-in-One

Marketplace All-in-One

AI Deep Dive AI Chapters Transcript
People
F
Frances Fry
J
James O'Donnell
M
Megan McCarty Carino
N
Nate Freed-Wessler
Topics
Megan McCarty Carino: 面部识别系统存在争议,因此出现了新的工具,它们不再依赖面部特征进行识别,而是通过分析身体的其他特征来进行身份识别。 James O'Donnell: 作为一名MIT科技评论的人工智能记者,我深入调查了名为TRAC的新型AI工具。TRAC被越来越多的警察部门和联邦机构使用,引发了关于隐私和监控的担忧。虽然TRAC不如面部识别那样具有侵入性,但它扩大了监控范围,并且可能更容易出现误报。各地的法规参差不齐,警察部门很少事先与社区沟通,这使得社区很难了解TRAC的使用情况。我担心TRAC引发了与面部识别相同的担忧,并通过扩大可使用的镜头量提高了标准。 Nate Freed-Wessler: 作为ACLU的律师,我坚决反对在警务中使用面部识别技术,即使改进或提高透明度,我也不会感到满意。我认为像人类一样,技术总会在现实环境中犯错,但识别和纠正这些错误可能更困难。人们倾向于相信计算机系统的结果,因此很难阻止警察相信面部识别匹配是正确的,并围绕其构建调查。技术在警务中的使用必须谨慎,以防止侵犯公民的权利。

Deep Dive

Shownotes Transcript

Translations:
中文

Offer valid on standard browsers, U.S. only. When I heard about Date My Age, I thought, really? But there I was in my empty, quiet house, my laptop on the kitchen counter, and I typed in my name. Looking for a man between the ages of...

Hmm, 40 to 60? Sure, why not? Date My Age is different. With verified profiles, you can feel safe and secure to explore meaningful connections with interesting and mature singles. Date My Age made it really easy. I could join and view online profiles for free. All of a sudden, my empty house wasn't so quiet anymore. I got so much attention, it just made me feel seen and alive. Date My Age helped me start a totally new chapter in my life.

Find a friend, a lover, a partner at Date My Age. Get 60% off when you join at datemyage.com today. That's datemyage.com to connect with thousands of singles worldwide. datemyage.com.

Hi, I'm Frances Fry. And I'm Anne Morris. And we are the hosts of a new TED podcast called Fixable. We've helped leaders at some of the world's most competitive companies solve all kinds of problems. On our show, we'll pull back the curtain and give you the type of honest, unfiltered advice we usually reserve for top executives. Maybe you have a co-worker with boundary issues. Maybe you have a co-worker with boundary issues.

or you want to know how to inspire and motivate your team, no problem is too big or too small. Give us a call and we'll help you solve the problems you're stuck on. Find Fixable wherever you listen to podcasts. Facial recognition systems are controversial, so new tools skip the face. From American Public Media, this is Marketplace Tech. I'm Megan McCarty Carino.

Facial recognition systems use artificial intelligence to analyze patterns in faces, and they've come under increasing scrutiny, particularly in policing. There have been multiple instances of false positives leading to the arrest and detainment of innocent people.

There's no federal regulation of this technology, but at least a dozen states have laws that limit its use. So some law enforcement authorities have turned to a new system called TRAC, made by a company called Veritone. It doesn't analyze faces, but looks at the rest of the body for clues. Things like clothing, body type, or hair. That's according to recent reporting by James O'Donnell for MIT Technology Review.

The company wouldn't give me an exact figure or name specific customers of theirs, but it's certainly used by roughly 400 state and local police departments as well as universities around the country. They also have some clients abroad. And it's also being used increasingly by federal agencies. So they have some

agreements with the Department of Homeland Security, which, of course, houses immigration authorities, as well as the Department of Defense and the Department of Justice. So, you know, U.S. attorneys are using some AI tools by Veritone, including TRAC. So it's not only state and local police departments, but also federal agencies. And what kinds of concerns does this raise? Well, I think, you know, one big concern that

people at the ACLU raised was the idea that now police departments have this tool that they can basically make use of any video footage where someone is present. So even if their back is turned or their face isn't fully visible, police departments can track that person across different video feeds. So the concern from a civil liberties standpoint is

is that a tool like this is not just making certain policing tasks more efficient, but it's actually creating a whole new scale at which police departments can analyze and ingest video feeds that they've never done before. And the concern is that, you know, should certain police departments or federal agencies wish to do so, this can cross well over into the realm of surveillance rather than just efficiency.

On one hand, this seems clearly less invasive than facial recognition. You know, I mean, it's kind of looking at features that it seems like the human eye could probably, you know, identify more so than the, you know, the type of unique identification that facial recognition algorithms do. On the other hand, as you know, you know, this is kind of opening the aperture of all

all of the various types of surveillance that police might be looking at. And given that it is not looking at uniquely identifying characteristics, I would imagine might be prone to more false positives.

Yeah, you certainly have a larger volume of data that you can work with, right? So one could imagine that the incidence of false positives would be really significant. If you're bringing in people who share the same clothing as you, general body features, you know,

If you now, you know, share the same backpack and shoes as someone that the police are investigating, then you very well might be lumped into this, you know, set of subjects that the police are tracking. So it does increase the scale. And I'm not sure it fundamentally changes the problem of false positives that all facial recognition causes.

tools deal with, right? These aren't perfect tools. These are algorithmic tools that look for patterns. And it's not always clear the accuracy of those tools. That's not always made transparent to the police and detectives using them, let alone the subjects and defense attorneys who are left to deal with the arrests that come after the fact. We've seen plenty of incidents as

where facial recognition identifies the wrong person or facial recognition isn't disclosed to the court when it was used in an arrest, even though there are certain rules that are supposed to force police departments to tell judges and prosecutors

and juries and defense attorneys that facial recognition was used. So in some ways, a tool like track, you know, raises a lot of the same concerns as facial recognition. But in another way, it raises the bar a bit by just expanding the volume of footage that it can be used on, you know, exponentially. We'll be right back.

Now at Verizon, we're locking in low prices for three years guaranteed on MyPlan. And you can get a single line for just $45 a month when you switch and bring your phone. That's our best price ever on unlimited welcome with auto pay plus taxes and fees guaranteed for three years.

Because at Verizon, we got you. Visit your local Chicago Verizon store today. $20 monthly promo credits applied over 36 months with a new line on unlimited welcome. In times of congestion, unlimited 5G and 4G LTE may be temporarily slower than other traffic. Domestic data roaming at 2G speeds. Price guarantee applies to then current base monthly rate. Additional terms and conditions apply. If you're struggling to keep up with all the latest innovations in tech and what they'll mean for your life, TED Tech has you covered. Get ahead of the curve with digestible downloads on some of the biggest ideas in technology. From AI and virtual reality to clean tech.

You are listening to Marketplace Tech. I'm Megan McCarty Carino. We're back with James O'Donnell, AI reporter at MIT Technology Review. How would you kind of describe the landscape when it comes to, you know, regulation of these cutting edge technologies and policing?

It's a, it's a serious patchwork of, of regulations. I mean, plenty of, uh, cities and municipalities and states don't have any regulations that are curbing the way police can use facial recognition. Um, some at the city level, you know, like San Francisco and Oakland have pretty near complete, uh, bans against facial recognition, uh, being used by police. And then there's some, you know, some middle ground states, uh,

and cities that ban it for live videos. So let's say police can only use it for recorded video rather than analyzing live video feeds in an attempt to sort of reduce the surveillance problem, the threat of surveillance. But there is certainly no overarching law federally

that governs the use of this sort of technology. And there are thousands of police departments in the U.S. that have a lot of independence in how they choose to spend their budgets. And, you know, one thing I would say from the perspective of the ACLU, which I included in the story, is that

It's fairly rare for police departments and tech companies to go to the community first and say, hey, we'd like to implement, you know, this facial recognition technology or a tool like TRAC. What do you think? You know, what would we have to do for the community to give us permission to use a tool like this? And if they did, there may be some restrictions on how those tools could be used. There may be some obligation of the police departments to, you know,

hand over data, make public reports about how these tools are being used. But oftentimes, what we see is police departments putting up contracts with these technology companies and sort of using the tech first, and then communities sort of have to react after the fact. So is there any way for someone to know if their local law enforcement is using a tool like this or if it's been used in an investigation?

Well, you know, in theory, the budgets and relationships that police departments have and what they spend their money on is public information. It's not always presented clearly to the public, but...

You know, in theory, this is something that everyone should be able to ask their police department and get an answer for should they wish to. There's no guarantees that that will be timely or a very clear answer. You know, they may take a while to get back to you. So it's possible to know if police departments are customers of this tool.

in theory, but it's certainly not likely to know if you've come up in these tools, particularly, you know, if your face or body has been identified by a tool like track. You know, it's not likely that you'd be able to get an answer on that. That was James O'Donnell at MIT Technology Review. Veritone CEO Jason Stielberg told James, quote, I hope we're exonerating people as much as we're helping police find the bad guys.

We'll have a link to James's full reporting at our website, MarketplaceTech.org. And if you want to know more about some of the problems with facial recognition software in policing, a couple of years ago, we spoke with Nate Freed-Wessler, an attorney with the ACLU, who was representing a Detroit man, Robert Williams, who was wrongfully arrested due to a false ID by facial recognition software and settled with the city last year for $300,000.

As part of the settlement, the Detroit Police Department agreed to reforms in their use of facial recognition, including a prohibition on making arrests solely based on facial ID matches, disclosing its use in an arrest, and conducting an audit of every case in which the system was used going back to 2017.

We spoke to Wessler before the case was settled, but I asked him at the time if any improvement or level of transparency would make him comfortable with the use of this technology in policing, and he said no. And one of his reasons is I think relevant to this new tool, TRAC, as well. He said technology, like humans, will always make some mistakes in real-world conditions, but identifying and correcting those mistakes could be more difficult.

There's extensive research by social scientists and psychologists now about human cognitive bias toward believing the results of computer systems. People just have an inherent kind of deeply programmed proclivity to believe algorithms and the results that they spit out. And so it is very hard to come up with systems that

that will stop police from just inherently believing that a face recognition match must be right and structuring their investigation around that. Jesus Alvarado produced this episode. I'm Megan McCarty Carino, and that's Marketplace Tech. This is APM.

This Old House has been America's most trusted source for all things DIY and home improvement for decades. And now we're on the radio and on demand. I think you're breaking into this wall regardless. I was hoping you wouldn't say that. I need to go and get some whiskey, I think. I would get the whiskey for sure. Subscribe to This Old House Radio Hour from LAist Studios, wherever you get your podcasts.