We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Mini Episode: AI Therapists, Facial Recognition in Detroit, Decolonialism in AI, and Deepfakes for Corporate Training

Mini Episode: AI Therapists, Facial Recognition in Detroit, Decolonialism in AI, and Deepfakes for Corporate Training

2020/7/12
logo of podcast Last Week in AI

Last Week in AI

AI Deep Dive AI Chapters Transcript
People
D
Daniel Beshear
Topics
Daniel Beshear: 本周AI新闻综述涵盖AI治疗机器人,底特律人脸识别技术的抵制,AI去殖民化以及深度伪造技术在企业培训中的应用。AI治疗机器人在方便快捷的同时也存在数据安全和伦理问题,需要谨慎对待。底特律人脸识别技术的应用引发了关于种族歧视和隐私侵犯的争议,活动家们呼吁公众参与政府对监控技术的决策。DeepMind和牛津大学的研究人员建议AI从业者借鉴去殖民理论,避免算法被用于剥削和压迫。深度伪造技术在企业培训中的应用也带来了一些伦理方面的挑战,需要制定相应的规范和标准。

Deep Dive

Shownotes Transcript

Translations:
中文

Hello and welcome, this is Daniel Beshear here with SkyNet Today's Week in AI. This week, we'll look at AI therapy bots, pushback against Detroit's use of facial recognition, decolonialism in AI, and how deepfakes are becoming a new corporate training tool. Earlier this week, James Deneen wrote for OneZero about his experience with Woebot, one of the digital mental health tools proliferating during the pandemic.

In a time when users are becoming more comfortable with telemedicine and other virtual tools, chatbots like Woebot may be able to lend a hand to therapists, many of whom are overburdened with patients. The federal government, optimistic about this prospect, eased restrictions on these tools. While some tools such as Woebot have been vetted by researchers affiliated with the chatbot companies, the deregulation has allowed unproven and potentially damaging digital tools to flourish.

A 2019 study in Nature Digital Medicine found that only one of 73 mental health apps studied, cited, published scientific literature, although almost half used scientific language in claiming their effectiveness.

Even for those tools that have been deemed effective, studies are inconclusive, and as Dineen found, the tools make obvious mistakes that patients are likely to find off-putting. Experts agree that while chatbots cannot replace therapists completely, they may help with scalability and give patients some reprieve when they're not able to meet with therapists. We are living in a time where some people are turning to therapy bots, and other people are fighting against racial violence and facial recognition.

On July 24th, Detroit's facial recognition contract with the company DataWorks is set to end. CNET reports that residents of Detroit, which boasts the highest percentage of black residents in the United States, have been pushing to stop the expansion of facial recognition services since the contract began in 2017. While Detroit's city council was expected to vote in favor of renewing the contract, Outcry has delayed the vote.

Activists point to a number of issues in facial recognition systems, including studies proving racial and gender bias, the wrongful arrest of Robert Williams when Detroit's facial recognition system misidentified him as another black man, and evidence that the technology has not reduced crime rates wholesale. Even if the contract expires, that doesn't mean the end of surveillance in Detroit. The city could reach out to partners like the Michigan State Police to run searches with facial recognition technology,

But activists are pushing for policies such as a community input over government surveillance ordinance, which would allow for public input into any surveillance technology purchased by Detroit. They hope to make Detroit into a leader in opposing the use of racist technology rather than allowing surveillance of its own citizens to continue. Stories of places like Detroit indicate that we have a long way to go in making sure that AI is used and deployed equitably.

VentureBeat reports a recent step towards that goal by a team of researchers from Google's DeepMind and Oxford University. Their paper recommends that AI practitioners draw on decolonial theory to reform the industry, put ethical principles into practice, and avoid further algorithm exploitation and oppression. The paper details how to build AI systems while considering colonialism, and posits that power is at the heart of ethics debates.

In order to design AI systems to avoid perpetuating harms like racism, classism, and heteronormativity, the authors argue that we need to recognize power dynamics. VentureBeat highlights a quotation from the paper:

It states: "Power imbalances within the global AI governance discourse encompasses issues of data inequality and data infrastructure sovereignty, but also extends beyond this. We must contend with questions of who any AI regulatory norms and standards are protecting, who is empowered to project these norms, and the risks posed by a minority continuing to benefit from the centralization of power and capital through mechanisms of dispossession."

That one was a mouthful. Now, ever wish that your job training was delivered by Adam Driver instead of your manager? That's understandable, and it might just happen someday. Wired reports that as COVID restrictions make it more difficult to shoot videos, companies are turning to synthetic media instead.

Advertising giant WPP will send corporate training videos to tens of thousands of employees this month. In these videos, a presenter will speak in the recipient's language and address them by name, while explaining basic AI concepts.

WPP used the services of London startup Synesthesia, one company that specializes in creating synthetic video. Given the ethical issues surrounding deepfakes, Synesthesia has posted ethics rules, says it vets customers in their video scripts, requires a person's formal consent before synthesizing their appearance, and won't touch political material. As synthetic media works its way into the corporate mainstream, we may be seeing many more stories like this one.

That's all for this week. Thanks so much for listening. If you enjoyed the podcast, be sure to rate and share. If you'd like to hear more news like this, please check out skynetoday.com, where you can find our weekly news digests with similar articles.