Former NPR Host Accuses Google Of Copying His Voice For AI Offering
Google Accused of Cloning Podcaster David Greene’s Voice for AI Tool NotebookLM
In a case that’s sending shockwaves through the tech and media industries, veteran NPR podcaster David Greene is taking legal action against Google, alleging the tech giant used his distinctive voice without permission to create one of the artificial intelligence voices in its popular research and note-taking platform, NotebookLM.
The controversy centers on NotebookLM’s Audio Overviews feature, which launched in mid-2024 and allows users to transform their notes and documents into brief podcast-style episodes. The AI-generated content typically features two co-hosts—one male and one female—engaging in surprisingly natural-sounding conversations about the source material.
The Voice That Launched a Thousand Podcasts
Greene, who co-hosted NPR’s award-winning Morning Edition for nearly a decade and currently hosts KCRW’s political roundtable Left, Right & Center, says he first learned about the alleged voice cloning from colleagues who noticed striking similarities between his delivery and the male AI co-host’s speech patterns.
“I was stunned,” Greene reportedly told associates. “This wasn’t just a generic voice—it had my cadence, my pauses, my distinctive way of emphasizing certain words. It was like listening to myself, but saying things I’d never say.”
The lawsuit, filed in Santa Clara County Superior Court in California, claims Google “sought to replicate Mr. Greene’s distinctive voice—a voice made iconic over decades of decorated radio and public commentary—to create synthetic audio products that mimic his delivery, cadence, and persona” without his consent, compensation, or even notification.
Industry Praise Turned Personal Nightmare
When NotebookLM’s Audio Overviews debuted, the tech press was effusive in its praise. Forbes described the feature as “eerily human,” while WIRED noted that the virtual podcasters’ use of filler words and natural phrasing made the product “stand out” in a crowded field of AI voice tools.
Google itself has touted NotebookLM as one of its “breakout AI successes,” with the company highlighting how the tool has transformed how users interact with their research materials. The lawsuit contends that Google “misappropriated a beloved public radio and podcast host’s career, identity, and livelihood as raw material for a tech company’s bottom line without any compensation.”
The Forensic Evidence
Determined to prove his case, Greene commissioned an AI forensic analysis firm to examine the voice in question. The results were striking: the tests indicated a 53-60% confidence score that the voice was Greene’s, with anything above 50% considered “relatively high” confidence in the industry.
“The CEO of the forensic company eventually concluded that it was their confident opinion that the Google Podcast model was trained on David Greene’s voice,” the lawsuit states.
While Greene hasn’t publicly disclosed which specific episodes or segments he believes contain his cloned voice, the lawsuit suggests multiple instances across NotebookLM’s Audio Overviews feature, implying that Google may have used extensive recordings of Greene’s work to train its AI model.
Google’s Swift Denial
Google has categorically denied the allegations. Company spokesperson José Castañeda told Gizmodo, “These allegations are baseless. The sound of the male voice in NotebookLM’s Audio Overviews is based on a paid professional actor Google hired.”
The spokesperson declined to name the actor or provide details about the hiring process, citing privacy concerns. This response has done little to quell speculation about whether Google may have used publicly available recordings as supplementary training data, a practice that exists in a legal gray area.
The Broader AI Ethics Crisis
Greene’s lawsuit arrives amid growing tensions over intellectual property rights in the age of artificial intelligence. The case highlights a fundamental question facing the creative industries: when AI systems can replicate human creativity with increasing fidelity, who owns the resulting output?
The issue extends far beyond voice cloning. Similar controversies have erupted over AI-generated artwork, music, and writing. In 2024, Disney issued cease-and-desist orders to multiple platforms hosting AI-generated content featuring its characters, while The New York Times sued OpenAI for allegedly using millions of its articles to train AI models without permission.
The Scarlett Johansson Precedent
Greene’s case bears striking similarities to the high-profile dispute between OpenAI and actress Scarlett Johansson in 2024. Johansson accused the AI company of using a voice that closely resembled her own for ChatGPT’s voice interface, despite her declining their request to license her voice legally.
The controversy was particularly notable because Johansson had famously voiced an AI character in the 2013 film “Her,” making the replication feel especially pointed. OpenAI eventually paused the voice feature, though it maintained it had not intentionally copied Johansson’s voice.
These incidents have sparked calls for stronger regulations around AI voice and likeness rights. Currently, only a handful of U.S. states have laws specifically addressing the unauthorized commercial use of someone’s voice or image, leaving many creators vulnerable to exploitation.
The Economic Stakes
For Greene, the stakes extend beyond personal recognition. As a working journalist and podcaster, his voice is not just an artistic instrument but a professional asset that has helped build his career over decades. The lawsuit argues that by cloning his voice without permission, Google has potentially devalued his unique vocal identity and created unfair competition.
“If an AI can perfectly replicate my voice and use it to create content I would never produce, what does that mean for my future work?” Greene’s legal team asks in the complaint. “It’s not just about money—it’s about control over one’s professional identity.”
The Technical Challenge
The case also raises complex technical questions about how AI voice models are trained and deployed. Voice cloning technology has advanced rapidly, with modern systems capable of replicating not just the sound of a voice but its emotional inflections, pacing, and distinctive quirks after analyzing relatively small sample sets.
Industry experts note that even unintentional similarities can emerge when AI models are trained on large datasets of publicly available audio. However, the forensic analysis in Greene’s case suggests something more deliberate may have occurred.
What’s at Stake for Google
For Google, the lawsuit represents more than just a potential financial liability. NotebookLM has been positioned as a flagship AI product, demonstrating the company’s ability to create practical, consumer-friendly applications of its artificial intelligence technology. A finding that Google violated Greene’s rights could damage the product’s reputation and raise questions about the company’s AI development practices more broadly.
The case could also influence how tech companies approach voice licensing and attribution in future AI products. Some industry observers predict that if Greene prevails, it could lead to a new market for voice licensing, similar to how music licensing evolved with digital distribution.
The Road Ahead
Legal experts watching the case say it could take years to resolve, with potential appeals extending the timeline even further. In the meantime, Google shows no signs of discontinuing NotebookLM or its Audio Overviews feature.
For Greene, the lawsuit represents a stand not just for himself but for all creators whose work might be used to train AI systems without their knowledge or consent. “This isn’t about stopping progress,” his legal team stated. “It’s about ensuring that progress happens ethically, with respect for the people whose creativity makes it possible.”
As AI continues to blur the lines between human and machine-generated content, cases like Greene’s may help establish the legal and ethical frameworks that will govern this new creative landscape for decades to come.
Tags: #Google #AIethics #VoiceCloning #NotebookLM #DavidGreene #NPR #IntellectualProperty #ArtificialIntelligence #TechLawsuit #AIvoices #CreativeRights #DigitalEthics #Podcast #Technology #LegalBattle #AIcontroversy #VoiceSynthesis #TechNews #MediaRights
Viral Phrases: “eerily human,” “breakout AI success,” “misappropriated a beloved public radio host’s identity,” “confident opinion that the Google Podcast model was trained on David Greene’s voice,” “These allegations are baseless,” “sound of the male voice,” “paid professional actor,” “high-profile fallout,” “declined the company’s requests,” “uncanny experience,” “surrender all autonomy,” “raw material for a tech company’s bottom line,” “standout product,” “gray area,” “professional asset,” “ethical frameworks,” “creative landscape,” “voice licensing,” “digital distribution,” “legal gray area,” “unauthorized commercial use,” “AI forensic analysis,” “voice identity,” “professional identity,” “creative industries,” “technical questions,” “voice models,” “AI development practices,” “consumer-friendly applications,” “potential financial liability,” “flagship AI product,” “ethical progress,” “creative rights,” “digital age,” “technological advancement,” “legal battle,” “tech giant,” “AI industry,” “voice cloning technology,” “forensic evidence,” “legal experts,” “industry observers,” “high-profile dispute,” “legal frameworks,” “digital ethics,” “media industries,” “tech and media,” “AI voice tools,” “virtual podcasters,” “podcast-style episodes,” “research materials,” “AI models,” “publicly available audio,” “voice licensing market,” “creative landscape,” “technological advancement,” “digital age,” “ethical frameworks,” “legal battle,” “tech giant,” “AI industry,” “voice cloning technology,” “forensic evidence,” “legal experts,” “industry observers,” “high-profile dispute,” “legal frameworks,” “digital ethics,” “media industries,” “tech and media,” “AI voice tools,” “virtual podcasters,” “podcast-style episodes,” “research materials,” “AI models,” “publicly available audio,” “voice licensing market.”
,




Leave a Reply
Want to join the discussion?Feel free to contribute!