Tennessee grandmother jailed after AI facial recognition error links her to fraud | Tennessee
AI Facial Recognition Error Jails Innocent Tennessee Grandmother for Months in North Dakota Bank Fraud Case
In a chilling example of the real-world consequences of artificial intelligence gone wrong, a Tennessee grandmother spent nearly six months behind bars after being misidentified by facial recognition software in a North Dakota bank fraud investigation. The case of Angela Lipps, 50, highlights the growing concerns about AI accuracy, racial bias, and the devastating impact of technological errors on innocent lives.
A Life Turned Upside Down
Angela Lipps, a mother of three and grandmother of five, had never been on an airplane until authorities flew her to North Dakota to face charges in a case she says she had nothing to do with. Living most of her life in north-central Tennessee, Lipps found herself at the center of a nightmare that began when US marshals arrested her at her home while she was babysitting four children.
“I’ve never been to North Dakota, I don’t know anyone from North Dakota,” Lipps told WDAY News, her voice still carrying the weight of disbelief months later.
The arrest came in July when Lipps was taken away at gunpoint and booked into a county jail as a fugitive from justice from North Dakota. She remained there for nearly four months without bail while awaiting extradition, charged with four counts of unauthorized use of personal identifying information and four counts of theft.
The AI Mistake That Changed Everything
According to Fargo police records obtained by WDAY News, detectives investigating bank fraud cases in April and May 2025 reviewed surveillance video of a woman using a fake US Army military ID to withdraw tens of thousands of dollars. The officers allegedly used facial recognition software to identify the suspect as Lipps.
A detective reportedly wrote in court documents that Lipps appeared to match the suspect based on facial features, body type, and hairstyle. However, Lipps maintains she had never been to North Dakota and did not commit the crimes.
Her attorney, Jay Greenwood, expressed grave concerns about the investigation’s methodology: “If the only thing you have is facial recognition, I might want to dig a little deeper.”
The Cost of a Technological Error
The consequences of this AI misidentification were devastating. Lipps spent 108 days in jail before authorities in North Dakota transported her from Tennessee at the end of October. She appeared in a North Dakota courtroom the next day, her life already shattered by the experience.
While jailed and unable to pay bills, Lipps lost her home, her car, and her dog. The emotional toll was equally severe, as she spent months away from her family, including her five grandchildren, for a crime she didn’t commit.
The case finally began to unravel when Greenwood obtained Lipps’s bank records and presented them to investigators. The records showed that Lipps was more than 1,200 miles away in Tennessee at the time investigators said the fraud occurred in Fargo.
Abandoned After Exoneration
Even after being released on Christmas Eve, Lipps’s ordeal wasn’t over. Fargo police did not pay for her trip home, leaving her stranded in North Dakota. Local defense attorneys helped cover a hotel room and food on Christmas Eve and Christmas Day, and a local non-profit, the F5 Project, was able to help her return to Tennessee.
To this day, Lipps says no one from the Fargo police department has apologized for the error that upended her life.
A Growing Pattern of AI Failures
Lipps’s case is far from isolated. In October, an AI system apparently mistook a Baltimore high school student’s bag of Doritos for a firearm and called local police to tell them the pupil was armed. Taki Allen was sitting with friends outside Kenwood High School in Baltimore when police officers with guns approached him, made him get on his knees, and handcuffed and searched him – finding nothing.
Earlier this year, police arrested a man in the UK for a burglary in a city he had never visited after face-scanning software confused him with another person of south Asian heritage. Authorities had used automated facial recognition software which matched him with footage of a suspect in a £3,000 burglary 100 miles away.
The Broader Implications
These cases raise serious questions about the reliability of AI systems, particularly in high-stakes situations like criminal investigations. Facial recognition technology has been shown to have higher error rates for people of color and women, leading to concerns about racial bias and discrimination.
The technology’s defenders argue that it’s a valuable tool for law enforcement when used correctly and in conjunction with other investigative methods. However, critics point to cases like Lipps’s as evidence that the technology is not yet reliable enough for use in criminal justice.
Moving Forward
As AI technology becomes increasingly integrated into our daily lives and critical systems, the need for robust oversight, testing, and accountability becomes more urgent. The case of Angela Lipps serves as a stark reminder that behind every technological failure are real human lives being affected.
For Lipps, the journey to rebuild her life continues. She’s working to recover from the financial devastation of losing her home and car, reconnecting with her family, and processing the trauma of being wrongfully accused and imprisoned.
Her story stands as a cautionary tale about the dangers of over-reliance on technology without proper safeguards and the human cost when AI systems fail.
Tags: AI error, facial recognition failure, wrongful arrest, grandmother jailed, North Dakota fraud, Tennessee woman, mistaken identity, AI bias, criminal justice technology, technological error, facial recognition bias, AI consequences, wrongful imprisonment, bank fraud investigation, AI reliability
Viral Phrases:
- “I’ve never been to North Dakota, I don’t know anyone from North Dakota”
- “If the only thing you have is facial recognition, I might want to dig a little deeper”
- AI facial recognition error jails innocent grandmother
- Technology gone wrong: when AI mistakes destroy lives
- The human cost of AI mistakes in criminal justice
- Wrongfully imprisoned for months due to facial recognition error
- Grandmother loses everything after AI misidentification
- When technology fails: the Angela Lipps story
- The dark side of facial recognition technology
- AI mistakes are destroying innocent lives
- From Tennessee to North Dakota: a grandmother’s nightmare
- The system failed her: how AI errors ruin lives
- No apology, no accountability: the aftermath of AI mistakes
- Technology’s false positives have real human consequences
- When algorithms make mistakes, people pay the price
,




Leave a Reply
Want to join the discussion?Feel free to contribute!