Top 5 This Week

Related Posts

A “true crime” video about a Littleton man’s “secret gay love affair” with his murderous stepson is going viral. It’s fake.

A YouTube video that’s garnered nearly 2 million views and 8,000 comments details a salacious “true crime” saga in Littleton that “exposed a hidden world of secrets and lies shaking the town to its core,” according to the video’s description.

The only problem?

The “gripping true crime story” of a Littleton real estate agent murdered by his stepson after they had a “secret gay love affair” is, by all appearances, complete fiction. The 25-minute video likely was created with, and narrated by, a generative artificial intelligence program.

“I’m not surprised to see this kind of thing,” said Casey Fiesler, an associate professor at the University of Colorado Boulder who researches and teaches technology ethics, internet law and policy and online communities.

Fiesler said she’s seen similar AI-generated content created to spread false conspiracies.

“True crime makes a ton of sense as a genre in the same way that conspiracy theories do because people watch this kind of content,” Fiesler said. “The motivation for it is money.”

The YouTube channel True Crime Case Files has been pumping out similar-looking and comparably themed “true crime” content for eight months, all relying on still photos rather than any actual news footage. On July 30, the channel posted a video about Richard Engelbert, a supposed real estate agent who was killed by his stepson, Harrison Engelbert, in a “grisly murder” in 2014.

According to the video, Harrison Engelbert was convicted of second-degree murder after a police investigation and jury trial, and sentenced to 25 years in prison without the possibility of parole. The community, the video says, was left “deeply shaken.”

Yet there’s no evidence any of this happened. The video’s narration says the case was subject to local and national media attention, but Google searches yield no coverage from 2014. Local law enforcement officials say they’ve found no records corresponding with the purported case. And the Colorado Department of Corrections lists no inmate named Harrison Engelbert.

Eric Ross, the media relations director for the 18th Judicial District Attorney’s Office, said the story appeared to be fabricated because none of the names popped up in a search of Colorado’s court records.

Sgt. Krista Schmit with the Littleton Police Department said the agency did not investigate the crime described in the video, and none of the names used in the segment are people the department has interacted with.

Yet comment after comment under the video features viewers expressing shock, outrage and disgust over the heinous nature of this bogus crime.

“The way that people believe something just because they see it on the internet is going to be increasingly a problem,” CU’s Fiesler said.

Fiesler estimated this one video could have easily generated tens of thousands of dollars for the creator, whose contact information is not listed.

When she watched the video, the AI red flags jumped out at her immediately.

The voice narration was off, she said. The photos of the people used in the video — all studio-like portraits — have an “uncanny valley” appearance, she said. Google’s reverse image searches of the photographs don’t turn up real results.

The facts in the video don’t add up, either.

For example, the stilted narration initially says the alleged murder happened on Bleak Street in Littleton — a street that does not exist in the town — but later refers to it as being on Oak Street. The narrator constantly changes the pronunciation of “Engelbert.” The video also says Richard Engelbert’s wife, Wendy Engelbert, was a school principal who had to drop out of her campaign to be “school superintendent,” which is not an elected position in Colorado.

The other names used in the video don’t come up in Google searches: District Attorney Laura Mitchell, Detective James Cattle, neighbor Luby Johnson-Guntz.

What Google searches do reveal are dozens of knock-off YouTube videos and TikToks and webpages summarizing the original fake video.

Creating misinformation is not new, but with the advent of AI-generated content, Fiesler said the technology levels the playing field for more people to make and spread false content.

“The thing that generative AI has done is sort of democratize these types of bad actors in the sense that the more people who are able to create this kind of content, the more of it we will see,” Fiesler said.

Get more Colorado news by signing up for our daily Your Morning Dozen email newsletter.

Originally Published: August 27, 2024 at 10:32 a.m.

Popular Articles