What is the 'AI homeless man prank'? Police says it's dangerous

BIG RAPIDS, Mich. (WOOD) – An AI-driven TikTok trend is resulting in 911 calls by panicked people who think a man has broken into their homes.

The prank uses artificial intelligence to create a picture or video of a "homeless man" entering a person's home, going through their fridge, or lying in their bed. The prankster sends the fake video to a loved one, who thinks the convincing images are real.

Police departments in at least four states have received calls for reported home intrusions only to find out the "intruder" was an AI-generated person, The New York Times reports.

The West Bloomfield Police near Detroit, Michigan, said it has received reports of people being fooled by the videos. They warn the "AI homeless man prank" wastes emergency responders' resources.

"Here’s the problem: officers are responding FAST using lights-and-sirens to what sounds like a call of a real intruder — and only getting called off once everyone realizes it was a joke," said New York's Yonkers Police Department in a Facebook post. "That’s not just a waste of resources… it’s a real safety risk for officers who are responding and for the family members who are home if our officers get there before the prank is revealed and rush into the home to apprehend this 'intruder' that doesn't exist."

“It’s frustratingly easy to do,” said Greg Gogolin, a professor and the director of cyber security and data science at Ferris State University. He created a program in a couple hours to show how AI technology can manipulate images.

“This is a natural language processing machine learning program called a face swapping,” Gogolin said.

The program was able to make the images look realistic and take features from a person’s face and combines that with other images.

Once a technology like this is developed, it often gets used in ways the original creators never intended.  

“They share that out or sell it. … It’s dispersed and that’s where the real danger is because people without any technical background can then utilize that the way they wish,” Gogolin said.

In some cases, there are things you can look for that could indicate an image is AI.

“You might generate something and an arm will be off, the elbows are in the wrong place. It used to be you would often see people with like three arms. A long arm, a long leg, the dynamics were not correct. A lot of that has been corrected or at least drastically improved with the newer versions,” Gogolin said.

Gogolin said investigators and law enforcement also need more advanced training. "There are very few degreed investigators that have a cyber security background, let alone a computer science background particularly at the local level, even at the state level."