This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

WINSTON-SALEM, N.C. — Experts are warning of a sophisticated, quite convincing tactic ahead of this year’s election cycle. Similar to face-swap apps, more complicated programs can use images of someone’s face and – with the help of a good impressionist or audio manipulator – can make it appear as though someone is saying something they never did. These videos are best known as deepfakes.

“Deepfakes are very difficult to identify,” said Ananda Mitra, professor of communication at Wake Forest University. “They actually look like it might actually be you speaking.”

Through a simple Google search, you can find deepfakes showing anything from a scene from the movie Office Space, yet featuring Keanu Reeves’ character from The Matrix, to impressionist Jim Meskimen imitating several well-known celebrities, to actor Jordan Peele using AI to create a fake news PSA featuring what appears to be former President Barack Obama.

“Most of us have so many pictures of us out there that those pictures can now be fed into certain forms of computer programs where it would appear like you’re saying something,” Mitra detailed.

Mitra says these videos are just the latest example of how misinformation can be created and circulated.

“I think it’s gonna be increasingly more difficult as we go towards the election cycle where more and more of this is gonna come,” he said. “No doubt.”

These false videos, stories or statements can come from individuals, institutions and other countries. Yet, they don’t have to be as complex as a deepfake to influence a person’s thinking.

“If it sounds too good to be true, or on the other hand too bad to be true, there’s a reason to suspect it,” Mitra explained, adding that the best way to stop the spread of misinformation is to pause before you share something and first talk to friends and family about the validity of the story.

“You’re spreading it to people you know, and the people who are getting it actually trust you, and thus trust the story,” Mitra says. “You can’t stop the production of the thing, but you can certainly stop spreading it.”

Mitra also weighed in on President Donald Trump’s executive order aimed at curbing protections for social media companies.

“That becomes an incredibly slippery slope and eventually could be,” he said. “I mean where do you draw the line? Can I not say something in my own home?”

There are several fact-checking websites which can help identify false information.

“Well-mentioned and well-intended websites which will catch a falsehood,” Mitra said, offering Snopes as an example.

While avoiding spreading false stories can help protect the people around you, there are also steps you can take to protect yourself.

“Google yourself and figure out what’s going on,” Mitra said. “You might find things about yourself that, ‘jeez I never said that.’”

Last week, Twitter announced that it will test a new feature that will ask users to open a link before sharing it. Twitter support says the platform will only check if a user recently clicked the link through Twitter, not elsewhere on the internet. The company says when a user sees the prompt, they will always have the option to retweet.