Culture

This journalist isn't real. His destructive stories are.

A deepfake creation has been publishing articles in the Jerusalem Post and the Times of Israel bashing academics.

Reuters

When Oliver Taylor accused London-based academic Mazen Masri and his wife Ryvka Barnard of being “known terrorist sympathizers," they were understandably taken aback. What they didn’t realize at the time is that Oliver Taylor is not a real person.

According to an investigative report by Reuters, Oliver Taylor is a deepfake — an entirely fabricated persona, with computer-generated photos and a full backstory to boot. “Taylor” even has published bylines in noted media outlets like the Jerusalem Post and the Times of Israel.

Reuters spoke to six experts about Taylor, all of whom agreed that the only known image of him is definitely a deepfake.

“The distortion and inconsistencies in the background are a tell-tale sign of a synthesized image, as are a few glitches around his neck and collar,” said digital image forensics pioneer Hany Farid.

But identifying Taylor as fiction isn’t the difficult part; it’s figuring out who’s behind the deepfake that’s proving nearly impossible. Though the case of Oliver Taylor is fairly tame, it’s a dark glimpse at what mayhem deepfake technology could cause in the future.

Very unremarkable, very fake — If you didn’t know Oliver Taylor was a fictitious creation, you’d assume him to be just another average 20-something with brown air and a bit of stubble — meant, most likely, to be quite forgettable.

Most of Oliver Taylor’s internet presence has now been wiped from existence, save for a few random blog posts. Taylor still has a profile on question-and-answer site Quora, which reads:

24-year-old student, barista, and political activist. During the week I work on completing my masters in political science, and come the weekend you will find me engrossed in all things coffee. Whether I am brewing it or sipping it.

He supposedly attended the University of Birmingham in England, though, of course, the school has no record of him.

Nothing’s real any more — Deepfaking is very much the next evolution of catfishing, the online dating phenomenon whereby someone uses another person’s photos to scam other users. Except that with deepfakes there’s no real person whose photos are being used — photos of the persona are computer-generated.

And the most pressing problem with deepfakes is that they’re essentially untraceable. One expert interviewed by Reuters said investigators chasing the origin of such photos are left “searching for a needle in a haystack – except the needle doesn’t exist.”

Well, this is concerning — The spread of misinformation is quite rapidly becoming one of the internet’s most pressing issues, with every social media network on high alert as they try to weed out real statements from conspiracies. And as misinformation becomes more prevalent, deepfakes are proving to be an easy method by which to peddle it — a perfect way to hide behind your statements without fear of being discovered.

Only minimal damage was done by Oliver Taylor’s slander. As deepfake technology continues to improve, though, it’s only going to keep getting easier for the average person to create and hide behind a deepfake persona. And those deepfakes could serve up much more trouble than false accusations.

Even the best deepfake detector only works about 65 percent of the time. The case of Oliver Taylor is a reminder that we must be more vigilant in keeping an eye out for deepfakes, and that we’ll need much more accurate methods of detection if we hope to mitigate their damage in the future.