At face value, adding color and quality to old videos seems like a win-win situation. Viewers love watching historical footage without the scraggly camera view and greyscale tones, and creators love finding inventive ways to make the changes look more realistic.
But some people — especially historians and scholars — say the technique is inherently problematic. Wired reports that experts worry the upscaling and colorization could do more harm than good for viewers. The problem stems from the common misunderstanding that removing imperfections alters the media to what it would have looked like had it been taken in 2020.
“The problem with colorization is it leads people to just think about photographs as a kind of complicated window onto the past, and that’s not what photographs are,” says Emily Mark-FitzGerald, a professor at the University College Dublin’s School of Art History and Cultural Policy.
The enhancement process being used by companies like Poland-based Neural Love uses complex neural networks to create images and videos that look very much like reality. And therein lies the issue: they’re so convincing that they become reality for many.
Not a new issue, but a more pressing one — Restoration of old photos is nothing new. Colorization in particular has been around for a very long time — hand colorization since the early 20th century and digital colorization since the 70s.
The advent of more complex algorithms and better machine learning techniques has made the restoration process much more realistic looking. Now you can take a piece of film 1903 and add enough frames and color to make it look like it was shot last week. It’s never been more difficult to pick out the true source material.
You can’t control what people perceive — Researchers at Neural Love say they’re upfront with clients about the line between restoration and artificiality.
“We consider our work to be an adaptation of the original, similar to a modern take on Shakespeare,” says Neural Love’s Denis Shiryaev. “Our work seeks to transform access to and awareness of the originals, not pose challenges to their authenticity or artistic merit.”
What Shiryaev is missing, though, is the factor of outside perception. Once an image or video has been posted to the internet, viewers are going to believe what they want to believe. Neural Love’s intentions are good — but the company can’t control how people understand their enhanced images.
Mark-Fitzgerald says she’s already had students submitting falsely colorized images without realizing it. The artificial image takes on a life of its own once being released — often overshadowing the original.
More to discuss — There’s a larger discussion at play here, about how we can validate the truth of a digital image. It’s the same ongoing conversation we’re having around deepfakes, those tricky pieces of misinformation pretending to be real media.
Right now the best tool we have for helping each other understand manipulated media is to practice media literacy — both in our own lives and in our interactions on the internet.