This is apparently the next big debate that is brewing, whether or not AI should be used in the process of film restoration, usually one of the major steps in re-issuing a movie (in theaters or home video).
To be clear, there is a big difference between generative AI and the kind of AI that is being used to restore films digitally.
Here's an article from the NYT for those who may be interested in reading it.
To be clear, there is a big difference between generative AI and the kind of AI that is being used to restore films digitally.
Here's an article from the NYT for those who may be interested in reading it.
A.I. Made These Movies Sharper. Critics Say It Ruined Them.
Machine-learning technologies are being used in film restoration for new home video releases. But some viewers strongly dislike the results.
Machine-learning technologies are being used in film restoration for new home video releases. But some viewers strongly dislike the results.
By Calum Marsh
Published April 13, 2024
In 1998, Geoff Burdick, an executive at James Cameron’s Lightstorm Entertainment, was hunched in front of a 12-inch monitor at a postproduction house, carefully preparing “Titanic” for release on LaserDisc and VHS. A state-of-the-art computer process had made it possible for Burdick and his team to scour the film frame by frame, removing tiny imperfections embedded in the original negative: little scratches, flakes of dirt, even water stains that smeared the image. The computer could erase these blemishes using a kind of copy-paste tool, concealing the defects with information from another frame.
Burdick, now a senior vice president at the company, told me that this process “seemed like freaking magic at the time.” And yet the results were not entirely well-received. “There were a lot of people who said that this was the most beautiful VHS they’d ever seen in their life, because we’d gotten rid of all that gobbledygook,” he recalled. “But there were a lot of folks who said, ‘This is not right! You’ve removed all of this stuff! If the negative is scratched, then we should see that scratch.’ People were really hard-core about it.”
In the decades since, home video formats have reached higher and higher resolutions, with VHS and LaserDisc giving way to DVD and Blu-ray, and eventually to ultra high-definition 4K discs, known as Ultra HD Blu-rays. As the picture quality has improved, restoration tools have evolved with them, making it easier than ever for filmmakers to fine-tune their work using computers. Several of Cameron’s films, including “The Abyss,” “True Lies” and “Aliens,” were recently released on Ultra HD Blu-ray in newly restored versions that are clearer and sharper than ever before — the product of painstaking attention from Lightstorm and Cameron himself. “I think they look the best they’ve ever looked,” Burdick said.
But as with the old “Titanic” home video, these restorations have proved controversial, with many viewers objecting strenuously to their pristine new look. What has caught the particular ire of critics is the fact that these versions have been restored, in part, using artificial intelligence. Park Road Post Production, the New Zealand company owned by the filmmaker Peter Jackson, helped clean up Cameron’s films using some of the same proprietary machine-learning software used on Jackson’s documentaries “The Beatles: Get Back” and “They Shall Not Grow Old.” The images in Cameron’s classic blockbusters were refined in a way that many felt looked strange and unnatural.
The level of detail is eye-popping. Water looks crystalline; colors are bright and vivid, while blacks are deep and inky. Some surfaces, however, do look a little glossy, with a buffed sheen that appears almost lacquered. It can be hard to pinpoint what is changed. But there does seem to be a difference, and depending on the viewer, it can feel slightly uncanny.
“It just looks weird, in ways that I have difficulty describing,” the journalist Chris Person said of these releases. “It’s plasticine, smooth, embossed at the edges. Skin texture doesn’t look correct. It all looks a little unreal.”
Person is among a number of viewers who are skeptical of the need to use A.I. to “enhance” the appearance of films that seemed to look fine to begin with. Although he said that there were “legitimate use cases” for A.I. in restoration, such as when a film’s original negative has been lost or badly damaged, he suspected that with something like “True Lies,” they were “using it just because they can.”
The recent Cameron releases, and particularly “True Lies,” have become the subject of intense scrutiny and fervent debate online. Home video reviewers have described it as an overly sanitized presentation, with one faulting its “routinely odd-looking images” and another arguing that it appears “almost artificial.” Web forums are teeming with complaints, often vicious, while social media posts criticizing it have spread widely.
Dan Best, the general manager at Park Road Post, acknowledged the debate around remastering movies. “The thing is, technology is changing,” he added. “People are viewing things at a lot higher resolutions at the moment. Therefore, a lot of recent films are being enhanced for these new viewing platforms.” Traditional home video releases were adequate for the days of tube TVs and 1080p video, in other words. But in the era of OLED screens and 4K smart TVs, restorations need a little more to meet increasingly high standards.
Burdick, who has been dealing with this kind of criticism since the “Titanic” days, seemed resigned to the fact that “you can’t please everybody at the end of the day,” though he accepted that the response to these Ultra HD Blu-rays was especially heated. The dissenters, he argued, were mainly just disappointed that “Aliens,” “True Lies” and “The Abyss” no longer look like they did in the VHS or DVD eras.
“People love these movies, which I think is great,” he said. “And they take that love to heart. So when the movie suddenly doesn’t look like they remember it looking, or the way they think they remember it looking, or it just doesn’t look the way they think it should, they get upset. What can you do?”
It doesn’t help that there is a stigma around the technology: Dissenters not only bristle at the appearance of these restorations — they are also unhappy that it is A.I. being used to make them appear that way.
But, Burdick said, that disapproval is based partly on misconception: “People hear, ‘Oh, they’re using A.I.,’ and they’re thinking about pirate ships and the cup of coffee,” — a reference to a recent viral video of a miniature ship sailing in a coffee mug, all generated with A.I. — “and they’re like, ‘What are you doing to it?’ But nobody is doing that to these movies,” he explained. “It’s not the same A.I., conceptually. It’s more like, this piece of negative looks kind of cruddy, and we can use some software to improve it, carefully.”
Best, at Park Road, said that this kind of A.I. upscaling was “definitely not the same” as the kind of generative A.I. used in apps like Midjourney or ChatGPT. Generative A.I. is a type of machine learning model that creates information, including images and videos, from users’ prompts. A.I. upscaling is subtler and less intrusive, using machine learning to refine an image without inventing new material from scratch. Generative A.I. could, say, add more aliens to “Aliens.” A.I. upscaling just adds more pixels, polishing the pre-existing images.
Eric Yang, the founder of the A.I. upscaling company Topaz Labs, said that one of the main differences between A.I. upscaling and generative A.I. was fidelity to the original source: With upscaling, “the enhancement that you get does not measurably change the meaning or the content of the image.” Nevertheless, he said that misunderstandings about the technology have given the whole enterprise a certain ignominy.
“People try not to talk about it,” he said. “Nobody likes to say that their film was A.I. upscaled or that a certain release had A.I. applied to it.”
The reluctance to admit to using A.I. is understandable given some recent controversies. In 2021, the filmmaker Morgan Neville came under fire when it was revealed that his documentary “Roadrunner” used A.I. software to create a deepfaked version of Anthony Bourdain’s voice for narration; last month, the horror film “Late Night with the Devil” was criticized for using A.I.-generated imagery, with some critics going so far as to call for the film’s boycott.
Although “Get Back” and “They Shall Not Grow Old,” which involved footage from World War I, made extensive use of the same A.I. processes, they did not receive as much criticism. That’s partly because of the condition of the source material: Both films took damaged archival images and appeared to reverse the deterioration, and in one case, to also colorize it. By contrast, the recent Cameron restorations were based on new 4K scans of the original negative, none of which needed extensive repair of that kind.
“It’s not a question of the negative being damaged,” Burdick conceded. “But back on the set, maybe you picked the shot that had the most spectacular performance, but the focus puller was a bit off, so it’s a bit soft. There could be a million reasons why it’s not perfect. So now there’s an opportunity to just go in and improve it.” The A.I. can artificially refocus an out-of-focus image, as well as make other creative tweaks. “You don’t want to crank the knob all the way because then it’ll look like garbage,” Burdick said. “But if we can make it look a little better, we might as well.”
For viewers like Person, the problem is what those minor enhancements entail: That uncanny smoothness, though perhaps more in focus, can look oddly fake. “I don’t want to sound anal, but it really is egregious,” Person said. “It’s the same thing as TV motion smoothing — they say it’s better, so you feel like you’re the one person cursed with vision who can see that it looks bad.”
Published April 13, 2024
In 1998, Geoff Burdick, an executive at James Cameron’s Lightstorm Entertainment, was hunched in front of a 12-inch monitor at a postproduction house, carefully preparing “Titanic” for release on LaserDisc and VHS. A state-of-the-art computer process had made it possible for Burdick and his team to scour the film frame by frame, removing tiny imperfections embedded in the original negative: little scratches, flakes of dirt, even water stains that smeared the image. The computer could erase these blemishes using a kind of copy-paste tool, concealing the defects with information from another frame.
Burdick, now a senior vice president at the company, told me that this process “seemed like freaking magic at the time.” And yet the results were not entirely well-received. “There were a lot of people who said that this was the most beautiful VHS they’d ever seen in their life, because we’d gotten rid of all that gobbledygook,” he recalled. “But there were a lot of folks who said, ‘This is not right! You’ve removed all of this stuff! If the negative is scratched, then we should see that scratch.’ People were really hard-core about it.”
In the decades since, home video formats have reached higher and higher resolutions, with VHS and LaserDisc giving way to DVD and Blu-ray, and eventually to ultra high-definition 4K discs, known as Ultra HD Blu-rays. As the picture quality has improved, restoration tools have evolved with them, making it easier than ever for filmmakers to fine-tune their work using computers. Several of Cameron’s films, including “The Abyss,” “True Lies” and “Aliens,” were recently released on Ultra HD Blu-ray in newly restored versions that are clearer and sharper than ever before — the product of painstaking attention from Lightstorm and Cameron himself. “I think they look the best they’ve ever looked,” Burdick said.
But as with the old “Titanic” home video, these restorations have proved controversial, with many viewers objecting strenuously to their pristine new look. What has caught the particular ire of critics is the fact that these versions have been restored, in part, using artificial intelligence. Park Road Post Production, the New Zealand company owned by the filmmaker Peter Jackson, helped clean up Cameron’s films using some of the same proprietary machine-learning software used on Jackson’s documentaries “The Beatles: Get Back” and “They Shall Not Grow Old.” The images in Cameron’s classic blockbusters were refined in a way that many felt looked strange and unnatural.
The level of detail is eye-popping. Water looks crystalline; colors are bright and vivid, while blacks are deep and inky. Some surfaces, however, do look a little glossy, with a buffed sheen that appears almost lacquered. It can be hard to pinpoint what is changed. But there does seem to be a difference, and depending on the viewer, it can feel slightly uncanny.
“It just looks weird, in ways that I have difficulty describing,” the journalist Chris Person said of these releases. “It’s plasticine, smooth, embossed at the edges. Skin texture doesn’t look correct. It all looks a little unreal.”
Person is among a number of viewers who are skeptical of the need to use A.I. to “enhance” the appearance of films that seemed to look fine to begin with. Although he said that there were “legitimate use cases” for A.I. in restoration, such as when a film’s original negative has been lost or badly damaged, he suspected that with something like “True Lies,” they were “using it just because they can.”
The recent Cameron releases, and particularly “True Lies,” have become the subject of intense scrutiny and fervent debate online. Home video reviewers have described it as an overly sanitized presentation, with one faulting its “routinely odd-looking images” and another arguing that it appears “almost artificial.” Web forums are teeming with complaints, often vicious, while social media posts criticizing it have spread widely.
Dan Best, the general manager at Park Road Post, acknowledged the debate around remastering movies. “The thing is, technology is changing,” he added. “People are viewing things at a lot higher resolutions at the moment. Therefore, a lot of recent films are being enhanced for these new viewing platforms.” Traditional home video releases were adequate for the days of tube TVs and 1080p video, in other words. But in the era of OLED screens and 4K smart TVs, restorations need a little more to meet increasingly high standards.
Burdick, who has been dealing with this kind of criticism since the “Titanic” days, seemed resigned to the fact that “you can’t please everybody at the end of the day,” though he accepted that the response to these Ultra HD Blu-rays was especially heated. The dissenters, he argued, were mainly just disappointed that “Aliens,” “True Lies” and “The Abyss” no longer look like they did in the VHS or DVD eras.
“People love these movies, which I think is great,” he said. “And they take that love to heart. So when the movie suddenly doesn’t look like they remember it looking, or the way they think they remember it looking, or it just doesn’t look the way they think it should, they get upset. What can you do?”
It doesn’t help that there is a stigma around the technology: Dissenters not only bristle at the appearance of these restorations — they are also unhappy that it is A.I. being used to make them appear that way.
But, Burdick said, that disapproval is based partly on misconception: “People hear, ‘Oh, they’re using A.I.,’ and they’re thinking about pirate ships and the cup of coffee,” — a reference to a recent viral video of a miniature ship sailing in a coffee mug, all generated with A.I. — “and they’re like, ‘What are you doing to it?’ But nobody is doing that to these movies,” he explained. “It’s not the same A.I., conceptually. It’s more like, this piece of negative looks kind of cruddy, and we can use some software to improve it, carefully.”
Best, at Park Road, said that this kind of A.I. upscaling was “definitely not the same” as the kind of generative A.I. used in apps like Midjourney or ChatGPT. Generative A.I. is a type of machine learning model that creates information, including images and videos, from users’ prompts. A.I. upscaling is subtler and less intrusive, using machine learning to refine an image without inventing new material from scratch. Generative A.I. could, say, add more aliens to “Aliens.” A.I. upscaling just adds more pixels, polishing the pre-existing images.
Eric Yang, the founder of the A.I. upscaling company Topaz Labs, said that one of the main differences between A.I. upscaling and generative A.I. was fidelity to the original source: With upscaling, “the enhancement that you get does not measurably change the meaning or the content of the image.” Nevertheless, he said that misunderstandings about the technology have given the whole enterprise a certain ignominy.
“People try not to talk about it,” he said. “Nobody likes to say that their film was A.I. upscaled or that a certain release had A.I. applied to it.”
The reluctance to admit to using A.I. is understandable given some recent controversies. In 2021, the filmmaker Morgan Neville came under fire when it was revealed that his documentary “Roadrunner” used A.I. software to create a deepfaked version of Anthony Bourdain’s voice for narration; last month, the horror film “Late Night with the Devil” was criticized for using A.I.-generated imagery, with some critics going so far as to call for the film’s boycott.
Although “Get Back” and “They Shall Not Grow Old,” which involved footage from World War I, made extensive use of the same A.I. processes, they did not receive as much criticism. That’s partly because of the condition of the source material: Both films took damaged archival images and appeared to reverse the deterioration, and in one case, to also colorize it. By contrast, the recent Cameron restorations were based on new 4K scans of the original negative, none of which needed extensive repair of that kind.
“It’s not a question of the negative being damaged,” Burdick conceded. “But back on the set, maybe you picked the shot that had the most spectacular performance, but the focus puller was a bit off, so it’s a bit soft. There could be a million reasons why it’s not perfect. So now there’s an opportunity to just go in and improve it.” The A.I. can artificially refocus an out-of-focus image, as well as make other creative tweaks. “You don’t want to crank the knob all the way because then it’ll look like garbage,” Burdick said. “But if we can make it look a little better, we might as well.”
For viewers like Person, the problem is what those minor enhancements entail: That uncanny smoothness, though perhaps more in focus, can look oddly fake. “I don’t want to sound anal, but it really is egregious,” Person said. “It’s the same thing as TV motion smoothing — they say it’s better, so you feel like you’re the one person cursed with vision who can see that it looks bad.”