10.6 C
Friday, June 2, 2023
HomeEntertainmentAI decision-making in Hollywood is already here, now what?

AI decision-making in Hollywood is already here, now what?


Every maker on Earth now feels the guiding hand of AI.

On social media, TikTokers are rewarded with huge views for matching content to an algorithm carefully designed to trigger dopamine release. In Hollywood, producers are rewarded with lucrative movie deals for developing projects that fuel black box AI in studios and streaming platforms, which keep valuable viewer data insights to themselves. That viewer data is built through feedback loops created by recommendation engines, reinforced by the viewer behavior they shape in the first place. It’s value creation increasingly being usurped by machines, and between TikTok and streaming platforms, the precious space that enables human-centered innovation is shrinking. TikTokification is spreading.

The Writers Guild rightly urges protection against AI, but nowhere is this protection more urgent than in the documentary and non-fiction world, where I have worked as both a producer and a writer.

There’s a lot at stake and creative careers at stake. But the biggest threat to the wider culture shaped by environmental machines isn’t the bottom-up AI-generated art that populates social media (think: Wes Anderson Directs Star Wars). It’s the top-down AI-powered platforming of art that we’re already seeing in the media landscape—algorithms deciding on a global scale what stories to tell and how—and it’s especially insidious in the realm of non-fiction.

“The danger is not so much in AI in documentary making, but in the actual production, and more in the curation of it,” said Amit Dey, EVP of nonfiction at MRC, which has untitled documentaries by Sly Stone and Rudy Giuliani in the works. “It’s one thing if human-made movies compete in the market with robot-made movies. It’s quite another when data in the form of artificial intelligence or proprietary algorithms drives decisions about what the human public is exposed to. In other words, what is being bought and when. What is being platformed and where. What stories are being told.”

Media veteran and producer Evan Shapiro, who just headlined MIPTV, says outsourcing accountability is a time-honored tradition in Hollywood. “From dial testing to focus grouping to ‘my kids didn’t like it,’ a certain breed of TV executives have long relinquished their greenlighting decisions to a series of third-party safety nets that protect against actually making the choices themselves, Shapiro says. “These devices allow executives to take credit when shows work, and easily abdicate when they don’t… AI is just the latest excuse craze.”

And yet, AI is already hard at work at every level of filmmaking.

At XTR, which Magic Johnson doc They call me magic for Apple TV+, CEO Bryn Mooser has built a proprietary algorithm called “Rachel” to guide its development process. He calls it a “zeitgeist machine” that combs through social media to see what’s trending and then focuses its development around those cues.

“It bothers me a lot,” says Mooser. “Then came ChatGPT and the world changed overnight. We’ve always seen it as a tool, and as a tool it’s incredibly useful. Which conversations are trending. What people are talking about. We built it so that we could overlay that with historical data in the documentary world. What works, what doesn’t. Its application as a tool to enhance what filmmakers can do is incredibly powerful and important. And I hope it is embraced.”

It’s also true that human executives are still making the final green-light decisions on these platforms, but with the growing wealth and power of AI-generated data insights — data insights proven to drive viewer engagement, for better and for worse. adversity – the willingness of executives to die on the mound of one’s own (human) opinion fades. Why take risks with more new concepts when, according to the data, for example, the true crime genre is a guaranteed hit factory? It is human nature, especially in this job market, for a manager to cover himself. I don’t blame anyone. But in Hollywood’s rampant CYA culture, now AI-powered, executives can hide themselves.

Is viewer engagement on most of these platforms without clever (human) executive intervention, which challenges our baser instincts as viewers to relentlessly tap puppy videos, even that great? Maybe for TikTok. From a more sophisticated aesthetic point of view, the uncontrolled race to maximize viewer engagement is a race to the bottom. Even worse, from a journalistic ethical point of view, in the realm of non-fiction it is a race to ignorance and delusion.

In 2021, filmmaker Morgan Neville famously used AI to recreate Anthony Bourdain’s voice in the documentary road runner and the move received backlash. For his part, Morgan took only real quotes from actual print interviews of Bourdain and used the deepfake technology “to make them come to life”. And last year’s Netflix docuseries The Diaries of Andy Warhol waded into similar territory in recreating Warhol’s voice for narration. That kind of controversy will feel much less incendiary in 2023, when AI technology has advanced to make completely fake audio, video or photos appear lifelike.

There’s a lot to be said about the moving goalposts of ethics within the documentary profession these days – with or without the use of AI as a tool in filmmaking. The more sinister force at play, and the one leading to what could be considered widespread ethical breaches, is the possible surrender of human curation to algorithms and the exploitation of data to decide which projects to purchase and even how to shape on an act-by. act base. Yes, there were focus groups and dial buttons in the past. Yes, there was data from Nielsen. But the processes behind the insights were transparent. There was human responsibility. As the industry turns more of these decisions to black box AI, the technology is no longer a tool to streamline development and maximize profits — it becomes the decision maker itself.

And I don’t think we have one Black mirror episode to outline the horror of this scenario, especially in documentary.

Non-fiction storytelling shapes our understanding of the real world. For this reason, the preservation of human-curated documentaries is rising in urgency over other genres. Hollywood has always tried to balance commerce and artistic expression, which has historically allowed it to forge its own brand of art for the masses. But now more than ever before in history, the world’s relationship to reality is at stake. The disinformation plague is already rampant on social media and so are curation algorithms largely to blame.

Furthermore, to fulfill its commitment to truth, nonfiction requires trust from its audience—trust rooted in transparency and integrity—relying solely on end-to-end human control to build it.

To use deepfake technology, for example, if the viewer cannot trust the veracity of the images they see or the audio they hear, the film loses its power. Unlike narrative film or television, it falls apart when audiences cannot trust the integrity of a documentary as a work of non-fiction. Joe Hunting. The Ross brothers. Jessica Beshir. These are filmmakers making changes with their artistry,” adds Mooser. “It will be a long time before AI can compete.”

With regard to accountability, the same can be said for the administrative level of non-fiction, the editorial role played by the film manager (and increasingly shaped by AI-generated data insights). With a human at the helm, audiences can question a studio or platform’s motives for green-lighting a movie—commercial, political, or both—but audiences can’t question the motives of an algorithm that does something about it. shows an audience because it believes content is “popular”.

For Josh Braun, co-president of documentary behemoth Submarine, there’s a deep-seated appetite for breaking the rules that defines us as human beings, and this manifests itself in a never-ending desire for the fresh and new. “This is the savior of the possible nightmare scenario. Any way you slice it, people have visceral reactions to things. This pushes the most interesting documents back to the distribution companies. The neons. The A24s. The Magnolias. The IFCs. These are the places where we see deals,” says Braun.

And the indie market could be the stronghold. “The more esoteric titles that people want will rejuvenate the theater market,” added Braun. “You don’t get the same level of choice on the algorithm-driven platforms.”

As the industry integrates AI into every aspect of the business, technology must remain a tool, not a substitute for human judgment and accountability. That’s what the Writers Guild is currently striving for in its deadlock with producers and studios. And protecting the integrity of non-fiction stories is paramount, as it is one of the few remaining realms where truth and faith in a shared understanding of the world are sacred. Ultimately, anything done under the influence of AI should be done with a strong moral compass, guided by the principles of honesty, transparency and respect for the dignity of the people involved in the stories being told.

But maybe it’s already too late.

Merry C. Vega is a highly respected and accomplished news author. She began her career as a journalist, covering local news for a small-town newspaper. She quickly gained a reputation for her thorough reporting and ability to uncover the truth.

Latest stories