Red versus blue It’s officially over. On Tuesday, Warner Bros. Discovery released Red versus blue: Restoration, the latest installment in the long-running saga that was once at the forefront of an entirely new form of entertainment: web videos created from in-game footage. Machinima pointed to a new world where those images—of aurain Red versus blueAs is the case, it could boost viral clips. That was in 2003. Now it seems that Restoration It could be Machinima’s swan song.
“Machinima directors use game engines that allow them to shoot a scene from any angle imaginable, much like a Hollywood director uses a cinematographer,” WIRED wrote in a 2002 article announcing the potential of this new filmmaking technique. When it was released a year later, Red versus blue exemplified those possibilities. The series was created by joining several Xboxes and recording images of one aura multiplayer game and then add voiceover. The absurd, existential tone of the dialogue was a hilarious counterpoint to (and commentary on) the run-and-gun gameplay of the first-person shooter used to create it. The show’s creators founded a production company, Rooster Teeth, and made episodes for more than a dozen seasons.
Red versus blue would go on to develop a huge fan base and become a geek touchstone over the next two decades, which is why RestorationRelease feels like an ignominious farewell. In March, Rooster Teeth general manager Jordan Levin Announced that Warner Bros. Discovery, now the parent company of Rooster Teeth, was closing the studio, and it soon became clear that intellectual property was being divided and sold into pieces. Today, the last installment of Red versus blue is being unceremoniously released to streaming platforms with minimal fanfare or promotion.
it’s sad moment for fans of Red versus blue and Rooster Teeth, but it’s a great time to reflect on the impact the web series had. Machinima isn’t talked about much these days, but across the media landscape, you’ll find people using games to create everything from streams to clips to GIFs to art films, and they’re doing it in ways that were unimaginable 21 years ago. . “Machinima is no longer a word we use, and it’s really no longer something we think of as a medium or a genre,” says Adam Bumas, editor of the Internet Culture Newsletter. garbage day. “But he’s still going strong. In fact, it is everywhere.”
What has machinima done? To begin, let us observe the phenomenon of fortnite concerts. In recent years, major recording artists such as laroi child, Ariana Grandeand Travis Scott They have made sets for millions of people connected to the gaming world. (Lil Nas X did a similar virtual event inside roblox.)
“The reason those concerts happened is because Epic realized that people were just hanging out at fortnite and not even play,” says Bumas. “It’s like an evolution of a social space.” And from fortniteSince Epic’s gameplay focuses on building and creating things, as well as shooting each other, it was natural that Epic would also lean towards developing tools that help people express themselves and entertain themselves within the game world.
The game publisher has also developed tools that allow filmmakers to use the underlying game engine that fortnite continues in its production process. For example, Industrial Light & Magic has used Epic’s Unreal Engine in its virtual production process on the StageCraft set since the first season of The Mandalorian. For the most recent season, the company used Unreal to help actors and filmmakers. visualize how a computer-generated droid character would interact with flesh-and-blood actors.
“When you’re faced with a sea of green and representations of characters on ping-pong balls or tennis balls, it becomes a pretty daunting experience for the actors and the director,” Epic Games CTO Kim tells WIRED. Libreri. . “I think what we’ve been able to do here is give control back to the filmmakers.”
In a different galaxy, far, far away, artist. Tim Richardson He recently collaborated with fashion designer Iris van Herpen on the short film CG. neon rapture, which was also made with Unreal. The technology allowed van Herpen to take his amazing concepts and designs further than he ever could have done in the real world, and Richardson says the engine of the game was his “soundstage” for the production. Where he Red versus blue the creators had to simply capture footage of themselves playing aura, Richardson had a set of tools to work with that was designed especially for someone who intended to render content rather than have a gaming experience. He allowed the filmmaking team and fashion designer to prototype every aspect of the shoot, from layouts to lighting, costumes and sets, and mix motion capture data with a digital environment on the fly to determine their shots. .
“It was the closest thing to shooting live-action footage that I’ve experienced in VFX-based filmmaking,” says Richardson. “I was able to share ideas and collaborate with Iris on a timeline impossible in linear VFX. “I see game engines as an essential aspect of my future work.”