CineMontage

Q3 2022

Issue link: https://digital.copcomm.com/i/1475547

Contents of this Issue

Navigation

Page 51 of 55

so did my exposure to high-profile clients, such as "Star Trek: Deep Space Nine" (DS9) and "Star Trek: Voyager." I got to know all the assistant editors and editors on the two shows as well as several other series I covered. I made a point to tell anyone and everyone I met that if a position opened up or if they heard of anything to please keep me in mind. I did this endlessly and I'm certain I irritated many people — but I knew I needed to get my name out there. I had carefully clocked all the necessary hours to join the union but had no offers for work, so I waited. In 1997, an assistant editor position opened up on DS9, and the supervising pro- ducer, Peter Lauritson, generously offered me the position. It was my first big break. In between my assistant editor duties, I watched and learned from the talented editors on the show. This period became the foundation of my editorial career. In 2000, while working on "Resurrec- tion Blvd." for Showtime, I met and became longtime friends with post producer Zoila Galeano. Galeano gave me my first fulltime editor position on MTV's "Tough Enough," and as they say, the rest is history. Twenty-five years later, very little has changed. I still turn to literature, music, and television to take epic journeys and meet fascinating people. I never imagined the very thing that provided solace as a young shy child would become a career and lifelong creative outlet to express myself. ■ Noel A. Guerra is a six-time Emmy nominated editor and producer with an over 20-year career in the television industry. Some of Noel's career highlights include "Project Runway," "Survivor," "Last Comic Standing," "Treasure Hunters," "Project Greenlight" and "The Voice." He can be reached at noel_guerra@ mac.com. it is now. It's already there. And I can see that this is where it's going to go. Ev- erybody else is going to have access to this ability in the future." The 4D Fün approach simplifies motion capture. It doesn't require mo-caps suits or gloves or specially-made rigs for facial capture. "You can walk into the capture area as you are, wearing any outfit and playing any instrument – violin, guitar, and even drums. We accurately capture a much larger dataset than traditional mo-cap can ever do and this offers us limitless creative options to generate future content – e.g., Deep Real™ experiences in our Scenez platform," Master said. And the 4D Fün stage doesn't have to be green (although it is still useful in con- junction with traditional media production tools). In fact, it doesn't even require a stage. "We can go out into actual, real, live locations. We're literally out in the world doing 4D volumetric capture on a city-scale level," Master added. 4D Fün is all about democratizing the VR/AR content creation industry, giving artists control over their assets so they aren't beholden to those holding the purse strings. Musicians, artists, and storytellers can create what they want, edit it to be what they want, and distribute it when they want, where they want, and how they want. "4D Fün co-founder Paul Vowell and I feel strongly about allowing everybody to have the same ability to have an outlet, to express themselves, to show what they're up to. Our focus is on education, entertainment, and the preservation and restoration of human legacy. The studio's built on that premise; we need to allow people to have access to this technology so we're making it easily accessible. People don't have to spend a million dollars just to do this," Master said. 4D Fün is currently focused on making their captures ultra-realistic – capturing humans and representing them as true to life and with as much detail as possible. But the potential uses for animation – or for cre- ating lifelike experiences for the metaverse – are profound. Imagine capturing the fine details of all micro expressions – distinct movements like gait, posture, arm swing, and subtle facial expressions – and using that 3D model to create a digital character. And since audio is captured as well, this opens the door to AI voice generation – cre- ating a custom voice clone that can make your avatar say whatever text you type, or use speech-to-speech AI voice generation to have your avatar speak in whatever voice you want. "It all comes down to how we collect, analyze, and process our large datasets using our custom AI and ML tools," said Master. "We haven't explored this technology for traditional animation production, but it is possible to do. We have only done this in our VR/AR experiences for our Scenez platform. Scenez can be used to create tra- ditional video media. We could easily create characters with highly stylized attributes combined with our ultra-realistic human renders, with a very small team of creators maintaining control of the creative and production process," concluded Master. Jennifer Walden is a freelance writer specializing in technology. 52 C I N E M O N T A G E Tech CONTINUED FROM PAGE 45 Up and Away CONTINUED FROM PAGE 20

Articles in this issue

Links on this page

Archives of this issue

view archives of CineMontage - Q3 2022