RAY TRACING FOR THE WIN

         Earlier this year I photographed a short film for Chaos Group, a special effects software company with offices located around the world.  Its purpose was to demonstrate the efficacy of Project Arena, a new fully ray-traced virtual production technology that reduces the time needed to put an image on the LED volume from days and weeks to mere minutes.  More exciting in terms of its use by cinematographers, it also allows precise, localized manipulation of the visuals as they’re happening.  Invented by Chaos co-founder and current Head of Innovation Vladimir Koylazov, it marks a revolutionary advance that will deliver tremendous benefits to every corner of the industry.  He’s also responsible for the Academy Award-winning renderer Chaos V-Ray, but in this case, he has outdone himself.

         Ray Tracing FTW was directed by effects veteran Daniel Thron, with whom I enjoyed one of the most rewarding collaborations of my career.  Shot predominantly within an LED volume on an Arri Alexa 35 and Panavision P-Vintage lenses, the cast was made up of other notable special effects geniuses, including Scott Ross, co-founder with James Cameron of Digital Domain.  All of the live-action backgrounds were computer-generated specifically for the volume.  Supplemented by the performers’ engagement with real-life foreground items, I was continually blown away by how genuine the results looked – even to the naked eye!

         My main takeaway from this experience was how much quicker and easier Project Arena made the compositing process and how much more creative it allowed me to be.  By correctly managing the color and density values of the volume with the live foreground action, I was able to bake-in the exact look I wanted.  This also eliminated any guesswork that normally burdens a special effects supervisor.  I’m certain my colleagues will embrace this tech as soon as it becomes available.

         The film premiered at ACMSIGGRAPH’s Rainbow Conference in London just days ago and runs for a brisk ten minutes.  The producers are eager to get the word out, so have a look and let me know what you think…

9.24.2024

10 thoughts on “RAY TRACING FOR THE WIN”

  1. Very cool and fun. Looks like everyone has a great time showing off the new tech and obviously lampooning a lot of the steps along the way. Bravo!

  2. It’s really amazing to watch but a bit confusing to me as to what was AI generated as they say in the dialogue, what was classic CGI. How long was the actual shoot, how long was the post, and which Volume did you shoot on? What was the size of the pixels on the volume and how close did you ever physically get to the LED wall?
    It is very impressive and nicely lit and framed. Is there a chance that you could do a presentation at the clubhouse and breaking it down shot-by-shot? That would be very interesting to see.

  3. Hey Roberto – In short: all the backgrounds to the live-action were computer-generated, but the selling point of this software is how it dramatically reduces the time required to get those images onto the volume, as well as providing instant fine-tuning of the same. We shot for 3 days on the volume stage at Orbital in DTLA. Post lasted a few weeks, as they were all engaged in other projects at the same time. The pixel pitch on the volume was 1.4; the closest I got to it was probably in the vicinity of 8′ to 10′. No moré issues whatsoever, from any angle. I’ve mentioned to the guys at Chaos that they need to present this to the ASC. Stay tuned for that, hopefully sometime soon!

  4. Thanks Richard for the information. One last piece of the puzzle- how much AI was used to either create or tweak the CGI backgrounds? And if AI was used do you know which programs were used?

  5. HA! Loved every moment of this. Exciting capability to be sure. There you are with your feet planted firmly in the future, still getting the shots to tell the story. I am excited to see what this opens up.

  6. Roberto – As you know, AI is a very broad term. Project Arena uses a special denoiser called Ray Reconstruction, which is seriously miraculous. The denoiser is essentially heavily-trained AI. This’s what enabled us to get such clean images on the volume. Without the denoiser, it would’ve looked like we shot everything at ISO 12,800 using an ISO 100 sensor!

  7. I wish you had seen it last week on the Onyx screen at the Culver Theater. It really was impressive!

Leave a Reply

Your email address will not be published. Required fields are marked *