Video: 2 min recap of Ultra Natural 2013. Check out the full episode here
In February, I was in Nelson, British Columbia, Canada, coordinating the post production for Red Bull’s Ultra Natural event. We set up a 4 bay post facility in a hotel in Nelson; ingesting, organizing and editing footage from roughly 20 cameras. We edited 3 pieces there – 2 of which had to be turned around 24 hours after the completion of the event (including the video above), and then took all the footage and equipment back to Brain Farm’s homebase in Jackson, WY to edit the 1 hour TV show, which aired on NBC as part of the Red Bull Signature Series on March 30th. You can see the full episode here.
Equipment Breakdown:
Server:
Mac Pro 2009 4,1 2×2.26 8 core, 40GB RAM
- Maxx Digital 24TB 6G EVO – 18TB at RAID-6, ATTO R680
- Small Tree 6 port Ethernet Card – each edit system connects directly via Gigabit ethernet
- CalDigit FASTA-6GU3 – eSATA and USB 3.0 Combo Card for Mac and PC (Amazon)
- nVidia GTX570 from MacVidCards on eBay
- Glue Tools PhantomCine converter
- Shotput Pro
- Panasonic P2 card 3-card reader with USB3
- RED card reader with USB3
- CalDigit VR2 with removable drives and many LaCie Ruggeds for field media
- Elgato Turbo.264 for H.264 conversions for posting screeners online (Amazon)
Edit Stations:
MacBook Pro Retina 15″ with 30″ Apple Monitor
Mac Pro 2009 4,1 2×2.26 8 core, 40GB RAM, 30″ Apple Monitor
- nVidia GTX570 2.5GB from MacVidCards on eBay
- Blackmagic Design Decklink Extreme 3D+ (Amazon)
- Davinci Resolve
- Euphonix/Avid Artist Color (Amazon 1) (Amazon 2)
- Flanders Scientific 24″ monitor (this model is discontinued, but this is the current version. I prefer this model as it has better blacks.)
MacBook Pro 17″ or MacBook Pro 15″ Retina with Maxx Digital Mobile Rocket
Each computer connected to the server via Gigabit ethernet. Many of us also connected our personal laptops into the server to multi-task.
I ran Phantom conversions using Glue Tools PhantomCine plug-in, which is GPU-accelerated, on the server. I also re-wrapped the AVC-Intra files in FCP7 on the server, since it ran at 600MB/s on the Maxx Digital EVO, making the conversions instantaneous. Using the CalDigit USB3/eSATA card, I was able to import P2 cards and RED mags faster than I’ve ever been able to before, pushing 150MB/s. In the past, using FW800, I could only expect 50-80MB/s. If you’re using the RED card reader with the CalDigit USB3 card, definitely use the additional USB to 5v cable or a power adapter as the CalDigit card can’t supply the full power needed for the RED reader. The transfer speed was further accelerated with ShotPut Pro since the verify stage could run at the EVO’s top speed of 600MB/s. I organized all the media on the server and acted as Assistant to the Assistant Editor, and later as Colorist and Finisher/Uploader.
Premiere Pro Issues
We decided early on to run Premiere Pro CS6 for these edits to take advantage of multiple raw camera formats, but we found a lot of areas that Adobe needs to address. Here’s a short list of the problems we ran into (pardon the change in tenses as I wrote some of these during the event and some after):
– Glitchy playback of P2 media. This one really took me by surprise. We have 9 P2-based cameras for this event. Last year, we transcoded everything to ProResHQ in FCP7’s Log and Transfer, which was a huge bottleneck and made it very difficult to meet the initial deadlines. So this year, we thought, Premiere Pro! But in my testing, P2 media had glitchy playback. Online, I found others complaining about this problem, and it looked like there was no solution. So I used FCP7’s Log and Transfer to create .mov re-wraps of the AVC-Intra media from the cameras. This would allow us to rename files into our naming scheme, and have spanned clips properly dealt with before going to editors, and just make the folder/file structure cleaner. FCP’s Log and Transfer to native AVC-Intra was almost instantaneous, so it didn’t slow our workflow too much, but it was surprising that we had to do this as Premiere is sold as a P2-compatible NLE. It wasn’t an issue for this event, but I found out a few months later that Davinci Resolve won’t play AVC-Intra files. Strange, as it can play just about everything else.
– Can only have one project open at a time, which makes opening projects from the Assistant Editors difficult. Adobe’s answer is to “import” a project into your current project, but then…
– Importing a project ignores all the bins in that project and dumps all the media into a single bin in the current project. So if you bin shots out by athlete or run or something like that, that organization is gone.
– Last month we tried Prelude for logging, but when you transfer the media over to Premiere, the logs (comments) are lost. So logging with Prelude is almost pointless. We don’t really need to log for turnarounds this quick, but we will need to log before editing the 1 hour show.
– We have one system with a Blackmagic Design Decklink card, a Flanders Scientific 24″ HD broadcast monitor, and a GTX570 for CUDA processing. If we turn on CUDA processing, the HD playback on the SDI output drops frames like crazy, making it almost completely unusable. We have to shut off CUDA processing to get smooth playback on the Decklink SDI output, then requiring rendering in the timeline. So then what’s the point of having a CUDA GPU? Because of this, we decided to not use R3D files in the edit, and instead to transcode them to ProResHQ with a Red Rocket in a Mobile Rocket enclosure, which adds a time consuming step. Why are we previewing on an HD-SDI monitor? Because we have to deliver 60i for network and we need to make sure that all our mixed frame rate formats are looking correct at 60i.
– You can’t run Warp Stabilizer and speed ramping on the same clip in Premiere. Strange. Actually, after further testing, you can’t apply any effect if you’re doing a speed change, even opacity. BTW, the speed controls in Premiere are AWESOME! So much easier and more intuitive than other NLEs. I also think that Warp Stabilizer is one of the best, if not the best stabilizer available. But the fact that the analysis stage can only use a single processor core is a huge bottleneck. We have 8 core Mac Pros. It’s the same in After Effects too. Yes, Warp Stabilizer can use a GPU for acceleration, but only when applying the effect after the analysis is complete.
– Premiere Pro’s multicam works pretty well. Now I wish we had better than Gigabit Ethernet for the SAN as a 9 camera timeline is too much data for the network to handle. The solution here was to use ProRes Proxy.
As we were running into issues in the field and back in Jackson, we had a line open to Adobe. They were very receptive and helped us fix the problems we were having, and made notes of problems they couldn’t fix for future releases of Creative Cloud. Many of the issues we ran into (many more than listed here) have been fixed in the CC releases and updates.