Christopher: So thank you for making time to discuss your background and the kinds of projects that you guys are up to at Cortina Productions. I’m excited to hear about it because… it’s really pretty impressive what you are creating, in particular for other national museums.

Andrew: Yeah, it’s a very unique kind of niche that we have here.

Christopher: Yeah. So let’s start off in the very beginning. I really like to hear, what was the spark? Was there some moment, maybe when you were a kid or you saw a certain film, or some light went off like, I want to do that, I love that.

Mr. Prasse

Andrew: I’ve always been interested in film; I remember going to Blockbuster and that kind of dates me a little bit. We’d rent VHSs and then DVDs and I’d always skip immediately to the behind the scenes sections, because that part of moviemaking just fascinated me. At a young age, I had an old DV camera and I’d go outside. I’d mock up these little sets and put firecrackers in them to do little explosions. We even hung a fishing line from the top window down to this barn and painted up a Japanese Zero model and set it on fire, and flew it into this barn. I did some crazy things as a young kid that really kind of set the foundation for what I wanted to do, but it really was an interest of mine, kind of like a hobby. It wasn’t until I started probably five or six years ago that it actually became a career for me.

Christopher: So at school, did you study design?

Andrew: I didn’t. I went to school and I studied business and geology; two very different subjects. And business was important for me, I wanted that degree. Geology was just because business was so boring that I needed to do something that was a little bit more stimulating. It was the science side of things.

Christopher: Right, everyone knows studying rocks is a lot more interesting than business.

Andrew: Oh yeah. After that, I got into marketing and I worked at a ski resort, doing marketing for them. That’s where I started to get a little bit more exposure within media. They had a little television station there – I’d do very basic, Final Cut editing. And then we’d do live satellite feeds when local TV stations would come up onto the mountain. So I got a little bit more acumen with dealing with that type of video media. Then, I left that job and started making a documentary in my hometown that aired on a local PBS station, and it actually garnered three regional Emmys. So that’s when I realized that okay, I’m not awful at this, I should keep pursuing it, and eventually found my way to Cortina Productions.

Christopher: Great, what was the name of the documentary?

Andrew: The documentary is called “Hunter’s Raid: The Battle for Lynchburg.” It’s a Civil War documentary. It was very homegrown, in the sense that we had over a thousand people work on this thing over five years. I got involved in the editing portion at the very end stages, but I had known about it for a long time. It tells the story of a Civil War battle that’s not very well known in my hometown, and it played on the local PBS stations around Lynchburg. I think every year, it still plays around the anniversary of the battle. So it still gets airtime.

Christopher: Wow, that’s a good story. It ties in nicely with Cortina because if I’m not mistaken, there’s at least one major piece that you guys did with Civil War reproduction.

Andrew: Yes, we did the Turret Theatre, in the North Carolina Aquarium.

Cortina Productions – Turret Theatre Battle of Hampton Roads

In the Civil War, the Union Monitor fought against the Confederate ironclad called The Virginia in the Battle of Hampton Roads. It was a draw. The U.S.S. Monitor went out into a storm months later and sank off the coast of North Carolina, and then became the first marine sanctuary, I believe in the United States. And that’s why we did the film for the North Carolina Aquarium, because they have a reproduction of the Turret within one of their aquariums. So it’s a film that plays in conjunction with that.

Christopher: A draw! Were they just blasting cannonballs at each other?

Andrew: Oh yeah, it was just a slugfest. They both had a bunch of iron on each one.

Christopher: Can you imagine being on either one of those ships and being hit repeatedly by a steel cannonball on steel?

Andrew: The sound design for that was one of the most fun parts; coming up with what would it sound like to be in almost a tin can, getting rocked by massive artillery shells.

Christopher: Insane. So that brings you out of college and in this marketing role, and where did you go from there?

Andrew: From marketing, after the documentary…

Christopher: The documentary, that’s right.

Andrew: I was just doing random freelance gigs here and there and there’s not much freelance work in Lynchburg, Virginia that i could find. So I decided to cold apply to Cortina. There wasn’t an open position but I sent my resume, because I knew the work that they did aligned well with my interests as far as historical recreations and working in museums. So it really kind of all came together when they hired me.

Christopher: That’s great. It’s seems like a great match,, Andrew. When we first discussed an internview, I said let’s focus on a project that you’re either really proud of, or that had significant technical hurdles and how you kind of addressed those.

Andrew: Sure.

Christopher: So like a before and after – project before you had your i-X2 Mediaworkstation, and the first one with it. I think this will give us a sense of how you were working on projects before, and then a more recent one perhaps you’re really proud of, where you had the i-X2 at your disposal.

Andrew: Definitely. Maybe a year after I started at Cortina, we began working on a project for the American Revolution Museum at Yorktown.

Christopher: When did you start at Cortina?

Andrew: I started in 2012.

Christopher: Okay, great. So the Museum at Yorktown?

Andrew: Yeah, the American Revolution Museum at Yorktown, and the piece that we were doing was called “Siege Theatre”. It is a 180º 4D theatrical experience, so it’s got scents, smoke and butt kickers and all kinds of fun effects that play as the film plays. It’s a 180º degree screen that encircles you, projected across five edge-blended HD projectors, giving you a resolution around 8K.

Christopher: Quite immersive.

Andrew: Yes, and this was a live action piece, so we had a big shoot where we had 60 extras, dressed them all out in Revolutionary War attire and recreated some actions that happened around the Battle at Yorktown in the 1780s. This was a challenge because we needed to get the full 180º in 8K. So the solution came in bolting three Red Scarletts onto a cheese plate and then we’d have to physically stitch those videos together seamlessly, into one 180º image. Needless to say, there was a lot of processing in that, and a lot of roto. We also had to composite visual effects on top of that – adding gun fire, matte paintings, all of the typical stuff you do in a live action piece, but at 8K.

Christopher: So when you were using the Red files was that 4K or 5K?

Andrew: It was around 4K.

Christopher: So stitching with 4K, were you transcoding first? What were the steps to your process?

Andrew: We had to transcode them into a different format and then run that through a batch process that would stitch it together – basically using photo stitching software that allowed us to distort the image to create the 180º piece.

Christopher: Right. And you had said earlier to me that pretty much the entire studio, or your production studio, is all Mac. So you guys had been doing this all on Mac Pro towers?

Andrew: This was done on Mac Pro towers; they were relatively new at that point in time, but this was still a very, very bleeding edge type of technology. We also got one of the first RED Dragons ever in production. We had met somebody who had a test camera from RED, who came and shot with us. The Dragon was around 5K at that point, so that’s how we were able to get some single camera stuff, but the majority was done through stitching.

Christopher: in terms of composting, were you just using After Effects?

Andrew: Composting was done in After Effects on the stitched files. So the good thing about it being 180º is you don’t have extensive camera moves, and everything is pretty much locked down because A) getting a rig on some type of moveable device would have been pretty difficult and B) if you’re within this totally immersive space and you have fast camera moves and cutting, it could actually make the visitors sick.

Christopher: That’s a different venue – more amusement park.

Andrew: Yeah.

Christopher: Or a Halloween show in the Bronx.

Andrew: Yeah. It was a challenge. We had massive amounts of data running through a Mac Pro. It took a long time to handle that data and on top of just the sheer file size of the stitch, we had to throw compositing elements – smoke, fire, all of that type of stuff, and green screen elements. Comps got big very quickly, and processing power diminished after that. It came out great and the piece is really impressive to see in the space. It’s just at that point, we saw if we were going to continue to do these big, huge, immersive products, that we’d need a little bit more umph in our computing.

Christopher: I’m looking at the picture on the homepage for the American Revolution Museum in Yorktown. I mean, I would love to see that. So I assume it is in Yorktown, I guess I should see exactly where. 6-18-18

Andrew: Yeah, really close to the battlefield, basically on the battlefield. It’s a very cool museum, we did a lot of the media for them. They also have a space outside where they have historical re-enactors portraying people from the time period, so it’s a really fun museum.

Christopher: Wow. So this was done 2012 and maybe 2013?

Andrew: Yeah, more 2013-2014. And the museum, I believe, opened in 2015 I want to say….

Christopher: So that kind of brings us to the next stage for your work at Cortina. We shipped your i-X2 Mediaworkstation to you in May of 2016…

i-x2 mediaworkstation

The i-X2 Mediaworkstation

Andrew: Just in time. If Yorktown was 180º, we said, why not double that? So we did a piece for the National Museum of African American History and Culture, that is the newest Smithsonian museum to open in D.C. I believe it opened in the fall of 2016. The piece is called “Cultural Commons 360º.” It’s roughly 27K x 1080 – so 27K in horizontal resolution, projected across 17 edge-blended projectors, creating a seamless 360º strip of content. It showcases African American influence in style, sport, music, artistry, food, and more. It’s a three to four minute loop, and plays above display cases for artifacts, so it’s meant to be very experiential in the space. It was originally speced to be 29,000 pixels. I think After Effects only does up to 30,000 pixels; so we went very close to that mark. And that’s when we immediately started looking for a more powerful computing solution when we had that resolution. We were trying to find a workstation that would allow us to edit 27K, quickly, effectively, and then export something out that would be equally as fast. In terms of the process, we built it in halves, or 180º sections. It included hundreds of photos and videos that we’d then composite. So we went through a rough cut in the 180º, and we showed that as a single strip. Then we split it into two strips, so that content was easily seen and approved. And those we would render out in around 1080 because the client would not be able to play something super large. We noticed — even working with the offline footage, that the i-X2 was super fast; we had no issues, no crashes, which was extremely important to get through the rough cut and fine cut on a huge piece that had tons of content. And then from there…

A 360 film “Cultural Commons” at the NMAAHC (View Larger Image)

Christopher: Quick question, Andrew.

Andrew: Sure.

Christopher: Did you have crashing issues with the Mac Pro towers?

Andrew: I had tons of crashing issues. I would have crashes all the time. And yeah, that’s where it gets really frustrating, when you’re in a groove and something happens. You might have autosave and it might ultimately save your work, but just the constant interruption of having that happen was really — it takes a toll on you as an editor.

Christopher: Understood. Underpowered or inappropriate hardware can have a dramatic negative impact on workflow. So you were able to do the editing, and the different stages, in halves. So you were taking the footage — but it’s all archival material, so it’s a little different.

Andrew: It’s all archival; a little bit different than the live action, but when we got to the online stage, that’s where the i-X2 really shined. We had photos from the Library of Congress and they’re huge, huge scans. Some of them are up to 10K pixels and usually, for most HD or even 4K content, you’d scale that down, create a proxy file that you could probably use in your final piece and you’d be fine. Here, at 27,000 pixels, we had to use that 10,000 pixel image to cover most of the screen and make incredibly stunning compositions. So we needed a machine that could take those extremely high res TIFFs from the Library of Congress and chug through it and allow us to edit, and see what it looks like and move it and animate it. And that’s where the i-X2 definitely shined as a computer.

Christopher: That’s great. So when you had assimilated all of those pieces into the file, what was your workflow? Were you setting up stuff in Premiere and using After Effects, or was it more complex?

Andrew: The application workflow for 27K … we’d online in Premiere, so replace all of our low res with high res. And then we’d break that up into sections of the film in After Effects, and that would get distributed to myself, as well as a team of animators to work on, in a total 360º canvas. My i-X2 acted as the render station, as well as my workstation. So I’d receive projects from our freelancers, bring them into After Effects, render out low res QC files and then from there, I would do the final tweaks. And then, I rendered out to a full res targa sequence. We actually had to split the full res targa sequence in half, because the player couldn’t handle it. And then the player would disseminate across the 17 projectors and edge blend everything. So that was kind of the last process.

Christopher: QC, you mean quality control?

Andrew: Yes. Just making sure everything looks alright, but the interesting thing with doing a 360º film, and that’s also where the machine helped out, was final visualization. So we’d churn out a skinny little strip at 27K x 1080. So it’s very difficult to watch and understand really what you’re doing, from an animation perspective. So we used the i-X2 and Cinema 4D. Because we had CAD drawings of the space, we knew what the screens looked like before they were even built. So we would take that strip, apply it in Cinema 4D into a model of the space. Render it out at different perspectives, and that’s what really allowed us to dial in our animation and composite of the piece. Make sure things aren’t moving too quickly, make sure our Hero images and assets are coming up in the right places. And even allowed us to position our captioning, because all of our pieces are ADA compliant.

Christopher: Yeah, I can imagine that. If you had a physically accurate CAD model, onto which you could project a low res copy of your target file, or your content.

Andrew: Also it allowed us to understand the piece better; it allowed our client to understand the piece better. So it was a real win when we took it into the 3D environment.

Christopher: Were there any specific hurdles on that project that you came up against, in addition to what you just described?

Andrew: Oh yeah. The trickiest part about doing 360º is that it’s got to be seamless. Your stitchpoint has to be dead on. So to determine where that stitch point is, you have to do a bit of math. So that was definitely a challenge. But what wasn’t a challenge was moving all of the assets into compelling compositions, and that’s what we really wanted from this piece. Before, if you have a huge image and you have a lot of them in After Effects and you’ve got to animate them, you can move your mouse and then you get the spinning wheel of death. And you wait, five or six seconds to see where it ends up. It’s no way to work.

We wanted to essentially eliminate lag time in After Effects. We knew that we were going to be dealing with hundreds of photos on the screen at the time. There might be like, 10 or 15 high res photos that are all animating in different ways, have color correction applied to them. So we definitely needed some way to do it quickly and effectively. Not having that headache during animating was huge.

Christopher: Yeah, that’s great. So you basically also had the heavy lifting done in C4D, is that right?

Andrew: Yes.

Christopher: So when you were using Cinema 4D, and I guess this would go for other projects, maybe there’s another instance that you want to mention, are you doing physical render or are you guys using Octane, Redshift or V-Ray, as your render engine?

Andrew: We use V-Ray. Sometimes we’ll do physical render if it’s not anything super intensive, but a lot of times, for our visual effects shots, we’ll use V-Ray.

Christopher: Yeah, it’s quicker, right?
Andrew: Yeah and it looks better.

Christopher: Yeah, it’s a better tool.

Andrew: I have an example of that too. In January of this year we opened a project at Mount Vernon called “Be Washington”. It’s an interactive theatre, where visitors listen to different scenarios that Washington faced during his time as president.

Christopher: George Washington.

Andrew: Yeah, George Washington.

Christopher: Oh wow.

Andrew: So the visitor has to choose what choice Washington should make, by listening to a bunch of advisors that appear on the touch screen kiosk in front of them. But the main scenario plays on a 30 foot LED wall, at close to 6K resolution.

Christopher: Wow.

Andrew: Those scenarios were shot on a RED Dragon, so we did have the resolution; we didn’t have to do any stitching or anything like that, but we did have over 40 visual effects shots — because I’m the senior editor as well as the visual effects supervisor here, those fall in my realm. So we had to do previs, after scouting locations and then during the shoot, we had to get plates. One of the most challenging shots we had was a dock scene in Charlestown. And I have a couple of before and afters of these that I could send you.

Christopher: Oh, that’d be great, yeah.

Andrew: We didn’t have a set that worked, so we had to use green screen. We had great set decorators and set designers, so we had lots of amazing props to work with, we just didn’t have a background. So we had to find lots of ship models and other assets that we’d had from previous projects. We had to reskin them, alter them slightly, and then we created a fully populated harbor.

Christopher: So when you say reskin them, you’re talking about surface textures on the ships themselves?

Andrew: I am.

Christopher: What were you using, what application? Were you doing all of that in Cinema 4D also?

Andrew: I was doing all of that in Cinema 4D and using V-Ray materials, but a lot of it was just a UV map so I edited the textures in Photoshop.

Christopher: Right, got it. So you were assembling all of that also. Did you shoot that on the Dragon as well, is that what you said?

Andrew: The plate, the green screen was shot on the Dragon.

Christopher: And again, compositing everything in After Effects?

Andrew: Yeah. Everything was composited in After Effects. We also had a ship battle that required us to do a lot of digital cannon blasts and those we rendered within Cinema 4D using Turbulence FD. That’s another pretty intensive simulation that definitely saved loads and loads of time on the i-X2. We had run it before on the Mac towers and it rendered at a snail’s pace, but on the i-X2, the bar just constantly moves. So that was really great to see. And these historical pieces have tons of black powder explosions and the smoke behaves very uniquely in them; it lingers for a long time. And Turbulence FD was a plugin that we found that could best replicate that.

Christopher: Interesting, so you could get it and you could tweak the actual smoke dissipation constants, just vary them.

“Be Washington” Dock

Andrew: Exactly. So I’d have to track the shots and then bring the cameras into Cinema 4D, or I’d track it in Cinema 4D and did a lot of cannon blasts from different angles. And when we composited them, with the live action plates, they looked great. There’s this one shot where we have a cannon going off in the foreground and it kind of like wipes the frame, it’s shooting right at your face. So obviously that’s difficult to get in live action, which is why we did it in CG, but it’s not in HD or 4K, it’s in 6K. So that’s an extremely intensive render, to do a smoke sim in 6K, that has to be detailed enough to look real. So we used the machine to do that, and it was a very effective shot; it came across pretty seamless. It’s quick but it’ll freak you out when it’s on the 30 foot screen, right in your face.

Christopher: Yeah, I imagine capturing that in live action would be pretty difficult.

Andrew: Yeah.

Christopher: You’d have to be pretty fast.

Andrew: Yeah, you have to be quick and the best part of all of it is now I’ve got a library of 6K cannon EXR image sequences that I can reuse on tons of other projects. So I’ve almost created an in-house smoke library from just doing a couple of cannon blasts at different angles.

Christopher: Wow, that’s great. I can’t wait to see this. I’ll come and see one of these, it really sounds incredible. I’m going to New York actually, flying out today, but I don’t know if — I didn’t run through and see if you have anything in New York.

Andrew: New York, I don’t know. We’ve got some stuff out in L.A. We’ve got the Museum of Tolerance.

Christopher: Oh wow, I can go see that and I’ve been meaning to see that forever.

Andrew: Yeah. There’s a film called “Anne’s Room” that we did. It’s a very powerful piece, it goes through Anne Frank’s diary entries but tries to recreate them in a stylistic way.

Christopher: Wow, and you worked on that as well?

Andrew: I didn’t work on that one. That was right as I started Cortina, but it’s an amazing piece. It’s really powerful.

Christopher: Wow, I want to see it. That sounds great. Very good, so moving onto the future in terms of maybe what Cortina is doing but also personally. Are there skills that you want to learn that you’re kind of excited about exploring? Is there a project upcoming for Cortina that you’re also excited about?

Andrew: Totally. We’re constantly on the bleeding edge of technology here in terms of displays and content. I’m really excited to add some more capabilities in-house and learn a couple of new programs. So we’re potentially going to start implementing 3D Studio Max, which would obviously run on this machine. And then take a good look at Nuke and figure out if we can add that to our compositing repertoire. I know it’s pretty industry standard and I want to get my feet wet with that one.

Christopher: It’s terrific, I mean the toolset Nuke has. Really amazing.

Andrew: And that in conjunction with Octane, just to really push the effects side of things.

Christopher: And Octane is obviously pretty low cost and really fast rendering. You want to get photorealistic fast and it’s very user friendly. V-Ray is much more complex by comparison.

Andrew: Yeah, V-Ray is pretty complex.

Christopher: Are you guys doing anything with virtual reality or VR?

Andrew: We are, we’ve got a couple of projects coming up. VR is big, but what we also have really sunk our teeth into here is AR.

Christopher: Yes.

Andrew: So we’ve done a couple of demos at SXSW, demoing our VR and AR capabilities. We have an AR…

Christopher: Andrew, just wanted to stop there … SXSW was just a couple of months ago. Do you remember the name of the projects that you demoed there? (View a video of the demos here.)

Andrew: I believe — we had a VR — or VR experience called Shark Cage, where you’re literally inside of a shark cage and you feed sharks through the inside of the cage. And then we had another one that we did in conjunction with the Bullock Texas State History Museum. Previously, we had done a piece about the La Belle, which is a 17th Century ship that sank off the coast of Texas, the explorer La Salle headed the expedition. They had found the ship and there’s the skeletal remains of it. So there are all of the timbers that are actually in the museum and you can see it. We implemented an AR experience using a 3D model that we created for the film. We altered the model in Cinema 4D and then created FBX files to be used in the programming software Unity and ultimately the Microsoft Hololens glasses. You’re able to take the glasses and then in the physical space, we have the model pinned on top of the skeleton. So in the space, you can see a realistic depiction of what the ship would have looked like at its true scale. So you can physically walk around the skeleton of the ship and see a 3D model of the ship at actual size in the space.

Christopher: Underwater, sunk at the bottom of the ocean.

Andrew: No, this is a full version of the ship, on top of the water, in all of its glory before it sank. So you have a better idea of what the ship looked like if it was sailing out on the water.

Christopher: Because that would be very kind of incredible too, to be in the watery depths. To be wandering around down in a shipwreck. I just looked it up, 1685 is when it wrecked.

Andrew: There’s tons of incredible applications for AR, especially in that sense. It really allows you to have an artifact and then put another layer on top of that. It can really wow the visitor and that’s where AR is extremely powerful within the museum industry. I think it’s going to be around for a while and there’s going to be new and exciting ways to implement this technology in a lot of different museums. In conjunction with the La Belle, we also created an AR demo for the LBJ Presidential Library, which is also in Austin. There’s this incredible wall of records that exists at the LBJ Library; you don’t technically go through them, but they’re on this extremely huge wall within the museum. We had these glasses and you’re able to recognize the different windows, and then you can pinch them using pinch gesture, and that will generate content from that collection, to fly into your view and you can pinch zoom, scroll, and go through content. So it gives you a way to physically go through files in the augmented reality space.

Christopher: So when you say pinch, you’re talking about going up to a bookshelf and pinching it and grabbing a book.

Andrew: Yeah, it recognizes your gesture within the glasses, so it’s basically like a mouse click but within the 3D space.

Christopher: Do you have a couple of favorite AR apps that you’re working with right now?

Andrew: I don’t typically do a lot of AR, our programming department does. They do a ton in Unity. Typically, I’ll take a 3D model and put it through Cinema 4D, clean it up, possibly retexture it, export it out as an FBX and then get that to programming to put into the actual software.

Christopher: Got it. Well Andrew, this is exciting, is there anything else to wrap up? I’m really — I’m going to go see Anne Frank when I come back from New York and I want to see some others. The African American exhibit sounds spectacular as well.

Andrew: It’s amazing, it was incredibly powerful. It was a real privilege to work on that museum. Extremely poignant stories and architecturally, it’s amazing. The exhibit design is amazing. It’ll take you like three hours, four hours, to properly go through it, but it’s a powerful addition to the Smithsonian.

Christopher: Anything else you’d like to add?

Andrew: I use the i-X2 every day. It’s been a huge asset to me. We have freelancers too, that have complex 3D work that may take them an hour to render a frame on an older machine. This machine can cut that down to nine minutes a frame. So that’s where a lot of added value is coming in, the i-X2 renders so fast, it reduces the amount of time it takes to see something that you create, and get it out into the final format. That’s really, really helpful.

Christopher: Yeah, and that might be a good thing to close on. I don’t know if enemy is the right word, but time can definitely have that negative impact and ultimately affect your project scope, and ultimately, the creative process itself. You referred to this moment where you move the mouse and then you’re waiting six to eight seconds to get back online. I’m just curious about maybe in terms of project scope, is there a good example of something that you can think of you’re now taking on that you wouldn’t have before? Like something where you were just saying … you know what, we can do this now because of what we have….

Andrew: Before we ordered the i-X2 we’d have to either go out of house or just settle with using 2D images if we were going to create an environment that was immersive. Now, we have a machine that allows us to buy 3D models or alter 3D models and create a richer, more historically accurate world – a better interpretation of history. I have the extreme latitude to move models around, position them, do smoke simulations, all of these things that we weren’t able to do that really make the projects stand out and come alive, and I think that’s what visitors respond to, too.

Christopher: I think that’s it, providing an immersive experience that takes you somewhere. Emotionally, spiritually, physically, just on all of those different levels. On some level, when it comes to something like a historical event, and knowing where we came from, but also who we are, those kinds of experiences can illuminate what it means to be alive.

Andrew: Yeah! Seeing people’s experiences that you wouldn’t ultimately ever get to see because they weren’t captured in a photograph or using video, we’re able to tell people’s story in an extremely immersive and visually rich way.

Christopher: So listen, it’s been a pleasure, Andrew. It’s so great to get a much deeper view into the work that you’re doing. We’re so proud to be supporting the work you’re doing, I can’t even tell you, it’s incredible.

Andrew: Yeah, it’s made life a lot easier, not waiting that eight seconds. If I had the timer, I’m curious as to how much time it’s actually saved me, probably days.

Christopher: Maybe! Not to mention what is provided by being able to iterate creatively without waiting. Andrew, again, really a pleasure to talk to you this afternoon, have a great weekend.

Andrew: You too.