Part I of II: Our interview with Jeffrey Jasper, CEO of nphinit, LLC. a company doing entertainment technology development for international markets. Former CTO, New Deal Studios, winner of Academy Award for Best Visual Effects on Interstellar.

Christopher: Jeffrey Jasper.

Jeffrey: Hey, how’s it going?

Christopher: It’s going good man, how are you doing?

Jeffrey: It’s going good.

Christopher: Thank you for your time to do this interview. I think the place to start is, where was the spark? Was there a certain movie that you saw that you walked out of, or some other experience that you had when you were young that led you in this direction?

Jeffrey: I’d say the one that got me actually got me actively looking to get into the industry was Jurassic Park. The original Jurassic Park. I’d seen the older movies like Tron and Last Starfighter when I was younger, but Jurassic Park came out when I was in college and seeing the dinosaurs in that movie was something else. They didn’t feel like CGI, they felt real, it was game changing. I was actually studying environmental science but I was wanting to switch into something more artistic and I was looking at product design and graphic design. I was going to Ohio State University which has a really good design school and had an innovative computer graphics program using computers from the movie.

Christopher: Oh wait a minute, are we talking about Silicon Graphics? Octane?

Jeffrey: Yeah Octanes, Indigos and these big Crimson workstations, the size of mini fridges under desks. I was doing a work study at the Advanced Computing Center of Arts and Design which is in the Ohio Supercomputer Center building and it just kind of struck me. I was like, this is really cool stuff, this appealed to me because it was both technical and artistic. At the school, if you wanted to do computer graphics, it was a graduate degree program. There were people there working on their Masters and PhD degrees and so I talked to some of the professors from the program and they were thinking about doing an undergraduate program and so we set up in an independent study program. I was the first undergraduate computer graphics graduate. I was lucky in that I got to take graduate level classes as an undergrad and then combine that with some of the other stuff I was doing, like engineering graphics, design and art.

Christopher: OK. You were at Ohio State, I think I’ve seen that before.

Jeffrey: Yes, Ohio State University, it was a cool place. I got to work with cutting edge equipment and software that as an undergraduate was unheard of. Ohio Supercomputer Center and ACCAD were heaven for a geek like me. SGI used to visit in their huge truck that had military style flight sims and “holographic” displays. It was insane.

Christopher: It’s kind of amazing in a way that they said—you were the catalyst for them creating a new program or major at the school.

Jeffrey: Yeah, I think they thought it was something that the school needed, there was a lot of interest from the faculty to do an undergraduate program.

Christopher: After school, where did you go? What was your first…

Jeffrey: I graduated right when a bunch of movies bombed and big studios were shutting down. So there was literally hundreds of artists with years of experience looking for jobs along with me when I came out to LA for SIGGRAPH. There was nothing in the industry at the time unless I worked for free and moved to LA, which is not a sustainable life unless you have a huge amount of money saved up to burn through. So I went back home with my tail between my legs and just tried to figure out what I was going to do. Then there was an opportunity, a job posting I saw to work at a computer consulting company and it was at Children’s Hospital in Cincinnati. So I went and interviewed and got the job. Children’s Hospital had tons of Macintosh systems as part of their research center and throughout the hospital and in various departments. But they didn’t really have anybody who knew how to support them, so they had a huge backlog of support tickets. I came in, jumped into it as the only Mac support person for almost 2,000 systems. I also handled UNIX tickets that came up in the research division. It was very rewarding work helping setup and configure cutting edge medical and research equipment and then seeing the impact the staff were having on the patients on the hospital side. After that I went and worked in Telecommunications in Northern Kentucky. We handled pre-paid calling for a bunch of the telecom companies. We had a large data center with insane amounts of bandwidth and the people were amazing. We had these custom built systems for doing the call routing on the calling platform. Each system had two incoming T1 lines and two outgoing T1 lines and we had hundreds of them. We were just at the beginning where voice over IP was just starting to take over the telecom industry. We were helping them build solutions for that. I was doing mostly Linux administration, Linux and UNIX servers and systems and just heading up solutions for cost savings.

Christopher: It sounds like you were getting quite a background; UNIX and then you went to Mac for a while and dealt with that for a long time, and then you were building Linux servers.

Jeffrey: When I was at the hospital was when Apple had just purchased NeXT and just started integrating MacOS into UNIX. We had lots of technicians and I was the only one who knew UNIX and Linux, because in that day everyone wanted to be Microsoft service engineers. Very few people in the area knew Linux back then and even building LAMP machines (Linux, Apache, MySQL and PHP) was painful and geeky to get systems set up so that was kind of my speciality, building servers and that kind of thing. And then for the hospital, Apple went through a kind of a painful transition between MacOS, which was horribly outdated as an operating system but super easy for people to use, over onto a modern UNIX platform. But the original version of OSX felt more like the Next Operating system than it did anything MacOS. So going through that transition wasn’t easy for people used to MacOS.

Christopher: And where did that lead to next?

Jeffrey: Like I said, I did telecom and then after the telecom company, I ended up actually going back to Children’s Hospital and worked there for a while for the hospital instead of with a contract company. When I was working there, I decided, I really miss being creative. I was really happy with the technical things I was doing building custom solutions for medical researchers, telecomm and cost savings solutions for the enterprise but I missed the creative aspect of what I was doing in college. I decided to look at the film industry again, since I love films, and see if there was an inroad. So I started looking more holistically at what goes into the movie post production process. I found that instead of doing what I think a lot of people do when they want to get into the industry, which is try to be a Pixar character animator, I found digital compositing looked like a good inroad. It was still pretty niche and movies were doing a lot more compositing even on non-CGI heavy films than doing 3D animation work. So I flew out to L.A. and took some classes in Apple’s Shake. At the time it was the compositing software that everyone used in big studios. After that I just started pretty much begging to work on any project I could. So I was flying back between Ohio and LA, working on these low budget commercial pieces and just kind of building up my Shake skills. The work paid enough that I was breaking even but it helped me get comfortable with the software and post workflows. I then met Shannon and Matt, through a friend of mine, from New Deal Studios. They had a well known Practical Effects house…
Visual Effects Digital Masterpiece
Christopher: When was this, what year was this roughly?

Jeffrey: I want to say 2004 to 2005. They were thinking about starting up a digital division for their company. They had a ton of experience building model miniatures and doing practical effects but they were kind of tired of handing off the work over to digital effects companies and those companies ended up doing their own take on what New Deal’s vision for how the shots were supposed to end up looking. So they were starting a digital division and I kind of wooed my way into New Deal and the first project I worked on with them was X-Men: The Last Stand. I came in and just worked my butt off on that project and they liked me enough to keep me on the for the next project and I worked my butt off on that one too. After X-Men: The Last Stand, the digital supe at the time wanted to have a much smaller team. We were working on The Good Shepherd starring Matt Damon. It was that movie about the start of the CIA. So I worked on that and it was all freelance-based at the time. After I finished that, went up and got the opportunity to help out a guy who’s kind of like one of the fathers of compositing. His name is Steve Wright, he was doing a new edition of his book called Digital Compositing for Film and Video, and he needed illustrations for it and then he had a whole section in the back that was compositing exercises. He had come from Kodak so he had worked on the Cineon compositing system, but at the time he was new to Shake, he is now a super guru in both Shake and Nuke, so he asked me to go through all the examples in the book and all the training material to make sure that it made sense for doing them in Shake. So I helped him out with that and that turned into an opportunity to work on Pirates of the Caribbean: At Worlds End.

Christopher: Never heard of it.

Jeffrey Jasper and Steve Wright

Jeffrey: [Laughs] I went up to San Francisco with him and working there with a small subcontract team. When I came back to L.A., New Deal’s technical lead and digital supervisor had left. Their servers and the digital department was in a bad state. I let them know I had an IT background and offered come in and help get things working again.

Christopher: You put your IT cape and swooped down to the rescue.

Jeffrey: [Laughs] Well, I got things back up and running but then talked with them how everything was set up and how it could be set up better and more efficiently for the workflow that they’re using. They bravely let me rebuild the entire digital department. With that they brought me on full-time as the Digital Supervisor and also as the Technical Lead and eventually the Technical Lead position involved into being the CTO of the company. So I stayed CTO and Digital Effects Supervisor there all the way up until just a little over two years ago, a little over a decade with them. In that time New Deal evolved from their early days of getting into digital to working on a bunch of movies, including Interstellar which they won an Oscar on to starting to doing a bunch of VR work in the very early days of VR’s rebirth. The companies who are just getting started building 360 camera rigs like Nokia and Jaunt came to us with their prototype cameras and asked us to do projects for their launches and then also help test the cameras in a production setting.

Christopher: One thing I just wanted to insert here before we move forward. I think it’s really remarkable that you approached the guru of compositing at the time regarding the book and helped him with the book (Digital Compositing for Film and Video). Based on and out of your own digital experience with then the leading app at the time, Shake. I’m just thinking about people who are new and really skilled, who are looking for an angle. What did you do? Can you say a little bit more?

Jeffrey: A lot of people who are skilled want to get in the industry. I tell people it’s all about the people that you know and the opportunities that you get afforded through those relationships. It was through a friend I connected with New Deal and I was part of a guild called the Pixel Corps which is run by a guy named Alex Lindsay, he’s an ex-ILM person and it was through him that I met Steve Wright who was the author of the book, and it’s just one of those things. It’s like, Steve is looking for somebody to help out with the illustrations and testing the workflows and stuff like that. Would you be interested? I’m of the mindset, you never turn down a good opportunity, even if you’re super busy, you find a way to make it work. As a young artist at the time, it was an amazing opportunity not only to meet somebody who was such an icon but also really had a deep, deep knowledge of how compositing works and just the core principles of compositing. So it was really great working on the book because I actually learned tons of stuff, going through and doing the illustrations and trying it out in Shake. A lot of the stuff I learned taught me to build solutions rather than just looking for a make pretty button.

Christopher: Jeffrey, I couldn’t quite make that out. You said building your own solutions?

Jeffrey: You can go into programs like Shake or now Nuke, which is the predominant one now, you can use the built-in tools but as you run into issues, you can use the nodes that are built into the software that are just doing the math, or scripting, and can actually build your own workflows to do just about anything you want.
All of the opportunities that came up were I knew somebody who introduced me to somebody who gave me an opportunity. When I got the opportunity, I took full advantage of the opportunity and just gave my best to have a good outcome. So that was kind of how I kept moving forward in the industry and got a lot of the great opportunities that I did. I never wasted an opportunity that came my way.

Christopher: It’s a great lesson and reminder also, of how to look at your life. Not just from that you want to have a successful career but you want to have a certain experience as well, it’s really great. It’s awesome. Interstellar is a great example. To touch on that before we move a little closer to the present, what was your exact role? You’re managing the effects team I imagine, but you can talk a little bit more about what your core responsibilities were and what you think made a difference in that particular film, that project?

Digital Department

Jeffrey: Not quite, for New Deal and when I was working on Interstellar, being the CTO and digital supervisor I pretty much oversaw everything that happened in the digital department. So if it was something that went through a computer, it was something that I was overseeing or managing. So for that one, it was interesting. We were doing tons of 3D printing at the time, so I helped New Deal switch to in-house 3D printing from using external services. We also brought things like laser cutting in-house. So we had systems dedicated to those workflows. I pretty much had to become knowledgeable on all of those processes. So I would learn it and then I would teach others to do it if they had not done it before. We used various kinds of 3D printers, so we use laser based SLA printers, as well as the cheaper form deposit printers, depending on the kind of parts that we were making. The big part of my job since New Deal was a very small company who worked on really big films was cost and efficiency management. They were always able to do a lot with a little [laughs]. So their whole unofficial motto was, you want to see the money on the screen and not be wasted. So all of the money that goes into a film, you want it to show up in the film and have people to be wowed by it, not to have a bunch of waste in the system. So my personal philosophy when I took over New Deal’s digital department was the idea of KISS or “keep it simple, stupid.” So we had a really simple pipeline. We didn’t have in-house developers having to custom code anything…

Christopher: When you talk about the really simple pipeline, can you talk a little bit—I think you’re really talking about workstations and servers but also the application mix, right?

Jeffrey: Yeah, the whole end to end process pipeline from idea to delivery. So what we did was we worked with our software partners instead of doing a bunch of in-house stuff. We participated closely in alpha and beta programs with all of our vendors and then that gave us a say in the features and workflow that went into the software. So we were kind of able to push things in the direction that we needed and it also got us an early, early access into new workflows as they were coming up, especially with our relationship with companies like The Foundry. When we got into VR, they were just thinking about VR for Nuke. I was building my own tools on top of Nuke to do VR work since there wasn’t anything in the market yet. So I worked really closely through the alpha and betas of what eventually became their CaraVR product. It was incredible having that close collaboration with their research folk.

Christopher: So are you talking about that–so you were there in the beginning and you helped build them? Did you help build Cara VR, or is that the program that it became?

Jeffrey: We didn’t build it. We worked with the head of research from The Foundry and the development team who was developing Cara VR and we would come up with ideas to solve issues I need to accomplish. Pretty much, this is how I think it can be done using your tools or you might have to build a new tool to do it. And then they would do the heavy lifting of actually building that to best work in their software. I had built some tools before in Nuke for doing work in VR, and they were kind of taking some of the same principles of the tools I had and building those concepts into CaraVR and made them much, much better and more optimized. They were actually then able to build native tools with GPU acceleration that just totally outperformed my stuff and was much easier to use.

Christopher: I want to hop back to Interstellar for a second because you said an interesting
concept and I just wanted to be clear. You were saying in the film that it sounded like you had companies, post production in particular or production companies, creating applications specifically for your work that were beta versions?

Jeffrey: For Interstellar, we were contracted to do tons of practical work. Most of the scenes in the movie where you have a ship in space, but specifically the scene where the main ship gets blown up, those are all practical model spaceships. Christopher Nolan is very keen on things looking and feeling real, so he wanted to do them practically instead of digitally. So a big thing that we did on the film was we took the previs and converted all of those into motion control shots. The computer controls the motion control rigs that drives our cameras. So it was actually shot on a celluloid film and not digital because the movie was IMAX so we were shooting film cameras on motion control rigs. So me and another artist programmed every single one of those camera moves for all of these scenes. A lot of times when you get previs, cameras are doing things that they just can’t do in the real world. They’re moving through parts of the ship so we’d have to rework the camera moves that we got from previs to actually work in the real world, but yeah, we programmed—I forget how many it was, but I want to say somewhere around 150-180 MoCo moves and then like I said—

Christopher: MoCo – Motion Control?

Jeffrey: Yes.

Christopher: Right. So to program the cameras, what cameras did you use but also what software to program the moves?

Jeffrey: So the cameras are film cameras, so non-digital, super35 cameras and I think we were using a Graphlite motion control rig for that, which is a large motion control rig. It kind of looks like a crane on railroad tracks and that’s controlled by proprietary software. What we do is we have a accurate model of our sound stages, the motion control rig, and any other shooting areas and actually do a digital take for each and every shot in our 3D software and then we have an exporter that converts the 3D camera position into the real world position of the cameras for motion control. Motion control needs special formatting for camera position, so we export out these text files with camera height off the ground, camera roll, camera tilt, etc. So we had taken all of that and formatted it and made nicely formatted text files that would then go into the motion control operator and then the MoCo rig would be able to run through the move. If they made any changes on the set, so if they decided to change the shot a little bit, we could actually then take the changes from the MoCo operator and load them back into our digital camera in our 3D software and update our shot with the changes from set.

In parallel with that, since we were building the spaceships as practical models with multiple scales, we were getting was concept designs for them. So we would convert the concept designs into actual CAD accurate, buildable digital models. We used Rhino and Modo to break out of the model into all of its pieces to convert them into buildable assets. Just like if we were going to build a house, you need accurate real world blueprints or CAD for that. We’re doing that for the models and figuring out how all the various parts should go together on a real physical model as well as the materials. We also converted the parts that we were going to 3D print. To 3D print stuff, you have to do what’s called a watertight model so the model piece has to be solid and clean in such a way that if you’re to put virtual water into that model it would hold it, and also make sure all the geometry and supports were clean or the prints tend to fail. So we did all of that conversion of the concept work into buildable, CAD accurate models.

Christopher: That’s incredible. The whole set, all the models, were all 3D printed.

Jeffrey: Not everything, but tons of ship detailing and parts. And then personally, they had the scene where the one lander docks unsuccessfully with the main ship, Matt Damon’s character docks and he’s trying to escape. When he pressurizes the dock, it causes a failure in the system and the ship collapses into the main spaceship causing a huge amount of destruction and we needed to do this practically on Earth. The explosion and destruction are happening in zero gravity and so I took the previs and I took all the paths of all the debris that was exploding off the spaceship and traced the path that they were traveling. Using that I kind of figured out the main direction of most of the debris and that became our gravity orientation. So our real debris falls with gravity. We oriented the model, and that model is gigantic it was taller than our building.

Christopher: Wait, the physical model that you used was bigger than the building you shot in?

Jeffrey: Yeah, it was taller than our building so we had to shoot it outside at night. So I took the direction of the debris coming off and we oriented the spaceship so that debris was heading towards ground. The spaceship was mounted on this big lift that held it at this odd, tilted angle, so the main debris would fall in the direction of the previs and for the other stray pieces we had our mechanical engineer build a system to pull bits of debris off in the direction that it needed to go on these special computer controlled cable rigs. So that’s how we were able to do a zero G explosion here on Earth.

Christopher: Wow. That’s really, really intense. That’s an incredible story in terms of the lengths you went for realism, just awesome. When you talk about then putting everything finally together in a digital environment, what were the core applications and what operating system?

Jeffrey: We used a lot. We used both MacOS and Windows depending on the tools that we used. The digital CAD side, our main programs were SolidWorks and Rhino, which are two CAD programs.

Christopher: Was that P.C.?

Jeffrey: Those are P.C. based, yeah, although Rhino now has a Mac version, but when we were using them they were Windows based and Mac was beta. On the digital side, we had Modo from The Foundry which was one of our main modeling applications and we also used that as our main render tool. So when we did rendering, we actually used Modo’s renderer which is very unusual for our industry, but it worked out really, really great for us and saved us tons of money and had great quality.

Christopher: Here’s a quick question about that for Modo, so you used the built-in renderer; is that all CPU or does it harness GPU rendering?

Jeffrey: Modo is all CPU, yes. So at the time, GPU rendering wasn’t really a thing. The GPUs weren’t quite there yet to handle film production scenes.

Christopher: That brings up an interesting point and I’m just wondering in your experience of reference. It just occurred to me that just because Octane still doesn’t quite have all of the feature film ready tools that something like VRay has…so it’s important to consider what you’re using it for.

Jeffrey: I’d say Octane is used more for areas that it makes the most sense. So if you’re doing architectural visualization or design visualization, so say, you design a laptop and you want to have a beautifully rendered version of that design; Octane makes perfect sense for that. Or if you want to do a little kind of camera fly through to your architectural interior, then Octane is perfect for that. Octane accels when you need physically accurate lighting and stuff like that. If you’re doing a scene with millions of assets and many many gigabytes worth of textures and stuff, Octane and GPU rendering are not going to be your thing because the limitation of the GPU memory and IO and being able to handle that level of production scene is just not there…although that’s slowly fading away, I think. I think there’s a new AMD GPU with 1 terabyte of memory which is crazy.

Christopher: Yeah, the Radeon Pro SSG card, have you used it? We had an early model in an i-X with here in LA. Amazing but AMD were still working out driver kinks.

Jeffrey: I have not. So the limitations of just what you can fit into GPU memory and stuff like that was kind of the biggest limiting factor for film production using it and GPU rendering generally. Although the limitations have fallen away and you’ve seen stuff like the opener to West World, which is stunningly gorgeous and all GPU rendering in Octane from Cinema 4D.

Christopher: Hold on one second. Yes the woman in a big wheel and there’s a horse, are you talking about that scene? It’s almost like a mosaic, a visual mosaic, but it’s beautiful. Very slow moving as well, right?

Jeffrey: Yeah, very slowly, beautifully rendered. The opening trailer to WestWorld was done with Octane and GPU rendering.

Christopher: I’ve seen that, gorgeous, I agree.

Jeffrey: I’ve known some TV production stuff that have been picking up GPU rendering as part of their pipeline. So they use it when it makes sense, so both Octane and Redshift have gained steam there. It’s coming on strong. I think it’s the future, I think it’s definitely the future. When we did our project for HBO, we were all about GPU rendering, for me it’s definitely the way moving forward. Not only real time rendering, like game engine based rendering, but also and Redshift.

Christopher: That’s going to be where I think we need to leave it and it’s maybe a good place because I do still want to talk about the present, and about the Rise of Game Engines.

Part II of our interview with Jeffrey Jasper is now live! Vew it here.