Christopher: First off, thank you for making time in your schedule for this interview.  So to start — was there a moment, was there a single thing, where … the light went on for you about becoming a digital artist, a visual effects artist? And on that note, what do you call yourself?

Colie: Yeah. Well, today, I call myself a concept artist, with a slant on modeling. So that’s what I would — that’s how I call myself. So in some ways, I’m almost like a little art director. So I think that’s — it’s sort of — I guess a new age art director, where I can help somebody come up with an idea, and then I can help them flesh it out, for a set build, or something like that. Or something that needs to be 3D printed. So I kind of help people design things. So not necessarily an environment artist, but if you need an interior set design, I can design interiors. Like let’s say you’re building a spaceship, and it’s got landing gear, a ramp, doors that need to open, and levers that need to move, and engines that need to turn. I can help figure out how all of that stuff moves through animation. And if you need shader work done, trying to figure out what a look is on a paint job, or a material, I can help you with that. A little bit of a jack of all trades, based on what — you tell me what you need, and I’ll figure out a way to do it. And that’s kind of the way that I got into the industry as well.

Christopher: So was there a moment for you when you went, wow, I want to do that, exactly that.

Colie: Yeah. Well, I was an architect in South Carolina. I went to Clemson. Yeah, no kidding. About time. I’m a football junkie.

Christopher: Yeah, it’s really good for them. I know another girl who went to Clemson, lives here in L.A. She’s just nuts, she wears orange every weekend.

Colie: Yeah man, I’m off the grid come weekend time. But I enjoyed — the school to become an architect is a lot of thinking, and a lot of philosophy of design, etc. etc. I got out, and I started doing architecture for real, and it wasn’t quite as fun as the heuristic design aspects of school. It wasn’t as rich to me, which was a good thing to admit to myself. And a friend called me and asked me if I wanted to come out to L.A. and work in multimedia. If we put a timetable on this, this was about the time that Mac clones were coming out.

Christopher: Oh yeah.

Colie: So Power Computing, things like that, were brands that existed…

Christopher: 1993, ‘94, ‘95…

Colie: Yeah, I moved out in ‘96. Yeah. So I drove across the country, kind of packed my bags, and said, “I’ll give it a shot.” And the sale was, you’ll be able to go back to architecture if you want to, and you’ll probably know Photoshop. So that was sort of the, why not, let’s try it. So I moved to Hollywood, and learned — I was doing multimedia at that time, which is — then, it was just — it’s essentially a touch sensitive screen maybe, or click the button, go to a new screen type stuff. It was like user interface design, for whatever the subject matter was. And it was a government job. And it was great, because I was living in Hollywood, and I was working in Burbank, at a little — you’d call it a startup now, but it existed to do some of this type of work, during the — I think Clinton was closing down military bases, and each military base wanted a presentation of why that military base needed to stay open. And I think McClellan was the one that I did the most work with. So whenever these groups would come through and need to know — like they’d go to McClellan and want to know — instead of having to go through presentation through presentation, they could go out to these kiosks and go through our multimedia presentation, and they would explain to them why McClellan needed to stay open. So I learned Photoshop, and I learned some simple programming and stuff like that inside of some multimedia tools, on the PC and on the Mac. And layers had just been introduced in Photoshop, so that’s when I came in as well. So that’s another kind of dating tool that you would be saying, oh yeah, that was like Photoshop 3; so I came in around Photoshop 3. And the group that I was working with — there was about four of us, and we wanted to expand our ability to do more visual effects work, not just stills for multimedia. And I got handed kind of a suite of tools…

Christopher: Was it like Macromedia and Director and all of those?

Colie: We were using Macromedia at the time for some of our programming. But what we started to do was, at night, we were designing our slides for our multimedia presentations by starting to model stuff, and paint stuff, paint maps, and composite stuff, with little animations. And the suite of tools that we were using were Electric Image, and FormZ, and — Electric Image for rendering and texturing and animating. FormZ for modeling. It was CoSA After Effects at the time, but it’s Adobe After Effects now, for compositing. And Photoshop for painting maps and stuff like that. So those were our tools. And we could run them on Macs. And it was easy to get Macs at that time that wouldn’t break, because of Power Computing and other little companies like that. So you weren’t having to buy straight Macintosh machines, Apple machines. So we could afford to do that, and we had a little studio in Burbank that we worked with and for. And we were outfitted with these things, and we trained ourselves on this suite of tools, and I immediately began to realize that the stuff that I enjoyed in college was modeling. Not on the computer, but actual architectural modeling, with wood, and things like that. And I would get that flavor again using FormZ to model. And I…

Christopher: So just to pause there for a second. So on some level, you knew that you liked the design work, but then working with FormZ in particular, a light went on for you too, like this is my sweet spot, modeling.

Mari 3D Recommended Hardware

Colie: Yeah, modeling. I like modeling, and I wanted to do more of it. And then I realized, oh my gosh, I can paint these things digitally as well. So I focused on that, and then I got better at texturing, and kind of learning that — back in the day, everything got spit out of the computer. It was really clean, and you needed to rough it up, and that kind of stuff; stuff that you take for granted now. And we had actually studied imperfections and things, and that was — to me, that was a lot of fun. Everything wasn’t clean, white. Everything had — even white had little things that were imperfect in them, like a fingerprint in a speck map or something like that. And stuff like that was becoming more and more attractive to me, and I was like, you know what, I don’t know that I really want to go back to South Carolina. But yet again, they — people out here seem to like what I bring to the table, not to mention that I was doing a lot of drawing for set building companies because I sketch a lot. So I was working during the day at my government job, and then at lunch, I would go from Burbank to North Hollywood and draw for a set design company. And then come back and finish my day after lunch. And then at night, I might go over to Pasadena where Electric Image was, and learn some of their tricks. So I had very full days, because I was essentially in — what I look back now and call graduate school. Because I realized that I wasn’t really having the fun that I thought I was going to have in architecture, and got this opportunity to kind of explore via my architectural design background. This new way of 2D — a 2D deliverable versus a built structure that I had to worry about shedding water, and was the client going to be happy with the windows, things like that. And it was a faster process, and the return — I got a faster return on investment. In other words, projects didn’t last a year and a half like they do if you’re building somebody’s house. They lasted six weeks at most. So that’s when the — that’s the long answer to, when did the light hit that this is what I wanted to do. So I moved into the, I think I’m going to be stuck here in L.A. And I enjoyed it, but I didn’t know what I was going to do exactly. But I knew that I liked it. So I ended up designing a spaceship for fun, and took the spaceship and entered it into a contest. And one of the judges — I won the contest. Oddly enough, it was through Electric Image. And I won the contest, and one of the judges was John Knoll, who started Photoshop. He was one of the guys who wrote Photoshop, he and his brother. And he was a visual effects supervisor at ILM. And he was busy working on a Star Trek movie, and he was getting ready to start doing some supervision work for the re-release of the Star Wars movie that George had, the first re-release. So episodes four, five, and six were getting re-released by George, like in 1999, I think. And he needed somebody to help — some of the MAT painters at ILM to help in building models, and I was like, alright, let’s do it. So me, and oddly enough, the guy that called me up to invite me to L.A. in the first place; we got jobs at ILM. And the rest is sort of history. But I just happened to have entered a contest; I wasn’t aiming to work at ILM or anything like that. My goal was to find something that I could enjoy doing with design. I just didn’t know that anything actually existed out there that I could truly enjoy as much as I did working in the architectural lab at Clemson. And I found it.

Christopher: Yeah, well our client Billy Brooks actually grew up in North Carolina.

Colie: Yeah, Billy always wanted to work for ILM. That was one of his goals. And when I was flying up to ILM for my interview, I actually — there was a Cinefex issue that had come out that month and it had the Millenium Falcon on the cover, and it was kind of like the history of ILM. And I was really glad that that particular Cinefex had come out because I knew nothing about ILM. I was like, oh cool. I might get to go work on a Star Wars movie, but I know nothing about ILM. So I had to — I was researching who Dennis Muren was and all of these other people that were pretty important in the field while flying out there. So I was kind of like, I just want to get that same high that I got when I was working in the architecture studio.

Christopher: Dennis Muren!

Colie: Mm-hm. These were all of the important people in special effects, I just didn’t know who they were. I didn’t really — to be honest, it never occurred to me that I should even care. Like I said, I was in my early 20s or mid 20s and all I cared about was — am I supposed to be a designer? Because we all feel like we’re not valid for so long. Like we’re not — am I just a pretend designer, or am I just a pretend architect? When do I validate; when do I become someone that’s essentially valid? And I started to feel some validity when I was in Hollywood, because people wanted me to draw for them. And I would contribute something that I had gotten better at over drawing all of my life. And people would say, “we like what you’re doing, can you do this for us again?” It wasn’t like, can you do me a favor; it was like, we want you, we want to pay you. And I was like, what the hell? So that was all new to me. I didn’t feel like I was getting taken advantage of in a negative or passive way…i was a part of an effort, I felt like I was actually getting utilized, and contributing. And I had never felt that way before.

Best Rederer For Maya - Strafing

Christopher: That is a big deal. So … you’re up there in Marin, you’re working at ILM. Now we’re talking about late 90s ‘98, ‘99?

Colie: ‘99, 2000, 2001. And I worked for ILM for about 11 years at that point.

Christopher: Got it.

Colie: So that was — my first stint was from the re-releases, through the prequels, and Men in Black, and Galaxy Quest, and a bunch, a bunch of movies. All the way through — the last movie I worked on was the first Transformers movie.

Christopher: Yeah, the big robots.

Colie: Yeah, Michael Bay’s first Transformers movie, and then maybe a little bit of Iron Man. And then I spilled over, and I left ILM at that point. And the reason I left ILM — when I first started working at ILM, the tools — I was part of what was called the Rebel Mac unit, or the “Mac Unit”.  The Rebel Mac unit was a small group of people using a set of off-the-shelf tools — and John Knoll was using the Macintosh outside of the ILM pipeline. He was using FormZ, Electric Image, After Effects, and Photoshop. To do shots that were taking most of the aspects of the pipeline used to execute a shot — that were not creature shots, or shots with dinosaurs. The shots  were like the space battle Star Wars’ style shots, where you have a bunch of ships flying around in an environment. That particular suite of tools was super valuable for that kind of work, and it took a fraction of the time because one or two artists could present a vast amount of information visually for review.. It was a generalist, boutique approach outside the pipeline there. The main pipe at ILM was built on Softimage, which wasn’t getting the updates it deserved by whoever had bought it…maybe Microsoft?… So the pipeline’s foundation  wasn’t antiquated, but it was getting close to needing some attention. Because the big post-houses at the time built their proprietary tools on top of an existing base. — and of course Maya came along, and on StarWars: Episode 1, we could see that a smart move would be to rebuild that foundation on top of Maya. ILM kind of scrapped the entire pipeline’s base, which is a massive undertaking, and rebuilt everything to work on a base of Maya, versus a base of Soft Image. And that was — I’m still using Maya in this space and most studios do. So that was fun to be a part of. But yeah, the rebel Mac unit kind of disappeared because boutiques have a hard time competing with industrial pipes, and we all got rolled into the big pipeline at ILM.  A lot of us left, some stayed. Weta Digital had just popped up in NZ, and that attracted a few of us.  I integrated as a modeler, viewpainter, and mattePainter at ILM over the next few years, but the writing was on the wall in my head.   That was when I was interested in — it was time for me to leave ILM at that point. That’s when I went to The Orphanage.

Christopher: Can you talk more about the tools and transition, the new tools that you integrated into your work?

Colie: Yeah, so definitely it was Maya; Maya was the one. We had a couple of tools that were ILM specific, that allowed us to use Photoshop, and ViewPaint, which is like what Mari does now. And the tools that we were using were in house, all proprietary. And because our rendering program, and our TD program, our shader development, was still mostly Renderman. We had TDs, which really did nothing but wrangle the shaders and our paint work, and our maps and light the objects to show us what we’d done…called LookDev. Because I was doing texture painting and modeling at that time. And we had to have a TD to assign all of our maps to the shaders, and tweak the shaders. So whoever was running the show would update certain characteristics of a shader, and it would break all of our maps.  It’s just what can happen on a big show. The CG head of a show would tweak a variable and it would spill downstream to all the other TD’s and all of a sudden…bam…my texture maps weren’t valid. And that kind of stuff got kind of frustrating, albeit “that’s what I’m getting paid to do”. And that’s when I was ready to go, because I was doing paint jobs, and I couldn’t see what my paint jobs were doing on the surfaces of my model. I’d have no idea what they looked like until two days later. So the turn around time was just ridiculous to me as a generalist, though very valid inside the pipeline, and I’m like, “man, this is not a healthy workflow for me…I’ve seen the value of near immediate turnaround on in my flow”.

Christopher: And, the two day render jobs were happening in Renderman?

Colie: Yeah, I think it was Renderman at that time still. But it wasn’t that it took two days to render, it was that I had my model, I was using it. I would paint some maps. I would name the maps properly, and I could not — if my TD was busy, they could not get to putting those maps on my model to render for two days; so I’d have to wait for two days for that to get updated. And I was like, I need to be able to test this stuff and figure it out for myself. And I didn’t see the benefit of learning how to be a TD at ILM –back then it took all of your time creatively , because you’re wrangling so much shit. So I was like, this is just beyond me now. So I was like, I’m not up for this gigantic learning phase on proprietary software…I want to be doing this on off-the-shelf software.  Companies were doing LookDev on smaller shows in Maya. That pushed me.

Christopher: Sorry Colie, did you say TD?

Colie: Yeah, technical director. Billy Brooks folded into the big pipeline initially as a TD. And you’ve got to really, really understand, keep up with a lot of emails, because updates to the shaders would go out, it was a true industrial pipeline, but with R&D going on on the side of it.  Remember, ILM was ahead of everyone then developmentally (early 200’s)…R&D was part of the DNA, and they made room for it in both time and budget. So as a creative, you had to understand that your maps not working on a model because a shader got a significant update the night before was part of the process, and that ILM prided itself on the very changes and decisions made at those higher levels in the pipeline.  They weren’t bad decisions, they were just changes to a prototype that trickled down to the prototypes below it. It took patience and time to be in the pipe.

Christopher: But it also sounds like there’s — it’s becoming almost like a bureaucratic form of production, where you’ve got all of these touchpoints that need approval, or integration.

Colie:  If you were trying to get a real feel for what something might look like, you were not getting it because you had to wait for so long. You were like, oh, I don’t even remember painting that map. You’re like, oh, wait a minute, I updated that map, and I checked it in to our check in system. It really felt like, oh man, this is not — it was something that was going on all the time, I just wasn’t really up for it. And I knew that if I wanted to do it all myself, I was going to have to do it by myself. So I went to a company that didn’t have the luxury of what I had, which had people doing one job and one job only. So I went to a place where they needed generalists, which I realized the Rebel Mac Unit had prepared me to be…

Christopher: And that was The Orphanage.

Colie: Yeah, so that’s what I did at The Orphanage. I was put in with a bunch of people who taught me how to use MentalRay, and MentalRay was competitive to Renderman at that time.  Not as network-robust, but the renders were competitive. And then V-Ray kind of came about, and it was — these are ray tracers that worked really well. And you didn’t have to know all of the switches in Renderman. I mean, granted, these things were, no doubt, based on the same philosophies of Renderman, but the names of the dials had changed, and were a little bit simpler. This was when I began to realize that the VFX software industry had started to catch up with all the bells-and-whistles of what ILM had been developing.  And they were based not on — they were based on true environmental lighting situations, that you could put a HDRI background in, and have an environment light your object or your scene. You could go out and shoot chrome spheres, and unwrap them, and let that be your environment. You could be fairly certain that the image you were using as you were lighting an environment would actually light your object properly. And it was all based on reflection, because light is — most of the light that we see lighting objects is reflected light. The surface of something has a reflective quality; is it a dull reflective quality, or is it a glossy reflective quality? But it’s all reflection. And that was different, that was different from what we were working with as our  paradigm at ILM. Which was not reflective based at my time of departure, it was more specular based. So we were actually doing some guessing. Oddly enough, now we’ve kind of moved away from reflections again, and we’re moving back towards more of a specularity based lighting scenario. The switches and tools and knobs and things that have kind of come about with Unreal, and KeyShot, and that type of stuff, has been even further simplified.

Christopher: Mental Ray was probably the best renderer for Maya at the time. And so from The Orphanage, where did you go?

Colie: So I got an invite from Doug Chang to move up to ImageMovers Digital, up in Novato, just a few miles from my house.. We worked on a few movies up there. I was in an art department, so I was learning about how to — I was using my suite of tools to do still art, versus full blown shots like I was doing at The Orphanage. So I was kind of moving away from shots and more into stills. So I was taking my visual effects background and knowledge of the way that light behaves on surfaces, and applying it to more of what I’m doing now, which is concept art. And that was through an art department, and I enjoyed that. That was another one of those scenarios where I was kind of introduced to a lot of current tools, like KeyShot, and — I started learning Mari up there. And with Mari, you really were able to sort of see all of your maps on an object. And Mari was sort of the game changer as far as having all of your maps work while you painted, to kind of see what was going on. And then that place closed down, and I went to 3D printing for a while. Worked for a company called 3D Systems, and spent a lot of time doing industrial design, like two years, I thinkyears. And then testing printers, and understanding tolerances. A lot of stuff that was — relative to what I was doing, but not quite the same. Because I wasn’t using a lot of precision modeling tools — parametric modeling tools, like Rhino. I always found it really difficult to design inside of tools that you were supposed to already know what the thing looked like before you started modeling it. So there was no flow like with, say, ZBrush. With ZBrush you just pick it up and you start banging stuff out, and all of a sudden, you’ve got something you like.

Christopher: Well speaking of tools again, before we go on to that, maybe leading in to — I think you were working on the Dune film, is that right?

Colie: Yeah, I worked on Dune last year.

Best Renderer For Maya

Christopher: Could you just take a sec maybe and kind of acknowledge what your current primary toolset is? Like these are the five apps I’m in all the time, or maybe it’s just three.

Colie: So the five apps I’m in all the time are, let’s see. Definitely Maya. OctaneRender. And if I’m not inside of Octane, I might be bouncing over to Keyshot. And Mari. Let’s see. After Effects. Photoshop, and I guess a little bit of Unreal. And then anything in VR, like Gravity Sketch. Those are my big tools.

Christopher: Octane may be the best renderer for Maya.  So do you not need the firepower of Nuke, is that why you don’t use it?

Colie: Well, I don’t have to do a lot of compositing of footage and animation, because I’m typically only doing builds. if I’m doing anything where I actually do have to composite, After Effects is simple. I know it, so I don’t have to — and it gets the job done for me. I know Nuke, but I don’t need to use it.

Christopher: Yeah, it’s kind of overkill; it’s got tools you just don’t need.

Colie: Yeah, right now I’d say — if I were still heavy in FX, I’d definitely have it. And I still do FX work. But because my slant is more on modeling, I’m kind of at a point in my career where I have the luxury of saying, I don’t really want to do that. I don’t really want to composite anymore. Or if I’m going to learn something new, I’d rather learn Substance. Or if I’m going to learn something new, I’d rather spend more time using Unreal. If I’m going to spend my money, I’d rather spend it on Mari. I’d rather have a good paint job than a good compositing tool.

Christopher: So in terms of like Dune, your work on Dune, what did you do there?

Colie: I probably shouldn’t talk too much about Dune quite yet.

Christopher: Got it.  When is it supposed to be released?

Colie: 2020.

Christopher: 2020, okay. Well and that’s fine. It’s a big project.

Colie: Yeah. Sorry.

Christopher: A lot of people are excited about it and interested. So … to talk a little bit about hardware…you were in the rebel Mac unit at ILM, using Macs. How did that change? Did that change at The Orphanage, or later, because you were pretty much — you purchased the a-X Mediaworkstation, which is a highly multithreaded AMD Windows workstation.

a-X Mediaworkstation

Colie: Yeah. So after rebel Mac goes away, we’re either working on — we typically had a Linux machine set up at ILM and a PC. And all of the scripts that were written at ILM for the pipeline, they always ran really well on the Linux machine. And over time, they ran really well on PCs. The bridges that were built between those two pieces of hardware, they disappeared. If you were a painter, a new painter using Mari, you needed a beefier graphics setup. When I got out of ILM this last time, and was buying my own machine, I knew I needed something that would handle that, because that would be what I would consider my worst case scenario as far as taxing the GPUs and the CPU. And then of course RAM is RAM; the more you get, the better. And RAM is not going to break the bank like it used to. So I got those three RTX 2080s, and they’re great.

Christopher: Yeah. And that’s probably worth talking about too, because going from your traditional renderers, like probably the first modern renderer you used was MentalRay. What does working in Octane, on your a-X Mediaworkstation with 3x RTX 2080 GPUs – what does that allow you to do as an artist that you, say, couldn’t really do before?

Colie: The goal for me is to be able to model and see the shapes as I want — as I design in the model. I want to be able to shape them in the environment that I think they’re going to be in, because the environment might influence the look and the shape of things. So if I want to move some CVs, or move some control points, or vertices, and see what that looks like while I’m modeling in my lighting environment. I’m actually getting that feedback inside of Octane. And I don’t have to break that flow, so none of that flow break that we were talking about earlier when I was having to wait two days for my map to be thrown on the model to see what it would look like. And I don’t just want to do it in a shaded render out of the modeler, like a flat shade, or an ambient occlusion shading, or something that comes standard inside of the modeling packets like Maya. I want to see if it’s  a nighttime — if this object is going to be seen at night, I want a nighttime environment. I want to be modeling in a nighttime environment.

Christopher: In a way — it’s like you want to be on set when you’re creating.

Colie: Yeah. As much as I can. And the other thing that’s really great is, I don’t have to stop what I’m doing, save that file, open it in another renderer and go. If I’m working, I’m — at any point in time, I can say, okay, because my GPUs are so fast, I’ll just let this — I’ll let this image on my screen finish rendering, and then I’ll just take a snapshot of it and save it out as a — all of that is a finished render of a model. And maybe it’s grey shaded, maybe it has a few maps on it, who knows. It doesn’t really matter. It’s just that I’m modeling something that I used to have to do a lot of preparation for a presentation. Now I’m just — there is no prep for the presentation. It’s just like where I am, I’m done, and I let the screen finish rendering for a few seconds and I’m done.

Christopher: So were you using your a-X Mediaworkstation for the concept art and modeling on Dune?

Colie: Yeah.

Christopher: So that really makes a difference in terms of — I guess what you’re really saying, it’s like, just let me iterate in real time.

Colie: Pretty much. I need to see — if I need to see how something might look, I don’t have to say, okay, let me bounce over into KeyShot, give me ten minutes. Or let me save out this file, and reapply these shaders; give me ten minutes. No.  When I start my model I’ve got Octane shaders on it, and all of the Octane settings are set. And whenever I say I’ll just turn — in my viewport, I just turn Octane render on, and my render starts. It’s just ready to go all the time.

Christopher: Yeah. You mentioned CV as one of the data points. What does that refer to, CV?

Colie: CV is a control vertice. Essentially it’s a vertex. It’s a point on a line. So if you can imagine a curve, three points on a curve, I can grab any one of those points and move it. In certain types of modeling programs it’s called a CV, and other types of modeling programs, it’s just called a vertex or even simpler, a “point”.

Christopher:  Got it – thank you.  So, to wrap up, when you’re creating, what is your mission, or your intention? What is that thing that lights you up? And then from there, where do you want to go? Where do you see your practice, your art, taking you?

Colie: My goal is to evoke. So everybody wants to evoke. If you’re going to communicate a form — I’ve learned how to do that. And I can draw a form, I can paint a form, and I can model a form. I can put that form under a bunch of lights, and everybody gets to look at it and go, oh, that’s what that is. But what I’m pushing myself to do now is not just letting that form exist alone. But helping — putting it in an environment, and giving it a mood that gives it a little bit more of a context, a history, or a story. And a lot of times, that’s just in a still. It was easy when you had — putting stuff in shots, because I would build these objects and then throw them into a plate, that was pre-shot on some helicopter or something like that. So that made it easy. But now I’m having to kind of come up with what that background is. And I’m finding that there’s a lot of tools out there including — Octane is really good at it, but a lot of subsidiary — or not subsidiary but tertiary tools, like Quixel and Substance Designer, and — I’d say mainly Quixel right now, Quixel Mixer. Where you can get these pretty hefty displacement maps and environments like a forest floor with puddles and stuff like that. Being able to use these with Octane and have that work, along with making like a — putting environment fog in your scene really easily and quickly inside of Octane, and having my model be in Maya and Octane, makes things a lot easier. And I’m able to adjust and kind of get that mood going pretty quickly. And that’s important, because once I get a vibe, then I can start adding — that’s when I start sweetening things up. That’s what I want to do.

And I think that with Unreal, that’s another thing that I need to get into a little bit more, because I think I’ve got everything I need to be able to move forward on what I would call a VFX path. And I’ve already been down the VFX path, where all of the models are high resolution, and I’ve carefully painted something in Mari and carefully modeled it. Now I want to be able to take all of that stuff and maybe dial it back. Instead of spending all of that time painting a perfectly UV’d model, maybe I send it through Substance and let Substance do the paint job for me. And then I’ll adjust it a little bit inside of Mari. I want to move from a VFX side of working to more of a new game-style of working. Where maybe in the end, if I’m working on a still, I do a lot of paint work over the top of something. But I’ve got it rendered 90% there.

Christopher: And would you say in your Unreal engine, rather than Unity? Is that what you use?

Colie: Yeah, I prefer Unreal.  They’re both good tools.

Christopher: And Unreal — I mean, it’s kind of an all in one solution from my understanding of it. Working with this tool, what does it bring, how does it support what you’re trying to express? You mentioned evoke, which is maybe it. When you see a still, if it gets a mood, something, you get it. It’s unmistakable, you feel it.

Colie: Yeah. So I think with Unreal, as I train in Unreal, what I’m finding is that — the environment, lighting, well you can kind of do a lot of this in Octane already. But to be fair, you can do all of it in Unreal. And I don’t want to — I don’t quite know how — outside of integrating something into VR, I can’t quite figure out how, in my world, I’m going to be working in Unreal. Other than generating a complex environment. Like let’s say I’ve got 30 or 40 pieces of a hanger, and I want to just hack them all together, and see what it looks like. And I’ll throw my VR headset on and run around in there; I can do that. There’s a lot of terrain building tools inside of Unreal that I’ve not really explored yet, but that I know are in there. And I can easily do stuff in that. I’ve already played around with it a little bit. But to be fair, I haven’t integrated it fully in my workflow. But Unreal seems to be the next wave of how filmmakers are going to work.

Christopher: Do you see — or do you sense something in the future that you are headed towards? Like you sense that Unreal is providing something in the tool arena for expressing something, or a way of working. But is there anything else out there that you feel you’re gravitating towards, or that you want, ultimately, to create?

Colie: I think I’d want to be modeling more in VR. That’s where I’d like to see things go in a dream-world. I see myself being able to import cockpits and environments in VR. I’m going to have to do that through Unreal. I think this is partly due to the fact that so many people are working remotely.  meeting together through a VR setup. I think once that’s really nailed down then we’ll be able to walk around in environments for critiques and reviews.. Which we really couldn’t even do when we were all in the same studio a decade ago. So for me, that would be the next great thing. And it seems like we’re right on the cusp of being able to do that anyway.

Christopher: Yeah, yeah. Well Colie, I think I’ll have some follow up questions which I’ll email to you, but this is a really…

Colie: Yeah, I think I was starting to fade on you a little bit at the end, sorry, man.

Christopher: No, I don’t think you’re fading at all! I think it’s an artform in itself, interviewing.  I think I’ll go back through the interview and maybe just see some things that I forgot to ask you. I really want to provide an accurate picture, but also talk about the specifics, the tools, and we did that.

Colie: Yeah, I get that.

Christopher: And how they help support what you do. Colie, again, thanks so much. It was a great interview and wonderful talking to you.

Colie: Right on, brother.

Christopher Johnson

Christopher Johnson was born out of a career as a successful producer and business development professional, in 2010 he founded Mediaworkstations.net. He saw a challenge then facing content-creators and technical professionals: Where can you find in-depth, current hardware and software expertise which maximizes productivity and digital processing power cost effectively?

He found that many of those he worked with were hard-pressed to find reps with major manufacturers such as Dell, HP or Apple with in-depth professional optimization awareness – or knowledge of how to best configure hardware for exact computing needs. In addition, many such companies’ infrastructures and products present obstacles to communication, optimized configurations, procurement and implementation.

Christopher’s company Mediaworkstations.net builds custom hardware optimized for the work you do. Mediaworkstations.net is a performance computing company. This is reflected in each part selected for each workstation and server. Whether you use Autodesk Maya, Octane Render and ZBrush, Cinema 4D, Redshift and After Effects, Davinci Resolve, Premiere Pro and Photoshop, Houdini, V-Ray, and Unreal Engine, or have apps like AutoCAD or MATLAB at the core of your workflow, Christopher’s state-of-the-art hardware and software knowledge provides performance and reliability-based solutions to your computing needs.