The Future of Visual Effects with John Gajdecki (Special)

John Gajdecki, the man behind some of Stargate’s most amazing effects, returns to address the elephant in the room — AI — and explore how it will shape the visual effects industry in the years to come.

Share This Video ► https://youtube.com/live/Sjl9YlGvb-8
Visit John’s Web site ► https://www.johngajdecki.com/

Visit DialtheGate ► https://www.dialthegate.com
on Facebook ► https://www.facebook.com/dialthegate
on Instagram ► https://instagram.com/dialthegateshow
on Twitter ► https://twitter.com/dial_the_gate
Visit this episode on IMDb ► https://www.imdb.com/title/tt38652517
Visit Wormhole X-Tremists ► https://www.youtube.com/WormholeXTremists

SUBSCRIBE!
https://youtube.com/dialthegate/

Timecodes
Coming Soon!

***

“Stargate” and all related materials are owned by MGM Studios and MGM Television.

#Stargate
#DialtheGate
#turtletimeline
#wxtremists

TRANSCRIPT
Find an error? Submit it here.

David Read:
Hello, everyone, and welcome to Dial the Gate: The Stargate Oral History Project, Episode 359. My name is David Read. Really appreciate you tuning in for this one. John Gajdecki, the visual effects supervisor for Stargate SG-1 and Stargate Atlantis, is joining us this hour. Our discussion is the future of visual effects. Now, that’s a very broad term and I’m hoping, John, with your help, that we will be able to narrow this down a little bit because, John, I’m seeing– How are you?

John Gajdecki:
I’m good, by the way, actually. I’ve been rushing to get stuff ready for today so I have as much stuff to show you guys as possible.

David Read:
Really appreciate it. Yes, Philippe Canat, “David, you’re late but it’s OK, I appreciate it.” I’ve been seeing what could only be described as this, for right or wrong, mass psychosis. Or perhaps hysteria surrounding AI and its transformation in the entertainment landscape in particular. And I’ve had people, to be perfectly honest, brush up against me and they’ve seen me use some of this stuff on the show and pull out of certain episodes for me using it. I’ve also had other people in the same episode on the backend be like, “Automation has been with us for centuries.” You have a ton of blue collar jobs that are constantly being eliminated decade after decade. We don’t really bat an eye from it because we’re all used to it. We get into more of the white-collar territory and everyone tends to freak out a little bit more. I’m curious, with those broad strokes in place, and knowing a lot of the footage that we’re about to see that you are generating is using AI, what your thoughts are on the cultural perception that we find ourselves in.

John Gajdecki:
Well, shit. Look–

David Read:
Don’t leave anything out, John.

John Gajdecki:
No. I’m just gonna need a bigger shovel. Look, I’m gonna start with a metaphor. And that is, years ago, I was trying to explain to my mom what I did and the kind of work that we did. And I explained how we would create these shots for visual effects and we would do a version, we would do another version. On House of David, actually, we got up to 173 versions of one shot, which is by far the highest that I’ve ever achieved, and I don’t know if it’s really a great thing to say. But my mom said, “Why don’t you just do the good version first?”

David Read:
She said that?

John Gajdecki:
She did say that. And I thought, “What a revolution in filmmaking that would be.”

David Read:
She solved it in one sentence.

John Gajdecki:
She really did and if I could only explain to her how much that sent us over budget. That’s my mom. My daughter is studying CG animation right now. And she fears for her job, the job she doesn’t have yet. And to be honest, so do I. There is a sea change coming. I’ve lived it a year ago on House of David. I’ve seen what it can do. Everyone was dubious and it was spectacular and it’s only better. That’s the thing. In the same way that when I started in the business, which was a couple of years ago, I filmed everything with this movie camera and there’s a can of film there. We used to build models. And we would shoot them and we would composite them photo-optically on optical printers. When digital came in, which was around 1991, a lot of people who are even a few years older than me didn’t survive the transition. So I was lucky to be really young at the time and I did everything I could to learn. I’m the guy who got 11% in calculus when I was in high school so I’m not always the smartest guy but I did everything I could to learn about computers, which we didn’t even have when I was in high school. That was something that I did. I really made sure I did it. Very successful. Now, ironically, years later, AI definitely favors people who are willing to invest emotionally, personally, intellectually in new processes and new technologies. But it doesn’t necessarily favor the young people, because it’s so easy to use that it favors the established people in some ways. I think it’s important to remember, when I do visual effects, I don’t use the tools. I’m the supervisor. I don’t do anything. If I do my job correctly, I’ve prepared everybody for what we’re gonna do. When we go on set, everyone knows what they’re gonna do, everybody knows what’s required that day. And as a team, the film unit does it. When I work with my artists, everybody knows what shots we’re working on. They’re gonna show me the works in progress. They’ll show me the Maya renders for the environments, and we’ll be showing you some of these. They’ll show me the Houdini renders for the people. They show me the AI renders, the AI generations. But I personally don’t do any of those things. I’m the supervisor. So, really what I do is I apply the process. I speak with the producers. I speak with the artists. I tell everybody what I want. I tell everybody what we’re doing, and I work with the artists to make sure that it looks right. They’re the foot soldiers. They have to know it inside and out. And as people do, it’s gonna change the world. Those artists who are gonna use those tools, it’s so easy now. I do appreciate, and I do understand that it will unemploy people in the same way that the digital revolution unemployed people, in the same way that the Industrial Revolution unemployed people. And there’s a really interesting theory. I could go on about the Industrial Revolution all day long, but that’s not what we’re talking about today.

David Read:
It’s part of what we’re talking about today, but go ahead.

John Gajdecki:
Metaphorically it is, actually. There’s a theory then that the changes in–The Roman Catholic Church created an environment where it became a merit-based society. Before the changes in the 1300s and the 1100s, from the fall of the Roman Empire until the end of the Dark Ages, they lived in a world where things seemed to be more family-driven. If you were a blacksmith, your son would be a blacksmith. Your daughter probably wouldn’t be a blacksmith. And then these changes happened where, if you wanted to be the blacksmith, you would have to be an apprentice blacksmith, and then you would become a journeyman blacksmith, and you would go away and you would work in other places. And then when you became a master and you came back, or stayed wherever you were, it was all based on merit. That was a spectacularly important change, and that was one of the things that …

David Read:
It wasn’t based on who you were born to.

John Gajdecki:
… led to the success of the Industrial Revolution, which has metaphorically, I guess, a lot to do with AI, because it’s gonna be the people who are really good at it. It’s not necessarily gonna be my daughter. Except she has a really good eye, and I suspect she’s gonna be really good at it. But that’s not a hereditary thing. That’s because we worked on it for her whole life, and let’s see how it goes. But for the film industry, the people who embrace this technology, there will be significant rewards. And I don’t think it’s just the film industry. It’s sort of society at large. But there’s definitely gonna be downsides to it, as we’re all aware. And hopefully society at large can come up with a way of guiding us through this.

David Read:
I see all of the data coming out about the neural activity of people who are using AI tools to do the work for them. And in the same studies, the neural activity of the people who are using AI as kind of a sounding board for the creative content that they’re developing. And the firing patterns are like night and day. Basically, some in one and none in the other. It’s very clear what it’s doing to us, because when you’re forced to think, you build pathways. And it’s those old sci-fi episodes of Star Trek and all these others where the society lost how to create. And then you have the other side: “I’m not using this at all. This technology be damned.” Isn’t there an argument between the two paradigms? Because I think it’s incredibly naive to think that if enough of us speak up, it will all just go away. And I know that that’s not what their intent is. Their intent is regulation. So that you can box it in. But whose regulation? Especially when you have some industries and even some governments who are gonna be using this technology and similar, and have no moral qualms whatsoever. This is not going to slow down. And I think that all of us are looking at which bronco to jump on to see if we can catch up to this moving train and keep pace with it long enough that we can grab on and get ourselves aboard.

John Gajdecki:
There’s a lot of social issues that we can talk about, and there’s a lot of social issues that are gonna be affected. I don’t think that’s the purview of this episode.

David Read:
Absolutely. Just wanted to establish what’s going on.

John Gajdecki:
But dude, it’s absolutely critical because you can’t not get on it unless that makes you feel better, and that’s only gonna last for so long. You can’t use it to exploit your fellow person. But there’s always a portion of the population that’s gonna do exactly that ’cause they can.

David Read:
No problem with it.

John Gajdecki:
So, unless there’s some incredible change, which is unlikely, this is here. It’s a tool. It’s gonna keep getting better. It’s gonna keep getting faster. It’s still driven by artists.

David Read:
That’s right.

John Gajdecki:
And I think that’s really an important thing to remember because the people who are gonna do this ultimately are the ones who are younger than me. Look, I’ve been doing this for a really long time. I’ve had a really good career. I love what I do. Some days, maybe I wanna stay in bed a little longer. But I’m doing everything I can to learn this technology and the art behind it. Not because I’m ever gonna use it, personally. Remember I talked about this. I don’t do Houdini. I don’t do Nuke. I don’t do Maya. But I need to know it in order to talk to directors and producers. And I need to know it in order to talk to artists so I can be useful, ’cause really I am the guy in the middle. But the artists have to know it. They’re gonna have to know it inside and out, and they’re gonna have to keep learning. And it’s gonna be spectacularly cool. The one thing is– Here, when we went from shooting miniatures to building things digitally– And in fact, interesting story. My company in Toronto had a whole model shop. We had this huge model shop. We built all kinds of models. We used them a lot in Stargate. The Goa’uld ship was a model in the early episode, as was the– Lots of the pyramids were–

David Read:
“Within the Serpent’s Grasp.” “The Serpent’s Lair.”

John Gajdecki:
When we went digital, it became faster to do that work. One could argue it didn’t quite look as good, but there’s two sides to looking as good. And one of them is, if you build a model and you take it outside, it’s gonna look real. But it’s hard to match camera moves. So it’s hard to track it into live action plates. If you build a CG object, it’s really easy to track it into the shots, and camera moves have become a big part of the language. So that’s an equally valid part of what makes something look real. It’s just sometimes the lighting wasn’t quite right or the contrast wasn’t right. But over years, all the artists got used to using the tools. There was an explosion in the amount of visual effects that were in projects. The content went way, way, way up for basically the same monies for the same amount of work. You’re gonna see that again. Except now, eventually when the tools get there, there will be a huge explosion in the amount of work that’s on screen, not necessarily in the amount of work that’s available to people doing it.

David Read:
Every piece of technology has to start somewhere. Just like every artist has to start somewhere. And I’ve heard a lot of people say, “You’re only doing this because you put in a prompt in AI and you’ve never done anything like it before.” An artist has to start somewhere. We’re all creative when we’re brand new. You have to be inspired by something. Have you seen the documentary on Disney+ called Light & Magic?

John Gajdecki:
Yeah.

David Read:
One of the single best documentaries I’ve ever seen. Anyone listening, if you are remotely interested in this, please drop everything and go and watch it. I think it’s six, seven episodes long and there is this section when they move from miniatures and models and stop animation and stop motion and go motion and the Dykstraflex and everything into digital. And there were people at ILM who just walked away. They weren’t interested in the transformation. And then there were others like John Knoll who came in and said, “OK, what are we going to do with this? I’m on board.”

John Gajdecki:
“We gotta figure this out.” I’m gonna interrupt for one second. Lately I’ve been getting a lot of emails or contacts from people who used to work with me thanking me for helping them get into the business. And one guy wrote to me a couple of days ago and he said, “I used to work in your model shop. And I know that you pushed the guys to learn 3D, but most of the model shop guys resisted.” One of the guys who ran the shop, he’s a visual effects supervisor in Toronto now. Another guy who ran the shop, my brother actually, left the business entirely, went to Ottawa. Now he works at the War Museum and he builds models for the War Museum and he builds displays for the museum of history. He recently did a display on guitars. So I got to see, and I wasn’t allowed to touch, the guitars from Rush, and Bryan Adams. Worked with him a couple days ago. Got to see his guitar. Of course, you’re not allowed to touch it. This one guy in the model shop, he wrote to me and he said, “Thank you, thank you, thank you for getting me started.” But more than that, he said he actually did go into 3D animation. And that’s been his career for his whole life. And really, the model shop was his way to get into the business, but 3D animation worked better for him. It worked better for the way that his mind worked. And he got to contribute more, and that was his career for the last 30 years. And I think we’re gonna see a lot of that now. There’s some people who, like you said, don’t want anything to do with it, and that’s great. There’s all kinds of different ways to be unemployed. There’s a lot of people who are interested, but may not have the skills. I don’t know why I said it like that, sort of like William Shatner. But then there’s people, it’s gonna work for them. It’s just gonna work, and they’re gonna have tremendous careers.

David Read:
You cannot– I’m gonna push back a little bit about what you said with your daughter. You cannot teach instinct. One of my favorite sequences from all film is Mr. Holland’s Opus. And you’ve got Richard Dreyfuss trying to teach– I forget his name. He was Rhodey, the first—

John Gajdecki:
I don’t know.

David Read:
In Iron Man. I forget his name. They fired him because they replaced him with someone else. “Find the beat, Mr. Russ,” is all I can remember. And he couldn’t beat the drum until he found it. He had to work at it and work at it and work at it in order to find the beat. You cannot teach instinct. You can only find it yourself and hope that you have it within yourself. I think that it’s certainly– It’s no coincidence that my godson is just as amazing at pounding a snare drum as his dad is. There’s not a coincidence there. You can learn to be proficient at something and may be just as good as someone else. He’s done it in eight months. So, there is that component. The other thing that I think really interesting is I keep on going back to a quote from Gabe Newell, who runs Valve. And he’s specifically talking about utilities that help you generate code. And one of the things that he focuses on, and I’m interested in your thoughts on this, is the fact that he believes that for a time there will be people who have never touched code before who will be ahead of people who have been coding for years, because their instincts and their intuition are not buttressed by, “You have to do it this way.” Because they can spit something at a machine, and the machine will spit it back out in the way that they want it, not knowing the subtext of anything that’s going on, because it’s not intelligent. It’s just a vending machine for content. So, new breakthroughs and new ways of doing things can manifest with people who have the innate instinct and feeling for something, but aren’t built in with that latent structure from their instruction that says, “No, you can’t because this.” In many cases, there’s a very good reason for that. But those things, once you lock into your phonemes, as a speaker for whatever language you’re in, you’re locked in. You can’t even physically manifest certain phonemes out of your language, because your brain is already developed that way. And it’s the same way for learning. And people who are approaching this technology who have never done it before, but have an instinct for story, will have a greater threshold of opportunity for a little while and will have a chance to really get in under that closing door.

John Gajdecki:
We should probably start showing people pictures soon, or they’re gonna rebel.

David Read:
Guess so.

John Gajdecki:
But there was an interesting saying a long time ago, and that was, “Hollywood scripts were not written by people who knew how to use word processors.”

David Read:
That’s it.

John Gajdecki:
As the technology changed, they still needed the artists. The artists had to learn the new tools. That’s all it is. You’re still an artist inside, and that’s gonna be your special sauce. That’s gonna be what sets you apart. And I think with the AI revolution, which it’s not really an evolution, the people who have it are still gonna have it. They just have to learn new tools.

David Read:
And be willing to learn the new tools.

John Gajdecki:
That’s the rub, isn’t it? And that’s fine if you don’t. There’s still lots of stuff to do.

David Read:
That’s it. 100%. Those of you who are live with us right now, the YouTube chat, my moderators are standing by. You can submit any questions. Terrence Howard, thank you, Raj, was the name of the actor. The chat is wide open for you to submit questions to John, anything you want to ask him about the industry or his work or thoughts regarding this conversation. Before we get into the stuff that we have, you provided me with a beautiful clip that you recently discovered from “Rising.”

John Gajdecki:
Yes.

David Read:
So what are we about to see?

John Gajdecki:
Yes. I’ve been cutting some new demo reels. So I’ve got hundreds and hundreds of hard drives, and I’ve been going through them, and I found an old previs of the space battle from “Rising,” so the previs that we did and then the final cut. And we have both of them here.

David Read:
I can’t show the final cut because it got flagged last time.

John Gajdecki:
Nobody wants to see the final cut. We all know what it looks like.

David Read:
We know the sequence.

John Gajdecki:
But we can look at the previs. Although, I bet you if we put the final cut together right beside it, it would probably work. Why don’t you–

David Read:
Absolutely. I wish I had that ability. But let’s see here …

John Gajdecki:
I can do it in about two minutes.

David Read:
… what we’ve got. This is so cool. So this was once the shots were already established and figured out?

John Gajdecki:
No. This is– So, what we did, we had a small in-house team that we set up, so it’s fun. Stargate, Outer Limits, Poltergeist, all of those shows were done at the Bridge Studios in Vancouver. And there was this building way in the back, and it became the effects building. So, I was the first one there. I was doing the effects for The Outer Limits initially, and they gave us this building. It was like, “You work in here.” And it’s a shame it’s so cold in the winter. But when we did “Rising,” we got one of the rooms in the back and I put a little team together, and there was one guy that I worked with, I’m trying to remember his name. I can see him clearly ’cause my brain works in pictures. But he brought all of these World War II books and he brought all of this dogfighting history to the scene, and he sat down and he worked this whole thing out. Damn, I can’t remember his name yet. And it was pretty impressive. And I think what was really the most impressive is he did all of this work, he designed the sequence really based on what was in the script. We would have conversations, but his job is more interesting if I stay out of it. And my job is more successful if I let him do a really good job, ’cause really one of my theories in the movie business is my job is to make the person directly above me look good. So, they work again and they hire me. So his job is to make me look good, and the best way to do that is to stay out of his hair. But he put this whole scene together on his own. We cut it together and looked at it going, “Well, this is really good.” And we took it to production, we showed it to Brad, and I don’t know if there were any changes. If we could cut this together one day and look at it in the future, it’s almost a one-for-one. And he just nailed it. Absolutely nailed it. The guy was a great animator. He could see the scenes in his head and–

David Read:
This wasn’t Dan Mayer, was it?

John Gajdecki:
No, it wasn’t. But it could’ve been. It was really interesting how quickly it came together and how successful we think that it was.

David Read:
When something feels right, you just know it. And it’s not something– Now, to borrow from Ellie Sattler in Jurassic Park, there’s some things that you don’t think through, you just feel them.

John Gajdecki:
No, it’s true, and sometimes …

David Read:
Certain things feel right, because you’ll–

John Gajdecki:
… you’re on a show, and we’re all going, “This is good.” Like, “What we’re doing, this is good.” And sometimes you’re on a show going, “I hope nobody sees my name at the end.”

David Read:
“I’m not crazy about this.”

John Gajdecki:
That doesn’t happen very often. Normally there’s so many talented people, and it’s not really talent so much as enthusiastic. The movie business rewards success, of course. But you get there through enthusiasm often. People who are really talented but are sort of dickheads, they don’t always survive in the movie business.

David Read:
You have to be able to communicate with people.

John Gajdecki:
It’s a team sport.

David Read:
That’s right, which means that politics, for better or for worse, is a part of it. One of the things that really got to me when I was going through this content was something that I saw online, and I’m gonna show an example of it here. This is on the account, YouTube ghost3d, at the lower right there. And this is using Houdini with VEX, V-E-X procedural animation. And it is AI, but it’s also an artist using the tools. You have the software which is examining the shot. And the tendrils are being generated by the software as it’s recognizing the collision in the environment. Hand animating this would be impossible.

John Gajdecki:
It would be–

David Read:
You can’t do it. This is the same technique that is used in Venom for the tendrils of those space slime creatures. I don’t know what the Venom people, whatever you wanna call them. You can’t hand animate this, and it looks absolutely real. I was blown away with what I was watching. I had a very hard time believing that what I was seeing wasn’t actually existing in that space. But you can use this thing to create all kinds of stuff now alongside the artist. So, the computer is actually doing a lot of the work, as it always did, all the way back to Photoshop. But the artist, the user, the end user is selecting the tools to make it work and decide what looks right. And what we’re about to see here is something that surprised me with what you said. I don’t know why it surprised me, but it did. I guess I’ve been consumed by all the dogma. How much your team was using AI for your day-to-day visual effects work. And what we’re gonna see from House of David in particular here in just a couple of minutes, a ton of it was augmented AI shots or some AI shots entirely. So, should we start off with your demo reel first before we get into individual shots?

John Gajdecki:
Yeah, you were thinking we should play the Emmy submission reel.

David Read:
Let’s go ahead and do that first. So, when was this submitted? Was that–

John Gajdecki:
It was for last year’s Emmys. We weren’t nominated. I’m always looking for a new way to lose at the Emmys, and there’s certainly no shortage. These three documents behind me, those are all Emmy nominations. So, I’ve got three Emmy nominations and I got some other of those things, too. When you submit to the Emmys, you have to prepare a document. And it takes a lot of work because you have eight minutes. It’s gotta be eight minutes or shorter. It can’t be 8:01. They’re very, very particular. And that’s the game. Four minutes has to be the material from the episode, not altered in any way. You can’t change the sound, you can’t change the effects to make them better. You can’t switch the order of shots. You can cut out between scenes to move things along. That’s really what you can do. And then the second four-minute block is: you can show behind the scenes. You can show befores and afters in any way that you want. And you submit a PDF as well. We’re gonna look at the four-minute behind the scenes, and it’ll include my narration. So, I’ll just stop talking for a bit.

David Read:
We’ll be right back with everyone. Check this thing out.

John Gajdecki [clip]:
The first tent city shot was filmed from a cable cam. We shot multiple passes with the extras and time-warped and grid-warped everything to get them to fit in. Then, of course, we extended the shot with flags and tents and smoke and green screen and CG people. The second shot was a really big version of the same thing. To this drone plate, we added CG tents and fire, people and also green-screen people on the parapets. Gath was created both as a CG build and an AI generation. That was the first version. This is the second. If you look around, you can see lots of the usual AI artifacts, which were cleaned up through paint and then the shot went into comp for extra people and flags. The problem we bumped into was the AI team could iterate so quickly. We would be working on a 3D model for a while, we’d get their new generation and suddenly square windows were round and details had changed. Goliath’s origin story fell to the AI team. We were gonna try and shoot something live. Then the AI team showed us some tests that looked really great. It was gonna be a struggle. We knew that. One of the things we did was we shot actors on green screen so that we could train the AI on the real people so that everybody who was in the show was on production. The line plates were shot with a reference. We animated over that, but editorial could use it for timing. Then the line itself was built in Houdini. We spent a lot of prep determining how we were gonna shoot Goliath. So, these tests, we were shooting handheld, moving in, moving sideways. How much can we move? How fast can we move before we break the illusion? Turned out, we were in pretty good shape. We didn’t shoot any motion control at all. These tests were to determine size. We liked 10 feet, which is a 1.65 scale, which the props people used to build everything that we needed. Frame rate tests. We thought we were gonna over-crank, but we turned out shooting at 24 because that gave us the athletic look that we were after. Eye lines. We had a giant on a stick. We trained the extras on the ball and then we’d shoot the pass with Goliath. This was Goliath’s big reveal. We shot the first pass with the giant on a stick, giving all the extras their eye lines, and then we shot Goliath on a green screen, which we set up over on the side. We comped it together with a ton of roto, but it really made it look realistic with him tied into all the foreground people like that. House of David did not shy away from roto. We’re always trying to sell Goliath’s size. We were done shooting at the time, so we took a few of our CG soldiers and detailed them up, got them blowing in the wind. Looked really great. And that was really all it took. We did motion capture with a suit and basically did version after version until the showrunners were happy with the performances. We shot parts of the training fight on green because we needed to shrink and enlarge Goliath and the soldiers. Most of the big environment shots extended the landscapes with DMPs and Houdini armies, but the majority of the shots were done using green-screen armies. We shot on a gigantic green screen that we put up on shipping containers. Shot for days, got an immense amount of material. We used them in a lot of shots. But the end of the season was all about big wide shots, CG armies and an endless amount of roto.

John Gajdecki:
They all have my contact info.

David Read:
John, that was amazing. That was really cool.

John Gajdecki:
I’m glad you liked it.

David Read:
How long did that project take you?

John Gajdecki:
It’s funny. It went by in a flash, but it also felt like forever at the same time. I’m not sure how those things both worked out. They called me in July, I think, of 2023 actually, and said, “Hey, would you like to do this show? It’ll start in a couple of months.” I’m like, “Yeah, it sounds really cool.” It’s big and it’s epic. And it’s hard, and I like hard. I knew doing Goliath would be difficult from a camera point of view, and I wanted to push how far we could shoot or how far we could push the camera trickery to shoot regular-sized people and make them look big. But I knew also–

David Read:
A long way from Peter Jackson.

John Gajdecki:
Yeah, but not that far, ’cause his name came up a lot when we were filming, and we can talk about that a little bit in a second. But the other thing that came up right from the beginning is they wanted to use a lot of AI.

David Read:
They wanted to?

John Gajdecki:
They wanted, meaning the showrunner, guy named Jon Erwin, who we can talk about a bit, because he actually had a lot to do with pushing this whole process forward. And they’re like, “We want to use a lot of AI. We think it’s going to be a big thing.” Now in 2003, the AI thing hadn’t started. We knew that it was out there, but we had no idea how you would go about it. The way the tools are now, they were much harder to find. You needed to know people, and that’s what was going on. Before the show started, there was a commitment on the part of production to use as much AI as they could. And I’m really glad that that was the case.

David Read:
You said 2003. Did you mean 2023?

John Gajdecki:
I did mean 2023.

David Read:
OK because I’m thinking in 2003 they had Massive at that point for crowds and things like that.

John Gajdecki:
That’s right. Thanks for catching that.

David Read:
Absolutely. Let’s go ahead and look at it again, but let’s hear you this time.

John Gajdecki:
OK, so these shots were done with Houdini and Maya. Pretty well Maya for the tents and everything, and Houdini for the smoke. That shot took one artist probably four weeks, five weeks. this shot took a couple of artists two months. So a lot of work to make them happen, to get them looking real. The tracking was difficult.

David Read:
You can tell me to pause it anytime.

John Gajdecki:
Let’s pause and go back.

David Read:
You know what. David, how are you going to pause? There it is.

John Gajdecki:
So one of the things about House of David is there was this really strong belief in the transformative power and the disruptive power of consumer technology. So we filmed it all on Sony FX3s. if we were going this year, who knows, they might use the new Nikon ZR, but at the time the Sony FX3 was the way to go. It was basically a high-end prosumer camera. That first shot we were looking at, it’s a cable cam shot. This one here. So we had a cable cam rigged between cranes. It would travel through the air, and what we did–I’m pointing at the screen as if you could see me point–what we did is we had 100 extras that day, and we moved them from section to section to section, as we ran the cable cam through and then we painted them together. Now it was a consumer tool, so we had to retime it, and stretch it and fight with it. But it was a tool they saw at a tradeshow; they literally bought it off the tradeshow floor, and brought it out to Greece. We shot the whole show in Greece in 2024, starting around May to August. so, lots of work on these shots, and it was pretty cool. The next one we see, the nighttime shot, is actually one of my favorites. It’s Chlemoutsi, I think was the name of the castle. We had permission to go. We shot the plate from a drone, then all the tents, all the fire, that was added. We shot huge greenscreen libraries of people.

David Read:
Is any of this shot real?

John Gajdecki:
No. This is the first of the AI shots.

David Read:
Oh my God.

John Gajdecki:
Now, this is the early early early version before we made it look nice. and there’s an interesting story behind it. Two things: politically, legally on the house of David, we were really the first show to use this technology in a big way. At first, the studio was like, “No, we can’t use AI.” We don’t know what it’s been trained on. All kinds of legal uncertainty, so we weren’t allowed to use it. At some point, about a third of the way through, maybe halfway through the production–

David Read:
The winds changed.

John Gajdecki:
The winds changed a little bit, and they said, “OK you can use AI to do concept work. So we were doing an establishing shot of Jerusalem, might have been Bethlehem. And we were working on it; we had been doing this, that, and the other thing. But we started using AI to generate ideas. And it’s like, “Oh my God, these are so good.” These are better than what we were coming up with. and it was embarrassing, but it can generate ideas so quickly that we could look at and go, “You know what? This is really good. We are going to take the city walls from this one, we’re going to take the layout of the city from this one, we’re going to take the smoke from that one. We couldn’t use the AI that was generated; it was just too low in resolution, but also legally suspect; we weren’t allowed to use it. We then put it all together and built it in Houdini and Maya.

David Read:
So you used it as a reference to rebuild yourself.

John Gajdecki:
Correct. So that’s how it started. A month or so later, they came up and said maybe we could use some AI. but any AI shot, you have to do a duplicate traditional visual effects shot. And if you were to open the border village folder, that’s a good example of it. If we can come back to this later or we can finish up with this reel and go to the border village later. OK. So, this is the rough cut. You can see ’cause it has all the time code and the names and descriptions. That shot that you’re looking at now, let’s just scroll back and stop at it.

David Read:
Let’s go back.

John Gajdecki:
That shot was fun. It said, “Remove the swords, keep the arms.” It was one of those murderously difficult paint shots because of all the camera move and the smoke. Anyway, this shot here was one of the first AI shots that they were working on, and that’s what the AI team was able to put together. There was a visual effects supervisor, Sean Devereaux, and he was in charge of the AI team and did a wonderful job spearheading that whole movement. This was the AI shot, and there were about 30 shots at that time that we had to duplicate through traditional VFX shots. If you can play one of the next AI border village shots, the name of it is HDAV_101.

David Read:
I think that’s this one here.

John Gajdecki:
No.

David Read:
It is not.

John Gajdecki:
No. Sorry, everybody. We’ve put things in– It’s all a story.

David Read:
Let me pull this up here. HD-101, the next one–

John Gajdecki:
009.

David Read:
009

John Gajdecki:
Yep.

David Read:
OK, let me pull that here. OK, let me switch modes.

John Gajdecki:
This is the traditional visual effect shot. If you look at that, of course you can’t see them together.

David Read:
Just a sec.

John Gajdecki:
Maybe you can.

David Read:
Not side by side, but… So, this is the AI.

John Gajdecki:
It’s a good-looking shot.

David Read:
It is.

John Gajdecki:
It is. That’s the one that we came up with, and that’s the one that ultimately went into the show. In this case, the traditional visual effects went into the show. But that was a rarity. Of the 30 or so AI shots initially done, probably 10 to 12 of the traditional visual effect shots went in and more like 18, 20 of the AI shots were used in the show when Amazon said, “OK, go ahead and use them.” And once Amazon said, “Go ahead and use these,” then the floodgates were open. And from a studio point of view, ’cause I was talking to Netflix about another project that hopes to use a lot of AI, Amazon is a little more comfortable, perhaps? It’s a little easier to use AI shots on Amazon shows than perhaps at other studios, and I think House of David has a lot to do with that.

David Read:
So much of this is, “What does everyone else think? I think it’s good, but will this get us canceled if we use it?”

John Gajdecki:
The lawyers were not idle.

David Read:
I’m sure they weren’t.

John Gajdecki:
They had to make really serious calls about what they could get away with and what they could defend, and they decided that it was OK, for whatever reason.

David Read:
Has there been any legal blowback from this technology?

John Gajdecki:
I do not know. Now, to be honest, I only did Season One. We were still posting Season One when Season Two started, and they went with–

David Read:
Different vendor?

John Gajdecki:
They went with a different supervisor, and then they went to another supervisor, and then they went to another supervisor. So I don’t really know. But I’m sure it was a lot of fun for everyone involved. Why don’t we go back to the…

David Read:
You wanna go to 102 Gath?

John Gajdecki:
No.

David Read:
Or you wanna go to the demo reel?

John Gajdecki:
Let’s go to the demo reel. I don’t know if you can hear it, but my dog’s best friend just arrived, and they’re running around the apartment destroying it. OK, scroll back a little ’cause this is really interesting. This is one of the things that we learned. Excellent. Back to the slightly brighter, wider shot. Yep. Nope. Yep. Perfect. Love it. Gath was a fortress. Gath was one of the five Philistine cities in the Middle East. Nobody knows exactly where it is. It’s assumed that it’s in the desert somewhere. This was one of the first render generations that we were given. If you can see the detail, there’s sand like snow being blown up into the corners. There are people walking, and they pass through each other.

David Read:
In the shot?

John Gajdecki:
All the usual stuff. People walking through walls, people appearing–

David Read:
Collision

John Gajdecki:
For any visual effects people in the crowd, this is the pixel equivalent of a large language model. It’s not designing something like a 3D model and rendering it. It’s just showing us pixels that are likely to sit beside other pixels. So, what I’m getting at there is, when we film a plate, we can track the plate, we can get the geometry out of the plate, and we can lay objects on top. But in AI, there’s no underlying geometry. And when you put it into a 3D tracking program, it can’t do it, because the geometry that you see in the scene is not rigorous enough, let’s say. So, when we added new people, when we changed the environment, when we changed things, you’d have to do it little section by section.

David Read:
‘Cause it’s not the same technology.

John Gajdecki:
There’s a lot of technology that has to adapt to the AI workflow. But if you look at this, there’s two things I’m going to point out. You’ve got some battlements in the foreground. You’ve got basically square windows, and on the corner of all the buildings, there’s no ornamentation. If you roll to the next shot… This is really the next version that we were given. Now all the windows are arches, we’re on water, and there’s these weird asparagus things on the corners of everything, and all the battlements were gone. It might have been two months between when we got the first version and when we got this version.

David Read:
This is the same city?

John Gajdecki:
It’s the same shot, they just did a new one.

David Read:
Wow, just a different version.

John Gajdecki:
It’s a completely different version. But it’s the same kind of angle. The sound effects people, aside from the water, their work wouldn’t be wasted. And it’s certainly a better shot. And that’s what we do in visual effects. There’s this idea that we have. It’s called the CBB, or it could be better. And the way it works is, you work on every shot, and you get them all up to good, and you flag them as CBBs. Because sometimes, some shots are just resisting. Remember earlier on, I told you we had a version 173 on something. That shot was just resisting. We couldn’t make it… It just …

David Read:
Wouldn’t wanna go.

John Gajdecki:
… wouldn’t come together. It wouldn’t wanna go. All the other shots, we can’t keep working on those. From a budgetary point of view, from a scheduling point of view, you have to get everything good. You don’t wanna say, but it’s really good enough.

David Read:
You can’t get tunnel vision on a shot.

John Gajdecki:
You get everything up to that level, and then you try and make everything better. There was a real push on House of David to push beyond the CBBs. That’s a good thing. But it was an enormous amount of work to get things from CBB level up to where everyone wanted them. So, this is what I wanted to show you next, because AI gave us the big honkin’ wide shot. But Maya gave us the 40 shots of the castle for the scenes that take place at the doors. And what we discovered as part of the AI workflow is, because they can iterate so quickly, they gave us the one castle. We built the whole model to match. It was several people for several weeks. And then very close to delivery, they gave us a new castle shot, and nothing we did matched. So, in the new one, the doors were all archways. So, you can see here, we went through, and we made as many doors as we could arches so that it would feel like it’s the same castle. We had battlements on everything. We kept those. But we made those weird asparagus things on the corner of some of the buildings, and that’s really as far as we could take it in order to be able to rerender everything and comp it and deliver it in time for our deadlines. So the AI team could generate stuff really, really quickly, faster than the visual effects team could. That’s really one of the takeaways from that.

David Read:
Makes a lot of sense.

John Gajdecki:
But it wasn’t expected. You didn’t think about it. It’s like, “Oh, there’s a new version. That’s great. Thanks, everyone. I’m gonna go and cry into our vodkas.”

David Read:
And develop for a fraction of the cost, in terms of time and resources.

John Gajdecki:
For sure. It’s very, very cool. So, this scene here, this is the origin story for Goliath. Basically, the script called for cave paintings that would come to life as Goliath’s mother narrated what was happening. We were working on that for a long time. Hey, we’re talking about the things that didn’t come together. Most of the shots went together, and it was great. But it’s always fun to hear about the train wrecks. So it wasn’t really coming together, and we knew that there was work with the AI team to create this alternate approach, and this is ultimately what we went with. And I’ve gotta say, it was way better than what the visual effects team was doing. And we were delighted that they did it, because we would’ve got bogged down and it would’ve cost us a lot of quality, I think, on the final episode. This is so cool. But everything’s AI. So, she’s AI.

David Read:
But not the actors? She’s AI?

John Gajdecki:
No, no. Yeah. Dude, everything’s AI. Now, there’s a few close-ups of hands and things like that. And those were shot green screen. And probably the rain and some smoke and things. But these were generated, and this was more than a year ago. Now what I want you to take–

David Read:
This person doesn’t exist?

John Gajdecki:
No. And that’s really important. Can you stop on the green screen of her?

David Read:
Yes, I can.

John Gajdecki:
So, the process… So, your prompts, you’re saying, “OK. Give us this giant, muscular guy with wings,” and I would imagine, at the time, the AI kept pumping out somebody who looked suspiciously like Jason Momoa. And she probably looked suspiciously like Demi Moore or whoever, ’cause that kind of–

David Read:
Natalie Portman.

John Gajdecki:
So, what happened… The process was you get it started and then you shoot green screen people from matching angles that you train on. You say basically, “This is the face we want you to use.” While that wasn’t completely successful, it doesn’t look exactly like these people, what it does is there’s a direct line from a human being that you shot, that you have a contract for where money changed hands, to the AI shots. And it was really important for us, obviously, and it was really important for legal to say, “These are the people represented in the scenes.” AI is not hallucinating you ’cause of your pictures on the internet.

David Read:
That’s right. You gave it the raw material to start crunching. A lot of people are like, “Why don’t we just pop this thing open and see what the AI is thinking?” So many people are under that false impression. It’s a box. You feed it the information into the box and something pops out. But you can’t open it up to see what’s inside of it, because it’s doing its own thing in there. It can’t explain why it thinks the way it does. We just delete the ones that don’t give us what we want and keep the ones that do and iterate the next ones. And that’s amazing and terrifying at the same time.

John Gajdecki:
Sort of like the eugenics for pictures. But it’s also like people. You can’t crack someone’s head open and look at what’s inside. There’s no correlation between what’s in their head and what they say.

David Read:
It’s electricity and synapses.

John Gajdecki:
So, that was really important. Now, let’s scroll backwards a little bit, because this is fun.

David Read:
Just a second. Let me scroll.

John Gajdecki:
If we look at the actress, you can see that she has a dress that has no straps. But if you go back a little bit further, sometimes there was a dress with straps. Her arms, the way AI rendered it, it’s not realistic. What I’m getting at here is, there’s always paint work. So, even though we’re using AI to generate the shot–

David Read:
Touch-ups.

John Gajdecki:
Always touch-ups that happen, and for continuity, but also, could they have gotten there? Probably. But at some point, it’s quicker to send it through the traditional effects pipeline. “We’ll put a dress over her arms. We’ll get rid of the arms. We’ll get rid of the straps.”

David Read:
Frame by frame.

John Gajdecki:
As it needs to be. And that’s frame-by-frame artwork, essentially.

David Read:
Which shot was ultimately used?

John Gajdecki:
Not that one. Probably this one.

David Read:
My, oh my. OK.

John Gajdecki:
In the final cut, her dress has continuity. We can look at that cut. It’s somewhere, but that’ll slow us down too much. So, the takeaway is twofold. You need to shoot elements. You need to shoot as much reference on set as you can to train the AI on your actors. And if you are generating stuff out of nothing, you still need a legal–

David Read:
You can point to your raw material and say, “Hey, here.”

John Gajdecki:
This is where it’s coming from.

David Read:
If nothing else, plausible deniability. I’m not saying that that’s the right way to think about it. But if nothing else, you could argue that.

John Gajdecki:
I think legal felt comfortable that the way the work was being done was going to be okay.

David Read:
Now, this is my favorite from the whole thing here. Fantastic. Talk about reference for the AI.

John Gajdecki:
So, we did this scene using a lion. A funny story, my daughter brought a cat home from Greece, and we called him Lioncat because of this character and that they were the same color and possibly had similar temperaments. When we shot the plates, we had this gentleman in a furry suit, and he was there and he was a dancer that we hired because of their ability to move, and they would perform some of the shots. We’ve got some funny shots also just outside where the camera’s filming over this guy. I don’t think we even did any effects ’cause he was out of focus in the foreground. So, this gave us placement and lighting reference and timing. So, when editorial was working on the scene—and we didn’t do this for every shot—but editorial could cut with this scene. We would then have to paint the lion out. But it really helped us with the reality of the lighting and the timing, and it was easier on the crew when we were filming, and it was easier for editorial when they were cutting. I thought it would be funny ’cause it’s funny.

David Read:
100% it is. And there’s nothing wrong with low levels of entertainment in that sort of way, because this can get, I imagine, monotonous after a while. What I’m really seeing here, John, is you are taking the AI and you haven’t thrown the baby out with the bathwater. You are adding it to your existing tool set, at least at this stage, and using it to, at the very least, with the folks at the end of the line, A/B compare and say, “OK, which is better?” But you’re incorporating these elements into shots, and then turning the work over to people to give them more work to do, because the AI doesn’t provide exactly what you want anyway. So that, in itself, is encouraging because it generates more work.

John Gajdecki:
Correct. Now, I’m gonna jump in for a second. In truth, the lion was done in Houdini.

David Read:
OK.

John Gajdecki:
We looked into doing it in AI. It wasn’t ready. So, we ultimately had to go Houdini. A company called The Embassy did it here in Vancouver. Did a spectacularly nice job.

David Read:
Wow. Hello, Goliath.

John Gajdecki:
So, these were camera tests. No AI here, but I can discuss it briefly as part of the process. We didn’t know how big we wanted Goliath to be. We knew the bigger we’d make him, the more difficult it would be to create the shots. But we also knew the bigger we made him, the more dramatic it would be. So, there was always that trade-off. Make him bigger ’cause it’s cool, but it’s harder. And I’m a fan of harder, so these are some tests we did, and really the goal behind this test was, how much can the camera move?

David Read:
Exactly. This technique has been around for decades.

John Gajdecki:
Correct, but they always locked off. And I know that nobody locks off. And one of the things that makes a shot look real… There’s obviously the lighting and the perspective being right and the colors being correct. But one of my theories is, I have to make my visual effects look not real. I have to make my visual effects look photographically real.

David Read:
Then there is a difference.

John Gajdecki:
There is. You need to put in the same artifacts that get captured by a camera. So, that’s why we put in lens flares, and that’s why we …

David Read:
Film grain.

John Gajdecki:
… make the depth of field and film grain, for sure. But here, photographically real meant moving the camera in the same way so that our visual effect shots felt like main unit’s photography. And what I was trying to do here was break it. So, we move a little bit, we pan and tilt, and then the camera comes back, and it goes sideways, and really, we decided that there’s nothing the camera can’t do that we can’t track and make work. So, if you roll to the next one, we probably have a different size of Goliath. See there, we pushed way in, and yet our artists were able to track him in. So, here’s a couple of different sizes, and we probably chose the upper right-hand size. So, these tests were all about how big is Goliath. The test we’re looking at now, what frame rate are we gonna shoot him at? Are we gonna shoot him at 24? Are we gonna shoot him at 30? Traditionally, you’d shoot him at something like 72 or 96 frames a second. 120, probably too much. But we did a scale.

David Read:
It’s all about scale.

John Gajdecki:
The faster the camera moves, the slower Goliath moves, and that gives him a sense of weight. But we wanted our Goliath to be an athlete. And we needed Goliath to talk, so if we were gonna shoot him even at 48 frames a second, none of the sound would sync up. So we decided, after looking at all of these tests, “We’re gonna shoot him at 24. We can slow him down if we have to.” The sound guys will be able to match, but we filmed Goliath at 24, so he was approximately double size. So what that means is, when you film Goliath, you put the camera half height. If he was triple size, you’d put the camera one third the height. But you also put the camera half as far away. So any linear value, you scale by the inverse of his size. But any radial value, you keep it the same. So if we were 25 degrees to the side, we still shoot at 25 degrees to the side, only we’re half as far away and the camera’s half the height. And when you do that, it just works.

David Read:
It’s multiplication and division.

John Gajdecki:
There’s not much to it, except being really rigorous, relentlessly rigorous. Don’t practice your alliteration on me. If you stick to the rules, you do it all the time, it’s gonna work. So, here, this shot, we shot Goliath on green where he was half the height by moving the camera down. Then if you scroll back a little bit, we walked a gentleman through the shot using what we call the giant on a stick. It’s just a pole that, if the shot had continued, tilts up and you can see his head.

David Read:
His head.

John Gajdecki:
There’s basically a toy soccer ball on the end. And that’s to give everybody their eye lines. Now, he’s wearing a blue thing, not ’cause it’s a blue screen, but actually ’cause it was raining that day. No one really carries raincoats in Greece. But they had these kind of blue garbage bag things that just came out of nowhere.

David Read:
It’s a poncho.

John Gajdecki:
Poncho. Good use of the word poncho.

David Read:
That’s what it is.

John Gajdecki:
It is. Here is the final shot. All right, I guess you can scroll forward a bit. Very successful. And one of the things we did is we sat on the shot for a really long time.

David Read:
Why?

John Gajdecki:
Because the longer something’s on the screen, people feel it must be real.

David Read:
I repeat my name long enough to the point where it no longer sounds real. Are you saying that that’s not correlative in this case?

John Gajdecki:
No, I don’t know where that came from.

David Read:
OK. I’m probably just foolish.

John Gajdecki:
Good on you. No, if a shot goes on and on and on and on and on, people stop looking for the trick. They stop looking pretty quickly. Philosophically, we tried to sell Goliath right away, right off the top in really convincing shots. These shots turned out great. I’m really happy with these shots. There were no restrictions on the camera. There were no restrictions on the director. “You guys shoot what you need and we’re gonna make it work.”

David Read:
At the end of the day, he got exactly what he wanted.

John Gajdecki:
I hope so.

David Read:
Probably for a fraction of the cost of what it would’ve taken to do it a decade ago.

John Gajdecki:
Absolutely.

David Read:
Minus inflation.

John Gajdecki:
This shot here, this is fun. Again, early shot of Goliath. We had to sell– In the upper left, that’s what we shot on set. We went through and we pulled all the big rocks out. We went through and we pulled all the big grass out and things like that. And he’s pulling a cart and you can’t really see it here, I guess ’cause we don’t see the wheels. There’s a guy behind the cart pushing it. So the cart actually has feet like the Flintstone car. So, of course, we always had to paint that out. But the scale wasn’t working. So what we did is we created these two guys in Houdini. So they’re not real at all. And we had a Rokoko suit, so that was a little motion capture suit that our artists could wear. That’s the model that we built. And we made sure that the wind would blow their kilts, for lack of a better word, and their hair. It’s very windy in Greece. Then we created this little scene between the two of them and Goliath. And of course we did the shadows. And the shot took a couple of days, but not as long as you’d think, and it really– It’s the first time we saw him in this episode, and it really set the scale.

David Read:
That’s just extraordinary. They don’t even exist.

John Gajdecki:
It was hot. Then here’s some other examples. Sometimes we would use green screen to sell Goliath’s height, which became massive paint jobs. Because you have to paint everything out and then scale it and put it back together. So, a lot of paintwork went into these shots, even if they were half a second long. Goliath, the actor, funny guy by the way, just a really good guy. He’s six foot four, so he’s the same height as me. And we would always try and cast the shortest people to act against him. Now, in the story, for the senior roles, you had to pick the best actor, so we knew we were gonna deal. But in this scene here, we found some gentlemen who were shorter. Shorter people tend to be proportioned better. And they look better. So, I was really happy with the way that it went. Except for there was this one shot where he’s walking out past some guards and somehow they cast the biggest guys they could find. It’s like, “Why? Why would you do this?” We could have found somebody a foot shorter. Probably lots of the extras came from the military. You want that kind of training. So, here’s a bunch of shots. If you could scroll back to that green screen– Or there. We basically shot 100 extras for several days on green screen from nine cameras. And that allowed us to build these armies up without using AI, without using 3D. Just quickly throwing green-screen people in.

David Read:
And that’s what I can think of, Robert Zemeckis did in Forrest Gump at the reflecting pool in Washington, DC.

John Gajdecki:
That’s right. In Forrest.

David Read:
He locked off his shot and moved the people from section to section. This is a little more complicated, or less, depending on how you look at it.

John Gajdecki:
Some of these shots are locked off ’cause that’s how they shot it. But there was never a requirement to lock off. And I like that the camera’s always moving a little bit. It makes the shots look very real. It’s not something that you see so much as you feel.

David Read:
That’s it.

John Gajdecki:
That’s what it was here. Lots of armies. There was a big Goliath. Here’s CG people in the background. Those guys were all created in Houdini. One of the things that we found at the time that we did House of David, and even now using the AI tools that are available: you can load plates and do the work to the plates, but the resolution is still low. The color depth isn’t what it needs to be. AI mostly generates 8-bit. There’s a few models that generate at a higher color depth, but not many and they’re not necessarily the best-looking ones yet. So, we couldn’t easily generate armies like this at the time. That’s gonna change, but it’s not there yet. There’s an AI fortress.

David Read:
Want me to share your screen? Or what?

John Gajdecki:
Probably could.

David Read:
OK. Give me a second everybody. Let me authorize. OK.

John Gajdecki:
I’m going, OK.

David Read:
Just a second here. OK. Go ahead.

John Gajdecki:
I’m gonna turn down the volume. There we go.

David Read:
You need to share it though.

John Gajdecki:
I’m gonna share my screen.

David Read:
OK. Perfect.

John Gajdecki:
Clicking screen one. Sharing. Moved over there.

David Read:
Awesome. Got it.

John Gajdecki:
There we go. So, this is similar to the other castle we saw, only now you can see there’s battlements everywhere, which is what really triggered– This is probably the first one that we based our work on. So, this is an early AI test. I’m gonna push play, see if it plays. So, what would’ve happened is this is what we’d get from the AI team, or this is what would go to editorial. They would cut this into the show, as you can see here. It’s a nice placeholder. Gonna close this, go to the next version. Let’s have a look at this. That’s the final.

David Read:
Much more dynamic shot.

John Gajdecki:
Much more dynamic shot. The AI guys kept riffing on the shot. One thing I did want to point out is the showrunner, a guy named Jon Erwin, actually did a lot of the prompting. He was unbelievably good at this. He did have an effects background. I think a lot of the success of the shots were because of his ability and his relentless pursuit of AI as a tool for the show. I’m gonna go to the next one to see if it’s a previous version. This is a previous version. This would’ve been the raw AI that we got. And there’s a few things to note. One of them is there’s really no people. There’s very few people. This is probably after we painted the people out, because there would’ve been people walking. Again, the standard AI problems: people sort of passing through each other. People walking into walls. One thing that we noticed, and this was a surprise, is that this looks a little too much like the Palestinian flag. AI is doing a fortress and we asked for Philistine, it put that flag on.

David Read:
It’s pulling data from Middle East.

John Gajdecki:
Correct.

David Read:
Whatever bucket that’s in, in its mind.

John Gajdecki:
That would’ve been the wrong thing to do. The war was sort of full on at the time and not too far down the coast.

David Read:
That’s it. Exactly it. It’s from that part, that kingdom of the world.

John Gajdecki:
Correct. Of course it is. That’s what AI does. At the last minute, we had to paint out the flags, replace them with our– ‘Cause we had the flags for each of the Philistine’s cities. All of the people that you see here are 2D comps.

David Read:
They don’t need to be complicated.

John Gajdecki:
No, they’re just green-screen people that we comped in to give it some life. The flags we put in. But the shot itself started as AI and it’s really beautiful.

David Read:
That’s amazing. I love that you could take the computer’s AI output and get a chunk of the work done. And then you guys go in and you clean up what you want to clean up once you’ve decided on the shot that you want that it did, like someone storyboarding.

John Gajdecki:
Actually, that’s a good metaphor. That’s a good way to put it.

David Read:
And I also think that it’s an interesting metaphor that we’re using this particular program of David and Goliath and discussing this issue. I don’t know if that’s occurred to you yet.

John Gajdecki:
No.

David Read:
It’s brilliant how that came together.

John Gajdecki:
I was gonna play this. This is something that I put together and it’s kind of fun because on screen left is the cut. This is the locked cut of the show as it came from editorial. On screen left is the final shot.

David Read:
Left is final? Where it says Property of House of David Productions.

John Gajdecki:
I have a real problem with right and left.

David Read:
That’s all right. Sorry, right is the final.

John Gajdecki:
I really suffer from dyslexia.

David Read:
No, you’re OK. The right is definitely the final.

John Gajdecki:
The right is definitely the final. This is what we shot, this is how they edit it together, and then this is what we came up with. We’ll watch it through and then I’ll step you through some of the shots because it’s kind of fun.

David Read:
I’ve got another 25 minutes or so.

John Gajdecki:
We can wrap it up whenever you need. Blah, blah, blah. This one’s interesting. And then off we go. I’m going to head back to the head and talk about a few of the shots. You can’t really see it, but we added sort of a river and we made this look more like our location when we were done. Actually, I think we let this one go.

David Read:
We can see your cursor, by the way.

John Gajdecki:
These shots, in a few of them, there were cameras and camera crews and things that we have to paint out. But really, this was just establishing. This shot is really interesting. This is not an AI shot. This was done through traditional digital effects. But if you look at what we had to cut out and then replace… If in the Emmy Awards, there was a category for outstanding achievement in rotoscoping…

David Read:
Exactly, with all of the tendrils of hair.

John Gajdecki:
And the wind and the fur and everything. This is the best piece of roto I’ve ever seen in my life.

David Read:
Frame by frame. How long did it take someone to do that?

John Gajdecki:
I suspect it took a week or more, and possibly longer than that. But they did a spectacularly good job. Here you can see, all we did was cut the three giants out. We made them bigger.

David Read:
Made them bigger.

John Gajdecki:
The way they interacted with the tree was messy, but you work your way around.

David Read:
There you go.

John Gajdecki:
For some reason, there’s a new guy here, but one of the artists would have said, “You know what? It’s easier for me if I put this guy here.” It probably sells the height difference.

David Read:
It fills in the space.

John Gajdecki:
I bet you it’s also continuity. Let’s see what the previous shot is. I think it was continuity. There was a gentleman standing in front of them. Here they are again. We didn’t even change him ’cause nobody would see him. Nothing here, nothing here. This was gonna be a traditional visual effect scene. It was shot in a different location. It didn’t match the location. We were tracking people all over this. We had armies down there. We spent a fair bit of time, but the AI team put this together, again, much quicker than we could. The process was– ‘Cause you need to train the AI on your location. We can’t show them this shot, because it’s not what we wanted. What we did was we took a drone to the location where we shot all the battles and we shot stills. We shot hundreds and hundreds and hundreds of stills.

David Read:
Like plates.

John Gajdecki:
At the height of noon, at sunset, and in blue hour after the sun had gone down. And then the AI team used that to build an Unreal model of the location. And that’s what they trained the AI on. They built it up and then they went from there. Now, you can see there’s some things that the AI did that we didn’t really want. These tents are enormous. These tents are the size of a Walmart. But that’s what it gave–

David Read:
Goliath tents?

John Gajdecki:
Yeah, these tents down here. But no one’s gonna notice.

David Read:
That’s true.

John Gajdecki:
We talked about this a lot, and this is where The Two Towers came in. The visual effects were done to a very, very high standard on the show. If we held AI to the same standard, this isn’t the layout that the army is in, for instance. Our location doesn’t have those mountains in the background. The tents wouldn’t be that big. But then we wouldn’t have had the shot, so there was a conscious decision to allow the AI team a little bit more freedom because what we got from them was so cool. That’s what gave House of David the tremendous scope that it had. This shot was done in Maya, and with Houdini armies, and gigantic matte paintings back here which took weeks and weeks and weeks. And they were able to turn something similar out in not half the time, maybe a quarter of the time?

David Read:
That’s extraordinary. We all see where this is going. If we’re gonna cut to the chase here, we’re gonna hand more and more of this over to the computer. And that’s just what’s happening. The YouTube software is beginning to flag some of this stuff, John. Just so you know.

John Gajdecki:
Let’s put it away.

David Read:
No, this is exceptionally, exceptionally cool. I really appreciate you sharing it. I have some questions that fans have submitted here. And I do want to get into this. Bravo. Very well done. And it only makes sense that this is where things are going as these tools become more photorealistic, as you put it. It has to… We accept when we sit down in front of a movie or a TV series that we’re going to be suspending our disbelief to a degree. What we’re seeing has not actually occurred. And once you have that mindset in place, it’s just a question of how far are you, as an audience member, willing to go? You, as the supervisor, have to place yourself in the position, in the head space, of that audience member. From a narrative perspective, it’s like, “OK. This is what the director wanted to achieve. This is what the story, what the script, what the documents have to say about what it is that we’re trying to achieve.” And you get to ingest all of this and make sure that from shot to shot, it is consistent. Like you were talking about, “Well, this guy was standing in this shot because it’s to be consistent from the previous shot.” You’re not looking at any one piece, any one shot, any one frame in a vacuum. The structure of storytelling and narrative, I suspect, will be the same for quite a while. It’s just the tools that we’re using to get there, and that’s what you guys are doing. And I think House of David is a great example of it, and I think it’s perfect that David and Goliath is the story that was used to spearhead that and be used in this example here. It’s wild, if you think about it. Marocmaster, this is interesting. He says, “I hope AI gets so good that I can generate my own fake version of Season Six of Atlantis,” which didn’t come out, “using the scripts to fix my first heartbreak. What is your opinion about it being used to show what might have been?” And I’m guessing by fans. Joseph Mallozzi went and created a fanfic version of Season Six, but he also created Atlantis Seasons Four and Five. At some point, someone’s going to be able to go into an AI and say, “OK. Here are your outlines for these stories that he’s made.” It’s 20 stories, just like each season was. Animate them, bring them to life. And software will be able to do that. Now, I’m assuming at some point, I’m curious as to your line of thought on this, that something like iTunes will eventually come along to facilitate providing an end user with content of something and getting the money directly to the right person. Just the same way as an end user could go in 20 years and say, “Put John Sheppard and Todd in a scene, and royalties from that will be connected to Joe Flanigan and Christopher Heyerdahl and MGM, Amazon, or whatever next, and they will get two cents if it’s a two-second shot, for instance.” I’m assuming you would agree that that’s probably where this is ultimately going for the legitimate stuff.

John Gajdecki:
There’s a saying we have that it’s not show friends. It’s show business. And that is what it is. So, until there is that sort of screen to bank account pipeline, you can only use the tools so far. It can’t have a gigantic market, because the distribution… It’s gonna get shut down. It’s because they can’t use somebody’s likeness, we can’t use somebody’s likeness without a contract covering it, whatever the consideration, whatever the remuneration turns out to be. Can it be done?

David Read:
How many times have they tried to shut down The Pirate Bay?

John Gajdecki:
If they didn’t, it’s ’cause they gave up, or somebody came to an agreement, or they decided it’s not enough of a threat, or anything. But the music business isn’t what it used to be. In any way.

David Read:
My point is to say that if I want something for free, I can still get it. Having said that, there are legitimate pipelines in place where I can still get the brand new song that I want for $1.99 or $2.99. And as far as I’m concerned, that’s still great. And I’m without a doubt gonna get it at great quality.

John Gajdecki:
It is. Look. Everyone I know, ’cause I have no friends, but everyone that I work with, we all do this for a living. These are our jobs, and the only reason we’re employed is because money changes hands and a small portion of it ends up with each of us. And we make a good living, and I love what I do, and I can’t stress any of that enough. So, when things are free, it’s a bit of a problem if they’re taking people’s likenesses or if they’re taking other people’s work or intellectual property. In our society, we trade money for work, essentially.

David Read:
“Money can be used for goods and services,” as Homer Simpson said.

John Gajdecki:
It can, and it is. But I think there’s gonna be a huge market for this. If you wanna see Season 6, if you wanna see Season 66, you’ll make it yourself, your friends might make it, people you’ve never met might make it. The computer itself one day might go, “You know, I’d like to see what happens,” and it’ll make it. But it’ll be hard to get unless there’s a formal legal framework for it.

David Read:
That’s it, and the bigger it is, the bigger its exposure is, the more likely that the lawnmower’s gonna cut that blade of grass first.

John Gajdecki:
If there’s money to be made, you can bet that Hollywood’s gonna be all over it. And I can’t blame them, ’cause it’s a business. And it’s a wonderful business. What it creates, super cool. And when I say Hollywood, I’m now referring to the global entertainment industry, and not just the one in Los Angeles.

David Read:
Kathiescall wants to know, “What happens to digital assets like this after the production is completed?” So, are you guys required …

John Gajdecki:
That’s a good one.

David Read:
… to copy them and get copies over to the studio?

John Gajdecki:
Yes.

David Read:
Do you have a contract where you’re allowed to keep your stuff for your own future work? How does that work?

John Gajdecki:
OK. Netflix really pioneered not necessarily an efficient system, but a very good system to get all their assets. When we work on a show, we have to make sure at the end of the show there was a very clear set of guidelines what they would like, much more clear than the other studios. So, yes, when we finish a show, all of the assets get copied and sent to Netflix. Sometimes some artists will then take what they’ve built and it seems to end up on TurboSquid in a modified form so that people can buy it in the future. I’ve never done that. I’ve never condoned that. But sometimes I look at a model going, “That looks suspiciously like something I saw over there.” I have about 90 terabytes of stock that I’ve collected over my 30-plus years in the business. It cost me a lot of money and time to organize it and to collect it and to put it in places where we can find it again.

David Read:
And verify through checksums that it’s all not corrupted.

John Gajdecki:
Correct. And I have this one favorite piece of smoke and somehow it’s gone missing, and I know it’s in one of those hard drives over there. So the next time I need to find it, it might take me two or three days to dig through that. Or let’s say it will take somebody two or three days to dig through those and find it. But I do remember years and years and years ago, this is a mildly funny story. I was working with a producer, he was a bit of a carnivore, and we were in a production meeting and they were talking about a certain type of effect. And I said, “I have some stock of water that does just that, and this is what I charge for my stock library.” And the producer said, “Wait a minute. You shot that on my last show. You can’t sell it to me again.” I said, “If you can find it in your archives, then we’ll use it for free, but I spent a lot of money to organize it and to find it and to put it on these tapes and to build a database so I can find it. So my fee for bringing it back is this.” No one at the table said, “That’s outrageous.” They’re like, “Of course.”

David Read:
You’ve curated this collection.

John Gajdecki:
On every show that I do, there’s a small fee to cover my stock library, and I bring the stock library with the show, and there’s 90 terabytes of smoke and wind and rain and explosions. Of course now there’s stock footage companies who have really good stuff. And sometimes we buy their licenses, and of course you always have to buy the enterprise license, not the cheap little indie license, ’cause sometimes their stuff is better than mine.

David Read:
And now you’ve got systems that can create them from nothing.

John Gajdecki:
Yes, we turn all the assets over to the studios. We also keep them ’cause maybe I’ve got a car shot coming up and I’ve had a similar car in the past so I can take that model and reuse some of it or reuse some of the textures or something. Visual effects are very expensive. AI is gonna change that. But we use the assets that we built to make things quicker and faster in the future.

David Read:
Lockwatcher: “Do you prefer practical effects when possible still even when it’s more expensive?” Or are we finally at a point where photorealism can be captured just as easily 100% within a computer? I guess it would depend on the shot?

John Gajdecki:
It’s a shot-by-shot basis, of course. I prefer what looks best and is cheapest, and you have to do that because, again, it’s a business that we work in. There’s the good, fast, cheap triangle you’ve probably heard of.

David Read:
We talked about it.

John Gajdecki:
We pull those levers on every single shot. When I was doing Superman & Lois, we would often shoot fire elements from custom angles because we didn’t have a piece of stock that would match. And while the 3D guys could probably generate it, the way it works is as a visual effects supervisor I have a budget for a certain amount of money to work on a season. Some studios say you can’t move money between episodes. Other studios say, “Of course you can move money between episodes.” If I get production to shoot something for me, it falls under a different budget. And art aside, there’s politics and there’s finance in visual effects. If the team is busy and I don’t have time to create that element, I would go to the producer and say, “Hey, what’s the chance of us doing a fire shoot in the parking lot and you getting me these elements?” And they will then weigh it from their point of view, “What’s it gonna cost you? What’s it gonna cost them? And who’s gonna do it faster?” Every month or two we’d have a big element shoot. We’d sort of gather up the elements that we needed and we’d go in the parking lot at night and blow stuff up.

David Read:
Shoot ’em up.

John Gajdecki:
It’s usually explosions and fire ’cause there’s a real look to those that while we can do them, the artists are busy doing other stuff. So, I’ll shoot elements when I need to, I’ll use CG when I can. Elements tend to be slower, although sometimes we needed some smoke for something, I’ll grab my Nikon and I’ll set something on fire. There was a show I did years ago called Van Helsing, it was a lot of fun. We needed some bubbling flesh and we just made up, in a pan, something that looked vaguely like flesh and we heated it up and it started, it would pop and boil and smoke would come out. We just shot that from a bunch of angles and gave it to the artists.

David Read:
That’s so cool.

John Gajdecki:
In they go. We use elements more than you think, sometimes just for little tiny things because we’re in a hurry. We have deadlines and we have budgets and that was the best way to do that particular thing.

David Read:
CristinaGraziella, “How are the Stargates created digitally?” Is that Maya or was that Lightwave?

John Gajdecki:
That’s a really good question.

David Read:
Which tools were used for that?

John Gajdecki:
Different companies have different pipelines. I’m comfortable with Maya, I’m comfortable with Houdini. And we comp in Nuke. We would use Flames and Infernos but even back in those days we used Maya, which was often called Power Animator, and Houdini, which would at that time be called Prisms. Other companies used Lightwave, maybe? I don’t know. You see Blender being used quite a bit. Depending on the company doing the work they would use different tools. The models were not necessarily transferable, so, for instance, when we were doing Stargate, the Puddle pass-throughs. There was the kawoosh, we filmed that using this very movie camera. Just water shot, just air blasted into– In fact I’ve got some pictures of that floating around somewhere. And then the pass-throughs were done in Maya, and at the time it was limited to three ripples, which seems ridiculous but using that toolset you could have three ripples.

David Read:
That’s fantastic.

John Gajdecki:
If you go back and watch the first season or two, you’ll see that the number of ripples are limited before we cut away. Other companies couldn’t make it work as well as we could, and MGM “politely” asked my artists to explain to some of the other companies how we created the puddles and how we made them look so good, and it was a bit of a rebellion. People were kind of upset about that.

David Read:
You politely give me 10,000, 20,000, 30,000, maybe?

John Gajdecki:
No money changed hands.

David Read:
Wow. That’s, I guess, a team player thing?

John Gajdecki:
I was a supervisor for the show. I’m artistically, creatively, and financially responsible for the show. There was no effects producer; it didn’t work exactly the same way back in those days. These days there’s effects producers and they look after the finances. So, as a supervisor, I was asking this company to explain to this other company how to do the work. In real life I was asking my own company to give the competitors the secret sauce. I was watching The Godfather last night and MGM made me an offer I couldn’t refuse.

David Read:
Exactly. There you go.

John Gajdecki:
And that was fine because at the end of the day that was the right thing for the show, and as the show supervisor that was what I had to do.

David Read:
Rudi says, “Do you end up retaining ownership of some rights to the assets yourself or the company which created them which you worked under, or not?” Is it all contract to contract? Where does that…?

John Gajdecki:
All the rights belong to the studio. They’re paying the check, they get everything. We store things like smoke, like I said, I’ve got this stock library.

David Read:
The elements.

John Gajdecki:
There’s not a single actor in my stock library. I can’t reuse a person, I wouldn’t reuse a person; they’re not gonna be doing what we want. But if I shot smoke on one show and I use it on another, technically that’s someone else’s smoke, but everybody does it and if we didn’t do that, effects would be more expensive and generally that’s what we do.

David Read:
Makes a lot of sense.

John Gajdecki:
But we don’t own the rights to it.

David Read:
Raj Luthra–we’re gonna steer this towards the end of this conversation here, and I think he sums it up. He says, “Surely there must be a best of both between visual effects and artists and using AI so no one loses their jobs.”

John Gajdecki:
Huh.

David Read:
People are gonna lose their jobs.

John Gajdecki:
People are gonna lose their jobs. Look, every 10 years or so there’s a writer, director, actor strike and it absolutely blows a hole through the industry. There’s a few reasons why that might happen, some of them political, some of them financial. Obviously there’s the legitimate concerns of the people who are striking, and the concerns of the studio as well. But every time that happens there’s a huge drop in the amount of work. Now, I used to have a Toronto company, a Vancouver company, and a Santa Monica company, and around 2000 there was a strike like that and it wiped out a lot of people, including me. So I lost my house and my savings and my favorite car and I started again when I was 40. And that happens about every 10 years and it ripples through the industry. A portion of the people who are the best, whether they’re the fastest, or the nicest to work with, or the most creative, whatever metric is being used at that time, those people survive the strikes. And the other people who are in the business, but are maybe difficult to work with, maybe their art isn’t what we’re looking for right now, they often don’t survive the strikes, and they move to another career. And it may be more lucrative, it might be less lucrative. It’s not fair. Fair is a funny word, isn’t it?

David Read:
Fair’s a funny word.

John Gajdecki:
But that is what happens, and it’s not just the movie business. So, I’ve been lucky. When these things happen, and honestly, they happen every 10 years or so, I somehow survive each one.

David Read:
You produce quality work that people whom you worked with previously remember you.

John Gajdecki:
People comment on my sense of humor a lot. That I work really hard.

David Read:
It’s definitely unique. Definitely very unique.

John Gajdecki:
I work hard, I have a quirky sense of humor. I learn my material inside and out. When I started Stargate, I read everything I could about that period. Egyptian, but also watched what I could. When I did Project Blue Book, I read everything I could about the military and also UFOs. So I internalize the subject matter, so I become a subject matter expert. But I also like what I do, and I try and let that show. People like having you around.

David Read:
100%, man. I sure do appreciate you. Really quickly, I gotta share with you here something that I’ve been working on. Herbert Duncanson is coming on the show tomorrow. We’ve been wanting him for ages. And this was a headshot that I had of him, and obviously, I have an equilateral square. I guess that’s redundant in this case. But I needed to– What I’ve been doing for years is Photoshopping on the edges, because a lot of these actors’ headshots, they think it’s cool to be off-center. In this case, I don’t have information. So, I tried about two or three years ago for ChatGPT to do something like this, and it was a mess. So yesterday, I approached it again, and I said, “Here is the base image. Add on the shoulders.” And watch what happened here. It made him clearer.

John Gajdecki:
Yes, it did, didn’t it?

David Read:
To the point where I was really gonna be suspect with that. It’s, OK, it’s gonna give him new eyes, new nose, new shapes. And I don’t know about you, but I cannot see the artifice. It really just made him clearer. I was like, “I wonder how far I can push that?” So, this is Joseph Malozzi’s first image that I originally had when he first came on 16 episodes ago. He’s done 16 since. I’m like, “I wonder if it can refine this for me.” This is just, again, ChatGPT. It spat out this.

John Gajdecki:
Interesting.

David Read:
My whole point is to say, it’s not there yet. But it will be.

John Gajdecki:
Dude, it will be. There’s no doubt about it.

David Read:
Technology, you cannot stop the wheel of progress. You can only grab on and hold on for dear life.

John Gajdecki:
I’m gonna throw a tiny hand grenade in there before you go. Didn’t mean to cut you off.

David Read:
Be my guest.

John Gajdecki:
I was listening to this podcast. And it was about something else, but the guy made this really interesting statement. And he said, “We’ve been told in the last couple of decades that if we love what we do, and if we get really good at what we love to do, that’s gonna be enough.” But it’s not.

David Read:
Enough for whom?

John Gajdecki:
Enough to be happy in life. Enough for your career to take off.

David Read:
Prosperous? Yeah.

John Gajdecki:
Be prosperous in the way that we all imagine.

David Read:
Exposure is a huge deal.

John Gajdecki:
But it’s not. What is needed is for you to get really good at something other people need. You might wanna do something. If there’s no market for it, there’s still no market for it. You can be happy. Let’s separate money from happiness. But if you want that career, if you want a job in a field that you enjoy, you’d better find something that you love to do that somebody needs. And I’ve been really lucky. Don’t get me wrong. I love what I do, and I’m lucky that I’m good at it, and I’m lucky that it pays pretty well.

David Read:
That’s it. You are in a sweet spot there.

John Gajdecki:
I really am.

David Read:
I don’t know if I’ve been very transparent about this before, but Dial the Gate brings in about $1,200 a year. It makes about $100 a month on Google Ads. I don’t do a Patreon. I don’t believe in taking money from the fans. This is free for fans. I do this show because I love it. I have other things in my life that function to perpetuate the house I’m sitting inside of. So that I can continue to do the things that I love. I’m consciously aware of that. You guys creating what you’re creating, it takes up so much time and energy. You have to be fairly compensated for it, because the amount of energy it takes to maintain yourselves on the cutting edge of that field is almost everything in and of itself. We’re not even talking about the creative eye and the talent and everything else. You have to keep up with it. Otherwise, you get left behind. John, it has been wonderful to discuss that process with you.

John Gajdecki:
Cool. It’s been wonderful.

David Read:
And thank you so much for taking me through all of your pieces.

John Gajdecki:
Excellent. I hope we do it again and get some more questions from the fans.

David Read:
Absolutely. Thank you, sir. I’m gonna go ahead and wrap it up on this side. Thank you. John Gajdecki, everyone, visual effects supervisor for Stargate SG-1 and Atlantis. My name is David Read. You’re watching The Stargate Oral History Project. We have David Rich coming up in just a moment here. He wrote “Upgrades,” SG-1 Season 4, before superheroes were really cool. We’re gonna speak with him in just a moment here. If you enjoy Stargate and you wanna see more content like this on YouTube, do me a favor. Click the Like button. It does make a difference with the show and will help us continue to grow our audience. Please also consider sharing this video with a Stargate friend. And if you wanna get notified about future episodes, click Subscribe. And if you give the Bell icon a click, it’ll notify you the moment a new video drops, and you’ll get my notifications of any last-minute guest changes. My tremendous thanks to my moderating team. I have a whole host of people who make this show possible week after week. Let me pull up who was involved in this particular one so I give everyone their due here. Thank you to Kevin and Antony for being our chief moderators today. Jeremy, Lockwatcher, Marcia, Raj, and Jakub, you guys all make this happen. Thanks to Matt “EagleSG” Wilson for his Puddle Jumper in the opening, and Frederick Marcoux at ConceptsWeb who keeps DialtheGate.com up and running. A number of shows are heading your way. They are all organized right now on DialtheGate.com, so go and check ’em out. David Rich is joining us in about five minutes, so stay tuned. My name is David Read for Dial the Gate. I appreciate you tuning in, and I will see you on the other side.