Image courtesy of Industrial Light & Magic

Rear projection on Solo: A Star Wars Story

We’ve looked a lot over recent months at the burgeoning technologies being used in modern virtual production workflows.

However one of the techniques being used in parallel to these (albeit with a modern twist) harks back to over a century ago.

Foundry Trends caught up with Rob Bredow, Head of ILM, about his work using advanced projection techniques as VFX supervisor and co-producer on Solo: A Star Wars Story.

STAR WARS

 

Foundry Trends (FT): How has rear projection advanced since the old days?

Rob Bredow (RB): Rear-projection is one of the oldest VFX tricks in the book, literally used for 100 years or so. So it’s not a new creation to project media on a screen - whether front projection or rear projection - and then re-photograph the result to create an in-camera composite.

This is a tried and tested technique, but the latest digital technology allows us to do it at much higher degree of sophistication than we could have done 100 years ago. On the film Solo, for example, we actually used multiple high res 4K laser projectors - so they were very bright, they had very good contrast - and we seamlessly stitched those multiple projectors together.

In the case of the Millenium Falcon cockpit, when you looked out, you saw a screen wrapped around the cockpit 180 degrees. It was very much how you might do a live film, except we were doing it on location, and projecting digital media.

Much of that media was pre-rendered, so as we were hopping into hyperspace we were shooting the transition at final picture quality - playing back an 8K x 4K clip, and then the really nice thing is, we had a bunch of interactive elements on top of it that we could cue at a moment’s notice.

In some cases, Bradford Young would call out that he wanted more of the green blaster fire backlighting the actors for a particular moment, because of where the camera was.  We could interactively cue that blaster fire, exactly where we wanted to on the screen, and continue to achieve full, photorealistic in-camera finals with those shots.

We did that for the entire Kessel Run sequence, and in fact for most of the sequences where we were flying in the Millenium Falcon for the show. That combination of very high quality media, very high quality projections, and the ability to add interactive elements, really pushed things forward for us.

STAR WARS

 

FT: Why is rear projection now coming to the fore again?

RB: The fun of combining these old school techniques with cutting-edge technology is really getting back to a more complete experience on-set. For a while, we found it incredibly convenient (and it still often is convenient), to use a blue or a green screen, and to fill in what’s happening outside the windows later.

That’s still a viable option. But more and more, as audiences are used to that trick, it’s nice to surprise them with a new one - or in this instance, an old one - and one of the things you get when you put everything together on-set in real-time, is a more creative and immersive experience.

A great example of a shot that was different because we used this technique is the moment when when Lando and L3 are sitting in the front seat of the Millenium Falcon, and Han walks into the cockpit for the first time. The shot starts in that quintessential forward shot, over Lando and L3’s shoulder as they push the bar forward to go into hyperspace. But you’ll notice, most Star Wars movies cut at that point, because you've exhausted the shot.

On Solo, because DP Bradford Young and 1st camera operator Sylvaine Dufaux, were able to experience this all in the moment, they actually designed a new shot which started over the shoulder, then pushed to camera right where Han was standing, and you can actually see the reflection of hyperspace in Han’s eye.  Itis a really nice moment in the film with a shot that was designed on set, mostly because we had all of those interactive elements together.

The projections  helped the camera department, but they also  helped the actors, who had this fantastic environment to react to, and put them in the mood of the scene.

Hopefully audiences pick up on the subtle differences that we can achieve now in camera that we couldn’t before.

FT: How did the actors find working in a rear-projected environment?

RB: One of my favorite moments on set was the very first time we put the actors in the cockpit, and they were rehearsing the scene, and at the end of the scene they go into hyperspace.

When they first walked in, we just had stars up on the screen. We had the lights low, and we didn't tell them in advance that I was standing-by ready to cue hyperspace.

When they finished the scene, they pushed the levers forward, and I cued hyperspace over the radio to Phil (our operator who was triggering the media) and we put them into hyperspace - and the special FX team simultaneously  moved the hydraulic rig to shake the Falcon, and the cast just flipped out - they just couldn't believe it. They felt like they were really in the Millenium Falcon cockpit.

Donald Glover -  who plays Lando in the film -  was in the cockpit at the time. After everyone has calmed down after that initial surprise reaction, Donald was just sitting there talking to himself and I could hear him over the headphones - and he says quietly to himself,  ‘This is the coolest thing I have ever done.’

I realised that we were really giving them an experience that was changing their reaction in the movie, which is fantastic.

STAR WARS

 

FT: Tell us a bit about the different projection techniques used on Solo.

For the film Solo, we used rear projection, front projection and LED screens all dynamically driven by pre-rendered and real-time media. In the case of the Millenium Falcon cockpit, the space requirements on the set made rear projection be the number one choice. LEDs could have worked as well, but it was a collaborative decision between Bradford Young and I to choose projectors in this case.

You're able to get higher resolution with projectors than you can with LEDs right now. That’s slowly changing as LEDs get finer and finer pitch. But to get to that same really high picture quality, rear or front projection still gives you the most pixels per inch.. Bradford Young, our DP, likes working at relatively low illumination levels - he likes working with relatively dark scenes, so in our case the projectors gave us plenty of light to work with. It was more intuitive for him.

Whenever you’re in Dryden’s ship  - that’s that giant space yacht that you see throughout the movie with the party scene and the scene up in Dryden’s office -  everything you see out the window, that’s all done with front projection. The projection screen wraps-around almost 360 degrees. Those are all in-camera final shots as well.

And some of the speeder chase work we did, we used poor man’s process. We did a lot of those shots  on location with CG enhancements in the background, but for some of the acting moments we shot those back at the studio. When we did the stunt work, we mostly used LED screens for illumination. So it was a combination of all those techniques.

FT: How do you handle the huge amounts of data generated from projections at those really high resolutions?

It does require a lot of data, and to prepare a lot of data in advance. I think one of the key differences with this workflow is that you need to be prepared with everything you want to shoot before the photography takes place.

So if you’re doing a scene like the Kessel Run, that's pretty significant in that you need to create all that media at full resolution, at full quality, in advance of the photography. So it takes the traditional post production process and flips it on its head, where you have to do what you would traditionally do in post in pre-production.

STAR WARS

 

FT: Did you use any other Virtual Production tools on Solo?

Yes,  we used an array of techniques on this show. We used VR as a pre-visualization tool. One example of that was when we were trying to work out the scale of the train in the train heist, and whether the train cars were a good distance from each other to make  the jump between the train cars - and if it felt like it was the right scale in real life.

We built all that up in VR, and let the creatives on the show, including James Clyne, who was the designer who led all that work. He actually put the headset on said ‘Can I try jumping from train car to train car?’

He took this big running jump to see how that would feel as an early test when we were playing with the scale of that.

Also, when we were shooting the train heist sequence, we shot with blue screen in the background, rather than using projection techniques, because with the extreme angles and scale it was more practical to do it with blue screen.

And for that scene, we did real-time, on-set compositing -  we had virtual environments and the photography available, so that we could real-time composite the sequences together while shooting.

Then when we left from a day’s shoot, we actually had the slap comps (or the rough composites) of every one of these shots. We walked off the set that day with those in the can, so editorial could cut with material that was very much more complete, which really helped with the process.

FT: Do you think rear projection is going to have a renaissance now?

I think that there will be a bit of resurgence of these and other newer techniques. We’re taking some of the work we built, starting with Rogue One where we used LED screens as dynamic lighting sources which would illuminate sets and cockpits, and building on that with Solo.

We have new shows that we’re working on now where we’re doing even more advanced things with real-time visual effects, and I think for directors and creative teams that really like to capture things in-camera, and like having the plan laid out in advance for getting that all on the day, it’s a really valuable tool.

And of course, it’s nice to have all of the options available to us. When you have those projections screens up or LED walls, if you need to float in a digital green screen or a blue screen, you can do that too.

So you get to choose in the moment whether you do something and capture it in-camera, or whether you work on it later in the post production process if something comes up on the day. So it does give us quite a lot of nice flexibility.