A Career Empowering Colorists: Talking with FilmLight's Peter Postma

Hi everyone, I'm Kali from Mixing Light and I'm

here today talking to Peter Postma

who's the Managing Director

for the Americas at Film Light.

Thank you so much for joining me

Peter, it's such a pleasure to have you on.

Yeah, thanks for having me.

So you've had such an interesting career.

I met you in your capacity as

Managing Director at Film Light, but

you have worked at Kodak as a

colour engineer prior to that.

And you are an alumni of

Rochester Institute of Technology.

So I just wanted to begin by talking to you about

this career trajectory moving

from Rochester to Kodak and then to Film Light.

These are three real staples of our industry.

So can you tell me a little bit

about what you studied at Rochester and

how that led you to Kodak?

Yeah, I mean some of it was just kind of being in

the right place at the right time,

I guess.

But initially, my interest was actually more in

just kind of computer graphics and

video games in particular.

So I initially was

studying computer science at RIT.

But I quickly found that on the technical side of

things, I wasn't learning

anything in school that I couldn't, I kind of

quickly learned myself just by

reading textbooks and

playing around and stuff like that.

It was the creative side of making images and

making movies that I needed

feedback on.

So after a year in computer science, I actually

switched to the Department of

Film and Animation and ultimately

graduated with a Bachelor's of Fine Arts

in Film and Animation.

But as part of kind of my electives, of course, we

have the Center for Imaging

Science right next door to the School of Film and

Animation says, "Oh, there's

some interesting classes about

colour and vision over there."

So I started taking those and then, yeah, just kind

of saw how I could kind of

really tie together my interests on

the technical side of things and my

interests on the creative side of things by really

focusing on colour for our

industry.

Because a lot of the resources out there for

learning about colour and stuff like

that are much more focused on print press work and

stills and stuff like that.

There wasn't a lot focused on

the motion picture industry.

So it's kind of an exciting area to study.

Definitely.

And it still feels a little bit like that.

I mean, I think, like you

said, right place, right time.

Rochester is one of the only universities that I'm

aware of that does teach colour

science and imaging science

in a really meaningful way.

I certainly know that here

locally, we don't have anything like it.

And so after that, like Rochester and Kodak have a

bit of a crossover of some

kind.

Is that right?

Yeah.

I mean, Rochester is an interesting town because it

is kind of in the middle of

nowhere, upstate New York, but it's the

headquarters for Kodak, the headquarters

for Xerox, the headquarters for Bausch & Loem,

which is a big lens company of

Corning who makes glass for

all kinds of optical devices.

So it really is this kind of hub of kind of optics

and colour science and stuff

like that.

So, yeah, luckily, it was actually one of my

professors in my last year at RIT who

worked for Kodak and was just working as an adjunct

professor who said, "Hey,

there's this opportunity at

Kodak you might want to apply for."

And so I did.

And yeah, I ended up being

with Kodak for five years.

But my first job there actually as just kind of an

intern was actually looking at

different kind of digital image processing

techniques for their laboratory because it

was really right at that cusp where things were

starting to go from film to digital and

what kind of tools that they want to have just for

their own internal team to kind of

assess and process images.

Oh my goodness, that's

fascinating to be on during that transition.

And without implying anything about your age, can

you tell me what year it was then?

Yeah, this was around 2000.

Yeah, around 2000.

OK, so like a good 12 years before it kind of that

that change to digital really

happened in a sort of commercial way.

Yeah, so it was actually 2001 that I moved out to

Los Angeles with Kodak because I

was initially just in Rochester.

And I worked I was still officially in Kodak, but I

worked right next door to

Cinecite, which was the place

where some of the first DIs were done.

So it was right after 'O Bropther, Where Art Thou?'

and one or two other films had

been finished. So really right at

the forefront of that whole DI thing.

And I remember, yeah, I was on the team that helped

build and deploy the first hardware

that could apply a 3D lookup table in real time.

So it was this crazy sun microsystems computer with

all these FPGAs in it just to be

able to process a 3D LUT in real time so the colorist

could see what the film would look

like once they went to record and print it out.

So which is such a simple thing that every color

corrector does now you apply LUTs all

the time, you apply three or four.

But yeah, back at the time, that was quite novel.

So that's unreal.

I mean, we really take it

for granted now, don't we?

And was it during that time that you were part of

the ASE committee to develop the CDL?

Yep. So right when the ASE Technology Committee

formed, luckily, the first meeting just

happened to be on Kodak property.

So I managed to get in the room, so

to speak, and start talking about it.

And yeah, because I already had some experience

supporting digital intermediate at that

point. Yeah, I was part of those early discussions

as to building the CDL because it

was quite clear at that time

that things were going digital.

But also just for supporting film, you know, it

used to be you would actually have

you'd either rent out a movie theater on location

or have like a trailer with a

projector and it's actually do film dailies.

And so this is around the time when film dailies

were going away and digital dailies were

starting to take over.

So you were scanning the

film on a telecine every day.

But the directors of photography saw that they were

kind of losing a key bit of

feedback that they got from the lab because it used

to be, you know, you could look at

what your printer lights were every day and know

kind of how things are going if you

needed to like overexpose a little more underexpose

a little more just if the film

from the last day was good.

But when you just get, you know, a DVD or something

with your dailies on it, you don't

really know, did the telecine operator have to do a

whole bunch of work to make those

dailies look good or were they just, you know,

sliding right into place as they should.

So the CDL was originally kind of thinking as a way

to be able to give feedback to DPs

to say, here's some specific numbers so you can see

if it is in fact being consistent

every day or if something changed and then, oh,

maybe there's a problem with the film or

a problem with the camera or something else with

exposure, we need to go back and check

to provide that kind of key bit of feedback.

I love that idea that the CDL isn't, you know, it's

so much of a grade as it is a

communication tool between the departments because,

you know, that kind of ties into

how CDLs are used now throughout the whole process.

They're communicating that idea from set through to

VFX and through to DI if you want

to if you want to start there.

And to have people who could look at those numbers

and say, oh, this is what they must

have done, you know, it just never occurred to me

that that was happening on the camera

side. So, yeah, that's really cool.

I mean, with printer lights, it does seem a little

bit more straightforward to say, oh,

they put in, you know, a stop of red or took out a

stop of blue or what have you.

But, you know, being only 10 numbers on a CDL, you

could you could look at it and say,

oh, yeah, they changed the

contrast and they offset it up a bit.

So. Right.

Well, and that's actually one of the reasons why we

went with a slope offset power for

CDL rather than lift gamma gain is because in

normal like lift gamma gain controls, there

is no straight offset.

So if you wanted to do just printer lights, you

actually need to combine, you know, two

or three controls to do that.

But when you have offset,

that can be your printer light.

So if you if you do want to kind of stay more in

that kind of printer light mode, only

have three numbers to work about.

You can just use the offset part of the CDL.

And so you only have to track three numbers still.

So that would all still work.

And the beauty of that as well,

that it's completely invertible.

You know, anything that goes up can

just come down again or vice versa.

So you maintain the linearity of the negative,

which is, you know, just a really great and

useful thing to have for other departments anyway.

Right. Yeah.

Yeah. So it's interesting to me that you've kind of

been a part of these committees along

the way that have in a really large way shaped the

workflows and the way that we work

every day.

And yet you're sort of not like

a household name, so to speak.

But you've certainly been there and kind of

developed the way that we work today digitally.

And you're continuing to do that.

And I know I was skipping ahead a little bit, but

you're part of the ACES project at the

moment as well, aren't you?

Can you tell me a little bit about that?

Yeah, I mean, actually, I started with ACES again

very early on, probably over 16 years ago.

I mean, when the first seeds of ACES were planted.

So it was actually while I was still at Kodak.

And there was already this recognition that, again,

things are going digital.

We should have some kind of industry

standard for how to process images.

And it took a good I think it was close to 10 years

before ACES 1.0 finally came out and

everyone could agree on, OK, this is kind of at

least a good first step, if not, you know,

everything we wanted it to be.

So, yeah, that was really just kind of

encapsulating some of the best practices of the

industry at the time to how to

set up a color managed workflow.

And that work continues because, again, when ACES

started, it was still very, very film

centric. It was keeping in mind, like a lot of time

was spent on, OK, how do we bring film

scans in to ACES and how

do we get back out to film?

And of course, these days, that's not so much of a

concern as dealing with all the different

digital cameras, dealing with high dynamic range

display devices and all that kind of

stuff. So, yeah, the ACES committee is getting

pretty close to releasing ACES 2.0, which is

a kind of major upgrade that more directly takes on

those concerns of, yes, super wide

gamut devices, both on the camera and the display

end and dealing with different dynamic

ranges and all that kind of

stuff in an even better way.

And that would be I mean, there's recent

developments in LED walls, you know, and Unreal

Engine and, you know, utilizing backgrounds that

have very saturated and potentially like out of

gamut kind of luminosity and colors to them.

Is that part of what you're

looking at with ACES 2.0 as well?

Yes, I mean, ACES was always designed to be able to

kind of accommodate any possible color.

So any color the human eye can see, you know, is

able to be put in an ACES container.

And there's certain issues that,

you know, ACES can't solve for you.

But certainly it could be, again, that good

foundation, that color managed

workflow that things then fit into.

So it's not going to have like a push button simple

fix to make your screen match

your camera and stuff like that.

But it's part of the workflow you set up that makes

that stuff easier and should make

it consistent so that you can use the

same approach no matter which LED wall you're

shooting with or which camera you're shooting with.

That I think is the biggest thing is to not have to

be dependent on like, you know, different specific

manufacturer solutions so that like,

OK, if I'm shooting with a Samsung wall one day and

a Sony wall the other, I have to do like completely

different techniques just to bring it

again to a basic industry standard, then you could

put your special sauce on top of that.

But at least the basic workflow is the same no

matter what you're doing.

It's fascinating, like the complexity of that and

just trying to deal with all of the variations.

Has there been anything particularly challenging

technology technology wise along the way, like, you

know, in terms of film going out and

digital coming in or any particular technologies

that have been tough nuts to crack?

That's a good question.

I mean, I consider myself very lucky to have

started my career when film was still around so

that I could learn like a lot from the limitations,

really, of the photochemical process, but how much

the engineers at Kodak were able to do to get like

great looking images out of that.

And to me, you know, being the young guy in the

room, a lot of times in those days, it was quite

clear that digital was the future.

That film only had so much life left on it.

But there were a lot of lessons learned that could

then be applied in digital and

digital should be a lot simpler because,

you know, there's so much intricacy involved in the

photochemical process for film.

You know, there's this whole industry that supports

all the different technologies that you need to be

able to lay down the chemicals

on a strip of film to expose it.

But then also the laboratories which can then

develop it and turn that into

an actual image and everything.

You know, if someone were to try to build that

infrastructure today, it would just be impossible.

Whereas digital is supposed to.

I thought digital was going to be a lot simpler.

You know, it's just like you've

got pure numbers you can look at.

You can go back to those same numbers.

You know, you have that magical

undo button that we have in in digital.

But yeah, because as you say, there's constant

technology changes, there's new sensor

technologies, there's new display technologies.

That's, I think the trickiest part really is not

any one particular device that has or technology

that's that's presented a problem.

It's just you have to keep up with all these

different things that are being produced in all

these different corners of the workflow that then

keep pushing it and evolving it.

And I think the biggest change, obviously, in the

last few years has been high dynamic range

displays, both, you know, laser projectors in the

theater that can get a lot brighter and a lot

darker, so much better

contrast and HDR displays in the home.

And we did some very early work with Filmlight with

Dolby, you know, on their early HDR kind of

technology and stuff like that.

But the biggest hurdles, I think, for HDR actually

ended up not being technical.

They ended up being creative. It's like, OK, we now

have this widely expanded palette.

How are we going to take advantage of it?

And I think that's something we're still kind of

figuring out as an industry is,

yeah, what's the best way to use HDR?

And that answer is often

different for every project.

You know, how much of that extended highlight

range, how much of that extended saturation range

do you want to take advantage of?

And when do you know when to kind of be reserved

and not pushed it just because it's there?

Yeah, those aesthetic, you know, the aesthetics of

standard dynamic range have

been around for 100 years.

And, you know, we we kind of have a sense of where

culturally we think skin tones should see it and

where highlights should sit.

But we're kind of like you say,

it's a new frontier creatively as well.

And I think another level of complexity, which I'd

love to understand as well, is that, you know,

working for companies like Kodak and now Filmlight,

you're in a position to be

supporting and contributing globally.

And, you know, different cultures and different

locations have different standards.

So, you know, sometimes I forget that there are a

whole other workflows out there and whole other

displays, you know, in China and in,

you know, the East and other places.

Can you comment at all about dealing with those

kinds of different cultural

culturally used technologies as well?

I don't know how to say that properly, but yeah,

no, I think, you know, because I

am more on the technology side,

you know, helping the creatives do what they do.

My biggest thing is just to provide the tools that

people can take it so they can take advantage of

those new technologies and kind of guide them a

little bit as to how we think

they're supposed to be used.

But then listening to the creatives once they say,

oh, I like this, I don't like this and just

constantly evolving that.

And yeah, you're right. It can be

very different for different cultures.

I mean, even if you look at just like stereoscopic,

you know, which had a big resurgence a couple of

years ago, it's still actually quite huge in China.

For whatever reason, in China, they really like

their stereoscopic 3D movies.

So we still have to maintain our 3D tools, even

though it's much less common, you know, in the US

and Europe and some other parts of the world.

So, but yeah, so we maintain all those tools.

And if, you know, someone in our sides of the world

wants to take advantage of it,

they're there to take advantage of it.

But if they don't, they don't have to.

So we've talked about being at Rochester and Kodak.

What was the work that you were

doing at Kodak as a color engineer?

Like, what would a day look like in that role?

So initially, as I said, it was kind of just

evaluating software tools that were in the industry

for people at the research lab at Kodak to use.

So it was because Kodak, of course, had developed

the the Cineon software, which was on the first

digital compositing software.

And that used to be their

own internal tool as well.

But once Kodak abandoned that, you know, again,

technology and industry moved on.

So they need to needed to move on to.

So I was helping them evaluate, OK, what can they

do with Shake, which is another compositor or

Raise, which was a new compositer

at the time that was coming out.

And also just different tools for

dealing with digital still cameras.

And so actually, once I had kind of done my initial

evaluation, those different devices, that's when I

got involved in a project which eventually be

called the Kodak Look Management System, where you

could take an image with a digital still camera and

then put that through a film emulation to see,

well, what would this look like if I

shot with, you know, 5218 negative?

Or what would it look like if I

shot with a slower speed film?

What would it look like if I did a

bleach bypass or things like that?

So we had these digital emulations

of all these photochemical processes.

And so when the director and the DP are on set

deciding, you know, what stock they want to go to,

what film process they want to use, they could

shoot an image with a digital still camera on the

day, put it through the software and

emulate all those different things.

So for a couple of years, I was involved in that

project and managing, yeah, kind of profiling

different SLR cameras to get them into that

software and then making sure we're accurately

emulating those different different processes.

What kind of tools did you use to do that?

Were you like, you know, looking at things through

a microscope or, you know,

like how does one do that?

It really is a mix of things.

And a lot of it was kind of just homegrown by some

of the color scientists I work with and just kind

of evolved over the years.

So a lot of it was just standard densitometry.

So, you know, like you record out a bunch of film

patches or different densities on film and then

measure that with a densitometer.

And you take that information and or you measure it

with a spectrophotometer so you can get the full

spectrum of the light that's shining through the

film and capture that and put it into your model.

For profiling like the digital still cameras, we

actually came up with this little device that had a

piece of dichroic glass in it that you could slide

up and down to basically

capture the full spectrum of light.

So we had a full spectrum white light behind this

dichroic glass that you could shift in like exact

increments and basically take a

whole bunch of stills to profile.

So basically how it was seeing each wavelength of

light and get a full profile then of how the

sensor, how that digital sensor reproduced light,

match that up with your measurements of the film.

And that's how you kind of got

the emulation between the two.

So that's unreal. I mean, that kind of physical

thing, you know, I just I don't think a lot of us

actually take into account that there are like

prisms and bits of glass.

Like, you know, that feels like something that

happened in like the 1800s when they were first

learning about the different

colors of light and things, you know.

But obviously it's still a part of practice today,

just in a more sophisticated way.

But I've never once in my life, you know, gotten a

prism and looked at looked at light through it.

I just think, wow, that's really interesting.

And so what when you moved over to FilmLight, were

you doing similar kinds of work there

or was that a bit of a shift for you?

It was a bit of a shift, but not not radically.

So my first job at Filmlight was supporting our

Truelight color management software.

And it was going in and

helping people calibrate their D.I.

Theaters to match, you know, what they were

ultimately going to see on film, because at that

time film was still considered

the hero deliverable for most shows.

Digital cinema was already starting to roll out.

But since most people were seeing theater, most

directors were used to seeing film in a theater,

they kind of considered film

their main primary deliverable.

So they wanted to make sure when they're color

grading in D.I. that they're seeing as close as

possible with the film is

ultimately going to look like.

So that was, again, yeah, measuring film with

densitometers to make sure we're accurately profiling

the output of, you know, that facility specific

film recorder and the lab

that they were working with.

And then measuring with a probe the projector to

see how it's reproducing light.

And again, you know, meshing those two together to

make the digital projector

look as close as possible to film.

So, yeah, there's a lot of running around.

It was just fun, as you know, Filmlight is a very

international company and there are a lot of

different corners of the world

that were starting to get into D.I.

So I traveled around quite a bit,

just going into these different D.I.

Theaters, helping them calibrate

their their rooms and set them up for D.I.

I bet they looked good, you know, like once they'd

all been calibrated, I bet they were

looking the best they'll ever look.

Yeah, of course, the trickiest thing with film is

that because it's a photochemical process, you

know, it's not 100 percent locked down.

So that was always a frustrating thing, is that,

you know, we could always exactly

match what the lab did yesterday.

The question is, is that what's

going to come out of the lab tomorrow?

So sometimes you have these awkward conversations

where you have to say, like, I know what I'm doing

in the lab is that fault here or, you know, kind of

figure out or are we doing something?

And so, yeah, it's kind of funny that in those

days, again, towards the end of D.I., like, I think

film laboratories are at their absolute best

because, you know, you have this rock solid digital

projector to compare it to now.

So the post house could say, you

know, hey, the lab was a little off today.

We got to reprint this. You

got to fix what you're doing.

So the labs actually got better.

They were doing it much tighter controls as well.

So yeah. Yeah.

Oh, that's so interesting to

be able to side by side it.

You know, I find today that I'm I'm constantly

hearing back from clients that, oh, we just watched

the DCP and it's not looking the

same as it did in the grade suite.

And then you have to say, well, you

know, have you measured the screen?

Do you know if the lamps that it's full brightness?

You know, you've got so many questions and

commercial cinemas, I don't think,

are doing that level of calibration.

But it's that thing of just, you know, being able

to say to somebody, look, when we looked at it

during the D.I., it was correct there.

And if it's deviating, at least we

know where where it's deviating from.

Right. Right.

Because you can't control every screen.

But wouldn't it be nice to be able to send someone

out and say, OK, it's going to get calibrated

before you screen for the

short film festival or something?

Well, I'm lucky enough to live in Los Angeles where

some facilities for certainly

for larger titles will do that.

You know, if they're going to have a big film

premiere or, you know, they know

that it's it's an important screening,

they will actually send someone out to measure the

screen and make sure it's properly set up because,

yeah, that is one of the most frustrating things I

think about what we do is

that you can always trust that an

calibrated environment is going to look great.

But once it gets onto someone's screen in their

home, who knows if they'll actually

be seeing what you intended or not.

So, yeah, that's right. Yeah, I do joke that it

would be nice to send my Sony monitor out for

everyone to look at things on and just ditch their

iPads and phones and TVs.

But unfortunately, we can't

provide everyone with that.

At the same time, one of my favorite anecdotes was

from a DP I talked to who

watched digital dalies on his laptop.

And he said it was great

because he could grade him himself.

If he wanted him a little darker,

he just tilt his laptop that way.

If he wanted it brighter

just tilt his laptop that way.

So he'd just grade his dailies by tilting a screen.

Oh, my goodness. Oh, that gives me

the chills. I don't like that at all.

For dailies, that's fine, I think. But yeah,

certainly when you get to the final, you don't want

to be leaving it to that.

It worries me, especially in short form, what some

people are watching things on on their phones, you

know, at the pub or something.

Yeah. So going all over the world and calibrating

things and using the true light system.

I mean, was the true light system sort of in

existence before base light was was that like?

Before Baselight or part of, how did that work?

So film light actually got started as kind of like

the R&D group from Computer Film Company

and all our first three main products, the

Northlight scanner, the base light

color correction system and true light color

management are early

versions of that all existed in CFC

before it split off and became a separate company

called Filmlight, where we

said, hey, we can, you know,

bring these tools to the industry at wide.

It doesn't have to be, you know,

just for this one, one company.

So Northlight was actually our scanner was the

first product that really kind of took off.

True light and base light kind of came out at

around the same time, but true light definitely got

traction a lot quicker because true light would

work with, you know, any color

correction set up any any kind of theater.

And obviously, you know, if you don't have that

basic color correction in place, you can't even

start to begin to do to do a DI.

So yeah, actually, when I joined Filmlight, I

didn't really know much about base light at all.

I said, oh, well, this true

light stuff is rock solid.

I've seen it before. I'm

happy to go out and utilize it.

This base light thing is kind of

interesting, but I haven't seen it.

I don't know what it does.

And but as I was working with the other people at

Filmlight, I said, wow, there's actually, you know,

something something really great here.

And and yeah, so base light was kind of the last

product to grow up and get widespread adoption.

But and you hadn't done color

correction yourself prior to that.

Only only technical kind of color correction.

So I knew how to like drive a DaVinci 2K and how to

drive a Pogle, but mostly just for how to set up a,

you know, a Telecine the right way.

Another one of the

projects I worked on at at Kodak.

I've actually forgotten the name of it, but it was

a product to to to again kind of make the film

transfer process more consistent and set up a

Telecine more like a scanner.

So you could take, you know, whether it's a spirit

scan Telecine or I.T.K. or whatever, we we

developed some reference calibration film that

you'd put on the Telecine with specific targets

that you'd have to, you

know, adjust the knobs to to hit.

And then you would know that your Telecine was

actually making kind of standard Cineon scans much

more like a film scanner that would

be used for VFX and things like that.

So I knew how to, you know, drive the controls to

do that kind of technical grading,

but never never creative grading.

Did you ever sort of get the bug for creative color

grading or have you always felt felt more like the

technology was was who you are?

Yeah, I'm very cognizant of the fact that, you

know, being a colorist, a huge part of your job is

kind of client management and client communication.

You know, it's figuring developing that that

language with them about what does it mean when

they say they want something redder or brighter or

darker or that sort of stuff.

And I fully respect that.

And I kind of decided that that that's interesting,

but that's not what I want to do.

I want to focus more on kind of

what's going on under the hood.

And, yeah, certainly like one of the great things I

think about BaseLight is that there's strong color

science underpinning all our

tools and everything we do.

And that's part of why I think we can explain why

the tools are the way they are is because we've

really thought through, like, we don't just want to

make some knob that makes the image looks zazzy.

And, you know, sometimes it

works and sometimes it doesn't.

It's actually like, why are we

doing the image processing this way?

Is this the best way to do that?

That's going to give you the most control over the

image and keep things looking natural and not just

like, you know, all electronic and weird.

So, yeah, I really like being a part of that

process of developing those creative tools and then

handing them over to people who can get inside

other people's heads and figure out how to actually

get the images they want out of it.

Yeah, it is such a different, like, discipline,

isn't it, to understand the technology that

underpins it as opposed to being

able to deploy it in a session.

But you've trained a lot of colorists.

I'd be surprised if you couldn't sit

down and grade something pretty well.

Yeah, I could.

I've done a few, like, indie projects and for

friends and stuff like that.

But, yeah, again, I just decided

that wasn't what I wanted for my career.

So when I spent some time with you and you trained

me, you showed me through the current features as

well as some of the new features that are coming

out in version six of Baselight.

And I just wanted to talk about a few of them

because it's actually quite fascinating to me

because prior to that, I hadn't had such a huge.

Kind of interaction with Baselight and I was kind

of taken aback by just how, like you

said, thought through the tools were.

So one thing which I really enjoyed were the tools

that worked in the perceptual color space.

Can you talk to me about those?

That was base grade mainly in version five.

Yeah.

So, I mean, the big evolution, I think, in

Baselight over the past few years has been focused

on tools that really emulate

the physics of light and the

way our eye reacts to light.

So as opposed to like our early tools like film

grade tried to emulate what happens in a film lab

when you adjust printer lights or, you know, vary

the processing a little bit to adjust contrast.

And video grade emulates what happened in a

telecine, which, of course, was originally kind of

analog voltage adjustments.

So really just trying to work with the limited

tools they had at the time to adjust the color.

So with base grade, which debuted a few years ago,

that was really one of our first major attempts to

say, okay, we have this

color management in place now.

We can know what the original light

coming into the camera looks like.

We know how the human eye works.

So let's make a grading tool that acts more like a

camera does and more like the human eye does.

So instead of grading in some RGB space, whether

that's, you know, based on the camera sensitivity

to RGB or your displays reproduction of RGB.

Let's say we'll just work in human perceptual space

because that's that's what color is.

It's the name we give to

how the human eye sees light.

So let's focus on that.

And so, yeah, we brought it into this human

perceptual space and we just made

the tools work consistently to the way

DPs and other filmmakers think about color in terms

of, you know, stops of exposure or color

temperature rather than just kind of like, you

know, arbitrary RGB numbers.

So.

And so in order to do that, because there has been

a lot of research into how the eye sees, did you

take that research and translate it into like a

digital map of some kind?

Like, how did that happen?

Yeah, yeah.

I mean, there's there is a lot of research done on

on human vision and still there still is stuff that

we don't really completely understand.

I think like we have a complete, you know, physical

model of the eyeball and how the cells and our

retina work and all that.

But then there's this big squishy brain behind it.

And that's the part that you can't exactly, you

know, just like pick a pick apart.

So there's still some stuff that happens in our

visual processing that we're not quite sure why it

happens or what part of the brain

it happens in and that sort of stuff.

But certainly for the initial part of the eye,

that's all very well modeled.

And so, yeah, we were able to take some academic

research papers that that model that behavior and

kind of trim the bits we didn't need to simplify it

to get it running in real time and that kind of

stuff to actually do the conversion

into the visual space and

then adjust the colors in there.

And so you were talking about color management and

color management workflows have been really

important to you throughout your whole career.

And one thing which I was interested in in

BaseLight was that you're

always working color managed.

You can't get outside of that.

Right.

So within that color management system, you have

like another little bubble for this tool.

So it's working within another color space, but

that's all handled automatically within BaseLight.

Is that right?

Yeah.

So of course, the way the color management system

works is at first have to know like what camera was

used to acquire the image.

So most of the time for camera original footage,

you know, we'll know if it was shot on RED or Sony

or ARRI or GoPro or what have you.

So we've profiled all those cameras and can bring

them from that camera native color space into

whatever color space you want.

And yeah, for what I call like our legacy tools

like video grade for lift gamma gain or film grade

or stuff like that, you can choose

what color space you want to work in.

So you could say, you know, I'm

primarily in ARRI show shooting Alexa.

So even though there's a few shots done on Canon

camera or a few shots done on RED, I can actually

bring them all into the ARRI

color space and grade with it there.

So my tools feel consistent.

But then beyond that, yeah, when you get to

Base Grade and some of our new

tools like Xgrade and ChromaGen,

they work in their own internal color space, which

is that human perceptual space.

And so, yeah, that basically lets us emulate those

those real world processes, regardless of what

camera you shot with at the start.

In terms of the way that it looks, you know, it's

hard to describe it, talking about it.

But when you're actually using it as a colorist, I

was so, so happy with how like

mainly saturation was mapped.

If you saturate an image in base grade, you never

get into those kind of electric colors that you can

reach when you've got Rec.

709 primaries controlling your saturation.

So it's definitely something

that's worth every colorist.

If you're no matter what system you use, just

jumping in to the Baselight Look software, which

is now available, you can you can

just download it to learn and play.

It is worth jumping in and just seeing what base

grade does and how it reacts, because it is a

slightly different experience when you're grading.

You know, you might be used to offset grading, but,

you know, to have these

controls like Flair and the ranges,

like you've got Bright and Dim and Dark and Light,

I think it is on the two panels.

I hope I got that right. Yeah.

They all operate just a little bit differently from

other tools we might be used to using and

especially the Flair control.

I mean, it's just worth having a play.

Well, thanks. Yeah, I mean, that's that's good to

hear. It means we we did our job.

So it is super good fun.

And you also mentioned another couple of tools that

work in that perceptual color space, which is X

grade and chromogen, which are these six tools that

are coming out there in beta at the moment.

Is that right? That's correct.

Can you tell me about those

because they are just something else?

Yeah, so X grade is a tool that's, again, kind of

plots all the colors in a human perceptual space.

So it's not exactly the same as the space that base

grade uses, actually, but it kind of plots the

colors out so that an equal amount of change in

that tool will be perceived the same as your eye,

because we're, for instance, more sensitive to

changes in like blue and purple hues than we are in

green hues or for saturation the same way.

We're more sensitive to

saturations in some color than another.

So that color space, first of all, kind of smooths

everything out so that you know, if you're making

the same amount of change in X grade, you're going

to see this perceive the same

amount of change with your eye.

And it basically when we built it, we looked at,

you know, other tools which allow you to manipulate

the full color space as kind of a mesh to be able

to like, you know, basically just push color colors

around and mold them like clay.

And we found it was a really interesting way to

adjust colors, but also a really quick way to

destroy the image because you can quickly get

images crossing over with each other causing

banding artifacts and tearing

and noise and stuff like that.

So we said, how could we do this, you know, more

smoothly and more

consistently to get a good result?

And the first was moving it from an RGB color space

into this human perceptual space where everything

is evenly weighted for all the colors.

And then the second was rather than, you know, like

just manipulating a 3D lookup table directly, let's

make it so you have kind of areas of influence

almost like you are molding clay and pushing stuff

around so that, you know, when you push really hard

in the green and then you push, you know, the red

direction in another way, it kind

of pulls a little bit of it with it.

So you don't get that tearing, you're just

constantly kind of molding the colors around.

And yeah, just ended up kind of being this really

organic way to quickly do what would normally be

secondary color correction.

So normally, if you wanted to make, you know, your

reds a little bit more orange and make your greens

more saturated and make your, you know, blues

darker, you'd have to do three different keys and a

bunch of windows and stuff to protect for exactly

what you want it to affect.

But with X-grade, you just kind of grab that part

of the color space, push it the direction you want

to go and do the same in the three other color

spaces and basically automatically interpolates and

smooth between them so that you're

not getting any of those artifacts.

Yeah, I mean, I've tried pushing it every way you

can imagine. And like you say, like one point will

always affect the others and it'll protect you from

from going into crazy town.

Another feature that I really like about it is the

neutral line that runs through the center.

So it did take me a little while to get my head

around exactly what that was doing.

But it looks almost like

degrees Kelvin running through.

Yeah, exactly. We kind of highlight where the

standard black body white points are so that, you

know, if you do want to keep stuff more neutral, of

course, there's nothing as no such thing as being

truly neutral because you're always neutral with

respect to some white point.

So you can decide kind of what that white point is.

But we also have, yeah, kind of like an anchor so

that it keeps stuff that is

close to neutral more neutral.

So when you're pushing other stuff around, you're

not suddenly pushing, you know, your grays towards

red, your grays towards green and stuff like that.

So that was kind of another key component we

realized early on is that normally you want to keep

your your neutral scale kind of grounded and be

pushing the colors around it

rather than shifting the whole image.

But you can take that pin out if you do want to go

crazy and push everything around.

You can do that, too. So it's also really nice to

just have that representation on screen because it

can show like I was playing with it the other day

and looking at, you know, I wanted to grab this

warm color and which

direction do I want to push it in?

Well, if I push it down the neutral line, it's

cleaning up and becoming neutralized.

If I push it the other

way, it's getting more orange.

So, you know, for me, it's like almost a part of a

map for me to to know where I am in space as well.

And it's also got ranges, which which I was having

a play with and discovering the other day as well.

So you can do a little bit of work more so in your

dim zone or more so in your

bright zone if you switch it over.

So I think there's like like any of these tools,

they start off looking quite

simple, like, OK, it's color warper.

And then you start to change a few parameters and

you're like, oh, wow, OK, this can really do some

pretty, pretty cool stuff in terms of look.

But yeah, definitely the, you know, utilizing it,

that's that's another thing which I really like

about the tools in Baselite as a whole,

is that there are a lot of tools in the main

grading options that can save

you from going to secondaries.

So Hue Shift is one that I would use every time.

It might be my favorite tool.

Being able to alter the value of a hue and, you

know, brighten skin tone a little bit and perhaps

darken a background a bit rather than adding a

window for a face or darkening

off a background with a vignette.

You know, I can just do that

right there in the main page.

So, you know, is that something that you guys are

interested in particularly is

keeping away from the secondaries?

Yeah, that was definitely a thing we consciously

worked on, because any time you do a secondary,

you're literally like carving out a part of the

image, treating it totally differently and then

trying to slap it back into that original image.

So that's always going to be problematic.

That's why, yeah, any time you pull a key, you have

to go in and watch it through, check for noise,

smooth it out, blur it, you

know, throw a window on it.

So it's only affecting the

part of the image you want.

You know, there's all these kind of extra steps you

have to do that eats up a lot of time.

And again, it's just not the best thing for the

image because you're really now kind of taking

things down a different path.

And that's something that, you know,

ideally you should never have to do.

You know, if you're able to capture an image you're

happy with, you should be able to just kind of mold

it to what you want to do

without having to tear it apart.

So, yeah, we're not going to completely replace

secondary color correction.

There were always be those times where you do want

to be super isolated with what you do or do have to

attack a very specific part of the image where they

weren't able to get light in on

the day on set or that sort of thing.

But for the vast majority of stuff, you know, if

it's lit pretty well, you shouldn't

have to carve up the image like that.

And not having to worry about that to constantly

have to go back and check,

oh, is the key noisy here?

Oh, is it tracking through

the whole length of the shot?

Just saves a tremendous amount of time in color

correction because you quickly with these primary

tools, you quickly develop the trust that, OK, if I

grade it well on this frame, that's going to

translate well to all these

other versions of the shot.

And you don't necessarily have to carefully QC them

all when you're setting the look, you just know

it's, you know, it's going to work.

So you're definitely time savers for sure.

I can see X grade adding a lot of

productivity to people's workflows.

ChromaGen is another new tool in V6,

which to me is just like its own beast.

It could be its own program.

Can you tell me a bit about chromagen?

Yeah, so chromagen is a look development tool.

So as opposed to X grade, which is very much for

kind of shot by shot grading, you know, adjusting

each shot individually, chromagen is a set of tools

to help you develop a look that you might apply to

a whole sequence or even to a whole show.

And we developed the tools again in a human

perceptual space so that when you're manipulating

colors, you're doing it based on sound

photochemical photographic

principles and the way the human eye works.

And it's specifically designed to kind of be broad

enough strokes that they do translate well across a

variety of different

shooting conditions and images.

So you can't be too isolated with the controls.

Like you can't do what you would do with a color

key or stuff like that specifically so that it will

hold up to a wide range of shots.

But what we found was that even with all the color

tools in Baselight, there are some color operations

which are often a part of look development that

aren't easy to emulate with current color tools.

So particularly when you think about like emulating

a film look, there's a lot of crosstalk that

happens between the color channels.

So like when you expose something that's green on

film, it's not just affecting the magenta dye

layer, which is where green is recorded.

It's actually bleeding into the

cyan layers as well, a little bit.

So we developed matrices and things like that which

can emulate that process on a technical level.

But artistically, that wasn't something there was

really a tool in the color

correction that can handle.

So we specifically want to make sure, OK, we can

adjust those kind of crosstalk

effects between the color channels.

But then also just kind of where

do you put the bumpers on your look?

So like if you want to be able to know that the

look for this show has a certain red quality to it.

Like you always want to skew towards a certain red.

And that could be for a commercial where it's like

a particular brand color you

want to make sure you're staying on.

Or it could be just when you're developing a

feature film or TV show or something like that.

Like there's certain kind of

signature colors you want that look to have.

And that's something you can do shot by shot with

Hue Shift and tools like that to kind of

consistently push things towards the same

flavor of red or towards the same flavor of blue.

But ChromaGen lets you do that as part of your

basic look development to say, OK, I

want all my reds to look like this.

And I want my dark reds to be this way and my

brighter reds to be this way.

Or I don't want to allow my reds to be too

saturated, but I do want my

blues and greens to be saturated.

So you kind of can define the color

space on which you're going to grade.

And then once you've set that look for the show,

you're just kind of grading, doing your adjustment

shot to shot underneath that.

But the broad strokes of the look are then set.

But yeah, it did end up becoming almost a color

corrector within a color corrector because it's got

a whole bunch of tools for how you set up that

color space that you're going to work within or

that color palette really that

you're defining for your show.

But the idea is that these

are the stages, aren't they?

The stage that you have with all of the different

sort of would you call them

tools within the stages or options?

Well, I mean, we could we could just as easily call

them layers, but we didn't want to confuse them

with the layers that are already within your your

main grading stack in baselight.

So that's what we opted to call them stages.

And each one of those operators, I guess you could

say, just kind of a slightly different way of

tweaking the color and you can combine them in

different ways to kind of cover the

whole color space you want to cover.

Yeah, so I suppose, you know, where you might

otherwise use a look or perhaps a LUT or, you know,

create some curves that you're going to put across

everything, you would use ChromaGen instead.

And using those stages, insert different tools that

are meaningful to the project you're on, like color

cross talk or contrast or...

There's a few that I'd never heard of before.

Like, can you talk me through, do you remember off

the top of your head what the stages are?

Because there's I'm putting on the spot here, but

there was some that I was like, I wouldn't know

what that was if I hadn't seen it here.

Well, yeah, some of them are named just because we

had to come up for the terminology, which isn't

something that regularly gets talked about.

So in some ways we're choosing the terms.

They're not, you know, industry standard terms.

I thought I'd missed a memory.

No, but yeah, a lot of it came out of, you know,

for a long time in Baselite, we've had this look

tool which allows you to apply preset looks.

So if you want a Kodak film look or a Fujifilm look

or a reversal as negative look or a bleach bypass

look, you could apply those.

But they were like look up tables.

They were kind of, you

know, could turn them on or off.

You could vary the strength a little bit.

But that that was it.

Like, if you really like the film look, but you

just didn't like the way the greens came out, you

wanted the greens to be, you know,

more yellow or less yellow or whatever.

It was kind of hard to tweak that look because

everything is all tied together in this one LUT,

basically that gets applied or not.

So with ChromaGen, we're able to

reproduce those same kind of looks.

We can produce different film looks.

We can produce different photochemical processes or

just, you know, other other creative looks.

But now they're broken down into steps so you can

tweak just the part you want.

So you can say, yeah, I love the Kodak film look,

but I don't want my highlights to be warmer.

I want them to stay neutral.

Or I do want my greens to be a little more green

and a little less yellow.

So you can tweak just those parts of the look that

you want rather than it being kind of all or

nothing thing that you would get with a LUT.

And are they going to be presets?

Because I feel like if it was a preset, I'd be

reaching for it all the time.

That's that's the way actually I recommend most

people introduce themselves to ChromaGen is, yeah,

we have a set of presets.

Some of them match the

existing looks in base light.

Some of them are totally new.

But yeah, I think that's that's certainly the way I

learn and am able to wrap my head around things

often is just by kind of just looking at examples

of what works and then picking it

apart and seeing how it got there.

So yeah, I definitely recommend people look at the

presets and then just kind of just turn things on

and off, adjust the knobs,

see what each step is doing.

And then you start to

really wrap your head around it.

I mean, that's another one

that's well worth having a play with.

I know it's still in beta, but there are various

places that are running the beta that you may be

able to get your hands on it or, you know,

of course, there's a lot of materials out there on

the FilmLight website about, you

know, showing you what it's going to do.

And who made that one?

Whose baby was that?

It's Daniele Siragusano.

Siragusano. So it's Daniele Siragusano, who's

one of our image pipeline

engineers based out of Germany.

He was really the chief architect on that working

closely with Richard Kirk, who's another one of our

color scientists in London.

So, yeah.

Wow. Okay. So I imagine that he's got a range of

resources that you could look at.

Is that I seem to have seen a couple of things

flowed into my inbox that

he's doing talks about it.

Yes. Yeah. So there's a number of videos already.

And certainly once version six launches, there'll

be a bunch more resources out there for to kind of

explain what all those different

different steps in the tool do.

And you mentioned Richard Kirk as well.

And just a side note, if anyone is interested in

color science at all, which you probably are, if

you're listening to this interview, his book, Color

Sense and Measurement is well worth a read.

And I believe you can

actually get a PDF of it online.

So, yeah, there's information

on that on the Filmlight website.

And it's definitely a great read because, yeah,

there's very few, as I said, early on kind of

resources for color in our industry.

You know, a lot of them are focused on pre-press

and photography and stuff like that.

So it's great that he wrote a book that's very

focused on color

management for film and television.

And it's actually really readable.

Like I was surprised. I thought, oh, my goodness,

what am I getting myself into?

You know, I'm going to be hit with a bunch of

equations and my eyes are

going to glaze over real quick.

But it was actually a page turner.

So he's got a really good

kind of personality behind it.

I think it's not dry at all. So

it's so well worth a check out.

And I saw that recently you did some talking about

cloud grading, grading in the cloud.

Is that something that's new for version six?

They're going to be new tools for that.

Because I haven't actually watched it, so I'd be

really interested to have a recap on that one.

Yeah, no, it's not new for version six.

So, yeah, we've been able to run Baselight in the

cloud for a little while now.

Initially, it was just the render engine.

So if you just wanted to render or transcode a

bunch of media in the cloud,

that's been available for a long time.

The new thing that we partnered with Amazon almost

a year and a half ago now was to be able to run

actually a fully interactive

Baselight session in the cloud.

So all the storage, all the image processing,

everything was done in the cloud and you're just

getting the output streamed down to you.

And of course, the big challenge for color

correction is that it

needs to be super interactive.

If you're adjusting a trackball or a knob, you need

to see the change to the image right away.

If you're spinning a dial and then a second or two

later, the image changes, you're going to

constantly be like overshooting or undershooting or

not knowing what you're doing.

So that was the biggest thing we really had to

overcome was getting that latency down.

And we basically done it.

So we now have high quality image streaming with

very low latency out of the cloud.

Color grading in the cloud on its

own, I think, doesn't make much sense.

Like you wouldn't do a production the traditional

way and then push everything up with

the cloud just to color correct it.

The economics and the time involved

to do that doesn't make any sense.

But as studios are looking at doing more and more

of their work in the cloud already, there is a lot

of VFX work done in the cloud.

That's pretty standard.

But if you're also doing your editing and your

sound mixing and everything else, then it makes

sense to keep it in the cloud for color correction

because it's possible now.

And ultimately to the colorist, it

shouldn't really feel any different.

They shouldn't know the difference between whether

they're working on a baselight that is in the

machine room down the hall or in some cloud data

center hundreds of miles away.

And so that's what we worked on is just making it

possible that you could work either way.

And that's pretty much all working.

So there's definitely, again, some economic things

you have to consider with how you deploy the

resources in the cloud and

the time to set it all up.

But, yeah, the technology is there and ready.

That's amazing. I just do use a virtual machine to

color grade seems like another another world.

I mean, one thing which is unique to base light is

that it's turnkey system and you've got certain

configurations that you support and

you know that that is going to work.

So so those are the ones that you

sell. Is it the same in the cloud?

You found the configuration now

that works and that's the one.

Yeah. So it's still kind of preset configurations.

And a big reason for that is really just for the

support so that we know if there's something that's

not working right, it's on us.

Whether it's a hardware issue, a software issue, a

config issue, you know, you have one number to call

and we'll we'll sort it out.

So I mean, that support is pretty incredible.

While I was in the office with you that week, I was

seeing the phone ringing all the time and you were

plugged into the community

and helping people in real time.

And the experience of using a base light is is

having that plug in to this community of people

who, you know, you're not

talking to somebody who's selling it.

You're talking to someone who's actually made it

and who has a stake in it.

You know, like I remember actually an anecdote of

one of my friends saying, oh, I was complaining to

Bob about something to do with the control surface.

And he's like, oh, I actually did some of the

electronics and now what

is it that what's going on?

You know, can you talk to me at all about that

support aspect and just being there for people?

Yeah, I mean, that's a huge part of what we do and

a huge part of why I like working for Filmlight is

we are, you know, a relatively small company who's

focused on color correction

and focused on this industry.

And it's it is a very innovative industry.

So as we said, there's new cameras, new display

devices coming out all the time.

But also filmmakers are looking to push things in

different directions and different ways.

So we're constantly involved in new projects and

new things and new ways of doing things.

And yeah, so it's great that we can be a resource

there to filmmakers or, oh, hey, someone just

walked in the door with this camera.

I've never seen it before.

Is there anything I need to know, any kind of

caveats to what we could tell them?

Well, here's our experience

here that we've discovered.

And, you know, if you learn

anything new, let us know, too.

So sometimes we are acting as kind of like a little

bit of a middleman to be able to get firsthand

experience with with new cameras and new display

devices and then share that with other people who

are who are working on them later down the line.

And also, like, you've got all of these backwards

compatible tools, like you've got

all of these legacy tools there.

So, you know, I imagine that it's not just the new

technology that you're supporting.

Like, if I was to unarchive something from 10 years

ago now and try to get my head around it,

especially if it was somebody

else's job, I think I'd go crazy.

So, you know, is that also something that you're

supporting people unarchiving old jobs as well?

Yeah, that's kind of one of the founding principles

with the way Baselite was built, is that you can go

back to any Baselite you've seen, no matter how

many years ago, and upgrade it.

And it'll still work and produce the same image in

the current release of Baselite.

So, yeah, I was actually just talking to someone a

few days ago who had an archival project from I

think it's 12 or 13 years ago, they wanted to bring

back online and they were able to.

So they were able to recover that job, see exactly

what was done to the image and then tweak it from

there for the new new pass that they wanted to do.

Wow. And is it true that they would be able to use

the databases saved undo's to go back and undo

everything in that 12 year old job?

Yeah, exactly. Again, that's just the way Baselite

stores its project files.

It's basically a history of every key press and

every change you've done.

So you can literally hit undo and see every step

that was made to get to where you were.

And that could be great for us in support too,

because if someone kind of paints themselves in a

corner, we can actually undo and

see every step they went to get there.

So we can actually pick it

apart and see what was done.

So what does your typical day look like? Are you

fielding support calls a lot of the time?

Are you working on special projects?

Yeah, all of the above. I mean, I think I'm pretty

lucky in that I don't have a typical day.

Sometimes I'm doing training with the colorist.

Sometimes I'm just kind of jumping in on support

calls, picking up the phone with

my background in color science.

Like I tend to focus more on the support calls that

are much more kind of color workflow oriented or

color calibration oriented,

where we have other people on the support team who

will focus more on technical issues integrating

with storage or things like that.

And then sometimes it's

just completely new projects.

I actually just got back. I was in Las Vegas last

weekend for the premiere of Deron Aronosky's

Postcard from Earth at the Sphere in Las Vegas,

which is this giant LED wall.

That's I forget how many stories, but it's like a

16000 seat venue with a giant LED wall that

completely surrounds your vision.

So there were some unique challenges in getting

that project over the finish line for color

calibration and also just dealing with the geometry

of the space and the screen and all that.

So yeah. Wow. Oh, can you tell me more about that?

What was your involvement in the sphere?

Because everyone's talking

about it with U2 playing there.

Yeah. Yeah. So some of it under

NDA therefore, I can't go into too much detail.

Of course. Of course.

But yeah, so the colorist for the first kind of

traditional film that was produced for the sphere

was Tim Sippen and Andre Rivas

was the assistant colorist on that.

So he's supporting those guys and what Sphere

Studios, because they built a studio specifically

to produce content for this this venue,

Sphere Studios built their own camera called the

Big Sky Camera, which had a fisheye lens on it to

be able to capture an 18K by 18K source image to

put into the into the sphere.

So, yeah, there was making sure that the color from

that camera was coming into base light correctly

and then developing the color profiles for the

venues so that when they project itself on the

screen, it had all the right colors.

And of course, we can't do so that the ultimately

the playback in that venue is a 16K

by 16K image at 60 frames per second.

So it's a huge amount of data

being thrown up on that screen.

Obviously, you can't color

correct that in real time.

So most of the time, Tim and the team were working

off of proxies just on a

standard Sony X310 monitor.

So again, the color calibration, making sure that

that monitor matched the color of the venue and

then also just dealing with the spherical mapping

to be able to look at the image in different ways

on that flat square to get an idea of what it would

look like when you're

surrounded by it in the sphere.

So, yeah, a lot of fun and

fun new challenges in that.

Wow, I can't imagine.

Just I just had this like image in my mind of them

just grading and looking at the sphere, just like

standing out there in, you know, Las Vegas on the

strip and with a little desk in front of them.

Yeah, we we we hope to get to that

point at some further down the road.

But also, you know, of course, that that the sphere

is now very much in demand.

There are U2 concerts in there.

They're playing the film regularly there.

So it's not unfortunately a space you can just kind

of take over for a few days and do what you want.

And it's it's a, you know, it's a live venue.

So, yeah, it's just

building the workflow around it.

It seems like something that belongs in Dubai.

It's just such a bizarre, such a bizarre thing.

Was it interesting to see in in life, in real life?

Like what what did it look like as a viewer?

Yeah, no, it's definitely it's a very immersive

environment, very unique.

You know, in some ways, it's an evolution of

previous dome projections like OmniMax.

IMAX have this.

They still do actually have an OmniMax, which is

basically IMAX film projected on a dome or, you

know, some planetariums will have

digital shows that they can project.

So in some ways, it's an evolution of that.

But because it's a LED wall, you know, there's no

optical artifacts or anything about

you have to deal with in projection.

It's just the light directly coming at you.

It's much higher dynamic range so it can get much

deeper blacks, much brighter whites.

And of course, the resolution is pretty insane.

So, yeah, when you're sitting in that venue and

it's completely filling your peripheral vision,

like any time the camera

moves, you feel like you're moving.

Yeah, it's just totally immersive that way.

And when you're yeah, when the so Postcard from Earth

has shots from all over the world and all

these different environments.

So, yeah, like when you're in the underwater

sequences and the waves are over your head and the

fish are down in front of you.

It really is, you know, much more immersive than a

traditional theatrical experience.

So it's really unique.

I'll definitely be heading to

check it out next time I'm over.

That's for sure. And you guys have a system that

can deal with that that large file.

Yeah. So, I mean, that

actually was nothing new for baselight.

We had always architected baselight to be kind

of resolution independent and we'd always, you

know, so whatever resolution image you could throw

at it, you know, we can't

promise how fast that it will be.

But it will at least, you

know, produce an image and work.

So, again, a lot of the real time

adjustments had to be done off proxies.

But, you know, there are 4K proxies.

So 4K is the proxy and 16K was

ultimately what was what was rendered.

Wow. I bet that took a

while to render, but it's OK.

You don't have to tell me how long.

Well, look, I reckon I've probably covered most of

most of the questions that I had for you.

But I just find it super interesting what you've

been able to do in your career and how much

everyday stuff that we take

for granted, you know, has actually been the result

of committees and companies

that you that you've worked in.

And I think really it's it's, you know, people who

are doing all of this work in the background to

make sure that things really

look the best that they can.

That gives us colorists the

ability to do what we do.

So I can't thank you enough for your contribution

to the craft and just for for being available and

having a chat and, you know,

being at the other end of the phone

support. It's it's awesome.

So thanks a lot. Well, thanks.

Yeah, that's really nice to hear. And yeah, like I

said, it's I if, you know, the creative process is

just able to go smooth and you

don't even think of that technology.

Then, you know, then we've won.

That's that's really our goal.

So yeah, it feels a bit like that in coloring, too.

It's like if no one notices what

we've done, then we've done a good job.

So we're very self-effacing in this field.

Yeah. Well, thank you so much.

Peter Postma, I'm Kayleigh Bateman from Mixing

Light, and I'll see you next time.

A Career Empowering Colorists: Talking with FilmLight's Peter Postma