Cinema vs Bluray masters

As i am not doing any DI work for cinema releases, I have a question about the theory behind converting a cinema master for bluray/streaming.

Lets say the cinema master is done on a 2.6 gamma projector at DCI whitepoint and P3 primaries at 48NIT.

to my understanding of perception and gamma, the only thing needer to convert this to a “home” master would be to convert the whitepoint and primaries leaving gamma untouched but from what I am reading this is not the case?

general wisdom here is

dimm surround mastered content at gamma 2.4 “looks the same” on a 2.2 Gamma monitor in a “living room” surround.

Which looks the same on a gamma 2.0 monitor in a bright surround environment.

Which the translates to 2.6 gamma for Dark surround (cinema) content.

So the OOTF is relative to the adapted environment.

So what is transformed here gamma/contrast wise when going from cinema to home or vice versa? And why is this done?

I heard many are “changing aces odts” from 2.6 to 2.4 , but wouldnt that negate the point of surround luminance effects? The sRGB vs rec709 odts have the same thing going on which always confuses me.

This is something we do all the time.

The answer to this is simple, we do a Theatrical grade and a separate home ent grade. The HomeEnt grade uses the starting point of the Theatrical grade with a colour transform into the colour space of the deliverable but is then tweaked/adjusted. When using ACES of RCM generally the change in Output Transfrom usually gets you in a decent starting place. You wouldn’t just apply a transform then export though as the look of a projected image vs a display is different enough tht you need a colourist to deal with that. It’s close but not close enough.

Most Home Ent deliverables we do are mostly comprised of a P3D65PQ HDR master that the SDR is derived from using Dolby Vision.

Some people are now grading HDR first then using Dolby Vision to derive both the Theatrical & SDR. Haven’t been impressed by the Theatrical analysis from tests we’ve done but HDR to SDR is amazing and by far the best way to do it.

1 Like

For EOTF, I’d definitely do a gamma 2.6 to 2.4 if doing SDR, but as mentioned we almost always do a HDR grade before the SDR so it is gamma 2.6 to St.2084.

Whitepoint will not need to adjust if graded with a D65 whitepoint but obviously if you’ve graded with a DCI or D60 whitepoint on a projector then the white point needs to adjust as well.

ACES really is excellent at this. RCM does a decent job too to be honest.

1 Like

i think thats my main question , why are we changing the EOTF? That seems wrong from what I am reading about OOTF and surround luminance perception.

I understand the practical use of switching aces ODTs or using the new dolby trims but i want to understand it in a more basic way.

I a dark room my eyes adapt and see “less contrasty” thus the projector applies 2.6 gamma

In a dimm room my eyes see more contrasty thus we use a 2.4 Gamma.

If I convert the signal from 2.6 to 2.4 thus making the image darker/more contrasty - wouldnt that totally defeat the purpose of the whole system?

it all goes back to the rec709 EBU papers and what charles poynton writes about that stuff, its impossible to A/B test it of course, but i have always been sceptical about the understanding of the relationship between screen luminance/eotf and display luminance.

If in general the change from 2.6 to 2.4 in the signal or vice-versa is used to transfer the look of something from dimm to dark surround that would suggest that surrou d luminance does not have the impact thats generally belived it does.

Which also in turn would invalidate apples claims that a ootf=1 (gamma 2.0) is correct for bright surround viewing.

You’re going down a rabbit hole here my friend.

This is why there are calibration specifications. Bt.1886 exists so they can specify both the black and white points as well as the viewing environment. When all these settings are correct, the curve essentially matches gamma 2.4. If the environment changes though the gamma curve will measure differently. You can’t account for all the different ways people will be viewing, on their phone in the middle of the day, viewing your 65” TV in a lounge room at night with a warm white lamp in the corner. So you have to work to a specific viewing environment when deciding what specifications you are using to calibrate to. If you’ve ever seen a home theatre tech forum, there are a lot of enthusiasts trying to get the lighting conditions, viewing angles and distances, etc; correct to try and get their viewi by environment as close to spec as possible.

Having calibrated theatre plus finishing suites available, Gamma 2.6 looks better for projected whilst gamma 2.4 looks better for displays. I understand your thought process as it seems counter intuitive but due to the difference in approach to turning your data into an image, a projected image with 2.4 loses a lot of definition in the blacks due to light bleed. Gamma 2.6 looks best for projection. Gamma 2.4, having the ability to produce brighter images makes better use of the range.

So your theory is not wrong but since you can’t control the environment, or even the technology (EOTF on OLED is different to gamma on LED/LCD even) then they set a specification tht takes as much into account as possible and picked what the best EOTF for that viewing environment is.

1 Like

“They” being SMPTE of course.

Just to make sure we are talking about the same thing,

EOTF would be the inherent transfer function of the display or projector.

What we are changing when we are doing a 2.4 to 2.6 conversion with ACES for example on the image data would be a OETF change or basically changing the “virtual camera”.

So what I am saying is:

The EOTF is dictated by the display/projector 2.4 for dimm surround and 2.6 for dark surround 2.2 for “office” surround and 2.0 for bright surround

The Signal in all cases should be encoded using the same OETF.

Lets say you take a rec709 camera, the camera would have a OETF of roughly 0.5 - a 2.4 gamma monitor would thus create a gamma shift or system gamma of 1.2 (1/0.5 = 2) → 2.4/2 = 1.2
as the OETF would not be the inverse of the EOTF.
If we introduce a additional gamma shift into the signal by grading/conversions we would change the OETF and thus the OOTF.

A little visual example of what exactly baffles me …

This image shown on a 2.4 gamma display

Would look exactly the same as this image on a gamma 2.6 display

Would look exactly the same as this image on a gamma 2.2 screen

If all displays are side-by-side in the same environment

But according to SMPTE and all of those guys this would be incorrect as there SHOULD be a gamma shift due to different environments, thus making a conversion between 2.4 and 2.6 inherently wrong and it should only be the EOTF of the display that actually changes the light output.

I am totally ignoring blackpoints and such and yes there absolutely is more to it 100% and there is a huge visual difference between a 200 inch projector screen vs a 24 inch LCD , no doubt!

just to show some more visual examples, DOLBY does the same thing, they are making the image “brighter” for cinema projection

Dolby 100NIT trim (from PQ master)

Dolby 48NIT P3/D65 Trimm

This is exactly the point though. You are changing your image to be correct for the EOTF of whatever standard the display/projector is set to. Projectors are gamma 2.6 as that is the standard set by SMPTE in a specified viewing environment. Bt.1886 is a standard set for a particular viewing environment. St,2084 is a standard set to a particular viewing environment. This is set to be standardised so that all material displayed in a differing viewing environment will view in the same way and have discrepancies but all material viewed in that environment will hve the same discrepancies.

Basically, a group of engineers has walked into a room, decided how the room should be setup, decided what is the best viewing setup for the type of display in that environment and set a standard. Those engineers have decided that the EOTF for Rec709 should be Bt.1886. A bunch of Dolby engineers have sat in a room and developed the PQ EOTF as they believe that is best. A bunch of BBC engineers have set in the room and decided upon HLG as the best EOTF for HDR. So you adhere to those standards because that’s what everyone else is doing so that in the same conditions everyone’s material should look how it is supposed to look.

You cannot correct for different viewing environments, what mode their TV or projector is set to, if they turn the saturation or Contrast way up. So a standard needs to be set and adhered to. It is as simple as that.

I’d have to say that I think SMPTE have got it right in regards to 2.4 for TV. We have a Dolby monitor calibrated for both gamma 2.6 & 2.4 and 2.4 looks better for displays. I hve also seen a Projector calibrated for both 2.4 & 2.6 for cinema. 2.6 looks better.

So you can argue against a standard I guess. I think 4000nits for HDR is overkill for example, but they are a standard so you work to them.

So when an individual projector or TV is set to a EOTF that differs from the standard, this should only be to compensate for the viewing environment and how it is affecting the gamma measurement due to the viewing environment.

If you left the EOTF the same for the different calibrations, then the images will look more different between the displays. Remember, changing the EOTF is counter intuitive. It is working opposite to what happens on the display/screen.

A good way to think of it is when you see HDR on a SDR calibrated screen. It looks flat. Or when you view SDR on a HDR calibrated screen. It looks super saturated.

Changing the gamma is counter intuitive between projectors and displays in this exact same way.

Well, you shouldn’t do it in a vacuum, and you also can’t compare them side-by-side on the same display.

The theory is that in the entire pipeline at each interface there’s an agreement what you’re handing off and what the other side is expecting.

The projector is calibrated based on the expectation to receive a 2.6 gamma file / signal.
The reference monitor in the color suite for SDR is calibrated to receive a 2.4 gamma file / signal.
The desktop monitor in a bright office space should be set based on the expectation of a 2.2 gamma file, or in the case of Apple sometimes even a 1.8 gamma file.

You would have to watch the file in each of these environments and then mentally compare your perceptions of these images. As it would be hard to compare directly, you would want to think more about ‘Did the image look flat or rich in contrast and color?’

If everything was done correctly, you should have the same general recollection about how rich in color and contrast the image was regardless of environment. Of course it’s very difficult to fully separate what you saw from the surroundings.

Going back to the gamma, it’s whatever the distribution channel specifies on how their end of the pipeline is setup and what gamma they expect to properly reproduce it.

You can take a look at the IMAGO photon path diagram as an attempt to capture this properly: Photon Path

Daniele Siragusano from Flimlight was the main author of this, he’s Baselight’s workflow and image engineer.

2 Likes

@allklier Thank you for saying what I was trying to communicate in a much more concise way!

1 Like

See and this is where stuff gets weird for me and i think we are talking about different things here.

If I have a rec709 camera signal, that is encoding scene-light by definition as 0.5 gamma (1/1.961)

and view this on a gamma 2.4 monitor(sdi into monitor from camera) I get the correct response for a dim (5-10lux) surround viewing environment, thats what rec709/bt1886 is based on.

as 2.4 is NOT the inverse of 1/1.961 we have a mismatch this is the OOTF or system gamma. (~ 1.2 gamma)

( ill definetelt read through the photo path as thats exactly what matters here the relationship between scene light to display light)

So now the idea is to adjust this OOTF for different viewing environments and display luminances.

the brighter the surround the more “contrasty” is our perception, thus we need to adjust display gamma (or encoding gamma, either or) to give us a lower total OOTF/SystemGamma.

So when we move content from a dim surround mastering environment to a “office” surround the usual approach of NOT messing with the signal in any way is generally accepted as OK as long as we change the monitor EOTF from 2.4 to 2.2 gamma, the inherent different OOTF then helping with surround compensation. (there are more advanced ways to deal with this but we dont have that stuff yet)

This directly contradicts that we need to have have “right encodings” for what a “display expects”. HD video has to be encoded as 1/1.961.

which brings me to cinema, projectors have a gamma of 2.6 which then in turns makes my rec709 1/1.961 encoded camera signal look “even more contrasty” thus adjusting for the difference in surround luminance from a dark cinema to a dimm grading suite…

Apple for example uses a ootf of 1 for bright surround viewing for rec709 content by default they assume a mac user is in a bright room, confirming what I am rambling about.

My whole
point is the response of different gamma monitors should NOT be the same in the same room as thats inherently not the point of the video system. but using these DRTs and colormanagement systems like aces and dolby are doing exactly that , which i find weird as its contradicts all these things that the bbc and ITU have written for years now.

also the more I read the old specs about sRGB and 709 they all confirm what I am saying, a inherent 1.2 system gamma for dim surround viewing.

If we reproduce the same light output exactly in a grading suite vs office environment vs a dark cinema would that not be MORE wrong? Which is what “encoding to different targets” would do exactly?

so I wonder which is right? what works best? been going back and forth between my 2.6 @ 48NIt projector and my grading suite and honestly the shift is so minor ts hard to say what works better and the difference between oled and projector is so great it probably doesnt matter and this is purely philosophical :joy:

also adding this as this whole HLG OOTF paper is what lead me here

It definitely still is the same thing. This sounds a bit like the argument between HLG and PQ.

PQ will have the same result (or as close as a particular display can replicate) in the same room under the same conditions. HLG does not as the black and white points are optimised to the limits of the display and the gamma curve will change respecting this. That’s why Filmmaker mode works with a system such as Dolby Vision but doesn’t with a HLG gamma system. Two fundamentally different approaches. You are in the HLG camp way of thinking whilst Bt.1886 & St.2084 are a more modern way of thinking about things. For many years, Rec709 didn’t even have a specified gamma setting.

I understand your thinking and I have read a white paper at some point years ago that explains your thinking and why they moved to standards for gamma as well as gamut. I believe it was a SMPTE white paper actually. I have had a quick search to no avail. Once again, the answer all comes down to standards. You may not necessarily agree with a standard but essentially a bunch of SMPTE engineers have gotten together in a room and decided what they believe are the optimal settings for a system and viewing environment and documented those turning them into a standard.

It all still boils down to what the monitor expects to be receiving and all displays have been designed and have settings to reflect this. Any display referred colour space have specifications that you need to adhere to. You set your output transform to match your display’s calibrated monitor space for this same reason. You seem to be thinking the behaviour of a scene referred vs display referred colour science is the same when it is not. I know you already understand this but you can’t treat them the same way. From what I am reading you are trying to apply the principles of a scene referred colour space to that of a display referred colour space.

but this would in turn mean that any display / TV set to gamma 2.2 would inherelty show a wrong image when playing back a bluray, not taking into account the difference in surround luminance? like what i am saying is its the correct image for a office surround but the wrong image for a dark surround

I would love to see that whitepaper as I fail to see any relation to scene vs display reffered here, what I am saying is that a OOTF of 1.2 from scene to display light is correct for dim surround viewing no matter what is beign done with the signal inbetween, removing camera gamma into linear and then applying a display gamma without a OOTF is what apple is doing by default and everyone loves that…

so yes please give me sources so I can see the light and embrace all the color transforms as I have before i was tought about this ootf stuff which makes me question everything

also both PQ and HLG have concepts of this system gamma / OOTF where the encoding mismatches the display EOTF.

I belive for PQ the OOTF is also 1.2 while for HLG its variable as per the paper above, not 100% sure what Dolby IQ is doing but its probably doing just that.

And i undestand the “standards” that are made for a particular display in a particular room A but none of them are saying I should encode my images with the inverse of the display EOTF, they are all saying I should not do that hence why I am so confused. (ok actually srgb has a ootf of 1 bur only because their reference environment is 64 lux which is insane?!)

If I say aces sRGB odt for a sRGB display

vs aces rec709 on a rec709 display

they will look the same. thus givig both images a ootf of 1.2

but they should not, sRGB expects a OOTF of 1.09 from scene light and rec709 a ootf of 1.2 so they should render different :sweat_smile:

arghhh.

let me quote someone much smarter than me

" There is no boost required for viewing the natively bright surround rec.709 content (it is bright surround which is why it requires the 1.22 boost for dim surround viewing, consider the environment outside of Philo Farnsworth’s barn) so we are providing the correct response for this environment. Further use of a reference display response in the bright environment is actually wrong."