Cinema vs Bluray masters

The way the light is produced from a bulb is different to how it is produced on a LED which is different to how it is produced on an OLED.

The standards are trying to come up with the best approach for the medium.

JUDD offset exists because of the way OLEDs deal with White Point. It is a standard for the tech (even though most people I know hate the JUDD offset).

1 Like

To be fair I assumed the gamma used ona projector is directly related to the dark enviroment used.

So you are saying a LED wall instead of a projector in a cinema would NOT use gamma 2.6?

I dont really see how a projected midgrey colorchart would look any different from a different emitter?

from the sRGB specs:

Viewing Gamma

The reason that a viewing gamma of 1.125 is used instead of 1.0 is to compensate for the viewing environment conditions, including ambient illumination and flare. Historically, viewing gammas of 1.5 have been used for viewing projected slides in a dark room and viewing gammas of 1.25 have been used for viewing monitors in a very dim room. This very dim room value of 1.25 has been used extensively in television systems and assumes a ambient luminance level of approximately 15 lux.

1 Like

just saying i find this higly interesting as its a big debate right now as to also how to design ODTs and what ODTs to use for what and how and everything.

I find it interesting that most people say changing ODTs to the appropriate display EOTF works well giving matching OOTFs across displays and their respective reference environments.

there are many people saying the same wirh 2.4 and 2.2 gamma monitors.

And I so belive the missing piece is the massive difference in display luminance that the old standarts have not have to deal with in pracice.

under “viewing conditions” they talk about how their DRTs have compensation build in vs a plain colorimetric approach, this seems a lot smarter than just changing aces ODTs.

Probably not. It all depends on the colourspace of what is being used on the LED wall. What I would say though is an LED wall in P3D65 gamma 2.6 would look different to a projected P3D65 gamma 2.6 in the same environment, even with blacks and whitepoint calibrated identically.

And I am saying that potentially an LED wall with P3D65 and gamma 2.4, being fed a P3D65 with gamma 2.4 signal may look closer to the projection that a P3D65 gamma 2.6 source being fed to a gamma 2.6 calibration on the LED wall.

that i find highly interesting and something i would love to test!

Oh and one thing I got wrong

aces ODTs from 2.6 to 2.4 do HAVE a surround comp build in

But 2.4 to sRGB does not (as they assume the same surround at least in aces).

But a “stupid” change from 2.6 to 2.4 gamma in resolve when doing the same would probably not yield good results?

Throw in a display into the same room as your projector and try and calibrate it as close as possible to the same spec. It won’t look the same.

thats fair but will a display set to 2.6 or 2.4 gamma look closer to my projector? :sweat_smile:

If the display is being fed a gamma 2.4 signal and calibrated to gamma 2.4 then it will potentially look closer than if it was calibrated and fed 2.6.

i mean it depends on how that signal is generated and to what target.

Which boils back down to "how are we changing the signal to get the same visual impression across those different monitoring environments. "

baselight and aces are both introducing a surround compensation into the different target display rendering transforms, which is basciallt what not doing anything with the gamma does to a extend (its more advanced but it follows the same principle of havign a different ootf for each display+environment

So a 2.6 DRT made for dark surround vs a 2.4 DRT made for dim surround would result in different visual impressions on the respective displays given they are in the same environment, which is what i think is right.

Just doing a colorimetric conversion of using 2.4 instead of 2.6 as a DRT with no further surround compensation would yield the same visual impression on the respective displays given they are in the same environment which i think is wrong.

That we can, but the point of difference is that I can appreciate the reasoning by the approach of ACES, RCM & SMPTE towards colour science, that you can’t correct for a viewing environment so you need to specify that as a standard and these adhere to that standard. I think your bone of contention is that the variety of viewing conditions need to be taken into account, which the standards do not (apart from HLG which is somewhere inbetween the two).

So in the constraints of there being a specification that everyone is working towards I disagree with what you are saying (the maths works) but I can also appreciate that you are trying to correct for different viewing conditions which I do not think is so straightforward.

I actually do not want to correct for viewing conditions outside of reference that is misunderstood.

for me its all about the logic behind tranferring images between the few standards that we have - always combining monitor + surround as they belong together.

Broadcast says:

dont change the camera signal , sRGB monitor will render the same source values in its reference env visually similar to a 2.4 monitor in a dim surround to a projector with a 2.6 gamma in a dark surround. the whole
point of different display gammas IS the surround compensation

What I assumed cinema was saying:

We change the camera signal to get a absolute colorimetric match between all
different displays disregarding surround luminance, so if all displays are in the same room they all look the same.

what I figured out:

Good rendering transforms that are used have a surround compensation and do MORE than just a colorimetric conversion. its just not inherently obvious if you dont work in baselight.
Which is a advanced way of doing what broadcasters are doing but still follow the same principle of surround compensation.