sRGB Encoding vs Gamma 2.2

I had thought that sRGB was gamma 2.2 but you get very different results when making a custom color transform with apply_sRGB_encoding vs apply_2.2_gamma.

Also, and maybe this was just me being naive, but this was what I thought would happen: If I take a rec709 timeline and export a quicktime tagged 1-1-1 (no colorsync button pressed) and then took that timeline and exported it with a custom transform with remove_2.40_gamma and then an apply_2.2_gamma and tagged it sRGB Display so that it comes out of flame as a 1-13-1 quicktime, shouldn’t they look the same? Isn’t that whole point of quicktime tagging?

EDIT to say that when I show front and result of the color transform before I export it they are identical, but only when I have it set to apply_sRGB_encoding, however the two quicktimes do not visually match regardless of apply_2.2_gamma or apply_sRGB_encoding.

I looked into this around the time I was looking into NCLC colourspace tagging for QuickTime.

The first thing that I found was that the gamma for sRGB and rec.709 isn’t consistent. It is simplified/ summarised as 2.2 or 2.4 but it is subtly different. Don’t ask me. Ask the colour scientists.

What I read - The overall gamma of sRGB is approximately 2.2. Whereas Rec. 709 uses BT1886 gamma curve approximately 2.4.

The other thing I did before the ColourSync option appeared in Flame, was to build a LUT that took my rec.709 media and removed the gamma and made a sRGB gamma version. I wanted to see if this would be a better way of presenting to clients.

This conversion made the media look washed out on my Flame but then I am working on rec.709 footage and my monitor is not shifting from rec709 to sRGB to compensate regardless of how I tag it.

The real test is how my quickTime looks on a mac. I export my client WiPs without using colorSync compatibility but now that we have it I can use three exports to compare on a mac using QuickTime.

The one on the left, (1-1-1) is not colourSync compatible and is the traditional way I send my wips to clients. I know it is wrong in terms of gamma but stay with me.

Middle is my newly created sRGB version. I have actually changed the look of my pixels not just tagged it differently and on export it has been tagged (1-13-1). It looked different on my Flame with my rec709 monitor not able to correct for the gamma shift.

The one on the right is my rec.709 WiP being given the correct colourSync treatment. (1-2-1) getting a gamma of 2.4 injected into the NCLC tagging.

When I view them in QuickTime I get this:

Top is my regular export with no colorSync. (1-1-1) rec.709 being displayed incorrectly using Apple’s interpretation of the rec.709 display gamma (1.95).

Middle is my conversion to sRGB (1-13-1)

These two are close but no cigar.

Bottom is my colorSync compatible export (1-2-1) with a gamma of 2.4 injected into the NCLC tagging.

It looks wildly different to both other versions. I would have expected it to resemble the sRGB version but it doesn’t.

Now is this a better representation of my rec709 version I see in my Flame suite? Maybe. It depends on how we look at it and in what environment. How bright my client has their monitor.

At this point my enthusiasm melted and I decided to keep doing it the way I have always done it and to keep answering the same questions I always get about it looking different on their monitors.

1 Like

It’s non-trivial and a minefield.

NCLC tagging has become more available and common, which has helped. It doesn’t actually transform your colors, but is more precise in telling the player what encoding was used. So the player can pick the right transform between what it reads and what it knows (assumes) the display to be set to. This all came about, because in its infinite wisdom Apple and Quicktime made some quirky assumption in absence of better information.

One of the biggest issues though is the Quicktime player. I never use it. Though you may find that many of your clients use it as it’s the default on any Macbook. And there are other factors that go into it - even if properly tagged, very often laptops and desktops aren’t set to useful color profiles, are often way to bright (Rec709 / BT1886 is supposed to be 100nits, yet most desktops are set at 120-160 by default, and none of that is considering the viewing environment conditions of dark office vs bright coffee shop vs backyard table).

Also, what you are comparing is just the gamma curve. The color space also includes the color primaries. Now sRGB and Rec709 share the same primaries, but when you build custom transforms you need to make sure you’re maintaining any primary transforms (though that would show up as color shift, not contrast changes). The third leg of the color space stool is white point, but it’s the same for everything we used except P3.

Add to the pain point and when you look at ACES transforms there are two flavors of Rec709, the device (i.e. camera) and the display flavor. One is intended to be used on camera footage, while the other one is intended for material that has already gone through a color pipeline and tone mapping for a display. So if you import Rec709 footage from a Sony camera - use the input transform for Rec709. If you import something rendered from AfterEffects, use the display transform for Rec709.

I’m not answering your question, because there’s so many variables that it’s hard to tell what’s driving your specific discrepancy.

All that said, I would evaluate colors not on QuickTime, but a proper QC player (I use Telestream Switch, which has many other functions, including DeckLink output and a swipe overlay where you can compare before/after of two renders), and I would upload it to Frame IO and see how it looks there.

Also, even if you don’t have a reference monitor - a separate computer monitor that was calibrated with an XRite to Rec709 and is driven via the broadcast output of Flame (can be HDMI, but with interface, so it’s not subject to MacOS color management) is worthwhile. That way you at least know your file is what you want it to look like. That solves half of the problem when your client complains, and you can focus on their playback scenario rather than having two unknowns.

1 Like

matthew broderick professor falken GIF


Yeah, this client is really big on QuickTime Player. Love it. Not going to use anything else, in fact I can guarantee that every one of them will be viewing in QuickTime Player on some sort of 100% Apple device. So I cannot get away from this.

So we know what it is supposed to look like, but apparently I guess maybe to get around some of the uncertainty of “What is Rec709 really? Is it a color space? Is it a gamma curve? Is it a desert topping?” their in house cg department apparently has made it a habit of also viewing stuff in properly tagged sRGB quicktimes and so they would like one of those also. So obviously I’d really love for it to look as close to the Rec709 one as possible.

Some light reading (forgive the pun)…
Previous recommendations

That becomes a Pandora’s box.

If you (a) properly export an sRGB color space render and a Rec709/BT1886 render
and (2) view them on a dumb player that doesn’t apply the correct colorspace transforms to match the display they should look different, because the code values will be different in the file according to the specs.

If you do this on a proper color space aware player and the file has NCLC tags, both should look identical, because the player would know what to do with it.

Of course you could watch the sRGB render on a computer monitor in the office, and the Rec709 render on a reference monitor in the suite and to the eye they should look identical.

Without color management, sRGB and Rec709 were meant to go different screens. When it gets wonky if people put them on the same screen side-by-side and then act surprised.

These are apples and oranges being sent to a pie and cake factory.

So in the end you cannot win because the client is uneducated and there’s nothing you can do about.

Best thing - bring them in your suite, with proper setup (calibrated reference monitor, proper 65K whitepoint neutral lighting at proper level, bias lights). Show it to them there, have them sign off and give them a file. They’ll take the file watch it in their office and call and yell at you that it doesn’t look the same. And you respond, you’ve been doing this for a long time, the file is correct. Please sign the check.

But to give you a more practical suggestion - encode the file in Rec709. Tell them only device they should watch it on is a recent generation iPadPro with TrueTone and auto-light adjustment disabled, set to 50% brightness. These are known to be 95%+ color accurate and used by colorists for this all the time (though they usually stream rather than give a file).

Betsy is correct.

2.2 is correct on a web player.

2.4 is correct on tv.

However: clients view both in a QuickTime Player. And they see different gammas. And get confused.

I don’t have the solution yet. I wish I did.

1 Like

Yes, and no.

It may be perceptually correct in some circumstances.

But when we encode files we don’t know what environment our files will get viewed, and we can’t ship one version for the evening living room and one version for the office party.

The file should always be encoded to agreed industry norms (which generally is Rec709/BT1886 for any video delivery). If any adjustment is needed to compensate for the viewing environment it has to happen as part of the display, because it knows or can be set appropriately.

And that is why computer monitors default to 2.2 and TVs (intended for evening living room) default to 2.4.

But these days we mix and match and have TVs in bright rooms, and watch everything and their dog and cat on a computer monitors.

1 Like

Well: you need to speak to MPEG because last time I talked to them, they told me they do not specify the decode part of the codec. Hence all the bollocks we all have to deal with, ie it changes on each player.


We’re saying the same thing though: We don’t specify the viewing environment. So we encoded it to our standard and store meta data that say how we encoded it, then it’s up to the player/decoder to do the right thing.

As such the whole question on what we do wrong when it looks weird to them is not our problem as long as we encode it according to standard. If their player is shit, not our problem.

1 Like

We aren’t quite saying the same thing. My opinion is the motion picture experts group need to specify the decode part of the codec as well. I think they are a bunch of wankers leaving this part open to interpretation.

Clients want solutions not uncertainties. That’s normal.

I can appreciate that, and it is normal.

We can provide a solution if we control enough of the environment, or if the the environment has the right ingredients for us. But there are circumstances where the environment is too unpredictable or unsuitable for the solution they want. What are we supposed to do then?

I mean that’s the same if the client says he wants a 5.1 mix because that is what the network asked for. And so we sent him a 5.1 mix, but he only has a stereo system to listen to. So he complains that he can’t hear it in surround.

So we can either send him a stereo mix that he can listen too, but that is not the same file that goes to the network (or theater). Or we can send him the file that goes to the network, but he can’t listen to it. There is no magic middle ground. He either has to change his environment or trust us that we’re doing the right thing.

Clients want solutions. But some are also unwilling to provide a way of doing that properly. That equation has no solve. And it’s gotten a lot worse since Covid with a bigger hodge podge of tech without adjusted expectations.

1 Like

Colorsync is just like aces, there are 2 things you can controll that changes the display rendering;

  1. Input metadata those are your nclc tags or whatever else of a system you use, every single pixel in macOS is managed this way.

  2. MacOS Display colorspace setting, “color LCD” can be a p3 profile or whstever your display is, on XDR the whole system works differently.

iPad OS does not work the same so macOS abd iPad/iPhone will render things differently on purpose for whatever reason apple thinks.

Just do 1-1-1 rec709 , its correct and if someone connects a rec709 display over hdmi they will get reference response.

Also 1-1-1 is the only one that consistently renders the same, across most apps.

Or if they use any of the reference modes on xdr macbooks or iPads they will need to have the metadata set to 1-1-1 to get reference response

only 1-1-1 is correct for SDR video anything else is “off-spec”.

also 1.976 is actually the correct “gamma” for rec709 as per specs. Encoding =/ Display Gamma leads to a shift that you want for dimm surround viewing.

2.4/1.961 = 1.22 this is your OOTF. optical-optical-transfer function

Now this concept stems from oldschool straight up rec709 cameras , taking linear light and running it through a 1/1.961 camera and then displaying it on a 1.961 display would have given you too low of a contrast in dimm surrounds so they made the display “more contrasty” to adjust.

Modern cameras still follow that curve however so the sdi output out of a alexa is still pretty much 1/1.961 encoded until about mid grey.

1 Like

Id argue that this whole idea is completely broken in practice and that we need a major revision on these concepts.

Espeically with phones and tvs now defaulting to 2.2 and rhen vivid mode and whatnot… ugh

new display tech and consumer behaviour has changed this so so much.

if apple decides to make stuff over-complicated and client wants solutions , the solution is to use a device that works out of the box and isnt depending on “this or that box to be ticked” .

Or : use a iPad…

Yes, today’s devices are very mobile and used in environments that change by the minute. Even sitting in the yard on a partially cloudy day, I frequently have to adjust my display brightness on my Macbook.

But it comes back to two things still:

1: Encode the file with a known standard. Then make the device smart enough to understand what it’s getting and transform that into what is most appropriate for the current (and changing) viewing environment.

As content publishers, we don’t know your conditions, so the only thing we can do is create a single well known and standard conforming output.

2: There is a difference between the end user / consumer and the experience they get, and the clients that are approving the work product. We obviously want to make sure that as many consumers as possible have a positive experience with our content and see it to the extent possible in the intended look.

The standards and expectations for our clients should be different. They should evaluate the content in a professional and appropriate setting. Doesn’t have to be a suite with a $20K reference monitor. But it’s fair to expect that they view it on a proper monitor in an appropriate office (can be home office) setting, with suitable software (can be Frame.IO in a browser). Sadly, that isn’t always the case.

So we have to live with this unprofessional behavior and make it work.

Apple devices do have automatic brightness adjustment. Does that only change the exposure, or does it actually do something smarter such as a gamma curve adjustment. I’m suspecting the former, even if the latter would be more appropriate.

I want fresh pancakes every morning for breakfast. And I want it to be my buttermilk recipe. I want someone else to make them for me. Even if Im at a hotel traveling. In Cleveland. Oh, I also want them to be free. And healthy.

I want things too. But just because I want them doesn’t mean I can undo not only the laws of physics but the behavior and tech debt of dozens if not hundreds of other individuals, groups, consortiums, tech players, and maybe even a few government agencies thrown in for good luck.

Does it matter that we as individuals know this stuff? Yes. Does it help us be better craftspersons of our craft? Yes. Does it matter in the real world? Absolutely not.

We’d have a far better time doing more work for less money in even far lesser time if we were honest about this situation with not only ourselves and our clients to say…

let it go GIF


It does feel like the Golgafrinchams at times.


Dude totally! It is totally like Golgafrinchams at times. It 100% is!

what the hell is Golgafrinchams ?

1 Like

Also…I am from Cleveland.