I am wondering as to how many NITs your GUI monitors are running.
I somehow got so used to 250+ NIT GUI monitors that I find setting it to the correct 100-120 NIT really disturbing and dull, but i am trying it again now as its better for my eyes I recon. And i even sit in a dark basement with proper bias lighting and everything sooo I should be fine but its still so dull and… i dont know.
Also calibrating GUI to pure gamma 2.2 instead of the weird sRGB curve is not making me too happy, but thats a different topic
I set gui to 110-120nits in a light controled environment. At first you can think it
s dull, but this will go away when you used to it. Notebooks that used in not light controlled environment is another question, its usualy fine if user change screen brightness for comfortable viewing in that case.
PS my prefered gamma for GUI is bt1886, white point is depends on environment))))
somehow even after a few month running 100-120 NIT guis it feels dull. Going back to high birhgtness is like putting on glasses, its probably YEARS of looking at high contrast screens my brain just cant handle this dullness
macbooks are the worst, glossy screens you need to run 500 NIT just to see ANYTHING through the reflections
I use a super dull 80 NIT setup in an almost completely dark environment for the GUI.
wow thats dull, i am just now trying to sort the claibration but displaycal keeps adding a weird black lift that I dont get…
its never easy but keep em comming really interesting to hear how bright people set their monitors
do you set black point correction to 0%, by default it set to 100% if I not mistaken
I match the brightness of my lg 4k gui and imac pro screens to my 100 nit Sony EL trimaster broadcast monitor in a dark room. Its about 4 notches down from the brightest setting in system prefs
yea its funky I got myself a colorspace license to go check on that
I wouldn’t know how to measure or how to even start to caring. Care to enlighten us?
according to the rec709/bt1886 standard your broadcast monitor should be set to 100NIT peak brightness so when you put up a full white image it should display at 100NIT. (100 candela/squareMeter) .
Some professional displays have a nit setting some other just have a brightness or “backlight” slider.
Now you usually want your gui display to match the lumiance of the broadcast monitor.
Another reason for the 100 nit is that a normal ips lcd does only have a contrast ratio of 1000:1 so if you have your monitor at 300NIT peak you are also rausing the shadows with it (300/1000= 0.3NIT) just a a example.
everything above 1000:1 and 100NIT peak (or 120 depending on who you ask) would be considered “out of SDR spec”.
As lots of people find different display luminances comfortable dor working this is very interesting to see where people put their screen luminances.
It also heavily depends on environment obviously for a brighter room you want brighter monitors etc its a very complex psycho-visual topic
Oh and you can measure this with a simple display calibration probe thingy. its the first step of each calibration software to ask you for the lumiance target
Now that I am looking at HDR images more often than SDR, and can be compositing in ACES or Camera Native log colourspace as well as a display referred, I tend to not care so much about my GUI luminosity any more. I certainly wouldn’t want to be looking at a HDR GUI unless it was sitting below 250 nits and that obviously isn’t to any spec. If/When I was only working in SDR Rec709, then it made sense to work at 100 nits but not in these days of working in many different gamuts & gammas.
Yeah I see your point, I do run HDR guis when I can (when software supports it).
What can I say I like it but I usually softclip eveything to around 350NIT tops and then have reference set to ~500NIT with again a soft tonemap, this is mostly to review vfx though.
for SDR I just csnt get used to 100 its so dark