Vimeo Supports "Dolby Vision" - but how to make it

So vimeo now supports “Dolby Vision” Or apples special sauce variant that works with relative luminance levels and is HLG based but has dynamic metadata - Now I am not opposed to HLG , I actually preffer relative luminace mapping for many usecases (like mobile viewing) Absolute luminance mapping like what PQ is doing is great in theiry but only really works in proper home cinemas , you will not watch a Dolby PQ movie in the middle of the day in your living room unless your TV adjusts the lumiance and then whats the point.

So but back to the dolby thing - how do I make a HLG/Dolby master file with proper metadata is this just with fcpX? Is it just a auto button, if so why cant the player just analyze maxfall and maxCLL and then apply “dolby curves” during playback - no idea anyone has more insights compared to the tech docs of vimeo?

We further support the upload of encoded Dolby Vision files. Dolby Vision for Vimeo uses an HLG base layer with Rec. 2020 primaries and Dolby Vision dynamic metadata. Using Dolby Vision for Vimeo, the metadata is automatically included inside the file. There’s no additional metadata required for upload. The HLG transfer function enables backward compatibility with a range of non-Dolby Vision devices that support HLG playback.

Here is the answer.

Its HLG with special metadata that can only be generated by FCPX or Compressor.
you can not change/author that metadata or create trim passes or anything else.

so its just pre-analyzing the HLG and then creating the appropriate tonemapping curves for each frame of content and display condition.

I guess its better than nothing or to have displays decide that themselves (many TVs create dynamic metadata on the fly). Its also easier to handle as its 1 file and not prores+sidecar.

I have the suspicion that this will be the de-facto HDR standard for web delivery going forward, vimeo now and then instagram e.t.c will follow…

Hey Finn,

I’m not quite sure how dynamic metadata with what could be described as a dynamic gamma curve would work when framing it in context of a trim pass.

I guess that’s why Dolby didn’t support HLG at first.

seems to just be a analyze pass so decoders dont have to analyze it on the fly, but yea without any controll, I doubt it to be very useful. but then maybe its just a standard of how to tonemap HLG , versus all players doing their own stuff.