After much experimentation with HDR, I wonder what the approach with the dynamic metadata is, I always read that it supports “per frame metadata” but pratically it seems I can only do metadata per shot.
I was sort of expecting that I would be able to fade between analysis data in case I have a shot that goes from really dark to really bright to sort of use the metadata as a creative tool for the trimm passes.
And funny enough I can do a dissolve fade between 2 HDR metadata segments in flame, and it does look like what I expected in the 100NIT trimm , I cant see the resulting per frame metadata like there is no read-out but visually it does work.
But when I export a dolby vision xml it crashes with a error saying there is a fade.
Am I missing something? Its not like I can grade the PQ source for the trimm in dolby vision as its just one master file and then just metadata telling the tv how to do the trim.