8K may be coming faster than we think

Hair and makeup and VFX artists round the world clench their collective sphincters.


Denoise the 4k plate, resize to 8k, do a fine sharpen, and renoise — I’ve got your 8k right here. I’ve done this from HD to 4k more times than I should admit.


Totally pointless exercise, unless you have 65" plus TV’s, you’ll never notice the difference, unless you are inches away from the screen.



Given that the only country I’ve lived in that cared about F1 racing was delivering their ads exclusively in SD only a decade ago, I’m not sure that the cards will align on this one.

Maybe if (American) football went to 8k.

But I honestly can’t see ad agencies accepting the higher cost for resolution they cannot actually see.

Also, also, going to 8k is dumb for F1. They should use that bandwidth to go to 96fps. It’ll look sharper than 8k and be impossibly smooth.

Stuff 8K or HFR. Wouldn’t it be great to see 10bit digital broadcast/streaming. Better pixels vs more pixels.


Would 10-bit make much visual difference? As soon as any grain enters the picture I can’t tell the difference between 8 and 10 bit images presently.

1 Like

It would make a big difference in my mind. I set my GUI to 10bit instead of 8 for that very reason. It only makes sense to try and get the maximum fidelity possible in broadcast & streaming.

You’re only getting a quarter of the fidelity of 10bit when viewing 8bit. Especially when viewing things in HDR!

1 Like

what changes visually? I know on a technical level what’s going on, but in terms of what the eye sees, what’s improved?

I wasn’t aware that there is a discernible visual difference between 8-bit and 10-bit.

I had to rebuild an entire spot in 10 bit that someone else did at 8 bit that was full of soft gradients. And a slow fade to black in 8 bit? Forget about it . . .


I am sure you already all understand this but will write anyway. The biggest difference is on gradiated colour where 8bit can introduce banding that you won’t see in 10bit. SDR isn’t quite as dramatic but on HDR there is a big difference. Also, on HDR you are compressing the SDR into a much smaller data area. So you get much more benefit seeing more detail in darker areas of HDR in 10bit over 8bit because there are 4 times the data values to sit the information into.

When it comes to graphics and fades, 8bit looks a lot worse compared to 10bit.

I’m not saying 8bit is awful by any means, it definitely works, but as a viewer I’d much prefer to watch UHD in 10bit over 8K in 8bit.


Yeah, that’s fair. I honestly think I could not pass that particular pepsi challenge. Haha.

1 Like


I get the difference between 8 and 10 bit in the background work that we do of course, but for displays, for the consumer, with the same feed, would anyone ever really know the difference between 8 and 10 visually? I doubt it. Not a lot of gain there for the cost of implementation.

In addition to all the great things Adam said, compression algorithms like H.26(4/5) can actually compress more efficiently in 10bit than 8bit. It’s kind of counter intuitive, but it’s true. For us in film land, we see a huge difference between 8/10, especially when we screen stuff in our theater.


Go try to do 8bit in HDR :slight_smile:

1 Like

Some of us just make snapchat ads, ok?

I am right now doing a UHD-HDR workshop at a very large german broadcasters, one that is worldwide on the cutting edge of actually creating HDR graphics for HDR broadcast .

I would not worry about 8K finishing anytime soon.

But just saying I got experience in finishing 8K HDR so just ask me anything (LOL)

Shot on 12K, finished in 12K, downscaled to 8K .

why? it was part of a university project they wanted something “new and cutting edge” so we threw all the buzzwords of the month at a map and did it all.

its dumb it doesnt make anything better, sharper , nicer… its just more useless data… great!