Fun with RTMP

(These Posts and Replies were originally in the @digitalbanshee 's post Anyone using Evercast? but we got a little off topic and wanted to make sure she gets relevant replies in her Topic and you guys have the space to explore general RTMP stuff…thanks, @randy)

@cno

Im out of the loop with Filmlight…that’s a thing ? RTMP out?

Yep. BaseLight remote is a whole cluster fuck of strange ideas aimed at allowing FilmLight customers to grade in sequester while at the same time being as restrictive as could possibly be imagined.

That being said, one of the better things to come of all this craziness is a high quality rtmp out directly from the BaseLight… currently in beta but I hear good stuff.

So there you go.

2 Likes

Assimilate Scratch can also stream video out without any additional hardware

https://www.youtube.com/watch?v=a85UqMstlkE

1 Like

So ball’s in Autodesk’s corner… that’s exactly how you need it to work.

1 Like

So it sounds like the answer is no, nobody is using Evercast :stuck_out_tongue:

2 Likes

The Scratch implementation actually is not color accurate at all. You CAN NOT get color accurate with H.264.

Thinking about it, not sure if the the Nvidia implementation of H.264 support 10bit 444, but I know for sure that x264 does it. It only does 10bit YUV 4:2:0 or 8bit RGB4:4:4. Neither are color accurate or appropriate for proper grading.

1 Like

IMO main problem with YouTubeLive preview is that it will be shown on a not accurate device (iPad is at best), 4:2:0 or 8bit is much less of a problem. And to a “non trained eyes” viewers most of a time (your situation may wary).

On other hand it works, it easy to set up…

In a perfect world it would be useless, of course.

1 Like

Indeed, but the question is where to set the bar and for whom. If you’ve got 20 creatives working on a social deliverable then a you tube stream doesn’t feel so far off mark. If you’re interested in mission critical evaluation and color accuracy then there are other techs and deployments which set that bar higher.

1 Like

Yeah. We ended up developing our own in-house streaming infrastructure, that can support 12bit RGB 444. Since we work on features, we need to support P3 and HDR color space accurately. Also need to be able to out 2k via SDI to multiple partners at the same time.

3 Likes

Yeah @ALan your Logik Live was rad. Love how you figured out your system and your one touch client box thingy. A fiver says most of our needs are on the other end of the spectrum, and looking for the best off the shelf and relatively cheap solution for the “good enough for commercials” world.

I saw your early test you show on logik, realy great stuff.

1 Like

Speaking of which, You can throw at least 2 Kona3s comfortably in an old Z800… pretty good use for all that old kit we’ve got sitting around.

1 Like

Thank you so much again for posting that. We’re well on the Ultragrid route. We ended up using spare kit around the office for the encoder and Intel NUCs to send to clients. They’re not capable of handling cineform though.

Which codec have you ended up using? I’ve been working with the guys at Fastvideo for a GPU JPEG2000 solution. We’re thinking of using it for remote grading between offices.

1 Like

Never heard of FastVideo before. Their website is a mess. Some of their product description seems strangely similar to UltraGrid. UltraGrid also has its own GPU based J2000 implementation. What resolution, and color space do you grade in? I’ve spent almost no time playing with CineForm as the dataload is just to much. H.264/H.265 are the only ways to make this work reasonably.

Yeah, their site is horrible. The product description you’re referring to is something they’ve put together since we’ve been working together as they’re wrapping their codec into Ultragrid. The Comprimato implementation requires the SDK which, when I asked, they want $18k/yr for.

I’m trying to see how much I can push it to match what Sohonet Clearview / Streambox offers…but on the cheap. We monitor our grades at 1080, Rec709 444 95% of the time. The objective with this route is the get as close to the final image quality as possible.

The little client box is H264 however. H265 always seemed to be a struggle as I could never managed to get a signal that didn’t crap out on me at some point (i.e. lose a required I-frame or something of the like so it goes all grey and then catches-up)

How are you handling the network connection to the client boxes? Port Forwarding? Are you doing FEC? Regarding FastVideo, if you are going to do GPU for J2k, you might as well use GPU for NVENC as that is already ready to go.

They connect via a VPN to an isolated network, it’s something our engineer has setup. Networking isn’t my strong suit so unsure about the specifics. We’ve been using NVENC for encoding either H264 or H265 streams. As for FEC, for both those codecs I’ve been using the one you originally gave me rs:200:250.

I think our little NUC (NUC8i5BEK) isn’t quite capable of handling it either, so that might not be helping things.

Can I ask what your h265 settings are?

Just default settings for H.265, but I’m encoding on a 32 thread Ryzen 9.
We decode 2k 12bit on i5 9400T without issue.
What OS are your decoders running?

Lubuntu 20.04. Are you encoding using libx265? I only tried hevc_nvenc so maybe that’s the issue as well. I’ll give that a look later this week.

Try Ubuntu 19.10 on your decoders. There is something in 20.04 that make x265 shit decoding.