Ultragrid via NDI6

We currently have an UltraGrid based appliance installed at a colorist’s house in Georgia. We are in LA, and the ping is about 45ms.

We encapsulate the output of UG into SRT and that has a 3x multiplier, so the inherent latency will be at least 135ms, which is around 3 frames @24fps. This does not take into account encode/decode latency. Full in real world latency is about 14-16 frames of DCI 2K 12bit RGB 444 P3 color. The colorist says it’s the least latent system he has worked with. He says some of the other systems he has worked with are in the multiple seconds range, even as high as 10.

We do send h.265 compressed video though. Using a non-temporal codec seems to significantly reduce latency at the expense of a vastly higher data load. Also, it is possible that NDI inherently is less latent than SDI-> BMD Capture → encode → decode → BMD SDI out. Also, most of the examples I see of people using NDI, they are running UG on the same machine as the Flame itself, which would add another advantage to reducing latency.

Anyway, if you ever want to test or chat about more complex UG configurations, I’ll be around.

3 Likes

So can you use a Tailscale 100.X.X.X IP address in the Destination of the player? For some reason I can get connections via local IP addresses fine, but testing via Tailscale isn’t working.

Any hints?

Okay, ID10T error. I got Tailscale working.

Now, the ultimately dumb question. How do I get it from this little window into a big window? Full screen?

if you just mean a fullscreen window you can use a Vulkan window in fullscreen with:

uv -t ndi -d vulkan_sdl2:fs --control-port 0

That’ll pick up the first NDI signal the box detects on the network (starting with localhost) and then draw a fullscreen window on the main-display. Pressing “f” on the keyboard will toggle it between bordered and fullscreen.

If you’re sending a signal from another box (just dawned on me that you must) then,

uv -d vulkan_sdl2:fs -r coreaudio --control-port 0

Is probably what you’re after. Uses the default core audio of the Mac for audio and does the fullscreen window.

1 Like

Helpful Ultragrid commands. This is an editable Wiki. Anyone can edit.

For Linux versions, just replace:

/Applications/uv-qt.app/Contents/MacOS/uv

with (assuming you are currently navigated to your Ultragrid installation directory)

./uv

Mac Sender
NDI, h264 422 10bit auto bitrate, opus audio

/Applications/uv-qt.app/Contents/MacOS/uv -t ndi:url=<source ip address>:5961 -c libavcodec:codec=H.264:subsampling=422 -s embedded --audio-codec Opus --control-port 0 <destination IP address>

Mac Sender
NDI, h264 422 10bit 25M/b, PCM audio

/Applications/uv-qt.app/Contents/MacOS/uv -t ndi:url= <source ip address>: 5961 -c libavcodec:codec=H.264:bitrate=25M:subsampling=422 -s embedded --audio-codec pcm --control-port 0 <destination IP address>

Mac Receiver
Vulkan player, full screen, core audio output

/Applications/uv-qt.app/Contents/MacOS/uv -d vulkan_sdl2:fs -r coreaudio --control-port 0

Mac Receiver
Vulkan player, small player, core audio output

/Applications/uv-qt.app/Contents/MacOS/uv -d vulkan_sdl2 -r coreaudio --control-port 0
5 Likes

@cnoellert Are there any advantages to using -d vulkan_sd12 over -d gl?

iirc vulkan can do 10bit and gl cant? vulkan is that how new shizz

1 Like

correct.

In fact, just this weekend, I tested on the scopes, the HDMI output of an Intel NUC12 running Ubuntu with UG set to 10bit vulkan out, and it is slightly more color accurate than the HDMI out of a BMD UltraStudio 4K Thunderbolt, which is supposed to be pro level accurate.

BMD keeps audio sync better though, and has SDI out too, which some of our installations need.

3 Likes

I’ve uploaded my UG script in the portal, but am putting it here too.

ultragrid.zip (46.3 KB)

It’s really nice to be able to convert my side mac into a “broadcast monitor” in a few seconds.

If you want to check it out, you can open the UltraGrid Setup window and enter your ssh info and commands. I’ve only tested it on Linux in Flame2025, but should work for 2023.2 and above. It won’t currently work for Mac because of the way it opens up new terminal windows and because JumpDesk is pretty good for our Mac stuff.

3 Likes

I am planning to get a bit more elaborate now that we keep adding remote users that work on onprem workstations.

Meaning i want to build a dedicated encoder box that can handle up to 8 incomming NDI signals and re-encodes them to whatever(high bitrate intraframe h264 or something i guess) using ultragrid , also want to wrap it in a nice webGUI for management purposes.

I wonder what kind of hardware I need to get it to run smoothly, probably software encode on a big threadripper scales better than trying to make a bunch of nvenc quadros work? dealing with multiple GPUs sounds like a nightmare.

My mac studios in my rack dont have BM output cards they are just headless little nodes, so NDI sounds like the ticket.

This would also include 1 suite in a different city where I would like to send somthing high quality like jpeg2000 at a high bitrate.

Remember that video I sent you awhile back showing the asrock rack 1u server with the rigged Quadra I put in there. That would do the job.

1 Like

Also, no J2K in UG except via paid for Comprimato plugin.

1 Like

Also, anything NVenc is going to be limited to YUV if that matters. Not sure their implementation of h264 supports 10bit 444 even, but likely.

Software x265 supports native 12bit RGB 444. What resolutions do you anticipate.

1 Like

https://www.reddit.com/r/unRAID/s/sGD0ygGwkp

And on workstations cards, this is unlimited.

1 Like

Sorry, one more thing too. I’ve never been able to decode NVenc without using NVdec. Such that an NVenc stream is all fucked up decoding software, or Intel hardware acceleration. So your receiver would have to also have GPU which is incompatible with our distribution model.

2 Likes

super good info thanks, I will probably play it safe and go with cpu encode, this will
mostly be UHD 10bit 422 would be my goal. @25fps

Also, FEC is kind sketchy within UG with h264/265. They made some improvements recently, but I just skip all that and pass the UDP output of UG into SRT and use that to transmit the signal. Adds a tad of latency but the stability is great.

Do your testing with simulated packet loss.

I’ve never tried to encode NDI. I think it is YUV inherently. H264 should be no problem on cpu. For UHD 12bit RGB444 CPU is hard but can be done. Lots of things for you to test and play with.

BTW, you can probably do all this with that Russia streaming software you use. I forget the name.

1 Like

too much latency with nimble sadly , cant really get it down from 1s end to end .

Allready usingn srt with UG , works great, just want to scale it

1 Like