Fake a Broadcast Monitor for Streaming

Is there any way to have flame “think” you’ve got a broadcast monitor hooked up so you can pipe that into the video stream versus the whole UI?

2 Likes

When you say “broadcast monitor” are you talking about an AJA/BMD SDI out, or a 2nd monitor that you choose to have picture on?

Welcome to Frustrationville. For as simple as possible, I’d recommend a Blackmagic Mini Monitor and either an Atem Mini or. Blackmagic Web Presenter.

I’m not even sure, since I’ve never bothered to hook one up on my home rig.

But the idea being the second monitor that only shows the picture coming out of flame with none of the UI.

the thing that is controlled by the “broadcast monitor” pane.

I’ve been using NDI and OBS for client sessions for a few months now. You need a piece of hardware that converts HDMI to NDI. I use a second machine (Mac or Windows) to run through and out to the internet. Not sure if you can do it all on your flame machine.

With NDI you can monitor the video feed anywhere on your network or send it out to the wider web.

My setup:

  • Second monitor out of flame going to Magewell Pro (HDMI to NDI converter)

  • Windows Desktop with NDI Tools and OBS connecting to Zoom.

  • OBS Scenes are set up to show my face (webcam) or flame broadcast out or both. OBS also controls audio (whether I want clients to hear my mic, program audio, desktop audio, or any combination of those)

  • On Windows you need a free plugin for OBS called OBS-VirtualCam that makes OBS output a webcam (for Zoom to see)

  • Zoom launches and sees OBS as a web camera. Ready to go for Zoom meetings.

  • Video and Audio playback maintain 24fps. Clients have commented on how smooth it is compared to other Zoom meetings they’ve been on.

  • We are not doing absolute color critical work but the color has been good for client review.

  • Make sure in Zoom video settings you are not sending out mirrored picture.

  • This is not ‘screen-sharing’ on zoom so you have to tell the clients to pin your picture in order to get it full screen on their computer.

  • Also - bonus - I’m using my Nikon DLSR as my webcam. There is a free driver for this called “Nikon Stream Camera” Clients can’t believe my beautiful depth of field on a webcam.

It seems like we’ll be doing this work-from-home for now with no plans to rush back in to the office.

3 Likes

You can do a dual monitor setup on the flame and then with RGS you’ll get two monitors at home. Then inside flame you can one of those be your video playback. Audio will run through RGS.

Don’t know what you’re using for remote though. Otherwise you can always use Ultragrid on a separate box and stream it to your home as a high quality/low latency option and then audio and video will both run through UV which can then be thrown to a home tv or second desktop monitor.

Depends on what you want to do I guess and what hardware you have at your disposal.

Just read your post again, sounds like you just need to hookup a second monitor to your home rig and enable it for broadcast monitoring. When you’re not using it for that you can throw scopes on it or the library. Handy if you’re into that.

1 Like

This is what I was doing before buying @randy’s ultrastudio mini box. Worked fine-ish. It also functions as a screen you can share to zoom, though framerate, color, and resolution are no bueno for doing remote client sessions. (Which I learned the hard way)

(he’s on a box at home, no pcoip/rgs)

You’re pretty much talking about my home rig setup for streaming. My second monitor feeds into another PC that I use to stream to YouTube for sessions.

2 Likes

Yeah man, I don’t know how I read that post so wrong :joy: whatever… I totally got it into my mind he needed broadcast to his house.

2 Likes

Its funny how contextual these things are sometimes, isn’t it?

1 Like

Irony being that the bit I thought irrelevant and edited out was really the most important.

For our broadcast streaming I actually took an old z800 and stuck a handful of old Decklink capture cards in it. Then I capture and transcode using bmtools piped to ffmpeg to generate an rtmp stream per incoming SDI signal which. Each rtmp is pushed to it’s own password protected webpage using Red5pro and an NGIX backend.

Took me a little time to work it all out but it’s fairly bullet proof, stupid low latency and will play on anything including an Apple TV over airplay.

Nice! I battle with @andy_dill 's situation DAILY. It’s perhaps the most frustrating part of my world. It’s tough to be Zoom/Slack bombed by producers and you need to show them stuff, and have a decent 1920x1080 stream with decent audio and decent color. So far the best Ive cobbled together is a Atem Mini solution…but it gets complicated real quick… Congrats! Your nice USB microphone is obsolete, you shoulda bought an XLR. Congrats! You need a few SDI to HDMIs and HDMIs to SDIs, and congrats! Now you can’t monitor your Atem Mini Output so here’s an HDMI extractor.

SHOUTING LOUDLY TO MY SON: “YOU NEED TO KNOW ABOUT COLOR CORRECTION SON!!!” :smiley:

Why not capture directly via FFMPEG? What codec are you using for compression?

It’s been a while now but initially on 18.04 I was seeing more robust behavior piping when capturing with more than one card rather than capturing directly in ffmpeg. Admittedly I haven’t bothered to investigate further once I got what I needed. I’m using libx264.

Somewhat related, but have you gotten FFmpeg to compile against 11.5 or higher Blackmagic @ALan?

I don’t use FFMPEG.

My bad. Sorry for the noise man