Flame Linux with RTX 4080 Super16?

Is anyone successfully running Flame with an RTX 4080 Super 16 GPU?
Does that work? Any issues.
Intel i9 chip.

This is a somewhat loaded question. Many people will say they have success and have experienced zero issues running non-supported graphics cards. To be clear, the RTX gaming cards like the 3070/3080/3090/whatever 4070/4080/4090 whatever may work, but are absolutely not supported and as reported publicly and reiterated privately to us in person at NAB 2024 which just wrapped yesterday, these gaming cards are problematic and you will experience issues. Some may YOLO it and run it anyway, like @finnjaeger, but many will not.

So, the decision is up to you, but if it were me, I’d stick to officially supported cards on the Autodesk Flame System Requirements page, linked here.


I’m with Randy on this. I have much respect for the folks who’ve gotten gamer card machines to run—I even tried to do it years ago—but you need to be both smart and tolerant. It was too much Linux-knowing for me.

Given that the machine is key to being able to charge money, it’s not the place to try and save a few grand, but if you wanna build one for fun, go for it.

1 Like

To put a fine point on it… ADSK support wont touch it with a 10ft pole.

Related… check out this tank PC AMD Threadripper - 2xRTX3090
if you could get it working… having a whole other 3090 to Background Reactor on would be siiiiiick.

Could you elaborate on the issues people will experience?

Thank you for the feedback.

I’ve run gaming cards on a couple of Flame home setups.

It definitely crashes a lot more and certain things such as ML toolset can take longer and be a bit buggy. Apart from that it worked fine enough for me. I didn’t have too much trouble getting it up and running either.

That being said, I wouldn’t spec anything but a pro card (NVIDIA have dropped the Quadro moniker I believe) for our workstations at our facility. My home setups have mainly been used to build setups that I would then render back at work.

I do not have first hand experience because I’ve got far too much going on to run a science club experiment in this department. When this exact discussion came up at our NAB event a few days ago, specifics weren’t mentioned but it was clear that this a dangerous place.

I had the same take away from that discussion. No specifics, though it sounded more like image quality problems not just occasional reliability problems. So they may be harder to catch or work around.

Generally just not worth it.

The real question is why?
The cheapest 4090 on B&H is $1700.

A workstation class card with equivalent VRAM and generation (RTX 4500 Ada), is $2,999.

Let’s say you amortize the cost over a 3 year useful lifespan. That comes out to $1.55 (4090) and $2.74 (RTX 4500) per day of ownership. It just isn’t worth it to try and save $1.25 per day to have ANY amount of hassle. My time and productively is exponentially more valuable. If you think you can’t afford the RTX, skip Starbucks once a week.


Understood but I don’t feel the equation is that simple.

I haven’t tested an A4500 ada personally. But extrapolating from my experience with my previous gen A5000, the 3090 was significantly faster. In fact, a 3080 was faster than the A5000 (Not that I would recommend a 3080 for real work because of VRAM). Surely this should factor into the equation for time, productivity and quality of work.

Availability of cards is also somewhat of an issue for people in smaller parts of the world, like me. If a 4090 goes down for whatever reason I can very easily get a replacement. Within 40 minutes I can buy a 4090 from the shop and install it; 15 minutes if its already on prem. For me specifically, this also factors into the time and productivity equation.

Anecdotally for what its worth: My threadripper pro system, with an A5000, has had serious issues over the past 3 weeks. This is not with Flame, but I have lost many hours with a fully certified system. So its not a magic bullet to avoiding downtime or problems generally.

While resolving these issues, I have been temporarily working on my desktop ryzen system with a 4090 inside it. This is mostly an unreal engine box but I have used it on/off for Flame when needed. I have not encountered any issues or anomalous behavior regarding rendering, image quality or stability. It also works out-of-the-box these days; there is also no additional setup required or use of non-DKU drivers.

So I find it very difficult to make an assessment. In my mind its a similar risk/reward proposition as buying hardware off of ebay.

I’m not advocating people go out and buy rtx over an A class card.
All I advocate for is transparency regarding the specific risks of using gamer vs non-gamer cards so that we can make informed decisions.


i replaced a A6000 with a 3090 as the A6000 kept crashing when rendering heavy scenes in blender …

You cna get a bad chip no matter what, nvidia support was garbage blaming the rest of the system (HP Z840)…

i trust them when they say there are issues, sure.

but i like my frankenmachines, they run well and i just have conpletely spare machines …

what I dont understand is how the cards are actually different or what causes what exactly, personally i havent seen weird stuff, but then I mostly use a mac nowadays and for some reason a 5K imac gpu was fine but a rtx4090 isnt.


Not a direct example because it’s from a previous generation - 2080 vs Quadro 6000. Same generation cards, one gaming and one workstation. Same chips as far as I know. But the Quadro card had two copy engines enabled, the 2080 only had one turned on. So there can be meaningful differences for the software.

And the there is the driver.

What I have learnt in the past, is Geforce and AMD(Mac) pixels are slightly different when compared with one another but as good as each other, but Quadro has better looking pixels than both. These differences are usually noticeable in Paint, when dragging pixels, what I mean by better looking pixels is hard to explain, but looks cleaner is one way to describe it. The way to test this if you want to try your self is to simply create batch setup with a paint node dragging on a live action frame, and reload it in to your different graphics hardware setups and compare renders.

Recently I compared a renders from Geforce RTX 4090 laptop, to a Quadro RTX6000, and they looked identical, but AMD Radeon Pro, in my intel Mac differed, not as clean. But flame on the 4090 laptop was not stable enough to drive and kept crashing.

So yes I agree Quadro is the best way to go.

1 Like

When you do these comparisons, do you have a way of doing a blind A/B? It’s sometimes hard to remove our mind from the comparison.

On the audio side of things, Nugen built an A/B test tool where you can feed it two tracks and swap between A and B, but it doesn’t tell you which is which track. It will get randomized. After you’re done, it then tells you you picked A 75% of the of the time, and which track A was.

Not sure if there’s a picture equivalent tool.

difference matte

Of course. But that just shows if they’re identical or not. If not identical, how did you decide which one is ‘better looking’ without imparting confirmation bias?

1 Like

my trained eye tells me that the better is cleaner the other looks muddier,.and darker… quadro is always cleaner, …you would know when you see better quality pixel drag, …well I definately do, …how are you conducting your test? what are you comparing ?

Yes. The trained eye is reliable if we’re comparing something where the difference is in the 95th percentile (i.e. plain obvious). But at some point, when you get into the 99th percentile, they look almost the same, but aren’t the same. That’s where confirmation bias can come in, where you expect the Quadro version to look better than the gamer version. And so you might subconsciously find that you see a positive difference on the Quadro version.

Using a blind A/B test is meant to remove the confirmation bias, by you not knowing which version you’re looking at when you decide which one you like more. Then you pick a version 10 times from a randomized sample (meaning the tool will randomly show one or the the other), and look at the distribution of your picks, and then reveal which one is actually the better one, not the one you expect to be the better one.

Now in the test you did, it may have been plain obvious, so then none of this applies.

Here’s the audio plugin I was referencing: https://youtu.be/lhb0zU5HBZc?si=HrNA8JV5E9iRWHJf&t=67

The same approach could be used for a comparison of two renders of the same shot.

I think your over complicating it,… with my 30yrs experience as a flame artist I can competently tell the difference and tell which is better, and im sure reading other post from the likes of Alan, Randy they would be able to see aswell,…, sorry I don’t have time take this any further with you thanks, believe what you want.