Denoise not rendering

denoising a clip flame 2023
rendering in batch is just black. It just doesn’t want to do it.
switching to background reactor worked.

thoughts?

I haven’t seen that, but my main thought is spend $150 on Neat Video, and you’ll get far superior results, and faster renders…

3 Likes

well duh!

this was just a quickie thing and then noticed the black frames which was the bigger surprise.

I guess it cleaned it too well. :slight_smile:

even when i rebooted I got the same results and on multiple clips.

Thowing a Mux node before it might help too. Sometimes that helps when paint craps out.

1 Like

The mux node trick has led me to always put a mux before/after a paint node. I’ve had a lot less issues since getting into this habit. Might just be coincidence though.

2 Likes

Everytime without fail brother!

2 Likes

the problem is not like the paint node issue.
in fact since this post i’ve actually done a clean install of flame as i have migrated to rocky linux & am now on 2023.0.1 instead of 2023.
I started a new project, new user - with a fresh clip and the same problem occurred. It will analyze fine, spot check fine, but if i go to render the sequence it will black out after the first frame.
I just don’t know where the problem originates.

oh and I also have tried various Nvidia drivers.

i have gone ahead and ordered a rtx a5000 to see if that just flat out squashes the problem and then gift that card to my son…stand by

Hmm. What hardware are you running and how old is it? I was under the impression that Denoise used CPU resources, so, the likelihood that this is a GPU problem is likely…umm…unlikely. I think. maybe.

ryzen 5950x

gigabyte x570 s aero g motherboard

128GB ram

I’ll keep ya posted.

Hmm. Has this always had this Denoise problem or is it a new problem?

i wish i knew. unfortunately i can’t recall. the system is fairly new and i haven’t played around too much with it. i use it just for tinkering.

i get the new card wed. will run some tests and then go from there.

if it’s the processor, i don’t know how to fully test that as it seems to pass stress tests and works quite well when i do cpu benchmarks on it.

anyways i’ll keep you updated.

thanks!

so…
new card.
problem gone. BUT I agree that it shouldn’t necessarily be the case.
Maybe i just tricked myself into buying a new card and one that was certified and plays nicely with DKU and such.

i ran a few denoise tests on several clips and they all worked as expected, but it’s really hard to say why.

1 Like

Wait…what GPU was in there? Autodesk is pretty clear that Nvidia cards are required.

it was still nvidia.

it was a asus 3080 with 12 GB memory

Ah. I should have been much more precise. They really want Quadro series or RTX6000/8000 or RTX A4000, A4500, A5000, A5500, A6000.

yes totally. which is why i just bit the bullet and did it.

But so many seem to work just fine with game cards - but at the end of the day it wasn’t worth the headache.

there is something nice about it just working.

That being said. one thing i did notice and maybe i’ll start a new chat for it.

I ran the GPU blender benchmark on it in linux vs windows and in windows the card had a 2x higher score.

I also ran the same GPU test on linux on an RTX6000 and it performed about the same as my A5000 did in linux.

i’m not sure why that is.

i’ll post the benchmark link and maybe some others can chime in.

Agreed!!!

Based on the benchmarks I’ve seen. An RTX 6000 is about the same as an RTX A5000 in what we typically do. Of course, the original RTX 6000 MSRP was what…$6300? And the A5000 was what, $3500 ish?