2023.1 Linux micro freezes — still?

I was just told this is still happening. Have others found this to be the case? It’s the one reason I was going to inconveniently stop down for a day to upgrade everything.

Please, please don’t let this still be the case. I consider it a showstopper bug.

Personally, I think it is worth the upgrade for being on Rocky Linux 8.5 alone. That makes the user/admin experience alone so much beter than being on CentOs7 but that is besides the point in regards to your question…

I have only been using Rocky Linux for compositing as our finishing systems are still on CentOs so I can’t comment on the micro freezes at this point. So far though, I haven’t noticed any as of yet but that doesn’t mean they are not happening.

I still see micro-freezes with 2023.1 on CentOS. I just installed on a Rocky system and I don’t see the microfreezes there but I haven’t used it much yet.

Another note that might be a factor… my CentOs system is HP z8 with P6000 and AJA hardware. The Rocky system is a “Gaming” PC with a 2080 graphics card and no I/O hardware.

Also the Rocky system is not on a network with other flame machines… The CentOS system is and I suspect some of the microfreezes are pings between the flame systems.

There are a few different factors leading us towards moving to Apple M1 systems. The microfreezes on Linux is the number one issue.

Thanks for the info, Bryan. Curious if you used 2022.3.1 on either OS, and if you saw the freezes?

I don’t remember the last version that didn’t have some bit of mirco-freeze. I think it was 2022.3 that they addressed something that helped but it didn’t eliminate it completely for me.

Hello group!

There was a fixed added to 2022.3.1 last March for one of the issues reported as micro-freeze. You should all upgrade to this small update if you are still on 2022.x stream.

As many write, moving to Rocky Linux 8.5 and latest Flame Family 2023.1 is not a bad idea.

I’ve been 0n 2022.3.1 and it still freezes.

1 Like

Like I wrote, there might be multiple issues related to the micro freeze and from what we have seen, the fix added to 2022.3.1 was taking take of most of them. We will need to continue investigating but for this we need to know there are still issues so make sure to open Support tickets so we can look deeper at the root cause.

For me, the bigger my setups get, the worse it gets.

1 Like

Yep. Same here.

Same here on 23.0.1 Rocky, but not just with setups, also when the amount of sequences is increasing, the time inbetween the freezes decreases.

Same. I see it most in Batch, but also in reels (2022.3.1, Centos)

I’ve given up and now accept that this is the new reality.

Something got broken between 2022 and 2022.1 and they still haven’t fixed it all the way. Even on 2022.3.1 I’m still getting 6-10 second pauses once every frew minutes. It’s maddening.

That is why I am skeptical that this is mostly an OS problem as the source. I didn’t change the OS between the non-freezing Flame version and the suddenly freezing Flame version.

I agree 100%. The freezes weren’t there in 2021.x.x. My uninformed, idiot artist-level superstition suspects issues from switch of all the services from /etc/init.d over to systemctl (becasue that happened then), but like I said: idiot artist-level.

Just curious if everybody who’s experiencing this, is also reporting it? With screenrecording et al… it’s really hard to nail down as it seems to be quite random. I’ve not had it this week for instance for some reason, but I didn’t do big batches this week. (2023.1) A project earlier, I suffered quite some… really weird.

I reported it initially, but quite frankly, life is too short to keep reporting the same bug.
Theres enough of us that have said it is an ongoing, (and serious IMO), problem that AD should put this to the top of the list.

2 Likes

I reported it and was unable to repro wiith Beau looking over my shoulder via teamviewer, but then it came right back on the next project.

I agree that this should be the absolute number one top priority for ADSK resources right now.

1 Like