Flame Machine Learning Timewarp, now on Linux and Mac

Oh silly me!

I’ll fix that ASAP, thanks for pointing it out!
Planning to install Rocky and the latest version of Flame tomorrow, so there might be some fixes for that as well.

1 Like

Hi guys, there’s a version v0.4.4 in releases - it has been tested on RockyLinux with Flame 2023.0.1 and RTX 3060

Due to the increased size of pytoch library it won’t be practical to bundle it with the file, so python dependencies will be downloaded during install. It does require internet connection. It still should be possible to download them on another machine and install manually from file system but I haven’t got time to check it.

7 Likes

just for the record, current 0.4.3 works in my rocky system.

1 Like

Hi Talosh,

Thanks so much for this. Is the centralized install steps still the same as before?

Thanks,
ALan

TWML doesn’t like super whites/sub blacks and can give weird artifacts sometimes. Convert to log, run TWML, convert back to AcesCG. See if that fixes it.

Ah bummer.

Not that this solves it, but an interesting case study would be to downres (and convert to log) and see if MLTW gives different results. I’m thinking the filterning might kill that pattern, but if it remains, it points more towards TWML introducing something.

I’ve also seen this kind’ve pattern get introduced in ProRes4444 in certain cases. It’s faint and you have to zoom in to see it. Running that through TWML might magnify it.

Perhaps try to enhance the pattern in batch to see how much is picture vs viewport related. Crunching the contrast, exposure, etc might show it more. Perhaps edge detect or frequency separation. If these don’t make it any more apparent, it might be viewport related, but if they do enhance it, it points towards the data being burned into the picture.

Hey all - I’m having a lot of trouble using ML timewarp on my Mac.

Mac Pro (2019) Catalina
16-Core 3.2GHz Intel Xeon W
240GB RAM (Apple 4x 8GB, OWC 2x 8GB, OWC 6x32 GB)
AMD Radeon Pro Vega II

I tried this a few months ago and couldn’t get it working. I’ve since upgraded to 2022.3.1 and i’m having the same issue: I never get a rendered clip (or am I just not seeing it?).

I did a test with colorbars exporting to my desktop - this is what the shell says:

[flameTimewarpML] creating folders: /Users/wren/Desktop/SMPTE_100_TWML2_2022JUL15_1700_394/source
[flameTimewarpML] Importing result from: /Users/wren/Desktop/SMPTE_100_TWML2_2022JUL15_1700_394
[flameTimewarpML] Cleaning up temporary files used: [β€˜/Users/wren/Desktop/SMPTE_100_TWML2_2022JUL15_1700_394/source’]
[flameTimewarpML] Executing command: rm -f β€œ/Users/wren/Desktop/SMPTE_100_TWML2_2022JUL15_1700_394/source/”*

There is an empty folder on my desktop with above name, and no rendered clip anywhere I can find.
@randy @talosh @Jeff

I’m speeding up my clip. The option I’m selecting from the pulldown is β€œtimewarp from Flame’s TW Effect (beta)” with Analyze Full Res and Model v2.4.

I installed flameTimewarpML v0.4.4

Thanks,
Renee

Hmm. First things first. Can you manually export color bars to the same path? Rule out a permissions problem first.

Hi! Yes - I thought permissions too but it’s making folders. I can see it exporting the frames too (the popup window shows that progress bar).

@hildebrandtbernd (in Discord) said β€œLooks like it’s skipping most of the script. So it could also be the permissions for the script itself, not the export location”

Oh wait. That version you just installed, zero.4.4, isn’t for Mac. Go back one version to 0.4.3.

2 Likes

I cleaned up the old version in python and home directory, downloaded 0.4.3 for mac, and launched into Flame. I clicked the dialog box to let it download into my home directory and a secondary terminal window popped up with this:

wren@bigmaclargefries ~ % tail -f /var/tmp/flameTimewarpML_install.log; exit

[flameTimewarpML] bundle_id: ea47ff7a566cc2c2dbbf10c48594a29effe9e648 size 621Mb

[flameTimewarpML] creating new bundle folder: /Users/wren/Documents/bundle

[flameTimewarpML] unpacking payload: /Users/wren/Documents/flameTimewarpML.v0.4.3.bundle.tar

[flameTimewarpML] Executing command: tar xf β€œ/Users/wren/Documents/flameTimewarpML.v0.4.3.bundle.tar” -C β€œ/Users/wren/Documents/”

[flameTimewarpML] exit status 0

[flameTimewarpML] bundle extracted to /Users/wren/Documents/bundle

[flameTimewarpML] extracting bundle took 3.4 sec

[flameTimewarpML] installing Miniconda3…

[flameTimewarpML] installing into /Users/wren/Documents/miniconda3

[flameTimewarpML] Executing command: /bin/sh β€œ/Users/wren/Documents/bundle/miniconda.package/Miniconda3-latest-MacOSX-x86_64.sh” -b -p β€œ/Users/wren/Documents/miniconda3” 2>&1 | tee > /Users/wren/Documents/miniconda_install.log

No activity. This looks like it thinks it installed, but I have no option for ML Timewarp in my pulldown. It’s been 45 minutes.

And thanks to @Brooks I made sure all the security things were checked before installing.

@talosh @Brooks This is just not finishing the install - here are the log files:

PREFIX=/Users/wren/Documents/flameTimewarpML/miniconda3
Unpacking payload …

0%| | 0/34 [00:00<?, ?it/s]
Extracting : libffi-3.3-hb1e8313_2.conda: 0%| | 0/34 [00:04<?, ?it/s]
Extracting : libffi-3.3-hb1e8313_2.conda: 3%|β–Ž | 1/34 [00:04<02:17, 4.17s/it]
Extracting : chardet-3.0.4-py38hecd8cb5_1003.conda: 3%|β–Ž | 1/34 [00:04<02:17, 4.17s/it]
Extracting : chardet-3.0.4-py38hecd8cb5_1003.conda: 6%|β–Œ | 2/34 [00:04<01:36, 3.00s/it]
Extracting : ruamel_yaml-0.15.87-py38haf1e3a3_1.conda: 6%|β–Œ | 2/34 [00:04<01:36, 3.00s/it]
Extracting : libcxx-10.0.0-1.conda: 9%|β–‰ | 3/34 [00:04<01:33, 3.00s/it]
Extracting : libcxx-10.0.0-1.conda: 12%|β–ˆβ– | 4/34 [00:04<01:04, 2.14s/it]
Extracting : pysocks-1.7.1-py38_1.conda: 12%|β–ˆβ– | 4/34 [00:04<01:04, 2.14s/it]
Extracting : setuptools-50.3.1-py38hecd8cb5_1.conda: 15%|β–ˆβ– | 5/34 [00:04<01:01, 2.14s/it]
Extracting : conda-package-handling-1.7.2-py38h22f3db7_0.conda: 18%|β–ˆβ–Š | 6/34 [00:04<00:59, 2.14s/it]
Extracting : openssl-1.1.1h-haf1e3a3_0.conda: 21%|β–ˆβ–ˆ | 7/34 [00:04<00:57, 2.14s/it]
Extracting : urllib3-1.25.11-py_0.conda: 24%|β–ˆβ–ˆβ–Ž | 8/34 [00:04<00:55, 2.14s/it]
Extracting : ca-certificates-2020.10.14-0.conda: 26%|β–ˆβ–ˆβ–‹ | 9/34 [00:04<00:53, 2.14s/it]
Extracting : six-1.15.0-py38hecd8cb5_0.conda: 29%|β–ˆβ–ˆβ–‰ | 10/34 [00:04<00:51, 2.14s/it]
Extracting : zlib-1.2.11-h1de35cc_3.conda: 32%|β–ˆβ–ˆβ–ˆβ– | 11/34 [00:04<00:49, 2.14s/it]
Extracting : pycosat-0.6.3-py38h1de35cc_1.conda: 35%|β–ˆβ–ˆβ–ˆβ–Œ | 12/34 [00:04<00:46, 2.14s/it]
Extracting : conda-4.9.2-py38hecd8cb5_0.conda: 38%|β–ˆβ–ˆβ–ˆβ–Š | 13/34 [00:04<00:44, 2.14s/it]
Extracting : conda-4.9.2-py38hecd8cb5_0.conda: 41%|β–ˆβ–ˆβ–ˆβ–ˆ | 14/34 [00:04<00:30, 1.50s/it]
Extracting : libedit-3.1.20191231-h1de35cc_1.conda: 41%|β–ˆβ–ˆβ–ˆβ–ˆ | 14/34 [00:04<00:30, 1.50s/it]
Extracting : readline-8.0-h1de35cc_0.conda: 44%|β–ˆβ–ˆβ–ˆβ–ˆβ– | 15/34 [00:04<00:28, 1.50s/it]
Extracting : cryptography-3.2.1-py38hbcfaee0_1.conda: 47%|β–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 16/34 [00:04<00:27, 1.50s/it]
Extracting : sqlite-3.33.0-hffcf06c_0.conda: 50%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 17/34 [00:04<00:25, 1.50s/it]
Extracting : requests-2.24.0-py_0.conda: 53%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 18/34 [00:04<00:24, 1.50s/it]
Extracting : brotlipy-0.7.0-py38h9ed2024_1003.conda: 56%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 19/34 [00:04<00:22, 1.50s/it]
Extracting : pyopenssl-19.1.0-pyhd3eb1b0_1.conda: 59%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 20/34 [00:04<00:21, 1.50s/it]
Extracting : pycparser-2.20-py_2.conda: 62%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 21/34 [00:04<00:19, 1.50s/it]
Extracting : cffi-1.14.3-py38h2125817_2.conda: 65%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 22/34 [00:04<00:18, 1.50s/it]
Extracting : python.app-2-py38_10.conda: 68%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 23/34 [00:04<00:16, 1.50s/it]
Extracting : pip-20.2.4-py38hecd8cb5_0.conda: 71%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 24/34 [00:04<00:15, 1.50s/it]
Extracting : certifi-2020.6.20-pyhd3eb1b0_3.conda: 74%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 25/34 [00:04<00:13, 1.50s/it]
Extracting : tk-8.6.10-hb0a8c7a_0.conda: 76%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 26/34 [00:04<00:12, 1.50s/it]
Extracting : wheel-0.35.1-pyhd3eb1b0_0.conda: 79%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 27/34 [00:04<00:10, 1.50s/it]
Extracting : idna-2.10-py_0.conda: 82%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 28/34 [00:04<00:09, 1.50s/it]
Extracting : tqdm-4.51.0-pyhd3eb1b0_0.conda: 85%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 29/34 [00:04<00:07, 1.50s/it]
Extracting : ncurses-6.2-h0a44026_1.conda: 88%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 30/34 [00:05<00:06, 1.50s/it]
Extracting : ncurses-6.2-h0a44026_1.conda: 91%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 31/34 [00:05<00:03, 1.06s/it]
Extracting : xz-5.2.5-h1de35cc_0.conda: 91%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 31/34 [00:05<00:03, 1.06s/it]
Extracting : python-3.8.5-h26836e1_1.conda: 94%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–| 32/34 [00:05<00:02, 1.06s/it]
Extracting : yaml-0.2.5-haf1e3a3_0.conda: 97%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹| 33/34 [00:05<00:01, 1.06s/it]

Ok after much fussing, this got installed manually.

The easy install method did not unpack the entire file…the logs show it just stopped partway through for no apparent reason.

Ran into 2 gotchas - had to open the permissions on the lock subdirectory, then had to set an environmental variable for an openEXR issue (security vulnerability with OpenEXR OpenEXR vulnerabilities Β· Issue #21326 Β· opencv/opencv Β· GitHub).

This is AHH MAZING!!

However, it is filling up my drive with the exported EXR files - shouldn’t it be deleting those when they’re done rendering and importing to Flame?

Also is only running in single-threaded mode; I would love to figure out how to use multi-threaded.

Thanks all!!
-Renee

1 Like

You’ll have to delete the EXR exports manually. You could have X cron job them to remove them after 14 days or something. Just cache them on import.

Or, you can always specific where you want to export them in the first place to somewhere more convenient/permanent.

If on your Mac Pro you’re performance is in the 30-45 seconds per frame range for a 3k or 4k clip ish, then that’s the best you’ll get on the Mac side, I’m afraid, as the open source tools were build and optimized for Nvidia GPUs which Mac doesn’t have and is therefore limited to cpu only, not gpu.

2 Likes

it certainly is magical ! The fluid morph is beyond magical as well!

Nope. There is a token you can set for it to auto clean itself.