Assuming I’m doing the math correctly here, this is how the bits fall:
If the sensor provides 18bit linear data, that’s 262,144 possible code values for each channel.
Compressing this into 13bit log, you use the gamma curve to throw away values in areas where their impact on perceived image quality is less important. Typically that means maintaining detail in the shadows, and then sacrificing detail in mid tones and extreme highlight nit values. The difference between 1400nit and 1800nit is less critical than between 20 and 30nit.
13bit log is 8,192 individual values.
Now that is one bit more than fits into ProRes 444, so you’re making it one level more coarse by dropping the last bit, and getting down to 4,096 possible values.
Now to visualize - when was the last time a clear visual difference smacked you in the head between a ProRes HQ (10bit) and ProRes 444 (12bit) file. Not totally fair, since one does 4:2:2 chroma subsampling and the other 4:4:4, but if you didn’t notice, and it had two degradations stacked against it (chrome subsampling and channel precision by 2 bits), well, you know they’re reaching for the last crumbs.
OK, not totally fair. That would only apply for delivery codecs. While you’re still in the pipeline, this extra precision can make a difference for keyers and other algorithm who have better ‘eyes’ than we do. And stacking operations benefits from extra margin to avoid image degradation.
Back to the numbers - if you use 16fp instead, you get 1024 different code values, but you can shift the decimal up and down 32 points. That means you can cover the full number range of those 262,144 original values, but you’re having massive voids in-between.
At 32fp, you have 16,777,216 possible code values, and 256 different decimal point shifts. Your original data fit into this nicely both in original linear or log without loss. But everything in your pipeline needs to be able to handle that. And with Flame I don’t think that’s the case. There are things that don’t run 32fp.
Having said all this, I think you’re save to convert to LogC in ProRes444 and keep working as usual. You’re throwing 1 bit away that was nice to have. They’ll still have the RAW files in case things change and in 10 years they want to re-process everything.
Also, all that extra data is helpful if you have a DP that isn’t the best at lighting, or something else was rushed and you have to push the footage around to make it work. If they can afford one more more Alexa35 presumably the footage is well captured and doesn’t have to go through a wringer to make it look nice, that lives or dies by that 13th pixel.
And tongue in cheek - There are many buildings who don’t have a 13th floor, and there never was a Resolve 13. They went from 12 to 14. That 13th bit may be cursed, and you don’t want it anyway 
And in the end most of content is watched on 8bit screens with just 256 values remaining at the end of the pipeline.