Compression/Decompression speeds for file formats?


#1

Just curious if anyone can comment with their experiences with compression/decompression times of common bitmap formats.

In a recent 4000 frame animation, I was able to shave hours off my multi-pass 1080p render by changing from LZW TIFFs to non-compressed TIFFs. (I had plenty of disk space, but not a lot of time). The few seconds its takes to compress each pass of each frame really adds up.

PNG also seems incredibly slow to compress. All flavors of OpenEXR seem very fast, except that Zip16 decompression seems noticeably slower than the others.

If you have a fast drive/network and plenty of space, is it always better to use uncompressed EXR or TIFF?


#2

For me it all comes down to identifying the bottleneck, sure you may have a lot of space, but it takes a certain amount of time to read/write to a single hard drive and that could be a major bottleneck. I like to stick to EXR’s because they are a decent size with great quality and the read/write speeds of my hard drives seems to be a big bottleneck for me.

It all depends on your hardware and what you are doing though. If it takes 5 min to render a single frame then who cares about the .1 seconds to compress it into a file? If you are reading/writing to a single internal or external hard drive compressing might actually save you time, if you have a great big RAID array then it isn’t that big of a deal.

I don’t think there are good guidelines, it’s probably time for you to experiment and figure out what works best for you.


#3

PNG isn’t inherently that slow to compress. If you have it cranked down to “level 1” compression, it’s actually reasonably quick. OTOH, if you run it through something like pngcrush cranked up the “level 9” with absolutely every bell and whistle turned on to shave off every last byte, it can take several minutes per frame. The key to writing PNG’s is knowing what you want your software to do, and keeping your software under control so that it doesn’t waste time bothering with work you don’t care about. It basically never makes sense to waste time to render directly to a highly compressed format, but sometimes it makes sense to do a post-render compression step. In this way, you can complete a render, and review and do QA in parallel while it gets compressed for somebody at a remote location to download quickly, for example.

For EXR’s, the biggest thing to watch out for is data layout. Nuke loves to take in scanline EXR’s, but most 3D renderers think in terms of tiles. Texture for a renderer will probably be quickest as a tiled EXR. In many cases, it can make sense to do a batch conversion process to scanlines of rendered frames to prepare them for comp.

With EXR’s there is also the issue that you should have only the channels that you are actually going to use. Multichannel data is interleaved in an EXR, so if you render 20 passes but only use the beauty pass, you are going to incur a significant IO penalty seeking over all the wasted data. (More of an issue on magnetic storage where seeks are expensive. Less of a performance issue on SSD where seeks are cheap. OTOH, SSD’s themselves are expensive, so you probably wouldn’t want to waste the space on an SSD, even if performance is adequate.)

I never use TIFF. I’ve never really studied the performance implications of it, but I always think of it as a DTP format more than a VFX format. I have no idea if there is any actual technical justification, but it always felt somewhat unclean for such things.

I wouldn’t say it “always makes sense” to use uncompressed images when you have the space and speed to spare. It boils down to whether the CPU time spent decompressing is more or less than the difference in the time spent reading compressed frames from disk vs. uncompressed. OTOH, actually having the space and speed to spare is unlikely when you start to look at how much bandwidth you would need for 20 layer EXR’s of ten different elements in stereo, at 4k, etc. to play back in real time. It’s fairly easy to create composites which would require many GB/sec of bandwidth to actually playback in real time, and compression of the data can help alleviate the bottleneck.


#4

This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.