Losslessly compressing similar images?

All we need is an easy explanation of the problem, so here it is.

I need to reduce the size of my photo library so I naturally want to compress them. Many of them are not quite identical, but still very similar (subsequent shots of the same scene). Is there any compression algorithm that takes advantage of this fact to effectively compress these images? 7zip (LZMA) is useless.

How to solve :

I know you bored from this bug, So we are here to help you! Take a deep breath and look at the explanation of your problem. We have many solutions to this problem, But we recommend you to use the first method because it is tested & true method that will 100% work for you.

Method 1

You might try Paq 8 (fp8_v2.zip). I just tried it myself on 1440 similar PNG images and then again on 111 similar JPG images. Here are the results.

  • 1440 PNG Files, 28,631,615 bytes => 2,058,653 bytes compressed
  • 111 JPG Files, 15,003,820 bytes => 489,096 bytes compressed

Compression of the PNG files took about 8 minutes and 550 MB of memory when using:

fp8_v2.exe -7 images *.png

Compression of the JPG files took about 5 minutes and 125 MB of memory when using:

fp8_v2.exe -5 images image12*.jpg

See also: jpg lossless image compression test

Method 2

I would imagine that that the burrows-wheeler transform with an arithmetic coder would be ideal for this given a large enough window. What happens if you configure BZIP2 to use a block size equal to a small run of photos? It’ll be slower and take more memory but the compression ratio should skyrocket. And have you tried LZMA with larger block sizes yet?

Method 3

Here’s a simple solution which doesn’t work for photos but may work if one has several images with large pixel-by-pixel identical areas: save the images in an unpacked format like BMP (not PNG or GIF) and then TAR them and compress with a decent compressor like XZ, e. g. on Linux with something like

tar -c myDirectory | xz -9 >myDirectory.tar.xz

Instead of TAR and XZ, one may use 7-Zip with the “solid archive” option to get roughly the same performance. This way I could compress 16 similar screenshots, that took about 900 KB each when saved as separate PNG files, into a 2 MB archive. The benefit of this solution is that it uses common file formats, so it works without installing new software. (Unfortunately the older and even more common programs GZIP and BZIP2 didn’t do a good job for me — maybe because the block size of BZIP2 cannot be configured to be larger than 900 KB.)

Method 4

Not that I’ve seen. Probably the closest thing would be taking several similar JPEGs and putting them into an MJPEG movie. You could also use APNG or animated GIFs for a similar purpose.

I’m not sure how well that would work though, and it sounds like you’re already talking about movie screencaps, so repacking them into a movie file sounds… counterproductive.

Maybe a better way, if you still have the clips that the screens came from, would be to simply find a command line tool that can extract the exact frame for you, copy that unique identifier into a text file someplace, and then you can always easily re-extract the frame when you need it.

Note: Use and implement method 1 because this method fully tested our system.
Thank you 🙂

All methods was sourced from stackoverflow.com or stackexchange.com, is licensed under cc by-sa 2.5, cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply