More Compression?

Pierre said on 12/12/2017

Hi,

Is that possible to have more compression with Pingo? Like -s9, -s10 etc. or better can we control iteration number like Zopflipng?

Thank you!

Cédric Louvrier said on 12/14/2017

it should be possible but it will slow pingo for just small gains. for PNG (Web usage), it could be better to find transformations to make data more compressible instead. the more these reductions are efficient, the less you need to add iterations

Daniel said on 12/14/2017

I support this.

It would be really great to have higher compression option, due to the reason many are ready to trade time for compression, even if it is only few bytes. :)

橘琥珀 said on 12/15/2017

Currently found pingo + ect should be enough for extreme compression. Or give it a try -> limitPNG. (Actually do the same work)

Daniel said on 12/15/2017

Is there english version of limitPNG?

Tetsuo said on 12/17/2017

I would definitely welcome extra flags that would push compression levels to the maximum at the cost of extra time.

It would also be great if geutzli could be added as an optional flag for the jpg conversion (while keeping the same quality metrics as you already have in place)

Simon said on 12/18/2017

+1 here ! About Guetzli, don't know how did you manage to get more compression than Guetzli here : https://css-ig.net/articles/guetzli#why

Can you provide Windows executable of eguetzli ?

Cédric Louvrier said on 01/29/2018

eguetzli (v0.05) — 64-bit — windows (zip)

however, it should not compress more than original — it could be smaller only because it uses progressive — [edit 03/29/2019] fyi, this uses old version of butteraugli

Cédric Louvrier said on 03/29/2019

pingo does brute force already with its implementation, but is very lazy. you could do more trials with this tester. using the right options, it could eventually find a better transformation than pingo — but not always. higher levels do not always mean better results

bee (v0.42) — 64-bit — windows (zip)

bee v0.41 — options
options range default what it does
-zN 1-6 (yes) profiles which are adaptive to file specs and faster than brute-force. could lead to better or worse compression (do not combine with other option)
-xN 1-6 (no) profiles which are adaptive to file specs and faster than brute-force. could lead to better or worse compression (do not combine with other option)
-pass=N 0-3 0 number of trials: 0 = do few estimations; 1 = test only non-filtered strategies; 2 = test targeted strategies; 3 = test all strategies
-in=N 0-9 2 internal trials compression level to set the output settings. higher value should not be always better. this have huge impact on processing time
-out=N 0-12 4 deflate level for output. 9-12 should offer more compression most of the time, but 8 or less could be better on small images. 0 = no compression
-alpha=N 0-255 0 RGB values transformation to N if alpha is 0 (tRNS). any values could have an impact on re-ordering (0, 128, 255, etc.)
-rgbalpha=N 0-3 3 RGB values transformation type if alpha is 0 (RGBA, grayscale+alpha). 0 = keep, 1 = partial, 2 = 0,0,0, 3 = retransform
-reuse=N 0-2 0 keep image data. 1 = avoid all trials to re-order colors but still try brute-forcing filters on bytes, 2 = just recompress
-noalpha (none) (no) keep the RGB value if alpha is 0
-force (none) (no) always overwrite output, even if bigger
:: example
bee file.png -pass=2 -in=2 -out=8

alternative

:: example
pingo -s9 -nocompression file.png
bee file.png -out=9 -reuse=2

instead of doing those trials, you could combine both tools in this order to recompress pingo's result

notes

  • i quickly modified it for public, it is barely tested so use it at your own risk
  • provided as-is, find yourself what is best for your files
v0.42 — 04/17/2019
- colortype trials

v0.41 — 04/04/2019
- speed optimizations (for real)

v0.4 — 04/04/2019
- new profiles
- new default
- speed optimizations

v0.31 > v0.35 — 03/31/2019
- profiles (-x1 to -x6)
- added the non-modification of RGB data

v0.3 — 03/31/2019
* complete rewrote:
- pre-optimization
- multiple decode state
- fixes

v0.2 — 03/30/2019
- bit depth test
- fixes

v0.1 — 03/29/2019
- out level could go to 12
- better progression during trials/encoding
- other options
- fixes

Aaron said on 03/29/2019

Thanks for this.

When you say higher levels do not always mean better results, does this mean that higher levels are not a super-set of lower levels? Or just that they could yield the same results?

Cédric Louvrier said on 03/30/2019

higher levels are not a super-set of lower levels

[edit] for more precisions, it means that a higher value could lead to worse results. for example:

  • -in=4 could sometimes choose a different transformation than -in=2 which could lead to worse compression
  • -out=12 sets more iterations but can produce a bit larger file than -out=9 with same settings
  • brute-forcing with high settings just lead to the same result than lower
  • see also profiles results

Cédric Louvrier said on 03/31/2019

profiles

[edit: 04/04/2019] — those profiles could need to be optimized further, but they should offer interesting results. note that they could override any other settings but -force

tweet-coffee.png
tool options time out
ECT 7584241 -9 -s --allfilters-b --mt-deflate 336.11s 55 389 bytes
bee 0.35 -in=9 -out=12 30.84s 55 516 bytes
bee 0.35 -x1 1.81s 55 379 bytes
bee 0.41 -z2 1.60s 55 379 bytes
nut.png
tool options time out
ECT 7584241 -9 -s --allfilters-b --mt-deflate 118.07s 40 752 bytes
bee 0.36 -in=9 -out=12 21.59s 40 745 bytes
bee 0.36 -x2 2.75s 40 741 bytes
bee 0.41 (default) 2.60s 40 741 bytes
bee 0.35 -x6 15.716s 40 745 bytes
bee 0.41 -z6 14.565s 40 696 bytes
palg.png
tool options time out
bee 0.41 -pass=3 -in=9 -out=12 11.442s 23 995 bytes
bee 0.41 -x6 5.802s 23 995 bytes
bee 0.41 (default) 0.702s 23 960 bytes
bee 0.41 -z6 3.955s 23 943 bytes

comment this