The original implementation of removeDuplicateGradient does O(n²)
search over all gradients to remove duplicates. In images with many
gradients (such as [MediaWiki_logo_1.svg]), this becomes a significant
overhead as that logo has over 900 duplicated gradients.
We solve this by creating a key for each gradient based on the
attributes we use for duplication detection. This key is generated
such that if two gradients have the same key, they are duplicates (for
our purpose) and the keys are different then the gradients are
guaranteed to be different as well. With such a key, we can rely on a
dict to handle the duplication detection (which it does very well).
This change improves the runtime performance on [MediaWiki_logo_1.svg]
by about 25% (8m51s -> 1m56s on 5 runs).
Original:
$ time for I in $(seq 1 5) ; do \
PYTHONPATH=. python3 -m scour.scour MediaWiki_logo_1.svg out.svg ; \
done
Scour processed file "heavy.svg" in 105042 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 103412 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 105334 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 107902 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 108161 ms: 1582746/4989544 bytes new/orig -> 31.7%
8m51.855s
...
Optimized:
$ time for I in $(seq 1 5) ; do \
PYTHONPATH=. python3 -m scour.scour MediaWiki_logo_1.svg out.svg ; \
done
Scour processed file "heavy.svg" in 21559 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 21936 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 21540 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 21518 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 21664 ms: 1582746/4989544 bytes new/orig -> 31.7%
real 1m56.400s
...
[MediaWiki_logo_1.svg]: https://upload.wikimedia.org/wikipedia/commons/archive/5/54/20120822053933%21MediaWiki_logo_1.svg
Signed-off-by: Niels Thykier <niels@thykier.net>
|
||
|---|---|---|
| scour | ||
| unittests | ||
| .gitignore | ||
| .travis.yml | ||
| CONTRIBUTING.md | ||
| HISTORY.md | ||
| LICENSE | ||
| Makefile | ||
| README.md | ||
| scour.sublime-project | ||
| setup.py | ||
| testcss.py | ||
| testscour.py | ||
| tox.ini | ||
Scour
Scour is an SVG optimizer/cleaner that reduces the size of scalable vector graphics by optimizing structure and removing unnecessary data written in Python.
It can be used to create streamlined vector graphics suitable for web deployment, publishing/sharing or further processing.
The goal of Scour is to output a file that renderes identically at a fraction of the size by removing a lot of redundant information created by most SVG editors. Optimization options are typically lossless but can be tweaked for more agressive cleaning.
Scour is open-source and licensed under Apache License 2.0.
Scour was originally developed by Jeff "codedread" Schiller and Louis Simard in in 2010. The project moved to GitLab in 2013 an is now maintained by Tobias "oberstet" Oberstein and Eduard "Ede_123" Braun.
Installation
Scour requires Python 2.7 or 3.3+. Further, for installation, pip should be used.
To install the latest release of Scour from PyPI:
pip install scour
To install the latest trunk version (which might be broken!) from GitHub:
pip install https://github.com/codedread/scour/archive/master.zip
Usage
Standard:
scour -i input.svg -o output.svg
Better (for older versions of Internet Explorer):
scour -i input.svg -o output.svg --enable-viewboxing
Maximum scrubbing:
scour -i input.svg -o output.svg --enable-viewboxing --enable-id-stripping \
--enable-comment-stripping --shorten-ids --indent=none
Maximum scrubbing and a compressed SVGZ file:
scour -i input.svg -o output.svgz --enable-viewboxing --enable-id-stripping \
--enable-comment-stripping --shorten-ids --indent=none