The original implementation of removeDuplicateGradient does O(n²)
search over all gradients to remove duplicates. In images with many
gradients (such as [MediaWiki_logo_1.svg]), this becomes a significant
overhead.
This patch optimizes for the average case by splitting gradients into
smaller lists (called "buckets" in the code). The splitting is done
by selecting some attributes to generate a key with the following
properties:
* If multiple gradients have the same key, then a subset of those
gradients /might/ be duplicates of each other.
* If their keys are not identical, then they cannot be duplicates.
Note that in worst case, we will still hit O(n²) and it is easily
possible to construct svg files that deliberately triggers the O(n²)
runtime.
With that caveat aside, this improves the runtime performance on
[MediaWiki_logo_1.svg] by about 25% (8m51s -> 6m40s on 5 runs).
Original:
$ time for I in $(seq 1 5) ; do \
python3 -m scour.scour MediaWiki_logo_1.svg out.svg ; \
done
Scour processed file "heavy.svg" in 105042 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 103412 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 105334 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 107902 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 108161 ms: 1582746/4989544 bytes new/orig -> 31.7%
8m51.855s
...
Optimized:
Scour processed file "heavy.svg" in 78162 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 81202 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 81554 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 80067 ms: 1582746/4989544 bytes new/orig -> 31.7%
Scour processed file "heavy.svg" in 77267 ms: 1582746/4989544 bytes new/orig -> 31.7%
[MediaWiki_logo_1.svg]: https://upload.wikimedia.org/wikipedia/commons/archive/5/54/20120822053933%21MediaWiki_logo_1.svg
Signed-off-by: Niels Thykier <niels@thykier.net>
|
||
|---|---|---|
| scour | ||
| unittests | ||
| .gitignore | ||
| .travis.yml | ||
| CONTRIBUTING.md | ||
| HISTORY.md | ||
| LICENSE | ||
| Makefile | ||
| README.md | ||
| scour.sublime-project | ||
| setup.py | ||
| testcss.py | ||
| testscour.py | ||
| tox.ini | ||
Scour
Scour is an SVG optimizer/cleaner that reduces the size of scalable vector graphics by optimizing structure and removing unnecessary data written in Python.
It can be used to create streamlined vector graphics suitable for web deployment, publishing/sharing or further processing.
The goal of Scour is to output a file that renderes identically at a fraction of the size by removing a lot of redundant information created by most SVG editors. Optimization options are typically lossless but can be tweaked for more agressive cleaning.
Scour is open-source and licensed under Apache License 2.0.
Scour was originally developed by Jeff "codedread" Schiller and Louis Simard in in 2010. The project moved to GitLab in 2013 an is now maintained by Tobias "oberstet" Oberstein and Eduard "Ede_123" Braun.
Installation
Scour requires Python 2.7 or 3.3+. Further, for installation, pip should be used.
To install the latest release of Scour from PyPI:
pip install scour
To install the latest trunk version (which might be broken!) from GitHub:
pip install https://github.com/codedread/scour/archive/master.zip
Usage
Standard:
scour -i input.svg -o output.svg
Better (for older versions of Internet Explorer):
scour -i input.svg -o output.svg --enable-viewboxing
Maximum scrubbing:
scour -i input.svg -o output.svg --enable-viewboxing --enable-id-stripping \
--enable-comment-stripping --shorten-ids --indent=none
Maximum scrubbing and a compressed SVGZ file:
scour -i input.svg -o output.svgz --enable-viewboxing --enable-id-stripping \
--enable-comment-stripping --shorten-ids --indent=none