You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
tl;dr: consider re-compressing archives with Zopfli to shrink package file sizes by ~5%.
npm packages are distributed as compressed archives; specifically, gzipped tarballs. Improvements to the compression of these archives would lessen bandwidth and storage used.
npm publish (and npm pack) compress packages with zlib. Switching to a better compression algorithm, like zstd, would improve compression but nobody could use these packages zstd is not compatible with gzip.
Zopfli is a library that can compress data in a gzip-compatible format. In other words, data gzipped with Zopfli can be decompressed "as normal". It usually improves compression over zlib but takes longer on the compression side. It is just as fast on the decompression side, so only package authors would have to wait.
I re-compressed the latest version of several popular packages with Zopfli. Here are the size savings:
Published packages would be smaller, resulting in faster download times and less bandwidth.
This is backwards compatible.
Disadvantages:
Integrating Zopfli is more complex and would likely mean adding a new dependency.
Publishing would be slower because Zopfli takes longer to compress things. For example, I tried recompressing the latest version of the typescript package. GNU tar was able to completely compress the archive in about 1.2 seconds on my machine. Zopfli, with just 1 iteration, took 2.5 minutes.
Possible implementation
Instead of effectively running npm publish, np would effectively run:
# Create a .tgz
npm pack
# Decompress it
gunzip my-package.tgz
# Recompress it
zopfli my-package.tar
# Publish that file
npm publish my-package.tar.gz
I don't think this is a good fit for np. I'm not interested in adding such a dependency or maintaining this kind of code. And the time cost for the user would be too high (Zopli is extremely slow). The correct way to do this is for npm to recompress server-side, but I don't think they are willing to do that.
Description
tl;dr: consider re-compressing archives with Zopfli to shrink package file sizes by ~5%.
npm packages are distributed as compressed archives; specifically, gzipped tarballs. Improvements to the compression of these archives would lessen bandwidth and storage used.
npm publish
(andnpm pack
) compress packages with zlib. Switching to a better compression algorithm, like zstd, would improve compression but nobody could use these packages zstd is not compatible with gzip.Zopfli is a library that can compress data in a gzip-compatible format. In other words, data gzipped with Zopfli can be decompressed "as normal". It usually improves compression over zlib but takes longer on the compression side. It is just as fast on the decompression side, so only package authors would have to wait.
I re-compressed the latest version of several popular packages with Zopfli. Here are the size savings:
I do this for one of my packages and it works well. You can also try this browser-based proof-of-concept for other packages.
np
could compress packages with Zopfli.Advantages:
Disadvantages:
tar
was able to completely compress the archive in about 1.2 seconds on my machine. Zopfli, with just 1 iteration, took 2.5 minutes.Possible implementation
Instead of effectively running
npm publish
, np would effectively run:Alternatives
npm
itself. I proposed this to npm a year ago but it was (rightly, I think) rejected.The text was updated successfully, but these errors were encountered: