It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Almighty gods of GOG,

I can't think of a reason for not using the better LZMA2 compression of the xz format in the Linux tarballs you're offering. Has this even been considered?

In case you were wondering, it's an already widely supported format or better said has been for some time now.

Based on the experience I've had, compared with yer' classic .tar.gz you can get up to 30% better compression on average for binary files. Think of all the extra happiness (and space) our external storage drives will get should you convert!

[Ends prayer, makes ritual know-how offering: just use the "J" parameter with tar to get that xz magic working]
Post edited March 11, 2015 by WinterSnowfall
No posts in this topic were marked as the solution yet. If you can help, add your reply
This is unfortunately pretty hard to do for the same reason they don't switch to 7-zip on the Windows side. Granted, typical Linux user is a little smarter, but you never know what weirdness will the people come up with.
I would love this and it would make a big difference with the larger titles.
How would it be any more difficult than .tar.gz? I don't get it.
avatar
WinterSnowfall: Almighty gods of GOG,

I can't think of a reason for not using the better LZMA2 compression of the xz format in the Linux tarballs you're offering. Has this even been considered?

In case you were wondering, it's an already widely supported format or better said has been for some time now.

Based on the experience I've had, compared with yer' classic .tar.gz you can get up to 30% better compression on average for binary files. Think of all the extra happiness (and space) our external storage drives will get should you convert!

[Ends prayer, makes ritual know-how offering: just use the "J" parameter with tar to get that xz magic working]
No reason to do that, imho, because most game resources files is compressed already.
avatar
WinterSnowfall: I can't think of a reason for not using the better LZMA2 compression of the xz format in the Linux tarballs you're offering.
Can anyone else think of one?
Maybe they simply want to save time: [url=http://catchchallenger.first-world.info//wiki/Quick_Benchmark:_Gzip_vs_Bzip2_vs_LZMA_vs_XZ_vs_LZ4_vs_LZO#Compression_time]http://catchchallenger.first-world.info//wiki/Quick_Benchmark:_Gzip_vs_Bzip2_vs_LZMA_vs_XZ_vs_LZ4_vs_LZO#Compression_time[/url]

No, seriously, I think they simply opted for maximum compatibility and haven't changed ever since.
Wishlist: Use better compression for Linux tarballs. For example tar.xz. Comments contain additional info.
avatar
Redfern: No reason to do that, imho, because most game resources files is compressed already.
In general, you'd be surprised, not everyone compresses, but yes, it really depends a lot on how the game is packaged.

If it already uses zip compression internally, then yes, it won't matter much if it's .tar.gz or tar.xz, but in other cases (your average mix of non-compressed binaries, scripts, ascii etc) it can matter by far.

30% doesn't sound that much for a 200MB tarball, but for the larger 10GB ones it makes a huge difference.
avatar
Gydion: Wishlist: Use better compression for Linux tarballs. For example tar.xz. Comments contain additional info.
Darn, it doesn't have that many backers. Oh, well +1.
Post edited March 11, 2015 by WinterSnowfall
avatar
Redfern: No reason to do that, imho, because most game resources files is compressed already.
avatar
WinterSnowfall: In general, you'd be surprised, not everyone compresses, but yes, it really depends a lot on how the game is packaged.

If it already uses zip compression internally, then yes, it won't matter much if it's .tar.gz or tar.xz, but in other cases (your average mix of non-compressed binaries, scripts, ascii etc) it can matter by far.

30% doesn't sound that much for a 200MB tarball, but for the larger 10GB ones it makes a huge difference.
avatar
Gydion: Wishlist: Use better compression for Linux tarballs. For example tar.xz. Comments contain additional info.
avatar
WinterSnowfall: Darn, it doesn't have that many backers. Oh, well +1.
In case its not compressed - yes. But recompressing already compressed files make it pain in ass both for one who prepare archive and one who use it later. For example you can sometimes encounter manga (japanese comics, JPG\PNG files) stored with Max compression RAR archives with solid format (no partial extraction). HATE HATE HATE.
avatar
dr.schliemann: Maybe they simply want to save time: [url=http://catchchallenger.first-world.info//wiki/Quick_Benchmark:_Gzip_vs_Bzip2_vs_LZMA_vs_XZ_vs_LZ4_vs_LZO#Compression_time]http://catchchallenger.first-world.info//wiki/Quick_Benchmark:_Gzip_vs_Bzip2_vs_LZMA_vs_XZ_vs_LZ4_vs_LZO#Compression_time[/url]

No, seriously, I think they simply opted for maximum compatibility and haven't changed ever since.
Well yes, gz is the mother and father of tar compression, but xz itself is rather compatible in this day and age, so unless they plan on adding support for Red Hat 5, I don't see the problem.

Besides, think of all the saved bandwidth costs... ok, maybe it won't be that much since we're talking about Linux tarballs here, but it does save some storage and bandwidth both for GOG and ourselves.

Beside the extra time spent on packaging I don't see any downsides.
avatar
WinterSnowfall: In general, you'd be surprised, not everyone compresses, but yes, it really depends a lot on how the game is packaged.

If it already uses zip compression internally, then yes, it won't matter much if it's .tar.gz or tar.xz, but in other cases (your average mix of non-compressed binaries, scripts, ascii etc) it can matter by far.

30% doesn't sound that much for a 200MB tarball, but for the larger 10GB ones it makes a huge difference.

Darn, it doesn't have that many backers. Oh, well +1.
avatar
Redfern: In case its not compressed - yes. But recompressing already compressed files make it pain in ass both for one who prepare archive and one who use it later. For example you can sometimes encounter manga (japanese comics, JPG\PNG files) stored with Max compression RAR archives with solid format (no partial extraction). HATE HATE HATE.
Got your point, but in this case they're already compressing the compressed files you're referring to with gzip :)... and in most cases these already compressed files use zip/gzip compression themselves.

It can't get any worse really.
Post edited March 11, 2015 by WinterSnowfall
avatar
Rixasha: How would it be any more difficult than .tar.gz? I don't get it.
All distros guarantee gzip and bzip2, and more recently xz/7zip has been pretty standard. They aren't hard to find either.

The only possible good reason is the memory required to uncompressed the data. At ultra compression decompression can take 500 Mb or more, while on simpler less optimal streams it can be 7Mb or less.

If memory isn't the issue and they are that you can't unpack it, they really should include tools as a separate package; Which would include BusyBox, SDL, Vorbis tools, PNGlib, SVGAlib, etc. 50Mb of tools and libraries to ensure all the games work is a small price to pay, and most likely you already have most of them...
avatar
Redfern: For example you can sometimes encounter manga (japanese comics, JPG\PNG files) stored with Max compression RAR archives with solid format (no partial extraction). HATE HATE HATE.
Been there... Although i find the jpegs are really badly compressed to begin with, and the jpegoptim tools do a decent job of lowering the files by 30% or so.

But locking it as a solid format/size, doesn't that guarantee you have to decompress the entire thing in order to get 1 file? Hmmm... In those cases i tend to decompress to a ram drive, then delete those contents when i'm done with them.
Post edited March 12, 2015 by rtcvb32
avatar
rtcvb32: The only possible good reason is the memory required to uncompressed the data. At ultra compression decompression can take 500 Mb or more, while on simpler less optimal streams it can be 7Mb or less.
I think they use the default compression level of gz for the current tarballs, so even switching to the default compression level of xz would bring a noticeable impact.

If you ask me, Ultra is not worth it due to diminishing returns. Maximum, yes.

I don't think it's unreasonable to expect at least 1GB of RAM on low-end PCs for some years now.
Post edited March 12, 2015 by WinterSnowfall
avatar
Rixasha: How would it be any more difficult than .tar.gz? I don't get it.
avatar
rtcvb32: All distros guarantee gzip and bzip2, and more recently xz/7zip has been pretty standard. They aren't hard to find either.

The only possible good reason is the memory required to uncompressed the data. At ultra compression decompression can take 500 Mb or more, while on simpler less optimal streams it can be 7Mb or less.

If memory isn't the issue and they are that you can't unpack it, they really should include tools as a separate package; Which would include BusyBox, SDL, Vorbis tools, PNGlib, SVGAlib, etc. 50Mb of tools and libraries to ensure all the games work is a small price to pay, and most likely you already have most of them...
avatar
Redfern: For example you can sometimes encounter manga (japanese comics, JPG\PNG files) stored with Max compression RAR archives with solid format (no partial extraction). HATE HATE HATE.
avatar
rtcvb32: Been there... Although i find the jpegs are really badly compressed to begin with, and the jpegoptim tools do a decent job of lowering the files by 30% or so.

But locking it as a solid format/size, doesn't that guarantee you have to decompress the entire thing in order to get 1 file? Hmmm... In those cases i tend to decompress to a ram drive, then delete those contents when i'm done with them.
In most cases png is more then enough for black-and-white manga pages :) But this is not otaku thread...
And yeah, solid rar compress all files as single stream so its allows to get, slightly, better compression but you can only decompress whole archive AND any integrity error will render whole archive useless too.
avatar
WinterSnowfall: If you ask me, Ultra is not worth it due to diminishing returns. Maximum, yes.
I remember reading an article on if compression was worth it, if calculating the smaller data was efficient, and the short answer was yes. The amount of energy it takes to send a single bit was something like a million times more costly than compressing the data, even if it took a while to do. Plus the sheer space, and bandwidth they'd save and everyone as a total saves drastically improves.

Ultra is worth it, i usually don't compress data on less than ultra. Course i don't go to over extremes either, LZMA usually says it takes 700Mb of memory to compress something and 300Mb to decompress it, while LZMA2 if i do 8+ threads that grows to 3Gb while only requiring 300Mb to decompress. Not to mention i have a lot of cycles that end up getting lost so i put a lot of my stuff on 'background' compression and just come back to it when it's done, be it ten minutes or an hour later.

Zopfli however for the regular user isn't worth it. It might be for file servers and companies trying to minimize bandwidth, especially html, text and png files, but not otherwise probably not.
avatar
Redfern: And yeah, solid rar compress all files as single stream so its allows to get, slightly, better compression but you can only decompress whole archive AND any integrity error will render whole archive useless too.
Ouch, sounds like a big problem... I personally don't like Rar archive formats.
Post edited March 12, 2015 by rtcvb32
avatar
rtcvb32: Ultra is worth it, i usually don't compress data on less than ultra.
It all comes down to entropy (information entropy, what this guy studied).

Data can only be compressed so far before you start losing computational efficiency rather drastically. With extreme settings, you'll get to a point where there's just nothing more you can squeeze out of the data and you're just wasting processing time and power.

Perhaps it's not the best example, but think of it as a battery charging cycle. Starting off from 0%, you'll get to 80% fairly quickly, but the remaining 20% will usually last as much as the rest - you could just unplug your charger at 80% and save yourself half the charging time.
Post edited March 12, 2015 by WinterSnowfall