It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
IZarc is indeed a good all-around free decompression utility to have around, as it seems to recognize so many different compression algorithms. But, as far as I can tell, it doesn't seem to be too good for creating huge multipart packets (spanning/spanned archives, I guess they are called).

Is there some Windows freeware utility which would be good for that? Something like WinRAR would be my choice I guess, but the price of it is pretty steep, as I'd need it quite rarely. I think the command prompt version of RAR might still be free, but I am unsure if it has all the functionality of the graphical version, and I'd prefer the latter anyway.

I think IZarc does support creation of spanning zip-packets, apparently so that you first create one big zip packet, and then tell the utility to divide it to smaller packets. But, if I am trying to create a multipart archive of a very big directory that way (say, 150GB or over), it will fail (no error, but I presume it crashes). I'm unsure if e.g. WinRAR or something similar can handle so big compressed packets, but here's hoping. I don't think I found any spanning options for the other compression algorithms in IZarc.
Post edited September 06, 2012 by timppu
This question / problem has been solved by Dzsonoimage
7z?
You could also keep using WinRar, I think it's nagware, so it won't stop working after 30 days.
I use 7-zip, but haven't had the need to try out it's multipart / size limit abilities.
avatar
JMich: 7z?
You could also keep using WinRar, I think it's nagware, so it won't stop working after 30 days.
I think IZarc supports that too, but either it didn't support creation of multipart archives using that algorithm, or then it failed the same way as spanning zip. But maybe I'll check again...

I don't recall if WinRAR was indeed just nagware. I was pretty sure it would stop working after some time, or then I was thinking of WinZip. It might also be I'd even pay for WinRAR, if it works. (yeah, I don't want to pirate them, and I'd actually prefer freeware so that I can freely install it to any computer(s)).
avatar
timppu: I think IZarc supports that too, but either it didn't support creation of multipart archives using that algorithm, or then it failed the same way as spanning zip. But maybe I'll check again...
By 7z I actually meant 7-zip, but it has been too long since I needed to use it, thus I couldn't recall the name at once. Also, you could always go way old school with arj.exe, since that also supported spanned archives, it would go on the next volume once target size was reached, not make a big one first and then split it.
avatar
JMich: 7z?
You could also keep using WinRar, I think it's nagware, so it won't stop working after 30 days.
This.

timppu, 7zip is the only tool you need.

If you are just going to compress this folder (that is you are not going to move it anywhere) - you could consider using NTFS in-build compression feature via compressed folders . Compress ratio is not too good but you don't need any additional software - just 1 click on any folder in Windows.
Post edited September 06, 2012 by tburger
I think I'll give 7-zip a try, it might be the one if just it can handle also humungous archives.
Yup, 7-zip is the way to go. Add to archive -> split to volumes -> you're done.

But make sure that you're using lzma2 with ultra compression, and bump the CPU threads number up to as much as your system can handle.
Post edited September 06, 2012 by Elenarie
I would also suggest you look into some of the free backup software programs. Not recommending these but offering them as a place to start.

With "zip" utilities you may be bumping up against their size limitations. As backups are important I would look into programs made for doing backups. Also, Windows has a backup utility built into the OS IIRC.
avatar
Stuff: Also, Windows has a backup utility built into the OS IIRC.
"AES Encryption, split or spanned archives, and Unicode entry encoding are not known to be readable or writable by the Compressed Folders feature in Windows XP or Windows Vista."

From Wikipedia. Not confirmed, and nothing mentioned about 7 / 8, but still, he'll get better compression ratio with 7z files.
WinRAR is nagware. I've never had it stop working.
avatar
Elenarie: But make sure that you're using lzma2 with ultra compression, and bump the CPU threads number up to as much as your system can handle.
Unless one is low on space I would never choose one better than balanced or normal, I don't think you will save much on using ultra compression. It will just take an immense amount of time because of the needed CPU power.

EDIT: I just did a quick test because I was curious on the compression ratio.

I took a few *.tga pictures I received as in-game screenshots from Vampire the Masquerade game, original size is 100MB of the amount I took. Here's some info:

This is with latest 7-zip 64-bit version on Windows 64-bit with a stock Q9550 @ 2.8GHz and using LZMA2 compression method.

Quickest: 3 seconds and 37.9MB
Quick: 5 seconds and 38.1MB
Normal: 23 seconds and 30.4MB
Maximal: 43 seconds and 30.1MB
Ultra: 47 seconds and 30.1MB

By those numbers it's enough to say that both maximal and ultra is useless because if you can afford a CPU that powerful to make up for the time lost you might as well invest in additional hard drives.

I know I took pictures which may not be the best example but considering the file size (high quality) of them it should suffice for a general presentation.
Post edited September 06, 2012 by Nirth_90
avatar
Nirth_90: ...
It depends on the data in question. Personally, I'm perfectly fine with waiting a long time for stuff to be compressed... with the exception of PAQ, no way I'd wait 5 minutes for a 500 KB file to be compressed (just saying, not exact numbers).

Some examples: Legend of Grimrock from GOG...

GOG's using LZMA with Normal compression, I used LZMA2 with Ultra (yes, different algorithm, but still). Difference, around 55-60MB (not to mention, I included the whole DirectX redistributable package, I think GOG only includes a few files that come from that package).

Spectrasonics Omnisphere latest paches and soundsources:

Both using LZMA2, with one using Normal, the other using Ultra. Difference, around 2-3 GBs (total size uncompressed is around 44 GB).
I use WinRar :D I'm so evil that I look at this "30 days evaluation period is over" and laugh!
avatar
Elenarie: It depends on the data in question. Personally, I'm perfectly fine with waiting a long time for stuff to be compressed... with the exception of PAQ, no way I'd wait 5 minutes for a 500 KB file to be compressed (just saying, not exact numbers).

Some examples: Legend of Grimrock from GOG...

GOG's using LZMA with Normal compression, I used LZMA2 with Ultra (yes, different algorithm, but still). Difference, around 55-60MB (not to mention, I included the whole DirectX redistributable package, I think GOG only includes a few files that come from that package).

Spectrasonics Omnisphere latest paches and soundsources:

Both using LZMA2, with one using Normal, the other using Ultra. Difference, around 2-3 GBs (total size uncompressed is around 44 GB).
Yes, it depends on the file type but it wasn't my point. I meant that in perspective you lose a lot of time just for small difference.

For starters, how long did it take for you to compress a 44GB file/files with LZMA2 Ultra? Unless you've a cluster or supercomputer on your hand it must have been days. How is that worth it?

avatar
keeveek: I use WinRar :D I'm so evil that I look at this "30 days evaluation period is over" and laugh!
Me too. For some reason and quite occasionally extracting files with 7-zip creates en error for me no matter size or file types (I've tried different drives and reinstalled etc..).