Best ZIP compressor?

A friend of mine, who still does some work on an old DOS box, compressed an image with XTree Gold’s built-in compressor and squashed that sucker down about half a meg tighter than I’ve been able to manage on my Win98 system.

I cannot get XTree to work on my system (I suspect it’s got to do with the 32 bit file structure, but I’m not sure. If I do sucessfully get it to load, it doesn’t see any file, just directory headers. And the only way I can get it to even load is from Safe-Mode, Command Prompt Only.)

Anyway, now my buddy’s being smug that his system has a better compression utility than mine, and I’m not puttin’ up with that! (and you thought nerds were uncompetitive!):smiley:

Anyone have a suggestion for a really, really good zip program? I don’t care about ease of use, I’m perfectly comfortable with dos shells. All I care about is output file size. Every search I’ve done has come up with “very friendly interface” or “wonderfully user-friendly”. I don’t care if it’s user-hostile: I want compression, dammit!

Any suggestions?


I think this is what you’re looking for:

I just put my hard drive on an anvil and hit it with a hammer.

I think my hammer may be infected with a computer virus, though… my hard drive never works well after I do this.

When you say zip compressor, I assume you mean any compression utility, not just one that is pkzip compatible.
To really win in the compression game it helps to have a utility optimised for the data you are compressing. What is it, wave files? Use Mpeg Audio Layer 3 (mp3)
32bit bitmaps? Try jpeg.

Otherwise, I’m partial to bzip2. It has significantly better compressions rates than pkzip and variants, or so I’ve found.

Results of compressing

12971 index.html
6854 compress.html.Z
zip (on maximum compression - level 9)
4579 gzip.html.gz
4505 bzip2.html.bz2

I happen to be very fond of WinACE as of late. Perhaps it is because of the cool graphics, but it seems to work well, too!

Show me the ratio. :slight_smile:
How does WinACE do on my sample text file?

BTW, slightly off-topic, but I’ve often wondered about increasing HTML efficiency. Is there any reason that the server could not send a query to the client to request existence of a compressor optimized for HTML tags (say, where each tag had a two-byte code, with additional bytes for options if necessary, and usual text compression) then engage in all further file transfers using a compressed version of the file? The server could have compressed and uncompressed side by side in directories so it wouldn’t have any additional CPU load.
As for me, I tend to strip extraneous white space out of pages I won’t be editing much, but that’s hardly as big a saver as reducing web page sizes by a factor of 5 or 6 (just ordinary compression above obviously does a factor of 3 easily)