InkBlot's Mess

Archive/Compression Study - Part 1

19 Dec 2012 7:37 pm

I have split my analysis into three groups: compressors and compressor/archivers released in 1991 or earlier, compressor/archivers released in 1992 or later, and compressors released in 1992 or later. This initial study does not include background or space-saving compressors like DiskDoubler or SpaceSaver. Such an analysis will likely occur as a separate project.

The test machine was a 266 MHz Power Macintosh G3 with 416 MB RAM running Mac OS 8.5.1 on a 7200 RPM IDE drive. Sadly, the drive was not defragmented before the testing was done. However, each piece of software dealt with the same files, regardless of their fragmentation state. The first group tested, those released in 1991 or earlier, was tested on the same G3 within Mini vMac 3.2.3, emulating a Macintosh Plus with System 6.0.8 and the speed set to 8x, the default. Obviously, depending on your base machine and files compressed, your results may be slightly different.

The table above lists the compressors and compressor/archivers tested in the first group. I found the interfaces of these programs quite interesting. Most of the programs conform at least partially to the Interface Guidelines. However, ArcMac and MacZOO have a sort of hybrid interface, using both the GUI of the Macintosh System Software and a command-line aspect. While this is certainly an interesting way to incorporate the power and flexibility of the command line, it must have confused users who had never used anything but a Macintosh computer.

Each application was put through three tests: compressing a single text file, compressing a single application, and compressing a mixed group of files containing text, binary, and application files. The smaller the numbers in the graph, the better.

Hopefully the graphs are self-explanatory. However, there are a few notes to make. In compressing a single application, both ArcMac and MacZOO were given the same application, but encoded with MacBinary and BinHex 4.0, respectfully, since neither application supports resource forks, and MacZOO only supports compressing text files.

In compressing the set of mixed files, MacCompress was given a tar file, since it is only a compressor. But, since MacTar does not support resource forks, I had to go through all the files by hand, applying MacBinary encoding to those with resource forks. Suffice it to say that when all the time for the extra file preparation is added up, MacCompress would not be a viable choice when working with Macintosh files.

Also, it should be noted that ArcMac, MacZOO, and PackIt III do not support adding directories to an archive. For this reason, I did not put these programs through the mixed files compression test.

There were a few programs I did not test, like MacLHA 2.00 (I test a newer version later on) and PackIt II. I casually tested MacLHA, and its compression was actually superior to almost all the other programs. However, it is significantly slower than all the other programs as well. PackIt II was not tested because I was unable to locate a copy. For those curious, PackIt, the first incarnation, is only an archiver, not a compressor.

My personal choice for working with files under System 6 or earlier is Compact Pro, the successor to Compactor. For some reason, Compact Pro is actually capable of running under an older System Software than Compactor. In any case, Compactor compresses better than a majority of the other programs, and can do so faster than most. It also supports resource forks, archive encryption, and self-extracting archive creation. Also, there are utilities available for Mac OS X that support extracting files from Compactor/Compact Pro archives. Unless a cross-platform format is needed, there is little need to use ArcMac, MacCompress, or MacZOO. And, from someone who had heard of, but never used it prior to these tests, PackIt III was a disappointment.

Author Message

Dog Cow

Joined: 11 Dec 2004 5:20 pm

Location: USA

PostPosted: 19 Dec 2012 8:32 pm    Subject:
Reply with quote

StuffIt classic reminds me of bzip2, my favorite algorithm. It takes longer to compress files, but it gets them down to a smaller size.

When you're constantly compressing hundreds of gigabytes worth of data (Usenet articles, mostly), like I am, the extra time is worth the wait for the smaller files.

Back to top View user's profile Send message Visit poster's website
Display posts from previous: