Advertisement

Optimizer Organizes Data for Quicker Access

Share
Lawrence J. Magid is Chairman of Know How, a San Francisco-based microcomputer education company

Several months ago, I replaced the 20-megabyte hard disk drive on my IBM PC with a newer model that also stores 20 megabytes (or 20 million characters) of data. After copying all of my programs and data files to the new drive, I began using it and noticed that programs seemed to load faster than they did with the previous drive.

Though I never performed stop-watch tests to compare the two drives, it seemed that I was also able to save files a bit faster. I attributed the apparent improved performance to the fact that the replacement model was newer and, presumably, more efficient than the previous drive.

Although I hadn’t given it much thought, I’ve been noticing recently that the new drive seems to be slowing down. Could it be that a disk drive, like the engine of a car, starts to loose its punch as it ages? Not likely. Old electronic components may die but they rarely go through a convalescent period.

Advertisement

The deterioration in performance remained a mystery until I heard about Disk Optimizer, a $49 software package that “tunes up” tired, old hard disks. The program, according to its manufacturer, quickens the speed by which old hard disks store and retrieve information.

The software doesn’t lubricate the bearings or speed up the motor. Instead, it rearranges the files stored on the disk so that they can be accessed more quickly. According to the Disk Optimizer manual, the positioning of files on the disk, rather than physical deterioration of the mechanism, is what causes older hard disks to operate more slowly than those that are brand new.

To understand both the problem and solution requires a brief lesson in the way the (MS-DOS) disk operating system used by IBM PCs and compatibles save information on disks. When you save a file on a new disk, the information contained in that file is written to the disk in contiguous clusters, of about 4K each. In such an ideal situation, all data in the file is stored in one location. However, as the disk fills up, DOS begins to get more creative as to where it places information. If DOS isn’t able to find enough contiguous space to store the entire file, it stores pieces of the file in various locations throughout the disk. DOS’s directory, or “file allocation table” keeps track of where it places the data so that you, the user, don’t have to worry about where it has been stored. So, while your work is kept to a minimum, DOS and the disk drive are kept quite busy locating and accessing that scattered information.

You’ve probably employed the same technique if you’ve ever tried to store a lot of items in a relatively small space. Imagine storing a case of canned goods in a crowded cabinet. There might not be room for the entire case, but by removing the cans from the box, you could “make room” by sticking cans in various nooks and crannies, amid all the other items. That’s an efficient use of space, but it takes a lot more time both when you load the cabinet (i.e., save a file) and when you try to retrieve (i.e., read) all those cans.

For a hard disk to find data that is scattered about, the mechanism’s “read/write head” must travel to each location on the disk. Positioning the heads, according to the Disk Optimizer manual, is the most time-consuming part of restoring and retrieving data. Therein lies the solution.

Erases Original Copy

By consolidating each file’s data into one contiguous location, Disk Optimizer improves performance because it reduces the amount of work that DOS and the drive need to do each time a file is accessed. It does this by reading each file on the disk and copying the information to a new “contiguous” location.

Advertisement

Once consolidated, it erases the original copy of the file.

Because it copies files prior to erasure, the program, according to its manufacturer, cannot cause you to lose data, even if you “pull the plug on the system while it is optimizing.” The program can take up to an hour to optimize a very fragmented disk. The manual recommends that it be used once or twice a month or “when the system starts to slow down again.”

The program disk also comes with an “analyze” program that examines each file on your disk and reports whether its data is in one location or scattered. It also reports the “total percentage of optimization” for all files. A 100% rating would indicate that all files are contiguous and the disk fully optimized.

When I analyzed my hard disk, prior to running the optimize program, I found it was 70% optimized, meaning that about 30% of my data was scattered in non-contiguous files.

Although a company ad says that the improvement can be “dramatic,” neither the ad nor the manual make specific promises as to the extent of the improvement. The improvement, according to the manual, “depends on the particulars of your system.” Programs that make frequent access to the disk (such as database programs and some accounting systems) have the most to gain.

I tested my system by writing a BASIC program which accessed the disk more than 100 times. The test program took 64 seconds prior to optimization and 53 after using Disk Optimizer, about a 20% improvement. In another test, I used dBase III (version 1.1) to sort a 1,000-name database.

With this test there was no change after running Disk Optimizer. The different results from the two tests are consistent with the manufacturer’s explanation. Improvement, if any, depends on the actual condition of the files in question. The dBase files, for some reason, were already contiguous and, therefore, did not benefit from the program.

Advertisement

When I tested the problem on a colleague’s 3-year-old IBM XT, their was a considerable improvement in overall disk access. I think it’s fair to say that, in the long run, the program will improve overall performance. The publisher, Manchester, N.H.-based SoftLogic Solutions, can be reached at 800-272-9900.

If you’re looking for an immediate and dramatic increase in disk speed, try Lightning from Dallas-based Personal Computer Support Group (not to be confused with Borland International’s Turbo Lightning). This program uses part of the computer’s random access memory as a “disk buffer” to speed up disk access.

Once Lightning is engaged, any program or data you access remains in the computer’s memory, even after the program or data is no longer in use. Subsequent accesses can be considerably faster, since DOS can fetch the information from memory, rather than from the disk. The program works with both floppy and hard disks.

Lightning is a RAM resident program. Once it is run, it remains operational until the computer is turned off. The program can be configured to work with any amount of RAM, including memory (in excess of DOS’s normal 640K limit) on one of the new “expanded” memory boards, such as Intel AboveBoard and AST Rampage.

Performance Improved

When you increase the amount of memory assigned to the disk buffer, you allow Lightning to store more information, improving overall performance.

I put Lightning through the same tests as the Disk Optimizer. My BASIC program ran three times faster while my 1,000 record dBase sort showed a 30% improvement. The program’s main contribution comes when reading files that have been accessed previously.

Advertisement

Like all RAM-based programs, Lightning takes memory away from other programs you are running, but it’s memory well used. Once you experience the increased speed, you’ll make it a permanent part of your computing routine. The program, which sells for $89.95, can be ordered by calling 214-351-0564.

Lightning and Disk Optimizer are not mutually exclusive. They perform different functions and, happily, are compatible with each other. Together, they add new pizazz to your old hardware.

Advertisement