User`s manual

CYDAS UDR Library User’s Guide How to Use the "Streamer" File Functions
24
A rough guideline of attainable speeds are those on a 33 MHz 80386 machine with a fast hard disk it should
be possible to collect a megabyte of data at 200 kHz sampling rate to a disk file. It should also be possible to
collect a megabyte of data to a RAM disk at 330 kHz. In general the maximum sustainable speed for
cbFilePretrig() / FilePretrig() will be somewhat less than for cbFileAInScan()/FileAInScan().
Another characteristic of these "streaming" modes is that the more data you collect the lower the maximum
speed will be. On any machine with any speed disk, you can collect 32000 samples to a disk file at the
maximum A/D speed of 330 kHz. If you are pushing the upper limits of speed you will find that you can
collect 100K samples at a faster rate than you can collect 500K samples, etc.
How to determine the maximum sampling speed
The only way to determine the maximum safe speed is to run it repeatedly. The speed may work the first time
but may not necessarily work the next time. The only way to be sure that you can reliably run at a particular
speed is to try it numerous times. Another method is to increase the speed to the point where it begins to fail
every time so that you get some sense of whether or not you are pushing the speed limit on your computer.
To test it, write a program that calls
cbFileAInScan() or cbFilePretrig() / FileAInScan() or
FilePretrig() (depending on whether you need pre-trigger data). Check the returned error code. If you get
an
OVERRUN error (error code of 29), the sampling rate is too high. Whenever you get OVERRUN error, some data
was collected but not all of it. It is often useful to check how much data was collected to find out whether it
was almost fast enough or not even close.
Speeding up disk files (defragmenting)
Because of the way that disks work, the time that it takes to write to them can vary tremendously. A large disk
file is made up of many small pieces that are written individually to the disk. If the file is contiguous (each
piece is side by side) the speed is very fast. If the file is fragmented (pieces are in different places on the disk)
the speed is much slower. If you create a large disk file, it will most likely be fragmented to some degree, and
the maximum sampling speed will be much lower than it would be for an unfragmented file.
To get around this problem, you should use a disk optimizer or defragmenter program immediately before
creating the streamer file with MAKESTRM. After you create the streamer file, it will remain unfragmented
so long as you do not erase and recreate it. The disk optimizer program included with Norton Utilities™, is
called Speed Disk, or SD. To run it type:
SD /Q
This will execute the "Quick" optimize, which works as well as the full optimization.
After de-fragmenting the disk, create a streamer file that is large enough to hold as much data as you plan to
collect with
cbFileAInScan() or cbFilePretrig(). To create the disk file, run the standalone
MAKESTRM.EXE program. This will create a streamer file of the required size.
After the file is created, run FRAGTEST.EXE to see whether or not the file is fragmented. It is possible that
the file may be fragmented even though you just de-fragmented the disk. This is because the disk may contain
some bad sectors that could not be moved when the disk was optimized. When you create the new file and it
hits one of these bad sectors, it has to skip over it, hence fragmentized.
If FRAGTEST reports that the file is fragmented, create a second file and test that with FRAGTEST. Repeat
this until FRAGTEST reports that the file is OK. After you have an unfragmented disk file you can try using it
with
cbFileAInScan() or cbFilePretrig() / FileAInScan() or FilePretrig() to collect data. If the
maximum sampling speed is still too slow, you should probably switch to a RAM disk.