ALGORITHM DEVELOPED FOR MINIMIZING CUMULATIVE TIME-BASE QUANTIZATION ERRORS.
In particular, the research examined the nature of the errors that result from nonrandom quantization errors in an instrument's time-base circuit. Simulations made for a sampling voltmeter showed that the errors in the measured rms amplitude have a non-normal probability distribution, such that the probability of large errors is much greater than would be expected from the usual quantization noise model. A novel time-base compensation method was then proposed that makes the measured rms errors normally distributed and reduces their standard deviation significantly. This quantization method is referred to as the cumulative-sum-limited (CSL) quantization scheme.
As a vehicle for implementing the CSL algorithm, a NIST scientist then applied it using the data acquisition software that he had developed in conjunction with the NIST Wideband Sampling Voltmeter (WSV). The result was that this scheme turned out to reduce the time-base quantization error by a factor of 25. A paper describing this research was presented at the IEEE Instrumentation and Measurement Technology Conference (IMTC) 2000 held in Baltimore, MD, and will appear in the August 2001 Special Issue of the IEEE Transactions on Instrumentation and Measurement.
|Printer friendly Cite/link Email Feedback|
|Publication:||Journal of Research of the National Institute of Standards and Technology|
|Date:||May 1, 2001|
|Previous Article:||NEW PUBLICATIONS ON THE FUNDAMENTAL CONSTANTS.|
|Next Article:||EVALUATION OF ELECTROMAGNETIC COMPATABILITY (EMC) COMPLIANCE CHAMBERS.|