Printer Friendly

Choose the right DAQ board.

Data acquisition (DAQ) hardware serves as the interface between a computer and signals from the outside world. There are hundreds of different types of DAQ hardware with a range of sampling rates, resolutions, signal conditioning options, and form factors. Use these five questions as a guide to help you choose the right DAQ hardware for your application.

1. What types of signals do I need to measure or generate?

Different types of signals need to be measured or generated in different ways. DAQ devices can include several types of input and output (I/O) functions for measuring and generating different signal types:

* Analog inputs measure analog signals.

* Analog outputs generate analog signals

* Digital inputs/outputs measure and generate digital signals.

* Counter/timers measure digital events or generate digital pulses.

Multifunction DAQ devices offer the best value and performance by combining analog inputs, analog outputs, digital inputs/outputs, and counters in a single package. Since the number of I/O channels per device is fixed, you should consider choosing a device with more channels than you currently need if you plan on expanding functionality in the future.

Modular DAQ systems consist of a chassis and interchangeable 1/O modules. Chassis can hold anywhere from one to 18 modules. Modules are available for a variety of 1/O types including specialized sensor measurements and high voltages. While modular systems are typically more expensive than multifunction devices, they give you the flexibility to configure the system to your exact requirements.

2. Do I need signal conditioning?

Some sensors generate signals too difficult or dangerous to measure directly with a standard 10-V DAQ device. These sensors require signal conditioning before a DAQ device can effectively and accurately measure them. For example, thermocouples output small signals in the mV range and require amplification, low-pass filtering, and cold-junction compensation.

As an alternative to designing your own signal conditioning circuitry or using an external signal-conditioning box, many DAQ devices offer built-in signal conditioning. This helps simplify the overall setup and improves measurement accuracy.

3. How fast do I need to acquire or generate samples of the signal?

The sampling rate is the speed at which a DAQ device's analog-to-digital converter (ADC) takes samples of a signal. It is controlled by software or by a clock built into the DAQ hardware that can perform high-speed sampling up to millions of times per second.

When choosing a sampling rate, you should consider the Nyquist Theorem, which states that to accurately reconstruct a signal, you must sample at 2x the highest frequency component of interest. In practice, you should sample at least 10 times the maximum frequency in order to represent the shape of your signal.

For example, suppose you are measuring a sine wave with a frequency of 1 kHz. Figure 1 compares the 1 kHz signal measured at 2 kHz and 10 kHz. As you can see, sampling at 2 kHz accurately represents the frequency component according to the Nyquist Theorem, but sampling at 10 kHz better represents the shape of the original signal.

4. What is the smallest change in the signal that I need to detect?

Resolution and input range are common specifications of a DAQ device that determine the smallest change in the signal that it can detect. Resolution refers to the number of binary levels an ADC can use to represent a signal. For example, a 3-bit ADC can represent eight (23) discrete voltage levels, while a 16-bit ADC can represent 65,536 (216).



These voltage levels are evenly distributed across the DAQ device's input range. For example, a DAQ device with a 10-V range and 12 bits of resolution (212 or 4,096 evenly distributed levels) can detect a 5-mV change.

5. How much measurement error does my application allow?

Accuracy is defined as a measure of the capability of an instrument to faithfully indicate the value of a measured signal. This term is not related to resolution; however, accuracy can never be better than the resolution of the instrument.

In an ideal world, an instrument always measures the true value with 100 percent certainty. In reality, instruments report a value with an uncertainty specified by the manufacturer. The uncertainty can depend on many factors, such as system noise, gain error, offset error, and nonlinearity. A common specification for a manufacturer's uncertainty is absolute accuracy. This specification provides the worst-case error of a DAQ device at a specific range.

To view the expanded online version of this article, visit http://

By Chris Delvizis, National instruments,
COPYRIGHT 2012 Advantage Business Media
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2012 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Test and Measurement
Author:Delvizis, Chris
Publication:ECN-Electronic Component News
Date:May 15, 2012
Previous Article:New SMU capabilities meet power electronics manufacturers' demands.
Next Article:Design west update, part 2: design west remained true to its embedded-systems roots, and companies offered a cornucopia of new products for engineers.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters