Specifications

2-12
2.4.4 Level monitor function
Monitoring the test signal voltage or current applied to the DUT is important for maintaining accu-
rate test conditions, especially when the DUT has a test signal level dependency. The level monitor
function measures the actual signal level across the DUT. As shown in Figure 2-10, the test signal
voltage is monitored at the High terminal and the test signal current is calculated using the value of
range resistor (Rr) and the voltage across it.
Instruments equipped with an auto level control (ALC) function can automatically maintain a
constant test signal level. By comparing the monitored signal level with the test signal level setting
value, the ALC adjusts the oscillator output until the monitored level meets the setting value. There
are two ALC methods: analog and digital. The analog type has an advantage in providing a fast ALC
response, whereas the digital type has an advantage in performing a stable ALC response for a wide
range of DUT impedance (capacitance and inductance.)
Figure 2-10. Test signal level monitor and ALC function
2.4.5 Measurement time and averaging
Achieving optimum measurement results depends upon measurement time, which may vary accord-
ing to the control settings of the instrument (frequency, IF bandwidth, etc.) When selecting the
measurement time modes, it is necessary to take some tradeoffs into consideration. Speeding up
measurement normally conflicts with the accuracy, resolution, and stability of measurement results.
The measurement time is mainly determined by operating time (acquisition time) of the A-D
converter in the vector ratio detector. To meet the desired measurement speed, modern impedance
measurement instruments use a high speed sampling A-D converter, in place of the previous tech-
nique, which used a phase detector and a dual-slope A-D converter. Measurement time is propor-
tional to the number of sampling points taken to convert the analog signal (Edut or Err) into digital
data for each measurement cycle. Selecting a longer measurement time results in taking a greater
number of sampling points for more digital data, thus improving measurement precision.
Theoretically, random noise (variance) in a measured value proportionately decreases inversely to
the square root of the A-D converter operating time.