|The accuracy of a digital tester is defined as the difference between the reading and the true value for a quantity measured in reference conditions. Accuracy is specified in the format: (±xx% rdg ±xx dgt)
The first portion identifies a percentage error relative to the reading, which means it is proportional to the input. The second portion is an error, in digits, that is constant regardless of the input.
"Rdg"is for reading and "dgt"is for digits. Dgt indicates the counts on the last significant digit of the digital display and is typically used to represent an error factor of a digital tester.
|A function used immediately after an insulation test to automatically release charges stored within the circuit under test during measurement.
Voltage remaining in the circuit under test can be monitored during auto-discharging process as the scale reading.
|A function of a tester to automatically select the appropriate measuring range based on the input signal.
|The average of an AC waveform's instantaneous values taken over a half cycle. Ordinary testers respond to the average value.
For sinusoidal wave :
Average value = Maximum value×2/π = Maximum value×0.637
When the true RMS value is 100V ;
Average value= Maximum value×2/π =141×0.637 = 90(V)
The reading of ordinary testers is calibrated in terms of the effective value of a sinusoidal wave even though they are responding to the average value. They are called average-responding-RMS-calibrated type of testers. As opposed to these, true-RMS type testers respond and show the true RMS value.