<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=736666&amp;fmt=gif">

What is the Difference Between Resolution & Accuracy?

Posted by Grady Keeton on Jan 10, 2017 10:35:41 AM

Two terms that often get bandied about when describing automated test systems are resolution and accuracy. To get the best results from your power supplies, it is important to understand the difference between these two specifications and how they affect your system.

What is Resolution?

The New Oxford American Dictionary defines resolution as, “the smallest interval measurable by a scientific (especially optical) instrument.” When applied to a voltage source, we can take that definition to mean “the smallest amount of voltage that the output of a voltage source can be changed.”

Now, let's take a look at what this means in practice. The DC, AC, and AC+DC voltage resolution of AMETEK Programmable Power's Asterion Series is 0.02 VDC. This means that you can change the output value in 20 mV steps. This fine resolution is more than sufficient for the vast majority of tests that require you to ramp up or ramp down the output voltage.

What is Accuracy? 

Accuracy is another story, however. The New Oxford American Dictionary gives the technical definition of accuracy as “the degree to which the result of a measurement, calculation, or specification conforms to the correct value or a standard.” A power supply's accuracy is a measure of how close the actual output will be to the value to which it is programmed.

The DC accuracy of the Asterion Series is ± (0.1% of actual + 0.2% of full-scale). So, for example, if the output voltage is set to 100 VDC, the actual output voltage could be off by as much as 0.6 VDC (0.1% x 100 VDC + 0.2% x 250 VDC = 0.6 VDC). That means the output voltage could be as low as 99.4 VDC and as high as 100.6 VDC.

In AC and AC+DC modes, other factors also contribute to the accuracy of the output voltage. When the output frequency of the supply is below 1 kHz, the AC accuracy is the same as the DC accuracy. When the frequency of the output voltage is above 1 kHz, however, you must add ±0.2% of full-scale/kHz. When the supply is in AC+DC mode, the output voltage may be off by an additional ±0.1% of full scale.

Knowing the relationship between resolution and accuracy will help you achieve better results with your test system. For more information on power source accuracy and resolution, contact AMETEK Programmable Power. You can send e-mail to sales.ppd@ametek.com or phone 800-733-5427.

Topics: AC Power Sources, DC_Power, Difference Between Accuracy and Resolution

Subscribe Here!

Posts by Tag

See all

Recent Posts