Making Electrical Measurements

Making Electrical Measurements Part 1

The Fundamentals
In electronics, the fundamental physical property is charge. Charged particles, such as electrons and protons, interact with each other over time and distance by exchanging discrete bundles of energy called photons. It is the ebb and flow of photons that gives rise to electro-magnetic phenomenon such as light and radio waves. It is our ability to control and use electro-magnetism that lets us build all our wonderful gadgets.

Physicists have worked out elegant mathematical structures to describe electro-magnetic fields in time and space. For the most part, we do not deal directly with such fields in the equipment and circuits we work on every day. Instead, we deal with them one step removed by working with voltages and currents. Roughly speaking, you can think of voltage as corresponding to the electric field and current as corresponding to the magnetic field.

Making Measurements
To measure a quantity such as voltage or current, we must make that quantity interact with an instrument in such a way that the instrument changes in a way that we can sense.

For example, Figure 1 shows an old-fashioned meter-movement for measuring current. The current flows through the coil of the meter and creates a magnetic field proportional to the current.

Dial

The magnetic field attracts an iron pointer which is held back by a spring. The more current, the more magnetic "pull", and the more the pointer moves.

Modern digital meters work on a completely different principle, but they still involve using the voltage or current you are measuring to do something to the meter which causes a change.

The Limitation of Measurements
The fact that the quantity being measured must interact with the instrument making the measurement implies that we change the value of the thing we are measuring by the very act of measuring it. In other words, there is always a limitation on how accurately we can measure voltage or current. There will always be some error, or uncertainty, in the numbers we get from our instruments and meters.

For most of the measurements we make every day, a small error is not important. For example, if the 5-volt power supply is actually 5.001 volts, it will not make a difference to our computer. However, it is good to keep in mind that there are limits to accuracy. Think of it as "noise".

Accuracy as a Percentage
The accuracy of an instrument is often stated as a percentage. For example, a voltmeter may be specified as 1% accurate. An important question is: 1% of what? Is it 1% of the reading or 1% of the "full-scale"?

Suppose you have a meter which reads voltages in the range of 0 to 100 volts. Then the full-scale value is 100 volts. Now suppose you use that meter to measure an unknown voltage, Vx, and it reads 50 volts.

If the accuracy of your meter is ±2% of the reading, then the actual voltage is somewhere between 49 volts and 51 volts since 2% of 50 volts is 1 volt.

On the other hand, if the accuracy is ±2% of full-scale (or f.s.) then the actual voltage is somewhere between 48 volts and 52 volts since 2% of 100 volts is 2 volts.

Accuracy is often given as percentage of full-scale, which means you should use the lowest scale you can to make the measurement. Suppose a 5% voltmeter has two ranges, 0-10 volts and 0 to 20 volts. If you want to measure a 9-volt battery then you should use the 10-volt scale since 5% of 10 volts is 0.5 volts while 5% of 20 volts is 1.0 volts.

Digital Meters: That Last Digit
Digital meters are often compared by the number of digits they can display. For example, a 2-digit meter can display values from 00 to 99 while a 3-digit meter can display values from 000 to 999.

Suppose you have a 2-digit voltmeter that reads 0 to 99 volts. Effectively it has a full-scale capability of 100 volts. Suppose you use it to measure a voltage with a value of 50.5 volts. What will the meter read? The only choices are 50 or 51, so either way there will be an error. That fact about digital meters is expressed by saying that all readings are plus or minus a count of one.

Digital Meters: Accuracy vs. Resolution
The fact that the reading on a digital meter is always uncertain by a count of 1, either up or down, defines the resolution of the meter. Resolution is the smallest change an instrument can measure (or "resolve"), so in a digital instrument it is the last bit: +/- a count of 1.

Like accuracy, resolution can be expressed as a percentage. A 2-digit meter has 1% resolution (1 count out of 99) while a 3-digit meter has 0.1% resolution (1 count out of 999). However, resolution is not accuracy. A 3-digit meter has 0.1% resolution buy may only have 0.5% f.s. accuracy. Read the specifications of the meter carefully: it is usually the case that the resolution is better than the accuracy.

Extra Resolution
In a digital meter, it is relatively easy to increase resolution by adding another digit. It is more difficult to make that extra digit accurate. Even if the last digit on a meter is not accurate, there are times when that extra resolution is useful.

For example, in radio circuits, you sometimes have to "tune-for-a-dip", meaning you adjust the frequency until you hit resonance as indicated by the amplifier current going to minimum (not zero) value. The exact value is not as important as the fact that it is minimum. Look at the table below which shows actual values compared to measured values.

                Actual            Measured
Freq.        Current (mA)        Current (mA)
---------------------------------------------
f1              16.99               16.88
f2              16.96               16.85
f3              16.93               16.82
f4              16.91               16.80
f5              16.94               16.83
f6              16.97               16.86

A 3-digit meter, even if it was accurate, would not let you see the minimum at f4 since it would read 16.8 for f3, f4 and f5.

For the extra resolution to be useful, as in the above example, it is necessary that it "do the right thing". That is, as the actual value increases, the measured value increases and as the actual value decreases, the measured value decreases. Such "doing the right thing" is referred to as being monotonic. If you do not have monotonicity, the extra resolution is useless.