Say you came across a sexy panel meter and wanted to make it work (maybe you found it in your junk box or perhaps your garden - doesn't matter - but let's pretend you don't know anything about this mystery device).

This one has a little offset (doesn't quite rest at zero) but the for now that doesn't matter. The range seems to go to 20 milliamps.

If we look at the back there are two threaded posts, on of which is labeled with a plus sign (so we know the polarity at least). Since the needle is near zero we will assume that supplying a positive voltage to the post with the plus will make it go up.

We need some little source of current to test our mystery sexy panel meter and a good source of smallish predictable currents at small voltages (3 to 8 volts DC - battery, bench supply, whatever) is an led and a resistor.

It doesn't matter very much what resistor you use (within reason for whatever led you have), but setup something that makes a reasonable amount of light and it should be fine.

Quick little measurements (about 5 volts, about 1k and about 2 volts across the resistor - remember the voltage drop on the led - our little circuit pulls 3.27 milliamps). If you check the math rather than measuring it will come out perfectly -> remember V=IR -> so I=V/R -> so I=(5.04 - 1.98)/983.3 = 0.003275 amps.

But... if we put the panel meter in the circuit the needle slams full scale instantly (oh no 3.27mA should be less than 20mA - what's wrong?)

The reason is of course that the meter doesn't really read milliamps at the scale reading... In this case the full scale is 1mA (so we need to put 1/20th the real current through the meter to make it match the scale).

Our little meter doesn't say what the actual resistance is (we could measure it but that's means we have to do a bit of math (very simple math) and today I don't want to do math for this.

Instead let's just try putting a 10 ohm resistor in parallel with the meter terminals. If we do this and try again the meter gives us a reading... yay.

But the reading isn't right... we get around 8mA on the scale and we know we only have 3.27mA. It would have been nice to know the meter resistance - but rather than measuring let's use a potentiometer in the way that we aren't supposed to (normally the dial should be used to set the wiper to a value (a potential) between the high and low terminals but I'm just going to use the low terminal and the wiper as a simple variable resistor... yup, shouldn't do this in a design but it's actually too easy a solution on the bench as a variable resistor).

By putting the potentiometer in series with the meter (and that combination in parallel with the 10 ohm resistor) we can adjust the pot until we get the right reading on the scale.

When we do this it comes to about 117 ohms (and that's easy to make with a 100 ohm and a 15 ohm resistor in series).

So now we have the right reading on the scale for the actual current (yay). A 15 ohm and a 100 ohms resistor in series with the meter and then a 10 ohm in parallel with the set. (All I have to do is fix the little offset at zero).

Just for the heck of it the math for figuring out the extra resistors is easy: The meter is 73.9 ohms, the resistor we want in parallel is 9.93 ohms and we want a 20 fold change in the scale... so (73.9 + x + 9.93)/9.93 = 20 -> 83.84 + x = 20(9.93) -> x = 198.6 - 83.84 -> x = 114.86

So all this is kind of silly - it's a very simple thing to do but it's fun sometimes to do things in a different way than usual. Now everyone seems to like digital displays - they are great, but there's something about the old style panel meters that I have a fondness for...