
Photo by Alex O’Donnell,2014
For years I have been using whatever DMM was laying around the lab at work. At home I have had a little analog, hobbyist meter with failing leads, limited range, but thankfully a mirror behind the needle to reduce parallax errors. With the extended work from home I decided it’s time to get a real meter. As the title suggests, I have questions, but right now I don’t have answers.
When I think back over the years, considering when and what I’ve used a meter for, I realise it has been very basic usage, typically one of the following cases.
- Opens and shorts – basic checking of cables when hooking up to a DUT, or a sanity check on a new or reworked board, looking for power rails, or reset shorted to ground.
- Power present and reset – again, this is board bring up, or test bench validation. Making sure I have a reasonable hook up, and seeing I have the right power and reset is released.
- Power consumption – how else will I know when my processor is going to deep sleep? Am I within power budget for battery consumption, and am I really browning out when you turn on the radio. Power consumption over time is best viewed with a logging meter, or a connected meter.
- Battery health – looking at battery voltage over time, matching voltage to state of charge.
- ADC calibration – how many counts am I seeing for an input voltage, looking at the level before and after any amplifiers or dividers.
- Temperature – primarily, looking at how battery or board temperature is perceived by my code after the ADC, or a temperature module on the board. I am also working on systems now which are effectively thermometers or thermostats, so temperature tracking is important.
The next question is how am I using these meters? Well, I write embedded SW, I’m not a EE, so I use my DMM to see if things look right or wrong. Do I need the accuracy? Do I need data logging? What ranges do I need? The tools in the companies lab are appropriate for what we do there, but as a consultant, or having a personal meter, it has to be more flexible. The question is how flexible?
The nature of what I do keeps me mostly below 12v, in fact lately everything seems to be 5v from VUSB, but the circuitry is 3v3 levels after the regulators. USB PD 2.0 can go up to 20v. Let’s be generous and say I might see mains voltage, so I wont’ need to go above 240 VAC.
How much current am I looking at? Well, I used to work on USB powered, so 1A at 5V would be the maximum allowed. Then in the camera industry I saw it start drifting up to 1.5A. I can’t imagine ever seeing more than 10A on the circuits I’ll be looking at, though some of the LED light controllers I worked on did get up there a bit.
So what is this about half digits? It seems 3.5 digit displays are very common, and 7.5 digit are top end. The half digit is the leading digit, being a 2 segment display instead or an 8 segment. This digit can show a 1 or not be shown. The more digits, the more decimal places you can see. The question is, how much detail do you need to see on a live display?
How often will I use this tool? I must admit, in the last 12 months I’ve reached for my meter about 4 times. In the year before that, never, and the year before that it was used almost every week. It really depends on my current project. USB supplies are so prevalent that I have not used a bench supply since 2017, so I no longer rely on a good meter to check a cheap supply is delivering what I expect or need.
The only thing that makes me want to step away from a cheap, basic DMM is logging current during run time. It looks like the $300 Keysight U1233A will give me all the awesomeness I would like, including ready for temperature probes and easy PC hook up. However, my normal usage leads me to believe the $150 Fluke 115 is the way to go. If I really need temperature, then I can buy add on hardware, and cost that to the project that needs it.
Whatever I pick out for a meter will not be what you need; I just thought I’d share what questions I asked myself while trying to pick the tool I need.