Hello,

I have a simple ckt composed of a resistor in series with a DC voltage

source. If I put an ohmmeter across the resistor terminal with the

voltage source off, then I get the resistor's resistance. However, I'm

curious how the source affects the measurment when it is turned on.

I did an initial Thevenin model and found that the resistor will be

parallel to a short, so that means that meter would be measuring 0

ohms. However, when I tried it in the lab, a 1 VDC in series with a 500

ohm resistor is reading a 2 Megaohm reading when I put the ohmmeter

probe on the resistor.

Any model or explanation? Thanks!

You have made a small error in your analysis by considering the DC

voltage source to be a short when off. It would only be a short when

doing an AC analysis. For DC, the DC source (i.e., battery) "appears"

as a large capacitor in series with a small source resistance., which

is, to DC, an open circuit. You can verify this by adding a switch in

series and using that to "un-power" the circuit to measure the R with

your meter. R is R.

From a practical point of view, trying to measure a live circuit with

an ohm meter will more than likely fry the meter, as many others have

already pointed out. The measuring system of a meter contains it's

own source, which is not designed to source or sink current from an

external source.

Hope this helps.