servozoom said:

I can't seem to find an answer to this question anywhere, So I've

come to the experts. I need to power a device that requires 500ma of

12v dc current, but it has to be 200 feet away from the transformer.

Will a 1amp power supply provide adequate current over a 14 gauge wire?

Is there a formula to figure this out? Thanks in advance

Oh boy, where do we even begin:

This question is impossible to answer because you don't provide useful

information. First there's a question of your power source and then

there's a question of your load.

You claim you are 200 feet away from the transformer and then talk about

a 1 amp power supply. So do you mean a 1 amp wallwart? You can't put DC

into a transformer so you'd have to rectify it on the output of the

transformer, it would also be a good idea to regulate it. You also don't

state what voltage is acceptable at the load. 12V +/- what? Bare in mind

that wallwart voltage tends to go all over the place depending on the

load and they are unregulated.

14 gauge wire has a resistance of .00297 ohms per foot (assuming copper

wire, and room temperature, resistance goes up with temperature), so 200

feet of it * 500mA gives you about .3V of drop. That means the voltage at

the output of the transformer must be at least .3V above what you need at

the load.