Phenomenon whereby the voltage output from a wire is less than the voltage input to the wire, due to the wire's resistance per foot, the length of the wire, the current being drawn through the wire, the presence of current harmonics, and the ambient temperature. Particularly relevant when running long lengths of wire; the most common solution is to oversize the wire to compensate, or split the loads across more wires (or cables in portable systems) in order to reduce the current being drawn through each wire.
There are many online calculators, of varying accuracy. The more information the calculator asks, the better the results. See also MikeHolt.com - Calculations Voltage Drop.
A common misconception:
A 1000W, 120V lamp will draw 8.3A when supplied with 120V. If powered by only 110V, many will apply W=VA, and determine, erroneously, that the lamp will draw 9A. Applying 60V (50%), the lamp should draw 16.7A. Taken to the extreme, at 1V, the 1000W lamp, should draw 1000A!
However, incandescent lamps are not ohmic, and the relationship of watts and volts is better described by the formula:
watts/WATTS = (volts/VOLTS)^1.6, where
watts represents calculated wattage,
WATTS represents rated wattage,
volts represents calculated voltage,
VOLTS represents rated voltage.
W=VA still applies, but the same factor must be applied to both sides of the equation. As voltage decreases, wattage and amperage also decrease. As voltage increases, wattage and amperage also increase.
This subject often arises when discussing 115V HPL and similar lamps; see Another current question and Is it acceptable to put 4x S4s on a dimmer?.
This page has been seen 1,001 times.
- Created by