115v vs 120v lamps

hobbsies

Active Member
Hi guys,

This question has been nagging at me for a while. What's the difference between 115v and 120v lamps, and how should I figure out which to buy?

I've always been under the impression it didn't matter much and have always bought 115v if given the option.

Thanks all!
 
Test your circuits at their positions (not coming straight out of the dimmers) and then buy whichever type of lamp fits. They make them both because you often find that while your dimmers might produce 120V, after the long trip through copper to the lamp, there has been some voltage drop. I'm pretty sure on most installs these days, they actually go through and regulate the voltage at the dimmer so that the final voltage at the lamp is 115V. The reason you want to use the correct voltage lamp for your application is simply because that's what they are made for. If you have 115V applied to a 120V lamp, they lamp is not going to perform as spec'ed. It will be dimmer and you will get a color temperature drop. If it is the opposite, you have 120V and a 115V lamp, the results will be that you will have a lamp burning hotter than it is suppose to, possibly leading to fixture damage and definitely leading to shorter lamp life.

I'm sure one of the smarter people on here will be along to clear up any of my mistakes and give you a much better/specific answer, but that is the gist.

-Tim
 
Hi guys,

This question has been nagging at me for a while. What's the difference between 115v and 120v lamps, and how should I figure out which to buy?

I've always been under the impression it didn't matter much and have always bought 115v if given the option.

Thanks all!

I take care of the stuff at a church. I typically buy 130 V lamps instead for the chandeliers and the sconces in the sanctuary - because it is a PITA to get a ladder in there. So a 60 W 130 V lamp will burn at about 55 W. It will also last a bit longer. Similar sort of thing with HPL bulbs. A 120 V lamp on a 115 V circuit will put out less overall energy and burn a little cooler. It doesn't hurt to have 120 V lamps. YMMV.
 
Someone double check my math and correct me if I'm wrong:

a 575w/120v lamp will draw 4.79amps at 120v (W=VA)
a 575w/115v lamp will draw 5 amps at 115v (also W=VA)

These lamps are designed for a particular resistance at the stated voltage, namely 25.05 Ohms and 23 Ohms respectively (R=V/A)

So, if you run a 120v lamp at 115v, the 25.05 Ohms of resistance would cut the output of a 575w lamp to 527w (W=V^2/R)
Likewise if you run a 115v lamp at 120v, the 23 Ohms of resistance would boost the output to 626w.

Which is why I don't like using four S4 units on a circuit, if they have 115v lamps installed.. (4x626=2504w)

Obviously I'm not factoring in any other outside conditions such as voltage drop or regulation...
 
Someone double check my math and correct me if I'm wrong: ...
Your math is correct, as far as it goes. Some other formulas that come into play:
(From Mathematical Formulas for Lighting - ControlBooth )
lumens/LUMENS = (VOLTS/volts)^3.4
life/LIFE = (VOLTS/volts)^13 (I.e., reduce the volts to 90% and the life increases by 393%!)
EFFICIENCY/efficiency = (VOLTS/volts)^1.9
watts/WATTS = (volts/VOLTS)^1.6 (not 'squared' as you would get with a fixed resistance)
coltemp/COLTEMP = (volts/VOLTS)^0.42

... So, if you run a 120v lamp at 115v, the 25.05 Ohms of resistance would cut the output of a 575w lamp to 527w (W=V^2/R)
Likewise if you run a 115v lamp at 120v, the 23 Ohms of resistance would boost the output to 626w. ...
See the note to "watts/WATTS = (volts/VOLTS)^1.6": "(not 'squared' as you would get with a fixed resistance)" Further, lumen is more indicative of output than watt.

... Which is why I don't like using four S4 units on a circuit, if they have 115v lamps installed.. (4x626=2504w) ...
See http://www.controlbooth.com/forums/question-day/9282-acceptable-put-4x-s4s-dimmer.html .
Guys, what you are missing here is that a dimmer rated for 120V nominal does not deliver 120V at the outlet. You have the drop across the SCR, the choke, and the cable run to the outlet. That is precisely why a 575W HPL lamp is rated at 115V--so that you can get the expected lumen output and color temperature despite those voltage drops.

So, 4x575W 115V lamps on a 120V 20A dimmer rated for continuous full loading is just fine--as long as you don't exceed 115V at the outlet. And that would be hard to do without a line voltage higher than 120V.
STEVETERRY says it's okay, therefore it's okay.;)

hobbsies, with the dimmer at 100%, using a true RMS DMM, measure the voltage at the luminaire. The easiest way is to add a two-fer and measure at the empty female connector. Measure several locations around your venue and take an average. One should expect the location farthest from the dimmer rack to have the lowest voltage. (Although the variance is likely less than a few volts and no one wants to get into a situation where you use 115V at FOH and 120V on the floor pockets, (assuming dimmers are in the basement.)) Then you can base your lamp decision accordingly. Your dimmers may or may not allow regulation, but be aware they can only lower, not boost, voltage.

Something I think worth noting: When speaking of HPL and some other lamp varieties, both the 115V and 120V come in Standard and Long-Life versions. Some think that all 120V ARE the long-life version, but that's not true, although historically it may have been that way before all four versions were introduced. See http://www.controlbooth.com/forums/...what-kind-hpl-lamp-do-you-use.html#post249789 for more discussions of standard vs. long-life.
 
Millamber has a great point.

Another way to look at it is this-- If your building feed is getting 120V, and your dimmer racks are putting out 119V (in example) at the end of the cable runs where the fixtures will plug in you may have 117V, then your decision is Output vs. Life. a 120V lamp will be underpowered, and therefore last far longer (but be dimmer). A 115V lamp will be slightly over-powered, and brighter, but have less life.

Think of it this way, in the way most people (non-amusement parks) will use the lamps, they will be dimmed frequently (not run at 100% all of the time) throughout their life. So a lamp rated for 300hrs. (60 min. on at full, 10 minutes off) will actually last far longer in actual application where it gets dimmed frequently and run at less than 100% all of the time.

115V also gets you a brighter, whiter color temp as the lamp is running hotter than the 120V one. Some people choose the short life, 120V lamps as a way of bridging the gap between life and output. Instead of a 1500hr. 115V lamp, they go with a 300hr. 120V lamp.
 
So, If "long lamp life to save money" is the primary issue, I'm ok using 120v no matter the circuit's voltage, right?

So long as you're staying roughly at or under 120V, then you will be fine. 122V and you probably won't see a perceptible difference. You'd be hard-pressed to find a power source you could plug into that'll run your lamps too high.
 
So, If "long lamp life to save money" is the primary issue, I'm ok using ...
Depends on how you define "ok." Using a 120V long-life lamp on a circuit that only supplies 115V will certainly last longer than a 115V long-life on a 115V circuit, but may not be "ok" to designers, except those who like their output dim and yellow.


Measure your voltage, and plug some numbers into this Excel sheet: http://www.derekleffew.com/referencedocumentsandwebsites/LampFormulas2.xls?attredirects=0 .
 
Depends on how you define "ok." Using a 120V long-life lamp on a circuit that only supplies 115V will certainly last longer than a 115V long-life on a 115V circuit, but may not be "ok" to designers, except those who like their output dim and yellow.

:) From what I read here... Sure, but wouldn't a designer prefer ever-so-slightly dimmer lamps, than ones that aren't working at all? OK for me means working fixtures. Can most designers really tell if a dimmer is at 96% instead of full? Would Mom notice the difference watching her kid on stage for the first time? I'd guess poor alignment would affect things more than 4% of voltage difference, and 5 year old faded Roscolux causing more of a color difference than an ever-so-slightly color temperature difference? No?
(I'll measure out of curiosity, and would be interested in comparing the two side by side.) Your data on lamp life differences is great, thanks.
 
Last edited by a moderator:
:) From what I read here... Sure, but wouldn't a designer prefer ever-so-slightly dimmer lamps, than ones that aren't working at all? OK for me means working fixtures. Can most designers really tell if a dimmer is at 96% instead of full? Would Mom notice the difference watching her kid on stage for the first time? I'd guess poor alignment would affect things more than 4% of voltage difference, and 5 year old faded Roscolux causing more of a color difference than an ever-so-slightly color temperature difference? No?
(I'll measure out of curiosity, and would be interested in comparing the two side by side.) Your data on lamp life differences is great, thanks.

you'd be surprised
 
Your dimmers may or may not allow regulation, but be aware they can only lower, not boost, voltage.

I don't think this is completely correct. I believe that in the 70's Kliegl had a dimmer for TV studios that had a transformer in the output to assure that the output of the dimmer at full was a true 120 volts. The intent was to make sure that color temperature from each lamp was consistent

I never saw one. I never sold one. But the sales support folks told me they existed
 
I'm running 220v in my system but we've always used 115v lamps, never had a problem though. Is voltage even a real issue? I would think one should be more concerned about how many amps one is running as opposed to voltage
 
I don't think you actually have 220 coming out of your stage pin plugs, or you'd have seen super bright lamps and more than one explosion already.
 
I'm running 220v in my system but we've always used 115v lamps, never had a problem though. Is voltage even a real issue? I would think one should be more concerned about how many amps one is running as opposed to voltage

I am guessing you mean that your system is 120v and you use 115v lamps. This is pretty common as you can eek out a few extra lumens. I suppose you could have a 120/240v split-phase system, but the voltage out of the dimmers is going to be 120v. (at least in almost all US theatres)

However, voltage is definitely a real issue. Have you ever tried to connect a significantly lower voltage lamp like a 12v RV lamp to a 120v socket? They go POP really fast. If you plugged a 220v lamp into a fixture connected to 120v it would appear to be at around half intensity. For an incandescent lamp voltage will affect the brightness or output of the lamp.

We do also need to be concerned with the current draw (amperage) that a fixture uses, but it is not always the same. Current will vary with intensity of an incandescent lamp. With things like MLs and LED fixtures, it will vary based on how many motors are working, the age of the lamp, how many LEDs are illuminated, which fans are running, etc.
 
:) From what I read here... Sure, but wouldn't a designer prefer ever-so-slightly dimmer lamps, than ones that aren't working at all? OK for me means working fixtures. Can most designers really tell if a dimmer is at 96% instead of full? Would Mom notice the difference watching her kid on stage for the first time? I'd guess poor alignment would affect things more than 4% of voltage difference, and 5 year old faded Roscolux causing more of a color difference than an ever-so-slightly color temperature difference? No?
(I'll measure out of curiosity, and would be interested in comparing the two side by side.) Your data on lamp life differences is great, thanks.

Nope. We want brighter and whiter. Remember- getting more life out of the lamp is EASY-- you just profile your dimmers to full at 99% (or 98%). Getting more LIGHT out of the lamp however is harder. You can always get less light. Only Spinal Tap was able to "take it up to 11". As a designer we want the most options available- which means the brighting source possible, and if we want less light, we dial down the intensity. At least that's my take on it.
 
I'm running 220v in my system but we've always used 115v lamps, never had a problem though. Is voltage even a real issue? I would think one should be more concerned about how many amps one is running as opposed to voltage

Now, this is worthy of one of my old stories from the past! :angryoldman:
Once, back in the 80's, I did a spotlight rental for a band from Europe that ACTUALLY was running 120 volt lamps and using the 208/240 volt mains on the dimmer! (the packs were running hot-to-hot) Seams the company had done some custom work on their analog dimmer ramp generators so they would top off at the equivalent of 120 volts. The reasoning behind this was that they toured in some of the old soviet states, where "220" could be as low as 160 volts (so the "120" legs would have been 80 volts even if they had such a thing over there, which I don't think they did.) The packs were also modified to be switchable between 50 and 60 Hz. Due to timing, I didn't get much time for an in depth discussion, but the system did actually work with no "4th of July" results. His claim was that as long as he got better than 160 volts, he was happy. Oh, the crazy 80s!
 
Last edited:

Users who are viewing this thread

Back