Preheating Conventionals?

The resistance increase with temperature, by a factor of around 10 in a lamp, the only exception, if my recollections from school are correct, is carbon, which has a negative coefficient.
 
.
So, that still brings us back to the preheating question. It isn't a question if if preheating will make you lamp last longer than the rated lamp life, it is a question of if preheating will make the lamp last longer than not preheating the lamp.
Probably the ideal test would be to take two identical fixtures and lamps from the same lot, Preheat one at some low lever for some time and then turn both fixtures on to full in a 0 count. Let the fixtures burn for about 4 hours a day and repeat each day until one dies. I picked a 4 hour burn time because most theatres probably don't run their lamps for more than 4 hours a day. Running the lamps at full gives us optimum operating conditions, so we should get close to the rated lamp life.

This is not a valid replication of the theatre situation which is
pre-heat lights
turn lights off for 30- 45 minutes {while audience enters}
turn lights up to full for 4 hours{cue 1 go}
Even if pre-heating did work the break between pre-heat and show of at least half an hour completely negates any dubious benefit.
 
This is not a valid replication of the theatre situation which is
pre-heat lights
turn lights off for 30- 45 minutes {while audience enters}
turn lights up to full for 4 hours{cue 1 go}
Even if pre-heating did work the break between pre-heat and show of at least half an hour completely negates any dubious benefit.

And this isn't a valid situation. What you forget is a proper dimmer check.
 
Sorry but totally absolutely wrong, power=heat=wattage, how you arrive at that wattage is irrelevant a 100 watt 240v lamp produces the same amount of heat as a 12v 100 watt lamp, namely 100 watts.
Power=heat=wattage is not true. Gaff is right, not all 100w lamps produce the same amount of heat. Power=heat=wattage would imply that all devices that draw the same wattage produce the same amount of heat, which also isn't true. I would happy run around with my IR thermometer to prove that, I just have to collect enough lamps of the same wattage, pretty easy to test though, and I will happily admit being wrong if I am, but I was taught that it is resistance that produces heat.

Also think about it this way, if there was no resistance there would be no heat. If there was no resistance there would be no light.


allthingstheatre said:
This is not a valid replication of the theatre situation which is
pre-heat lights
turn lights off for 30- 45 minutes {while audience enters}
turn lights up to full for 4 hours{cue 1 go}
Even if pre-heating did work the break between pre-heat and show of at least half an hour completely negates any dubious benefit.
Hate to say it, but for most shows that I have worked on many lights are on as the audience comes in. Even so, the first cue of most shows (in theatre) is usually not a bump to full. In fact in most shows it is only at very few points in the show that lights make it to full. It is a fundamental of design, if you start at full you have nowhere to go from there, you can't get more exciting.

The test scenario that was suggested is based on ideal operating conditions for the fixtures. Very few shows would have a light on at full for the entire show, but that is how lamps are tested. In theory it should give as close to the rated lamp life as you can get when turning the lamps on and off. This because lamp life is determined by running the lamps at full until they die.

In theory the halogen cycle will start as soon as the filament is hot enough to start burning off tungsten. At this point it should be hot enough to break the tungsten-halide molecules to have the tungsten redeposited. You can think about it this way too. If a 12v lamp says that for best results you shouldn't run it at less than 80% on a dimmer then since 20% of US line voltage is 24v running that 120v lamp at 20% should be enough to get the tungsten cycle going. This may not have the cycle at it's most efficient, but it will be working none the less.


Keep in mind that this is all in theory, and I would be happy to be proven wrong, but just saying I am wrong because what you say is right doesn't make sense. I have offered plenty of reasoning for my thoughts here, and I would happily call some of my old professors who would happily help answer the questions here.
 
Some scientific principles apply here; Input >> Output, Input = Output. (Sorry, no text arrows) Basically, whatever you put into something, you are going to get as output in some form. In this case, our input is electricity, and we will measure that in watts. (volts times amps) Our known outputs are Heat, Light, Magnetism and maybe a little of something else, who knows! So, the question is- How much of each? 98% heat, 1.5% light, .5% Magnetism? If so, our bulb has an light efficiency of 1.5%

It is putting out other stuff, but we only really want light. If all lamps were the same efficiency,and everything else was constant, then all bulbs of the same wattage would produce the same amount of heat. But, they aren't. If a lamp is 2% efficient, it will produce that much less heat. Mind you, the numbers I picked are arbitrary, but I do know that efficiency is a variable. If efficiency is variable, then heat output from lights of the same wattage are variable. If not, then we had better dig up Einstein and ask him what went wrong ;)
 
Alex has inspired me to conduct an experiment with my own IR Thermometer. I measured the surface temp. of the envelope of a 500watt EHD lamp at various voltages. Results are attached below. I have no idea what, if anything it proves.

Until it's proved or explained to me that what I'm doing is actually detrimental to my lamps, I will continue to do as I've always done as Master Elect/Board Op. on long running shows:

Come in for show call. All dimmers to 10-25% over at least 5 minutes to: a) gently warm the filament and b) allow for me onstage to check for burnouts and melted color media. [We ALL agree that bring a lamp up slowly is less likely to cause a failure than bumping it to full, Correct?]Then, either auto or RFU or another person at the board (although I prefer doing it auto once I know the focus, I find it more efficient) bring each dimmer to Full (manually: <ch1 @ FL *> or auto (effect): "on a 2 second fade.")

If a lamp has failed and gone unnoticed during the previous show, I'll see it during the preheat period. If a lamp fails during the dimmer/focus check, I'll see it.

I have discovered, through experience, since 1981 when I first started running memory consoles, that if I preheat lamps at the beginning of the day, I change fewer lamps than if I come in and start at <Dimmer 1 @ FL *>. Note I only do this once we enter the "show run" phase, not during hangs, focuses, or techs. We always seem to lose more lamps then, possibly from fixtures being moved about, possibly as a way of "weeding out" lamps near the end of their life.

An HPL575 is what, about $20 US, and has a rated life of 300 hours. That's 6.7¢/hour, worst case scenario, and we know that running the lamp at 90% extends life something like 400% (not exact, old "rule of thumb"). [We ALL agree that we generally get more than the rated life out of our lamps in the theatre, correct?] It's not the cost of the lamp or its life or the cost of power I'm worried about, it's the labor cost to replace it and the detrimental effect it will have if it blows during a show.

This has been one of the best discussions (arguments) you've ever posed, Gafftaper. Time to start another, as I feel this horse is dead. (Though its cause of death is still undetermined, and likely to be so for a long while.)

(JD/steveterry/BillESC/Logos--You may remember the "urban myth" that circulated in the 1970s upon the introduction of Q/I (T/H) lamps. It was said they wouldn't hold up if bumped to full, and therefore were inappropriate for rock&roll. I don't think we need Mythbusters to disprove that.)
 
Last edited:
Derek, what is that lamp base from?
 
Uh oh Derek, you're setting a bad example for the younger ones.
That cable coming out of your switchbox looks like SJTW, and the strain relief is a simple clamp.
 
Thank you, Philip, for noticing. The cable IS in fact SJTW and it IS a 19¢ "Romex Clamp." It was built before you were born, and I have used it as a "Booth Running Light Dimmer" many, many times. I dare say IT has run more shows than you have.

"Do as I say, not as I do." Or, if you prefer, "Because I'm your mother, that's why."
 
Power =heat= wattage is absolutely true unless you have discovered a new branch of physics Newtonian physics says that energy cannot be destroyed only turned from one form to another e.g.electricity - heat -light.
If one has to invent a new branch of physics to justify pre-heating then please feel free.
 
Power =heat= wattage is absolutely true unless you have discovered a new branch of physics Newtonian physics says that energy cannot be destroyed only turned from one form to another e.g.electricity - heat -light. If one has to invent a new branch of physics to justify pre-heating then please feel free.

WHAT?

Your are 100% correct about Newton... the electricity is being converted to light AND heat. The design of the lamp dramatically alters how much becomes heat and how much becomes light. A very efficient design will produce more light and less heat. If I go to the hardware store and buy two 100 watt lamps with different lumen ratings clearly they must run a different temperatures because as you said the energy is either being converted to heat or light. If they have different light outputs then they MUST also have different heat outputs to compensate.

If power=heat=watts then explain this to me:
A 1000 watt FFN VNS PAR64 lamp puts out 11,000 lumens
A 1000 watt BTR lamp puts out 20,500 lumens

The BTR is far more efficient, converting more of it's energy to light than the FFN. They both consume 1000 watts of power. Where does the rest of the energy go? It turns to heat, but at different rates due to the different efficiencies of the two lamps. Furthermore, the temperature of the lamps will be vastly different due to the difference in envelope size to dissipate that heat.
 
Gafftapegreenia has either given up or is trying to cheat by emailing me.;) Either way, his answer is below.

I think I said before that I "liberated" this fixture because the students kept hanging it upside down. The label on one side of the lens tube was affixed upside down at the factory. So much for quality control in 1979!

Until the SourceFour™ in 1994, this was my favorite line of ERSs. I still prefer its lamp alignment over all others: completely tool-less, every lamp change requires a peak-cosine adjustment, and the joystick is the simplest I've ever used. Strange that the Photometrics Handbook doesn't have the entire line of these--I wish I had the original cut sheets to send to Mr. Mumm.
 
Last edited:
It is so simple I can't understand the problem
A 1000 watt lamp consumes that amount of power which it converts to heat this heat is converted to light.
There is a trade off between all the design factors in a lamp such as robustness life, output levels, colour temperature etc.
So you choose the optimum lamp for your application e.g. very bright for very few hours or very dim for a hundred years or some point in between.
Whichever end of this range you choose you are still using 1000 watts and the
efficiency is your choice, to suit your needs.
Because the light conversion is very low practically both lamps will produce the same amount of heat.

It is simply a question of how much heat is generating light, which is a problem in this thread.
 
Last edited:
Gah, Strand-Century, I'm not too good with all the ERS's from that vintage. I don't come into contact with Strand's much.
 
It is so simple I can't understand the problem
A 1000 watt lamp consumes that amount of power which it converts to heat this heat is converted to light.
There is a trade off between all the design factors in a lamp such as robustness life, output levels, colour temperature etc.
So you choose the optimum lamp for your application e.g. very bright for very few hours or very dim for a hundred years or some point in between.
Whichever end of this range you choose you are still using 1000 watts and the
efficiency is your choice, to suit your needs.
Because the light conversion is very low practically both lamps will produce the same amount of heat.
It is simply a question of how much heat is generating light, which is a problem in this thread.

Heat cannot be converted to anything, let alone light. That is the law of entropy and chaos theory. Heat, in and of itself, cannot do any work. The only thing that heat can do is escape and eventually grow cool. Heat is a product of work. To get heat you need to use energy, heat itself has no potential energy. Heat can be transfered between bodies, but only until a thermal equilibrium is reached.

On the other hand, light is capable of doing work. For one thing, light can be converted to heat. Light can also be converted to electricity using photovoltaic cells.

When you turn on a light, the resistance of the filament causes it to glow, emitting light. The resistance of the filament also creates heat. The heat does not create the light, the resistance of the filament creates the light.

Also think on the black body radiator model, it says that to have a warmer color temp the actual heat produced is lower. Therefore an HPL750 should be physically hotter than an HPL750X which has a warmer color temp.

You still have not backed up anything you have said with either scientific data, or real world experimental data. So, if I have to, I will go in to work tomorrow and shoot the temperatures of the two different 300w lamps that we use for house light lamps and I will shoot the temps of an HX-755 (a 750w long life lamp) and an HPL750. I don't think I have any low voltage lamps that I can test. I still have to work out a test procedure though because my IR thermometer I don't think can measure high enough. I have a digital temperature probe that has a higher max, but I don't know if I can do what I need to with it.

As I have said, I am happy to be proven wrong, but so far no one has offered any proof.

Try looking at it this way, there are many people who teach preheating, there are many who preach it, and there are many who do it. If there was no good reason to do it why would so many people think that it is a good thing. Every stereotype is based on fact, the fact may not be a fact any more, but it was at some point.
 
Ah' them Lekos... nice, want one. Yep, grew up with them also - back in high school and to some extent in college.

Still one thing about pre-heating that has not been mentioned is the pre-heat voltage many types of dimmers and pre-sets apply to the lamps as benefit in both lack of cold start and quicker lamp to full times. Most dimmers I have worked on over the years do have a trim setting where by you can adjust the minimum voltage applied to the lamps - normally like 13v or adjust it down for say never applying full voltage to the lamps in extending lamp life. Can adjust the dimmers for this setting and a common problem in setting the dimmer trim. Since more modern dimmers I realize this say 0-85% actual dimmer ratio was a base of what was set between warming current for the lamps and what was full applied to them.

A few styles of pre-show lamp check have been mentioned - this lamp to lamp or low dimmer setting with all on and searching for the one that is out, both work adiquate when brought up slow given cold start or fine when brought up after a short period of time with simply dimmers on.

It is often noted, short of pre-warming the lamps, bringing them up slow is necessary - this especially with low voltage lamps I would add.

Valid point the question of doing the pre-show warming of the lamps some given say that warming current and time to cool down before the show starts in doing any good at all. Perhaps such a concept is left over from the days before warming currents that I am familiar with and will zap you to some extent if not aware of such a thing. Than again, if a lamp is only warmed, perhaps before the show it is not such a bad thing to before the show induce a bit more to it so as to ready / refreshen that lamp before its important use. This by way of readying the lamp for use or simply attempting to blow it out if it is going to go before the show rather than during the show.

Still, I would think tradition or not, doing what is common to do is not a bad thing overall benefit or not. Extra chance if nothing else that lamp if ready to go will go before rather than during the show.

Good discussion, all sides to some extent have valid points and good to learn from. Much further discussion and study on both sides no doubt will end in a much better understanding of what to do and how it all works. Not a bad thing.
 
QUOTE FROM iCEWOLF.
'Heat cannot be converted to anything, let alone light. That is the law of entropy and chaos theory. Heat, in and of itself, cannot do any work. The only thing that heat can do is escape and eventually grow cool. Heat is a product of work. To get heat you need to use energy, heat itself has no potential energy. Heat can be transfered between bodies, but only until a thermal equilibrium is reached.

On the other hand, light is capable of doing work. For one thing, light can be converted to heat. Light can also be converted to electricity using photovoltaic cells.

When you turn on a light, the resistance of the filament causes it to glow, emitting light. The resistance of the filament also creates heat. The heat does not create the light, the resistance of the filament creates the light.

ABSOLUTELY WRONG, of course heat can be converted to light or electricity or work. the resistance of the filament does not cause it to glow, I have boxes of resistors none of which are glowing, it is only when a filament is hot that it produces light.
If some poor physics student comes across Icewolfs theory and uses it in class, he or she will be seriously embarrassed.
 

Users who are viewing this thread

Back