David Ashton
Well-Known Member
The resistance increase with temperature, by a factor of around 10 in a lamp, the only exception, if my recollections from school are correct, is carbon, which has a negative coefficient.
.
So, that still brings us back to the preheating question. It isn't a question if if preheating will make you lamp last longer than the rated lamp life, it is a question of if preheating will make the lamp last longer than not preheating the lamp.
Probably the ideal test would be to take two identical fixtures and lamps from the same lot, Preheat one at some low lever for some time and then turn both fixtures on to full in a 0 count. Let the fixtures burn for about 4 hours a day and repeat each day until one dies. I picked a 4 hour burn time because most theatres probably don't run their lamps for more than 4 hours a day. Running the lamps at full gives us optimum operating conditions, so we should get close to the rated lamp life.
This is not a valid replication of the theatre situation which is
pre-heat lights
turn lights off for 30- 45 minutes {while audience enters}
turn lights up to full for 4 hours{cue 1 go}
Even if pre-heating did work the break between pre-heat and show of at least half an hour completely negates any dubious benefit.
Power=heat=wattage is not true. Gaff is right, not all 100w lamps produce the same amount of heat. Power=heat=wattage would imply that all devices that draw the same wattage produce the same amount of heat, which also isn't true. I would happy run around with my IR thermometer to prove that, I just have to collect enough lamps of the same wattage, pretty easy to test though, and I will happily admit being wrong if I am, but I was taught that it is resistance that produces heat.
Hate to say it, but for most shows that I have worked on many lights are on as the audience comes in. Even so, the first cue of most shows (in theatre) is usually not a bump to full. In fact in most shows it is only at very few points in the show that lights make it to full. It is a fundamental of design, if you start at full you have nowhere to go from there, you can't get more exciting.allthingstheatre said:This is not a valid replication of the theatre situation which is
pre-heat lights
turn lights off for 30- 45 minutes {while audience enters}
turn lights up to full for 4 hours{cue 1 go}
Even if pre-heating did work the break between pre-heat and show of at least half an hour completely negates any dubious benefit.
You're the next generation of "Theatre Lighting Historian", you tell me! The fixture IS listed in the Photometrics Handbook, many times. Nobody else help him, okay? And it's a "Lamp Cap." (Complete assembly.) You can't see the lamp base (or socket) in the picture.Derek, what is that lamp base from?
I prefer the former."Do as I say, not as I do." Or, if you prefer, "Because I'm your mother, that's why."
Power =heat= wattage is absolutely true unless you have discovered a new branch of physics Newtonian physics says that energy cannot be destroyed only turned from one form to another e.g.electricity - heat -light. If one has to invent a new branch of physics to justify pre-heating then please feel free.
It is so simple I can't understand the problem
A 1000 watt lamp consumes that amount of power which it converts to heat this heat is converted to light.
There is a trade off between all the design factors in a lamp such as robustness life, output levels, colour temperature etc.
So you choose the optimum lamp for your application e.g. very bright for very few hours or very dim for a hundred years or some point in between.
Whichever end of this range you choose you are still using 1000 watts and the
efficiency is your choice, to suit your needs.
Because the light conversion is very low practically both lamps will produce the same amount of heat.
It is simply a question of how much heat is generating light, which is a problem in this thread.
We use essential cookies to make this site work, and optional cookies to enhance your experience.