Heat vs wattage vs light output

JChenault

Well-Known Member
I have a question for the hive mind.

would one expect that a lower wattage lamp at full would generate more or less heat than a higher wattage lamp on a dimmer that would produce the same light output?

Certainly the lower wattage lamp at full would be a higher kelvin color temp, but I don’t see how that would matter.

the question that brought this up is a fixture I have that I can put a 20 watt MR 11 ( running at full ) or a 35 watt MR 11. If I run the 35 watt unit down on dimmer to be the same brightness as the 20 watt unit. Which will be hotter?
 
Assuming you're talking about incandescent lamps, rather than LEDs or some other sort of lamp...

I believe the 35 watt light will be generating more heat, as it's less efficient at converting electricity into visible light when dimmed than the 20 watt one at full power. Basically, all the electricity that doesn't become light (which is the great majority of it) becomes heat, so a pretty good estimate of the ratio of heat generated could be found by measuring the power consumption of the two bulbs at their given operating conditions. The lower color temperature is actually also a clue. Since the light bulb is basically a black body emitter, the lower color temperature means that relatively more radiation is at lower frequencies (in the infrared range and lower), and less in the part of the spectrum we can see, so to get the same amount of visible light there's more infrared, etc. emitted, and the lamp is operating less efficiently.

For an LED lamp, it would depend on the LED driver's efficiency and design, but in general I would not expect a vast difference in the power consumption or heat generation for this sort of situation. I also would not generally expect to see a 35 watt (actual power, not "equivalent") MR11 LED lamp, for that matter, nor would I particularly care to have such a beast shining in my eyes at moderately close range.
 
Without doing a bunch of research on lamp specifications and firing up my HP15, I would say that a 35W inc lamp dimmed to the same light output as a 20W inc lamp running at design voltage, would convert a higher percentage the 35W lamp's electrical input into heat than the 20W lamp's electrical input. (But both of 'em are pretty small; I usually think in terms of deuces, fives and 10s.)
 
I didnt quite read the real question as I posted above. The answer is, the heat may be the same if it takes the same 20W to create the same illumination level of a 20W lamp. If it's more efficient than the 20W, then it could produce less.
 
I didnt quite read the real question as I posted above. The answer is, the heat may be the same if it takes the same 20W to create the same illumination level of a 20W lamp. If it's more efficient than the 20W, then it could produce less.

This re-forms the question. Does a higher rated lamp ( Say a 35 watt lamp. Or a 1000 watt lamp) consuming 20 watts on a dimmer produce more or fewer lumens than a 20 watt lamp consuming 20 watts?

My suspicion is the higher wattage lamp would need more power for the same number of lumens - but I am not sure.
 
This re-forms the question. Does a higher rated lamp ( Say a 35 watt lamp. Or a 1000 watt lamp) consuming 20 watts on a dimmer produce more or fewer lumens than a 20 watt lamp consuming 20 watts?

My suspicion is the higher wattage lamp would need more power for the same number of lumens - but I am not sure.
The answer is indeed "yes" for incandescent lamps, assuming otherwise similar lamps.

In the extreme case, if you take your 1000 watt lamp and operate it at 20W, it probably will produce no visible light (or, at least, none for all practical purposes)--you're operating it at some sort of a preheat level. A 20 watt lamp, on the other hand, might even produce enough light to serve as a ghost light for a (small) stage. Of course, the other side of the coin is that the underpowered lamp will last a whole lot longer, all other things being equal; a 1000 watt lamp producing no visible light at 20W could probably continue to do so without burning out for many decades.

This is also basically the difference between a 300 hour lamp and a 2000 hour lamp of the same general type; the 2000 hour lamp is essentially a higher-voltage, higher-wattage lamp being underdriven. I guess you could instead say that the 300 hour lamp is being overdriven, depending on which you take to be the normal lamp.
 
Heat output of LED or incandescent is a standard 3.41 BTU/watt per hour (+/- 10%) . . .

Granted: the electricity turns into "radiant" energy at that rate, but the issue here is how much of that energy is given off as "heat" and how much is "light"? At what frequency shall we say that "above this" is light and "below this" is heat?
 
What makes me certain is the expertise of lamp designers.
One can usually only extend efficiency by losing some other factor. Life vs output/efficiency by changing voltage is a well known trade off for incandescents. Drop the voltage and lose efficiency!
 

Users who are viewing this thread

Back