The question is - is that just the physics of the LEDs? Is there an input
threshold at which they go from being off to being on rather than the smooth
fade up one gets as an
incandescent element heats up? Fade-outs haven't been a problem. If I was just bashing lights up and down for rock and
roll or dance competitions it wouldn't be a problem but as a designer I value the potential for subtlety more than flash.
The "pop on" problem you're talking about isn't with the
LED's in the
fixture, it's with the electronics that control them. Most
LED fixtures use
PWM to control the brightness of the
LED's. As was stated,
PWM is a scheme where the
LED's are turned "on" and "off" fast enough that you can't really tell they're turning on and off at all (there are some good reasons this scheme is used instead of simply controlling the
current in the
LED's.. but that's another subject). Again as was stated, for this to be "fast enough to not see", the
LED's need to make (at least) 200 "on/off" cycles
in one second (5msec per "on/off" cycle). If the "on" time and the "off" time are equal, the
LED's are being driven at half
power.. make the "on" time longer than the "off" time and the
LED's look brighter.. make the "on" time shorter than the "off" time and they look
dimmer.
Here's where it gets a little technical, but it's important to understand the terms "period" and "resolution" as they relate to
PWM. The 5msec on/off cycle time of the
LED's is called the "period" of the
PWM implementation. The amount of on time compared to off time is called "duty cycle", and is normally expressed as a percentage ratio of the on time to the "period" (on time/period X 100). So for example, 50% duty cycle is when on time and off time are the same (on time is half the period)... 100% duty cycle is when the
LED is on for the complete cycle (never turns off).
The "resolution" refers to how finely the 5msec "period" can be sliced up by the electronics to do the on and off timing. Microprocessors are typically used for these circuits, and as such, "resolution" is determined by the "number of bits" dedicated by the microprocessors internal hardware for this task. 8bits and 10bits of "resolution" are fairly common, and represent 256 slices and 1024 slices respectively. Here's the important part... The dimmest "on" setting for the
LED, is when one(1) of the "resolution" slices turns the
LED on, and the rest turn the
LED off... so with 8bits of "resolution" and 5msec of "period", that's 1/256 of 5msec, which is about 0.02msec "on" and 4.98msec "off". With high powered
LED's, that's more than enough "on time" for the
LED to cast considerable light. A shorthand way to think about this, is to divide the maximum
lumen output of the
fixture by the "resolution", and the result is the lumens generated when the
fixture is at it's lowest "on" setting. So if your starting out at 3000 lumens, with 8bits of resolution, the minimum "on" setting is something like 10-12 lumens (3000/256).. that's a pretty considerable "pop on". It's better at 10bits of resolution.. where the minimum setting is in the range of 3 lumens, but even that will be seen to "pop on", especially if there are multiple fixtures being used. It turns out that to get smooth dimming at the bottom and a minimum amount of "pop on", you really want 12bits (4096 slices) of resolution or better. It also turns out that you won't typically find this spec on a mfgr's data sheet.