Control/Dimming LED frequency: does it affect lifespan?

jonares

Member
Howdy.... to the hive mind: I had a photographer complaining about flicker on our ColorSource engines with LED cyc adapters... I helped him with better shutter speed options, but I looked to see if I could change the frequency of the ColorSource. Sure nuff, it's do-able... I've never been able to get RDM to work in my spaces, but was able to change it on the cycs using my DMXCat. Anywho.... the ColorSource is defaulted at 1200Hz, but has the option to turn to 25kHz. So I fixed 'em. But... why is it defaulted to 1200hz? Do higher frequencies shorten the life of the electronics? Can I just leave (or convert the rest of my ColorSources) to 25kHz and not worry about 'em?
 
As a general rule, dimming performance will be better at lower frequencies. 25Khz is "flicker-free", but the low-end dimming will suffer a bit.

I've never heard of any longevity implications.

Luke
 

Jay Ashworth

Well-Known Member
I would have expected possible reduced output, but not a reduction in "dimming performance" (which seems like it could mean several disparate things); can someone expand a sentence or two more on that?
 

aberry

Member
I would have expected possible reduced output, but not a reduction in "dimming performance" (which seems like it could mean several disparate things); can someone expand a sentence or two more on that?

Sure, but might be more than a sentence or two :p

Increasing the PWM frequency requires either increasing the frequency of the clock that drives the PWM counter or reducing the resolution of the counter. Since the maximum clock frequency is limited by the hardware generating the PWM signal, at a certain point you can't increase the clock frequency and need to reduce the resolution. That means that instead of, say 16 bit resolution (64k possible dimming steps) you might only get 8 bit resolution (256 possible dimming steps). That means the steps are bigger and it's more likely that you'll see those steps as little bumps in the lowest end of the dimming curve (since the human eye is most sensitive to brightness changes at low light levels). You can somewhat compensate for this by dithering, where the duty cycle wobbles between values from one PWM cycle to another, sort of averaging the two values, but this only helps so much.

In addition to the limitations of the PWM counter, there are electrical limitations. The efficiency of direct PWM dimming comes from the fact that the power transistor that switches the load is always fully on or fully off. When the transistor is off it's effectively open circuit and when it's on it's effectively a short circuit--either way it dissipates very little power. But going from off to on and back requires passing through a region where the transistor has finite non-zero resistance, and DOES dissipate power, and doing that takes a certain amount of time. So if the switching time is fixed, then higher switching frequency means the transistor spends more time in that transition region and, overall, dissipates more power. And if the on and off transitions are too close together (very low or very high duty cycle with a higher frequency) the transistor won't switch fully and the relationship between PWM duty cycle and LED output gets a little funny because the pulse is no longer rectangular. You can reduce the switching time by driving the transistor harder, but this requires higher power gate drive which also consumes more power, and harder switching makes for harder EMC challenges.

So fundamentally it comes down to a limitation on how finely you can divide the PWM cycle, so when you increase the frequency you have fewer divisions available for dimming and thus fewer dimming steps.
 

STEVETERRY

Well-Known Member
Sure, but might be more than a sentence or two :p

Increasing the PWM frequency requires either increasing the frequency of the clock that drives the PWM counter or reducing the resolution of the counter. Since the maximum clock frequency is limited by the hardware generating the PWM signal, at a certain point you can't increase the clock frequency and need to reduce the resolution. That means that instead of, say 16 bit resolution (64k possible dimming steps) you might only get 8 bit resolution (256 possible dimming steps). That means the steps are bigger and it's more likely that you'll see those steps as little bumps in the lowest end of the dimming curve (since the human eye is most sensitive to brightness changes at low light levels). You can somewhat compensate for this by dithering, where the duty cycle wobbles between values from one PWM cycle to another, sort of averaging the two values, but this only helps so much.

In addition to the limitations of the PWM counter, there are electrical limitations. The efficiency of direct PWM dimming comes from the fact that the power transistor that switches the load is always fully on or fully off. When the transistor is off it's effectively open circuit and when it's on it's effectively a short circuit--either way it dissipates very little power. But going from off to on and back requires passing through a region where the transistor has finite non-zero resistance, and DOES dissipate power, and doing that takes a certain amount of time. So if the switching time is fixed, then higher switching frequency means the transistor spends more time in that transition region and, overall, dissipates more power. And if the on and off transitions are too close together (very low or very high duty cycle with a higher frequency) the transistor won't switch fully and the relationship between PWM duty cycle and LED output gets a little funny because the pulse is no longer rectangular. You can reduce the switching time by driving the transistor harder, but this requires higher power gate drive which also consumes more power, and harder switching makes for harder EMC challenges.

So fundamentally it comes down to a limitation on how finely you can divide the PWM cycle, so when you increase the frequency you have fewer divisions available for dimming and thus fewer dimming steps.
Excellent explanation!

ST
 

aberry

Member
Oh. There *is* a D/A conversion in there, right? I assumed the PWM was done on the analog side.
Well, everything gets converted to analog when it hits the real world!

For any sort of luminaire that isn't LED tape, there's typically going to be a switch-mode constant current driver for each string of LEDs. The best way to dim those is to put a transistor in parallel with the LED string--turn the transistor on and it shorts the LEDs out, turning them off. That seems counterintuitive, but since the driver is going to very efficiently hold the output current constant, and the transistor has very low resistance, the output voltage drops to near zero and very little power is dissipated this way. This is a common enough method that a lot of LED driver ICs include provisions for shunt dimming--sometimes they include a driver for the shunt transistor, or at least a PWM sense input so that the internal control loop can switch operating modes between the on and off times for better efficiency/response time. You could also PWM the whole driver on and off, but it takes time for the driver to go from zero to steady state operation at the desired setpoint, so in practice the PWM frequency has to be pretty low. With shunt dimming the drive current continues to circulate through the transistor when the LEDs are off, so as soon as the transistor turns off the LEDs go right back to running at their setpoint, and the ultimate rise/fall times are much shorter, allowing for much faster PWM. Neither of those requires a D-A conversion, so they can provide very good precision with minimal cost.

On the other hand, you could implement dimming by adjusting the current setpoint of the driver, which would require some sort of analog input to the driver. That analog signal could be provided by filtering a digital PWM signal to derive an analog DC voltage, but there's a big tradeoff between effective resolution and response time with that method. Even with a proper DAC, though, analog dimming generally doesn't provide nearly as much dimming range or precision. Adjusting the current setpoint is better reserved for trimming the LED strings for color/brightness matching independent of the PWM dimming.
 

Jay Ashworth

Well-Known Member
Well, everything gets converted to analog when it hits the real world!

For any sort of luminaire that isn't LED tape, there's typically going to be a switch-mode constant current driver for each string of LEDs. The best way to dim those is to put a transistor in parallel with the LED string--turn the transistor on and it shorts the LEDs out, turning them off. That seems counterintuitive, but since the driver is going to very efficiently hold the output current constant, and the transistor has very low resistance, the output voltage drops to near zero and very little power is dissipated this way. This is a common enough method that a lot of LED driver ICs include provisions for shunt dimming--sometimes they include a driver for the shunt transistor, or at least a PWM sense input so that the internal control loop can switch operating modes between the on and off times for better efficiency/response time. You could also PWM the whole driver on and off, but it takes time for the driver to go from zero to steady state operation at the desired setpoint, so in practice the PWM frequency has to be pretty low. With shunt dimming the drive current continues to circulate through the transistor when the LEDs are off, so as soon as the transistor turns off the LEDs go right back to running at their setpoint, and the ultimate rise/fall times are much shorter, allowing for much faster PWM. Neither of those requires a D-A conversion, so they can provide very good precision with minimal cost.

On the other hand, you could implement dimming by adjusting the current setpoint of the driver, which would require some sort of analog input to the driver. That analog signal could be provided by filtering a digital PWM signal to derive an analog DC voltage, but there's a big tradeoff between effective resolution and response time with that method. Even with a proper DAC, though, analog dimming generally doesn't provide nearly as much dimming range or precision. Adjusting the current setpoint is better reserved for trimming the LED strings for color/brightness matching independent of the PWM dimming.
That was an even better explanation, not-at-all-accidentally (I'm sure) addressing the exact point at hand.

Thanks.
 

Users who are viewing this thread