As many people here know, connected load can't be above 80% of the circuit breaker rating in the United States. Sometimes this is straightforward, but with LED strips there are sometimes several ways to calculate this. I'm wondering which is correct for most city codes, and if there's a way to read/interpret building codes to clarify which method to use.
For example, suppose there are a bunch of lightup boxes, and each box has a Meanwell HLG-240H-24A power supply (240W max output) with 300 Watts of RGB LED strip connected, but the program only uses primary colors (100W) and secondary colors (200W).
Here's the Meanwell datasheet:
http://www.meanwellusa.com/webapp/product/search.aspx?prod=HLG-240H
which gives the following info:
- Page 2 OUTPUT: 240W. (Maximum output power is 240W. 24V and current-limited to 10A, voltage will drop if load doesn't reach 24V at 10A as seen on page 3.)
- Page 2 INPUT: AC CURRENT (Typ.): 4A / 115VAC
- Page 6 chart: PFC is around 0.99 in CC mode. (100% load is where CC mode meets CV mode.)
- Page 6 chart: Efficiency is around 90% at 100% load.
Now, I don't know how it could possibly draw 4A at 115V when the max DC output is only 240W. It would either generate a lot of heat, or the PFC would be terrible, both of which contradict the datasheet charts. But this is what is listed in the official data sheet.
Alternatively, the line current can be calculated by:
AC current = DC power / efficiency / power factor correction / AC voltage
Calculating using maximum power output:
240W/.90/.99/120V = 2.24A
Calculating with secondary colors:
200W/.90/.99/120V = 1.87A
Or the fictional scenario of all-white 100%:
300W/.90/.99/120V = 2.81A
Sounds silly, but I've seen somebody use the connected LED strips as the max load, even though the power supply output is current-limited.
Any thoughts on which method to use? With a large number of boxes, the calculation method can greatly affect the number of breakers required to meet code.
It looks like calculations usually use the nominal voltage of 120V, although I thought 115V is nominal and the range is 110 to 125V.
For example, suppose there are a bunch of lightup boxes, and each box has a Meanwell HLG-240H-24A power supply (240W max output) with 300 Watts of RGB LED strip connected, but the program only uses primary colors (100W) and secondary colors (200W).
Here's the Meanwell datasheet:
http://www.meanwellusa.com/webapp/product/search.aspx?prod=HLG-240H
which gives the following info:
- Page 2 OUTPUT: 240W. (Maximum output power is 240W. 24V and current-limited to 10A, voltage will drop if load doesn't reach 24V at 10A as seen on page 3.)
- Page 2 INPUT: AC CURRENT (Typ.): 4A / 115VAC
- Page 6 chart: PFC is around 0.99 in CC mode. (100% load is where CC mode meets CV mode.)
- Page 6 chart: Efficiency is around 90% at 100% load.
Now, I don't know how it could possibly draw 4A at 115V when the max DC output is only 240W. It would either generate a lot of heat, or the PFC would be terrible, both of which contradict the datasheet charts. But this is what is listed in the official data sheet.
Alternatively, the line current can be calculated by:
AC current = DC power / efficiency / power factor correction / AC voltage
Calculating using maximum power output:
240W/.90/.99/120V = 2.24A
Calculating with secondary colors:
200W/.90/.99/120V = 1.87A
Or the fictional scenario of all-white 100%:
300W/.90/.99/120V = 2.81A
Sounds silly, but I've seen somebody use the connected LED strips as the max load, even though the power supply output is current-limited.
Any thoughts on which method to use? With a large number of boxes, the calculation method can greatly affect the number of breakers required to meet code.
It looks like calculations usually use the nominal voltage of 120V, although I thought 115V is nominal and the range is 110 to 125V.