generators?? what?

Hello, Im new to the site and have a few questions as I'm studying for the etcp LX test. I know very little about portable generators, and confused about how they are rated, and how kVA and KW ratings relate to each other? Can anyone direct me in the right direction?
 
You just answered your own question for yourself:mrgreen: If you click on the terms your trying to understand it will take you to the wiki that describes the terms and how they are used.
 
Yes, the auto-generated links will take you right to the definitions. Here the summary:

If it was only a resistive load, KW would be the only number you would need to deal with. Unfortunately, resistive loads (Power Factor 1) are getting to be the minority. Although kVA may sound like the same thing since "V" volts times "A" amps equals watts, that's not what we are looking at. Any device that has a Power Factor of less than 1 (HID lamp ballasts, Dimmers, Computer power supplies, etc) draws its maximum current in a non-liner way. Because of this, they draw a disproportionate amount of current (amps) to the amount of voltage on the line. Lets say we had a 1000 watt load with a less than 1 power factor. It may be drawing 10 amps on a 120 volt line. Even though its KW = 1, its kVA would be 1.2
Even though it is only consuming 1000 watts, it is drawing that current out-of-phase with the voltage waveform. Since the generator windings produce heat based on amperage, we would need one that could handle the kVA rating as compared to the KW rating.
 
Yes, the auto-generated links will take you right to the definitions. Here the summary:

If it was only a resistive load, KW would be the only number you would need to deal with. Unfortunately, resistive loads (Power Factor 1) are getting to be the minority. Although kVA may sound like the same thing since "V" volts times "A" amps equals watts, that's not what we are looking at. Any device that has a Power Factor of less than 1 (HID lamp ballasts, Dimmers, Computer power supplies, etc) draws its maximum current in a non-liner way. Because of this, they draw a disproportionate amount of current (amps) to the amount of voltage on the line. Lets say we had a 1000 watt load with a less than 1 power factor. It may be drawing 10 amps on a 120 volt line. Even though its KW = 1, its kVA would be 1.2
Even though it is only consuming 1000 watts, it is drawing that current out-of-phase with the voltage waveform. Since the generator windings produce heat based on amperage, we would need one that could handle the kVA rating as compared to the KW rating.


Thank you, this explanation is exactly what I needed. I had a vague understanding before but now I get it!!!!
 
Yes, the auto-generated links will take you right to the definitions. Here the summary:

If it was only a resistive load, KW would be the only number you would need to deal with. Unfortunately, resistive loads (Power Factor 1) are getting to be the minority. Although kVA may sound like the same thing since "V" volts times "A" amps equals watts, that's not what we are looking at. Any device that has a Power Factor of less than 1 (HID lamp ballasts, Dimmers, Computer power supplies, etc) draws its maximum current in a non-liner way. Because of this, they draw a disproportionate amount of current (amps) to the amount of voltage on the line. Lets say we had a 1000 watt load with a less than 1 power factor. It may be drawing 10 amps on a 120 volt line. Even though its KW = 1, its kVA would be 1.2
Even though it is only consuming 1000 watts, it is drawing that current out-of-phase with the voltage waveform. Since the generator windings produce heat based on amperage, we would need one that could handle the kVA rating as compared to the KW rating.

Are you sure that you dont have this backwards? KVA does not calculate power factor, KW does. V*A = KVA. V*A*1.73(for 3 phase)*PF/1000 = KW. Most commercial generators are rated in KW not KVA and most are rated at a .8PF.
 
Are you sure that you dont have this backwards? KVA does not calculate power factor, KW does. V*A = KVA. V*A*1.73(for 3 phase)*PF/1000 = KW. Most commercial generators are rated in KW not KVA and most are rated at a .8PF.

kVA is the result of power factor correction on the base KW load. Remember, we are talking LOAD not SOURCE, so the output of the equation is viewed from the other side.

Since Power Factor is a number between 0 and 1, the kVA will always be higher than the KW. (sometime written as a %, so a pf of 47 would be .47)
KW is the actual work done. It is what the electric company would charge you for. kVA is the "apparent" load. The numbers can get pretty wild. Take the following hypothetical:

Lets say we have a 1000 watt 50 volt lamp. We know at 50 volts, it is drawing 20 amps. Now, lets say we want to design a transformerless circuit to drive the lamp off of 120 volts. So, we decide to use an IGBT. We put the IGBT in series with the lamp and design a drive circuit that switches it on so that the leading and lagging portion of the waveform passes, but during the area around the peak, the IGBT is switched off. The result is the lamp is being switched on and off 240 times a second, but the equivalent RMS voltage passing is 50 volts, so the lamp runs fine. Power flows from our source, through the IGBT, through the lamp, and back to our source. The lamp runs a 1000 watts. (1 KW)
BUT, look whats happening with the current- It is a series circuit, and it is running at 20 amps for a 1 KW load! In other words, our "apparent" draw, or kVA is 120 volts times 20 amps, or 2.4 kVA! Our generator would be feeling the heat of a 20 amp load, even though only 1 KW of work was being done. This is far in excess of the expected current of 8.33 amps that we would have with a Power Factor 1 load. Since the peak of the waveform is being ignored, it becomes not-so-much a question of how much work is being done, but when during the waveform the work is being done.

Power companies hate low PF applications because they require over-design to handle the kVA loads instead of the KW they are billing. Low PF loads include almost all newer equipment from computers and LED/CFL lamps, to dimmers and electric motors. Pretty much everything that draws current out of phase with the voltage. We are moving into a world where kVA is more important than KW.
 
Last edited:
Since Power Factor is a number between 0 and 1,
While it is a number between 1 and 0, it is also expressed as a leading pf or lagging.

We are moving into a world where kVA is more important than KW.
KW is still used as a standard in the generator world. For example, I had a customer that had a 100Kw generator that was powering an elevator (among other things). The elevator had a .5PF on start up which is the same as your above example and is very extreme. If you were to design the above circuit you would correct the PF because why would you pull 20a when you can pull 8a? To date the worst I have seen was a .47. This was throwing the generator into all kinds of issues mainly the transfer switch would not go closed transition. The solution was to put a 300Kw back end on it to give it better in rush handeling and change the name plate. It was a 100Kw 300KVA generator. As opposed to 100Kw 120KVA.

Also Kw is related more to engine BHP in generators. With my above example, the generator was not suddenly capable of more than 100Kw. And imagine the customer sold that generator and based on your example of KVA trumping Kw. They would think that they could pull 300KVA and the only way that may be true is if they were running a ridiculous power factor. It does not mean that they could run 3000 100W bulbs. They would get to 100KW worth of load and then they would start to hit a UFRO situation.

So I guess instead of picking nits, I would say size your transformer by KVA and your generator by KW.
 
KW is still used as a standard in the generator world.

I'm not sure I entirely agree with that from what I have seen in this part of the world. It seems to me that smaller gensets are rated in terms of their apparent power rating (kVA), while when we start talking about about large generators such as those found in power stations (I'm a power engineering student) we discuss them in terms of their real power rating (MW), although you of course have to pay attention to their reactive power capabilities especially as they are often under/over excited to provide reactive power compensation to the network.

In essence what I am saying is that the two ratings are not the same thing, but both of them can be very important in calculating the parameters of a system.
 
Also Kw is related more to engine BHP in generators. With my above example, the generator was not suddenly capable of more than 100Kw. And imagine the customer sold that generator and based on your example of KVA trumping Kw.

And that's the exact paradox. KW relates more to the work being done, so having a low PF does not require a bigger engine, but simply heavier windings in the gene to handle the higher current. Most low pf systems can achieve some correction. The best example is the HID ballast, where a capacitor can bring the current phase closer to the voltage phase. Other low pf systems are harder to correct. Computer power supplies pull all their current near the peak, and draw little on the lead or lag. In this case, the center of the current waveform is in phase with the voltage waveform, but the current waveform is more of a square wave as current only flows when the supplies diodes are in forward conduction with the line voltage higher then the supply capacitor's current state.

In my 50 volt bulb example, obviously going with a transformer would bring it close to 1, and adding a cap would help move the demand closer to the phase, but, the concept was a worst case scenario. Dimmers are also a big problem as the phase varies with the dimmer setting, so a fixed value correction would not work. Dimmers are also the one where you can end up with PF's well below 50 when the setting is below 10%. Conversely, at 100% the almost achieve a 1.
 
I'm not sure I entirely agree with that from what I have seen in this part of the world. It seems to me that smaller gensets are rated in terms of their apparent power rating (kVA), while when we start talking about about large generators such as those found in power stations (I'm a power engineering student) we discuss them in terms of their real power rating (MW), although you of course have to pay attention to their reactive power capabilities especially as they are often under/over excited to provide reactive power compensation to the network.

In essence what I am saying is that the two ratings are not the same thing, but both of them can be very important in calculating the parameters of a system.

Rating in KVA is a trick for the low end generator mfgs. Go to a Northern Tool or Harbor Freight and that is what you will see. Because a consumer doesnt know so if brand A advertises 20KVA and brand B advertises 20KW for $400 more, the consumer will most likely pick A. And from my experience almost anything over 25 - 30 KW will be rated in KW. At least when made by the big boys of the industry. The only time that they even think about talking MW is over 1.5MW, I dont know if saying 1 Meg doesn't sound appealing but a Meg and a half sounds big.

And that's the exact paradox. KW relates more to the work being done, so having a low PF does not require a bigger engine, but simply heavier windings in the gene to handle the higher current. Most low pf systems can achieve some correction. The best example is the HID ballast, where a capacitor can bring the current phase closer to the voltage phase. Other low pf systems are harder to correct. Computer power supplies pull all their current near the peak, and draw little on the lead or lag. In this case, the center of the current waveform is in phase with the voltage waveform, but the current waveform is more of a square wave as current only flows when the supplies diodes are in forward conduction with the line voltage higher then the supply capacitor's current state.

In my 50 volt bulb example, obviously going with a transformer would bring it close to 1, and adding a cap would help move the demand closer to the phase, but, the concept was a worst case scenario. Dimmers are also a big problem as the phase varies with the dimmer setting, so a fixed value correction would not work. Dimmers are also the one where you can end up with PF's well below 50 when the setting is below 10%. Conversely, at 100% the almost achieve a 1.

I guess we are both stating the same point from different ends of the table. :)
 
Thanks guys... Im just trying to understand why a manufacturer would make a generator with a KVA rating instead of a KW rating or why would they have both ratings in the specs? The KVA is something that is figured out when calculating your total load for a show, including power factor loss, for example?? So when you are looking for a generator of the appropriate specs, you should find your total KVA for the show you designed and then pick a generator with a KW rating that is slightly larger. Is this correct? So a manufacturer that has a KVA rating on a generator doesn't really make sense???
 
Other industries use a lot of inductive loads like motors and transformers which may only run 500kW but during start require upwards of 900kVA.
 
Using a kVA rating allows them to sell a generator set with a less powerful engine. The load in kW most accurately reflects on what is imposed on the engine.
 

Users who are viewing this thread

Back