Ok, it has been a week so here's what is going on.
The only time VA = Watts is when you have a load that has a
power factor of 1 (or 100% depending on which
system you use.) This would be the case with a pure resistive load where the
voltage waveform exactly overlays the
current waveform. This is not the case with switching
power supplies, induction motors, or
HID lamp ballasts,
etc.
Using the example of the
HID ballast: Uncorrected, the greatest
current draw does not occur at the peak of
line voltage. In fact, if you look at a 100
watt HID, the average
current draw may be higher than 2 amps at 120 volts. It is actually using only 100 watts of
power as it is drawing most of it's
power at a lower
voltage in the waveform, but it's VA = 240 (or more!) It is important to note that this is a separate issue from the startup
current, which also must be factored in.
Because this
current draw is very real, the support wiring and distribution must be chosen based on this higher number. 10 such fixtures would continuously draw 20 amps, even though the wattage math would indicate the draw should be 8.33 amps. You can see why this becomes very important when working off of a generator.
Power factor can be corrected in most cases. The ideal is to try to get it as close to 100% as is
practical. When it comes to sources of
power, such as a generator, the
rating may be given in both Watts and/or VA. The difference between the two is an indicator that the generator is designed to handle the excess
current. Since this will not
effect the actual work being done by the engine, the VA will be the larger number.
Some supplies, such as battery backup units for computers, have no expectation of running anything but lower powered factor devices (computers) so they only give the VA
rating of the device.