First, if I have a 120v input and 12VAC output that is rated for 500 mA, than it's good for 0.5 Amps at 12v or 6Watts/12v?

Thus a 850mA

transformer is going to be good for 850mA than it's good for 10.2 Watts/12v Correct???

A 1000mA

transformer at 12v thus is 12Watts. Or is something wrong with my calculations?

Pretty much correct. For the majority of transformers, the

current rating is specified for the output. However,

power isn't always I x E when you're talking about AC. For a purely resistive load, for example a lamp

filament, you can pretty much treat AC and DC the same, in which case, yes, it would be 6, 10.2 and 12 wattts.

Next is Volt Amps. I realize it's Volts x Amps or at least I think I do. If I have a single

phase International step down

transformer rated for 200VA, than the amperage of the

fixture able to be run would at 110VAC is1.8 Amps or 198 Watts Correct?

On the other

hand, if I have a

Power Boosting

transformer starting with 110V for the

inlet, and it's stepping up to 120V, the

kVA rating is how many thousands of Volt Amps the

transformer is rated for right??? In other words, if given the above, I have a 0.5kVA

transformer, than it's worth 500 Watts at 120v. Right?

The problem is that many loads are not purely resistive. A

__reactive__ load, for example an

electric motor, causes something strange to happen - the

current gets a little out of

phase with the

voltage. When that happens, the Volt-Amps stays the same, but the actual

power delivered is less. That's why transformers are often rated for Volt-Amps instead of watts. In the simplest sense, Volt-Amps is the

power the

transformer sees, Watts is the

power the load sees. If the load is purely resistive, the Volt-Amps and the Watts are the same.

Power Factor is the ratio between true

power (Watts) and apparent

power (Volt-Amps) = W/VA. A resistive load has a

power factor of 1. Any reactive load will have a

power factor less than one. A typical

electric motor might have a

power factor of 0.85 - in other words, if it's an 85-watt motor, it needs a 100

volt-amp transformer to drive it without burning up.

There's also the little matter of the direction of the

phase shift - an inductive load causes the

current (amps) to

lag behind the

voltage. A capacitive load causes the

current to get ahead of the

voltage. The

power companies use this to make their transmission systems more efficient.

The idea is that

electric motors make up a large part of electricity usage, shifting the overall

current to

lag behind the

voltage. They intentionally put large capacitors across the

power lines at most of their substations to

shift the

current the other way. The right amount of

capacitance can

shift it right back to where it's in-phase with the

voltage again, reducing the stress on their transformers, because when the total

phase shift is zero, true

power = apparent

power.

A three

phase transformer transfering 208v to 240v at 9.5kVA is good for how many watts per

leg of

power?

3.166

KVA. The actual watts, again, depend on the

power factor - if it's strictly lamp filaments, 3.166 KW per

leg. If there are cooling fans in some of the fixtures, or scrollers or movers, I'd limit the total to about 2.5 KW per

leg, just to

play it safe.

John