Ok, since I'm not electronic, it's time for me to verify some stuff. First, if I have a 120v input and 12VAC output that is rated for 500 mA, than it's good for 0.5 Amps at 12v or 6Watts/12v? Thus a 850mA transformer is going to be good for 850mA than it's good for 10.2 Watts/12v Correct??? A 1000mA transformer at 12v thus is 12Watts. Or is something wrong with my calculations? Am I supposted to be taking the mA reading off the 120v input side which would allow for a much higher wattage? Next is Volt Amps. I realize it's Volts x Amps or at least I think I do. If I have a single phase International step down transformer rated for 200VA, than the amperage of the fixture able to be run would at 110VAC is1.8 Amps or 198 Watts Correct? On the other hand, if I have a Power Boosting transformer starting with 110V for the inlet, and it's stepping up to 120V, the kVA rating is how many thousands of Volt Amps the transformer is rated for right??? In other words, if given the above, I have a 0.5kVA transformer, than it's worth 500 Watts at 120v. Right? Such things I just don't deal with enough to remember. And now for the fun part. A three phase transformer transfering 208v to 240v at 9.5kVA is good for how many watts per leg of power?