Voltage: US, Europe

Shawncfer

Active Member
Random question of the day...

On our typical house outlets, they put out 120v and can handle 15amps. Correct me if I'm wrong.

And when we go to Europe, there outlets put out 240v. And since our appliances or whatever else can typically only handle 120v, we buy a converter so it can handle the 250 volts. Plus an adapter to change out pins to their size.

Also, in Europe, their typical wall outlet is rated for up to 2.5amps. While ours are Typically rated for 15. So when they come over here, do they need a converter? Besides the adapter that changes there pins to ours. Because their appliances can handle 240v, and ours put out 120. Their appliances need under 2.5amps, and ours can handle 15. So do they need one?
 
Well that depends - some things use a converter, some things use an adapter. Some things like a PC power supllies simply need an adapter to get the right pin configuration. The Powersupply from the PC/laptop takes whatever mains (120v or 220v) and drops it down via a step down transformer to what ever is correct for that laptop / PC. However, some things do require a "step up" transformer to go from US 120 to 220V standard - those these devices aren't typically portable (some motors).

Similarly, when you take US equipment to the UK/Europe you use an adapter (if the piece of equipment autosenses the input voltage like a laptop) or a converter with a step down transformer to go from 220V to 120V.

for Europe, each outlet is rated individually at 2.5a @ 220v, where here in the US/Canada/Central America a Chain of outlets is rated for 15 to 20 amps (not to exceed 15a on any one outlet) @ 120V.
 
I was just asking because I think about it like this.
I look at a plug and see how many amps and volts it says and I know the wall can not put put more volts that this and has to be atleast this ammount of amps.
Which is why when we go to Europe, we look at whatever we have (excluding laptops and computers) and say, Oh, this blahblahblah can only handle 120v, and this wall is putting out 240. Thats a problem. And This blahblahblah requires 10amps and the wall only puts out 2.5. So theres a problem.
But when you look at the people who come to the US, they look at their blahblahblahs and think, "Oh, my blahblahblah is rated for 240v and this outlet puts out 120v so I'm good!" And then, "Oh, thankfully in Europe, most plugs only put out 2.5 amp, therefore the majority of appliances only put out less than or equal to 2.5 amps. And here in the U.S. their outlets put out up to 15amps for my blahblahblah."'

I hope I made since there! :lol:
 
Two important notes:

People coming into 120v territory from 200-240v regions don't really have to worry. A lower voltage will [not as often] result in damaged electronics as higher voltage. Running a 240v lamp at 50% is conceptually like running it at 120v. Generally, 120v will still work for what they need. It'll still charge cell phone and laptop batteries, and there's not much else that people take with them to foreign countries as far as electronics are concerned. Also, many computers come default with power supplies that automatically adjust to the input voltage.

(except technically dimmer curves and such don't operate on a 1:1 linear ratio, so it'd not actually be 120v at 50%, but for the purposes of this conversation the true value is irrelevant)

People going into 200-240v territory from 120v territory will likely see any/all of the things they connect directly to the outlets either start on fire or cease to operate properly. Running electronics at a higher voltage than they are designed creates more power and subsequently a lot more heat.

As a random fact:
The benefits of 240v over 120v is that conductors can be installed that are smaller. Far less aluminum and copper go into electrical materials because they can operate at a higher voltage, producing the same amount of power but while using up only half of the current. This is why it's actually not that unusual to find fluorescent lighting in a commercial building running at 277v, because it saves a lot of money in the initial install of electrical wiring and infrastructure in the building. However, you won't find 277v in a standard outlet because if a child stuck some keys into the outlet, it's unlikely they'd live to have their parents tell them about it years later.
 
So would it be fair to say that the voltage listed on my plug says the maximum voltage it can handle. And so the voltage the outlet puts out has to be under that. HOWEVER, it also has to be around that range?

For Example, European's blahblahblahs can handle 240v, and so when they come to the US their blahblahblahs can handle our outlets. But the voltage the walls putting out needs to be close to the ammount of voltage it can handle if you want the blahblahblah to work normal?
 
For Example, European's blahblahblahs can handle 240v, and so when they come to the US their blahblahblahs can handle our outlets. But the voltage the walls putting out needs to be close to the ammount of voltage it can handle if you want the blahblahblah to work normal?

This is pretty hard to follow, but I'll give it a shot.

A wall outlet can be rated to 600v. It won't make a bit of difference though, because your service panel is only sending 120v to that outlet. The rating of the receptacle or connector doesn't mean much as long as it's above whatever the present voltage is and the current draw of the equipment connected to it (like running your striplights at a full 2400w using those 15 amp plugs).

I've seen many porcelain sockets rated for 600v. This doesn't change anything to most people, though. It would act the same if it was rated for 200v or 1200v as long as you weren't exceeding any ratings.

To sum this up, the rated voltage of an outlet or connector doesn't have to be 'close' to the voltage being supplied. It just has to be above the voltage being supplied.
 
To sum this up, the rated voltage of an outlet or connector doesn't have to be 'close' to the voltage being supplied. It just has to be above the voltage being supplied.

Thats exactly what I thought at the beggining of this. But MNicolai said

Running a 240v lamp at 50% is conceptually like running it at 120v.

Which totally threw me for a loop. Because its not recieving less current than it needs, just less force at which pushes the current. which as long as the plug can handle the force that pushes the current then it's okay and will work fine.

Am I correct?
 
The connectors are listed for given voltage ranges for a variety of reasons. One of those is based on the quality of the insulation between conductors. If a voltage is too high for a connector, it could arc from one conductive path to another and short out. Another factor is the breaking capacity of a connector design. When under a full 15A load, a 5-15 (aka Edison) connector should break the load with minimal arcing. When disconnected at 30A or a higher voltage, a larger arc could be produced, creating a fire hazard or at least potentially damaging the connector. Note that Powercon connectors work quite well at their rated voltages and currents, but are designed in a manner that does not allow them a breaking capacity. Repeatedly broken under load, these connectors will arc, melt, and degrade.

One of the primary reasons for rated voltages and currents on connectors is for distinguishing one from another. For many of the different common voltages and currents for circuits, there are unique connectors. This is such that you can readily determine the rated voltage and current for a connector in the field without having to trace down the circuit breaker or try to read a rubbed off label.

Also, while it is a good rule of thumb to use the appropriate connectors for their appropriate voltages, you can do what you like so long as it doesn't violate code. For many years, and even still today, you'll find twist-lock connectors were used for speaker cables. Electrically, it works. You'd better just not have a matching twist-lock receptacle around or someone may connect your speaker's audio input into a wall outlet. So because many connectors have their own purposes, you want to make certain you use one that will ease potential conflicts in your space and will differentiate one application from another.
 
So there's a few things that seem to be misunderstood here.

A connector will have a maximum voltage rating, which is determined mostly by its insulation properties. It will also have a maximum continuous current rating, which has to do with its ability to cope with the thermal effects of that sustained current amongst other things.

These are MAXIMUM ratings. If I wanted to, I could feasibly use a 600V 125A connector to connect my telephone. It would be unwise for a variety of reasons as well as complete overkill, but the connector would not care. All of our mains fittings are rated to 250V but in theory we have a 230V supply. When we use 120V PAR64s, the connectors that connect them are rated at 250V, so you can't assume that the voltage marked on a connector is neccessarily in the same ballpark at what it is being supplied with.

As to interoperability, it depends on the individual appliance you want to plug in as to whether you need just an adapter or a step up/down transformer as well. Take the power adapter for my laptop as an example. On the back it is marked with an input voltage range, in this case 100-240V. So all I need is an adapter to use it in the US because the power supply is happy with a 115V supply. Conversely, my TV is rated for 220-240v input, so that would need a stepup transformer to be able to be used in the US.

Europe have multiple standards of connector. The Europlug to which you refer is only rated to 2.5A and only suitable for connection to double insulated equipment. Something like a microwave or kettle draws consdierably more than 2.5A, and needs an Earth connection, so is more likely to be connected via Schuko plug, which carries a 16A rating.

(Others have posted while I've been writing this, so I know there is some double coverage of areas...)
 
. Conversely, my TV is rated for 220-240v input, so that would need a stepup transformer to be able to be used in the US.

Hey Chris, Thanks for simplifying this shinanigans. But I have one question for you. Your TV is rated for 220-240v, so if you plugged it into an outlet in the US, then it wouldnt be able to function properly. Now are all European/Australian appliances like that? Do they all say between something-something volts? Or do some of them just say 240v? And in which case, are those able to work with anything under 240v?

For example, my TV just says 120v. Now I know theres not a standard US outlet below 120v, but lets pretend there is. It would be able to functin on any outlet under 120v because it doesnt say its between a certain amount, it just gives you a max voltage right?

And yes, after I asked this question, I though about it and realized the only reason it doesnt say between a certain amount of volts is because there isn't a lower voltage for it to be between. But still, lets pretend here.
 
The 120V is not the maximum. It's what's specified that's what the TV uses. If you plugged it into (again in magically chaotic electricity ville here) a 30 V outlet The TV would need a transformer to step up the voltage to make the tv's electronics function correctly.

I think what you're doing here is looking at the overall watts and wondering if the voltage is changed, will the electronics automatically just pull more current to correct for it. The short answer is no. The lower the voltage, the less current will flow through a circuit(IIRC).
 
Hey Chris, Thanks for simplifying this shinanigans. But I have one question for you. Your TV is rated for 220-240v, so if you plugged it into an outlet in the US, then it wouldnt be able to function properly. Now are all European/Australian appliances like that? Do they all say between something-something volts? Or do some of them just say 240v? And in which case, are those able to work with anything under 240v?

For example, my TV just says 120v. Now I know theres not a standard US outlet below 120v, but lets pretend there is. It would be able to functin on any outlet under 120v because it doesnt say its between a certain amount, it just gives you a max voltage right?

And yes, after I asked this question, I though about it and realized the only reason it doesnt say between a certain amount of volts is because there isn't a lower voltage for it to be between. But still, lets pretend here.

The reason the TV is 220 - 240 is an economic one. It allows the same product to be retailed in both the European markets, which have traditionally operated on 220v and the Australian / other markets that have traditionally run on 240v. In the last decade or so, there has been a worldwide push to harmonise supply voltages. So Countries operating in the 110 - 120 sorts of numbers are theoretically moving to 115v and those of us in the 220-240 realm are going to 230.

Now it's important to remember that the 230v is a nominal voltage. The supply authority is obligated to supply me at 230v +10% -6% which means they can give me anything between 216 and 253 volts and still be completely within their contract. So any piece of equipment has to be able to handle voltage variations like that.

When a piece of equipment has a marked operational voltage, that's not a maximum, it's the voltage that the equipment was designed to run on. Undervoltage can cause problems quite easily. Particular offenders are things with heavy start up currents. Undervolt say a dishcarge lamp and it won't strike. (Which is why if you have a number of movers on a long supply line you should strike the furthest one first). Overvolting causes problems like fire and burnout.

phil000 said:
I think what you're doing here is looking at the overall watts and wondering if the voltage is changed, will the electronics automatically just pull more current to correct for it. The short answer is no. The lower the voltage, the less current will flow through a circuit(IIRC).

The short answer is that your theory holds for linear loads. It DOES NOT hold for switchmode power supplies which power most devices these days... They will increase the current as the voltage decreases...
 
Okay so let me see if I got this right here

The volts on the appliance or whatever else that Im using has to match the volts coming out of the socket in which they're plugged into?

The amps required to operate the same appliance or whatever else cannot exced the amps of the outlet.

Am I Right?

So then let me ask. If I have an outlet that puts out 120v. But I have a plug that needs 125v. I know its okay to use. But Why is it okay to use? the plug doesnt say 110-130 (not that it would) or 115-125 (once again, not that it would). But since 125 is not in the normal range, why is it okay to use?
 
Okay so let me see if I got this right here

The volts on the appliance or whatever else that Im using has to match the volts coming out of the socket in which they're plugged into?

The amps required to operate the same appliance or whatever else cannot exced the amps of the outlet.

Am I Right?

So far so good...

So then let me ask. If I have an outlet that puts out 120v. But I have a plug that needs 125v. I know its okay to use. But Why is it okay to use? the plug doesnt say 110-130 (not that it would) or 115-125 (once again, not that it would). But since 125 is not in the normal range, why is it okay to use?

Basically because a plug is a connector, not an appliance, It's about whatever is drawing the load at the other end...
 
For example, my TV just says 120v. Now I know theres not a standard US outlet below 120v, but lets pretend there is. It would be able to functin on any outlet under 120v because it doesnt say its between a certain amount, it just gives you a max voltage right?

And yes, after I asked this question, I though about it and realized the only reason it doesnt say between a certain amount of volts is because there isn't a lower voltage for it to be between. But still, lets pretend here.

There is a minimum voltage threshold where your tv would fail to operate. It's probably somewhere around 100v. It wouldn't be a good idea to operate it at 100v constantly though, because under-volting can overheat some electronics. It all depends on how the appliance was designed. There is no set standard for this.

My keyboard (the musical kind) is a Kurzweil K2600X and it can handle anything from 90-240v. You wouldn't want to keep it at 90, but it will compensate for short periods of time because the designers at Kurzweil know that the instrument could find itself in use in an area where there are major power fluctuations.
 

Users who are viewing this thread

Back