In the United States, the standard household electrical supply is 120V and 240V at a frequency of 60 Hertz. Industrial and commercial electricity supply can often be 3-phase power, using different voltage standards from 208V to 400V.

If you’re considering upgrading circuits for your home or garage workshop, the important question is probably 120V vs 240V, and what is the best amperage for these outlets?

People who have traveled internationally are often curious why the power supply in the US is not the same as in Britain, Europe, and most other regions. Japan has the most complicated setup, with two voltages and frequencies used. This article is going to provide a complete understanding of electricity.

Having spent most of my life dealing with electricity supply, I’m fascinated by all the complexities involved in optimizing current and the importance of maintaining steady voltage and frequency (voltage vs frequency). This guide to understanding electricity is aimed primarily at helping people understand the power in their homes and shops. I will, however, provide many other details and interesting facts.

If you live in the US and want to gain a basic understanding of your electrical panel, outlets, and the best way to go about using 120V and 240V, this is where I’ll start. Following this, I’ll go into more technical detail on how electricity works and the fascinating history that led to the electric standards used around the world.

Common Electrical Outlets

NEMA electrical outlets

Electrical outlets in North America (and some regions in South America) use the North American Electrical Manufacturers Association (NEMA) standard.

There are many types of 120V, 240V, and 120V/240V outlets used in the US. I’m going to talk about the most commonly used receptacles and how they work. Modern electrical regulations require that outlets be grounded. So I won’t be dealing with the 2-prong variants (NEMA 1 & 2) used in older electrical installations.

Since most of us are dealing with 120V – 240V power, these will be the most common.

NEMA 120V – 240V Receptacles.

All NEMA electrical receptacles have the letter “R” at the end. Yup, you’ve guessed it, this stands for receptacle. Locking outlets, keep the plug locked in position so it remains connected if the cord is pulled. These sockets have the letter “L” at the beginning.

The first number in the sequence indicates the type of outlet. This will indicate the configuration for 120V only, 240V only, or 120V/240V, as well as a ground connector. The second number is the rated amperage.

  • NEMA 5 is the most common household electrical outlet. These are used to supply 120V (15 – 30A) with a ground prong in the center – either at the top or bottom of the socket, depending on how it is installed. The two vertical prongs supply hot and neutral.
  • NEMA 6 designates 208 and 240V outlets (15 – 50A). It looks similar to the NEMA 5 setup, but the prongs are not the same size. These have two hot connectors and ground. They have come to replace the NEMA 10 outlets used in older installations.
  • NEMA 7 is not used for domestic power and is rated for 277V
  • NEMA 8 – 13 with the exception of NEMA 10, all these outlets are used for 3-phase circuits, using various configurations and voltages.
  • NEMA 14 outlets have four prongs, for hot 1, hot 2, neutral, and ground. These receptacles supply both 120V and 240V from 15A to 30A.
  • TT-30R will be recognized by RV owners. The letters TT stand for travel trailer. They are the standard for campsite power, shore power for boats, and are common on RV generators.

Receptacle Wiring

Connecting electric outlets

Receptacles can utilize one or two hot wires (black or red), a neutral (white) wire, and a ground (green/yellow) wire. The above diagram shows how these wires are connected. The letter X and Y represent the hot connectors, W is neutral, and G is ground.

How to connect a receptacle

Every outlet has lugs behind the corresponding prong with a screw that secures the wire. You push the correct wire into the lug and fasten the screw.

What gauge wire should you use?

The conductor, usually copper wire, used for an electric circuit has to be rated for the maximum current that the circuit can supply. This is determined by the circuit breaker amperage. Electric current is all about the relationship between the voltage and power requirement. Let’s see how all the pieces of this puzzle fit together.

Power, measured in watts (W) is the amount of energy that an appliance consumes.

Potential Difference, measured in volts (V) is the electric potential between two points. It is also called electric pressure and is similar to the water pressure in a pipe. Think of it as the amount of electric force available.

Current, measured in Amperes (A), commonly called amps, tells us how much electric current is flowing through a conductor. Kind of like the water flow rate through a pipe.

Power = Volts X Amps

A higher voltage requires fewer amps to deliver the same power, as these two simple equations will illustrate:

1000W = 120V X 8.3A OR 1000W = 240V X 4.17A.

If we supply double the voltage, for the same watts, we will be drawing half the amperage.

As the amps increase, the wire needs to be thicker in order to handle the increased current. In the same way, a thicker water pipe will increase the flow rate.

The US unit for determining wire thickness is the American Wire Gauge (AWG). This uses a numeric scale to indicate the thickness of a conductor. The thicker the wire, the lower the AWG rating.

Amp ratings for AWG wire thickness:

Wire Gauge        INCHES                 AMPS

#14                         0.064”                   15A

#12                         0.08”                      20A

#10                         0.1”                        30A

#8                           0.13”                      40A

#6                           0.16”                      55A

#4                           0.2”                        70A

#3                           0.23”                      85A

#2                           0.26”                      95A

Understanding your Electric Panel

Breaker box

Your utility company supplies electricity to your home using a 2-pole transformer. Each pole supplies 120V. This is a split-phase setup, meaning that it is a single-phase transformer, split to connect 2 X 120V wires. This allows for 120V/240V circuits.  There will also be ground and neutral connections which serve as a return current. The ground and neutral wires are connected to the same point of the transformer.

When the current flows through both hot wires, and not the neutral, both poles of the transformer supply power, giving us 240V. If only one hot wire is used, connected back through the neutral wire, we get 120V.

Your electric panel distributes this power safely to your home by supplying various circuits for outlets, built-in appliances, and lights. These circuits are protected by circuit breakers which will trip if the circuit is overloaded or a short circuit occurs. I’ll explain this in more detail as we go.

Electric panel wiring

The power enters the electric panel at the top, through the main circuit breaker. This will be a large switch rated for the total amps that can be supplied throughout the home. The two hot wires from the transformer are connected through this switch.

Two copper bars run through the center of the panel. These are the two main hot connections that are connected to the main breaker and supply all auxiliary breakers. The main neutral is connected to a bar and all the neutral wires in the home are connected to this bar.

The ground connections are basically the same as the neutral, they are all common to a single bar. A second, thicker wire, connects the ground bar to a ground spike outside the home. The ground spike uses the low conductivity of the earth to eliminate electrical interference.

Ground Fault Interrupter

If the breaker box is fitted with a Ground Fault Circuit Interrupter (GFCI), this will detect a difference in the current flow between the earth (ground) and neutral, breaking the circuit when an imbalance. In many cases, the GFCI will be at the outlet. Either way, it serves the same purpose. This is to detect when current flows directly to the ground. When you touch the hot and neutral wires, the current flows through your body into the ground. This is what happens when someone is electrocuted. The GFCI stops the current immediately to protect you from this.

The GFCI can be reset once the danger is eliminated.

Circuit Breakers

Circuit breakers and fuses are used to prevent an electrical fire. The breaker will trip when an overload or short circuit occurs. Once the situation has been rectified, the breaker can be reset by switching it back to the on position.

When a conductor carries more current than it can handle, excessive heat is produced. A 14-gauge wire can only handle a maximum of 15A. So a circuit using #14 wire will be connected to a 15A circuit breaker (or fuse in older installations). If the current exceeds 15A, the breaker will trip by moving automatically to the off position.

Lightning may cause a current surge which will trip the breaker. It can be switched back on afterward, once the amperage is normal.

A short circuit happens when the electricity flows directly from hot to neutral without any resistance. This is usually a result of broken insulation. A short circuit causes an electric arc which can burn through metal. This will also cause the breaker to trip and can be reset once the short circuit is corrected.

There are usually two rows of circuit breakers connected to the two hot buzz bars in the electric panel. All are rated at the amperage for the wire connected to them.

120V, single pole breakers, will be connected to one of the hot bars. 240V, two pole breakers, will be connected to both hot bars.

120V vs 240V

Many people have asked me why use 120V and 240V? In most other countries only a single voltage is used. Is 240V better than 120V?

Determining the ideal voltage for household electricity is based on efficiency vs electrical safety. A lower voltage is safer, as electrocution will not be as severe. However, a high voltage is more efficient. Your conductors need not be as thick, and voltage loss is lower over long distances. The greatest problem with low voltage transmission is the loss that occurs as you move further from the point of distribution.

So how did the voltage standards come about?

When Thomas Edison perfected the incandescent light bulb, in 1878, the world would be forever changed.  Before this time, early electric generators were used only for industrial installations and municipal street lighting. Edison’s invention brought electric power into the homes of Americans. Pretty soon,  the rest of the world followed.

It was the Edison light bulb that determined the initial voltage for domestic power distribution. He decided to use 110V to supply his light bulbs. It was efficient enough to produce usable light, without being dangerously high. While 110V isn’t exactly safe, a higher voltage is more dangerous, increasing the risk of fatal electrocution. At this time, the Ground Fault Circuit Interrupter had not been invented. So the risk of electrocution was of greater concern. In Edison’s opinion, 110V was the perfect middle ground between relative safety and efficient lighting. 

Using 110V was fine at first. Electric power grids were not as big as they are today, and consumption was relatively low. As new electric appliances became available and the grid expanded to rural areas, the problem of voltage loss became evident.

High electricity consumption, over a greater area, caused the voltage to drop below 10% of the nominal voltage. By this, I mean if the standard is 110V, the actual voltage in the home could be anything from around 100V up to more than 120V. Electrical equipment can function normally within about 10% of the nominal voltage.

The Move to Increase Utility Supply Voltage

The first electric utility to see the advantage of using increased voltage was the city of Berlin. In 1899, Berlin changed from 110V to 220V AC electric supply. Engineers calculated the savings from using less copper for the wires that supplied the city with electricity was enough to cover the cost of replacing 110V equipment with 220V alternatives.

The rest of the world continued to use 110V, slowly increasing this to 120V to improve supply reliability. Similarly, Germany also increased the nominal voltage from 220V to 230V over time. 

Through the course of the 1920s and 1930s, Britain and other European countries decided to change to 220V and 230V, respectively. This would improve efficiency and reduce the loss factor. At this time, homes in the region were not highly dependent on electrical appliances. The cost of replacing all the appliances to use a higher voltage was offset by the savings in efficiency and lighter conductors.

The US faced a different conundrum. Despite the devastating effects of the great depression (starting in 1929), the US power grid grew astronomically during this time and people were buying new appliances just as enthusiastically. This meant that American homes were full of appliances, like refrigerators, stoves, and washing machines. By the end of the 1930s, around 44% of US households owned a refrigerator.

The cost of replacing all these appliances in every American home was simply too great. Instead, US electric utilities decided to gradually introduce 2-pole transformers for domestic power supply. 120V would be used for low-watt equipment, like lighting and smaller appliances. High-watt equipment, used mostly for heating, would use the more efficient 240V supply.

The Japanese Dilemma

While there is a perfectly logical explanation for the difference in voltage standards between the US and Europe, Japan is an entirely different case. Japan has two electric grids, split between the east and west of the country.

When the first power utilities were established in Japan, they did not cooperate in setting a universal standard for the whole country. Instead, Eastern Japan decided to contract AEG (a German company) to install their generators. This meant that the eastern half of the country is supplied with 220V 50HZ electricity.

The electricity utility in the west of Japan decided to install General Electric generators which used the US standard (110V 60HZ). As a result, Japan has two electric grids which presents a lot of problems. You cannot use the same appliances across the whole country without equipment to convert the voltage and frequency. More importantly, it has a great effect on the reliability of electrical supply.

In most countries, it is relatively easy to transmit electricity from one grid to another. Most of Europe is connected through a common grid that allows countries to send electricity to any country that needs it. This makes for a very balanced power supply. If one area is experiencing a low demand for electricity, the excess power can be redirected to an area that is experiencing higher demand.

Since Japan uses a different voltage and frequency on either side of the country, it is not as feasible to share power between the two grids. While it is possible to change the voltage and frequency of electricity, the equipment is complicated and costly. As a result, each area of Japan is entirely responsible for generating its own electricity. Power can not easily be transferred between Eastern and Western Japan as demand changes.

Tap to Browse the Best Discount Products on Amazon