Explained: Electrical Current and Voltage
As software engineers, we end up spending a lot of time far away from the physics that makes our equipment work. The point of this article is to explain the relationship between voltage amperage, and resistance in electrical circuits. If you had physics in college, this is not new. If you didn't, or you don't recall the details… read on!
- Voltage, (AKA electrical potential) measured in Volts, is the difference in electrical potential between two points. This difference in potential can be compared to the pressure in a hydraulic system. The higher the pressure, the more flow of current.
- Current, measured in Amps, is the rate at which electrons are flowing from point A to point B.
- Resistance, measured in Ohms, is the degree to which a material resists the flow of electrons. The symbol for ohms is 'Ω'.
- Power, measure in Watts, is simply Current multiplied by Voltage.
Ohm's Law - The Easy Part
Due to some details about our electrical infrastructure, there are some confusing labels that make Amps and Watts confusing. I will address those in a minute. First let me explain the simple relationship between Current, Voltage, and Amps:
German physicist Georg Ohm discovered that:
Current = Voltage / Resistance
What this means to you is that at a particular voltage, the resistance of the circuit determines the Current that flows. ALWAYS. You cannot force more current through a circuit without increase the voltage of the circuit or reducing its resistance.
So, for example, lets take a 60 watt light bulb that runs on standard US voltage.
Watts = Volts * Amps
So, at 120 volts standard US household power:
60 watts = 120 Volts * x Amps x = 60 / 120 x = 0.5
So that light bulb will draw 0.5 amps of current at 120 volts. We can determine its resistance as well:
0.5A = 120V / rΩ rΩ * 0.5A = 120V rΩ = 120v / 2A r = 120 / 0.5; r = 240
So a 60 Watt light bulb has a resistance of about 240 ohms. This resistance limits the current it draws at 120 volts to 0.5 Amps.
Current Ratings: The Confusing Part
So the confusing part is that most power outlets, power supplies, and even many batteries are labeled with Amps as well as volts. So you make see an outlet, extension cord, etc that says:
So if the Currrent (Amps) don't match, you end up with a nagging fear that you are doing something wrong. If you plug your 1 amp, 120 volt appliance into a 20 amp, 120 volt plug circuit, what will happen? The answer: it will work as normal. The ratings are the MAXIMUM available current. Unless something is wrong with your appliance, it will never draw more current than it needs, as long as the voltage is correct.
Here's the explanation for this labeling:
Since the current in a circuit is limited only by the total resistance of the circuit, a very low resistance can cause massive current to flow through the entire circuit! A simple example of this is plugging both ends of a copper wire into an outlet. You would end up with a circuit that has very low resistance, and the amount of current flowing through the wire would superheat the wire, as well as the wires in your house. It's the same with a battery: if you connect the terminals on a battery with a copper wire, you will get a superheated copper wire and discharge the entire battery in a matter of seconds. This kind of discharge is destructive and can cause a fire or explosion.
To stop this from happening, we use fuses and circuit breakers to interrupt the circuit if the total current goes over a preset safe threshold. So a socket that is labeled “120 V, 20 A” means that it delivers 120V of electrical potential, and that it has a maximum safe current of 20A. If a battery is rated at 500ma (0.5A) then it means the battery can safely delivery a maximum of 0.5 amps. Unlike household current, the battery itself generally does not have a fuse or a circuit breaker, so it's potentially more dangerous.
In general, if your appliance (or combination of appliances) requires more total power than your circuit breaker allows, you will trip the breaker or blow a fuse. This circuit breaker strategy is used at multiple levels of power distribution. Not only do we have breakers for individual areas of a house but for (a) the house as a whole and (b) power substations in each neighborhood and for each district. Without this precaution, the current drawn on any particular circuit could exceed that capacity of the wires carrying current and cause them to overheat.
The current limitation also makes sense because we all essentially share power. Imagine this crazy worst case scenario: Let's say that we used superconducting wires everywhere so we didn't have to worry about overheating. An now imagine that some joker decided to plug one of those wires into his home electrical socket. With zero resistance, 100% of all the available power would flow through this loop, essentially shutting out all other users from access to power.
Summary and Disclaimer
This is not an attempt to explain everything you should know about electricity, just about the basics of current, voltage, resistance, and power ratings. Alternating current and direct current are not the same, but for purposes of Ohms law they behave similarly enough that I haven't gone into any details.