Why Does Current Decrease When Voltage Increases? Explained!

Source

Any electric circuit consists of three main parts, power source, conductor, and resistor. The current passing through the circuit can get affected by many things, which results either in increasing or decreasing it. One of the factors that can affect the current is the voltage; the voltage can either decrease or increase the current.

Many applications use this relation between the voltage and the current like the transformers; they are used in many applications and industries. They can either lower the voltage or step it up.

Why Does Current Decrease When Voltage Increases?

Because according to the power formula I = P/V, the current is inversely proportional to the voltage, which means when the power is constant, and the voltage increases, the current will decrease. For more clarification, if the power is 10 watts and the current is 2 amperes, the voltage is 5 volts because the power formula states that P=I*V.

Therefore, if you want to keep the power constant at 10 watts and at the same time increase the voltage, for example, to 10 volts, this means that the current will have to decrease to 1 ampere to keep that power at 10 watts (1 * 10 = 10 watts). This mathematical relation always happens when you want to keep power constant and increase the current or the voltage.

Can Current Increase When Voltage Increase?

According to Ohm’s law V = IR, the current is directly proportional to the voltage, which means increasing the voltage will increase the current, but this can cause some confusion on how the current both increases and decreases as the voltage increases. This is because the factor that controls the relationship between the current and the voltage is the power.

When the power is constant, you will have to use the power formula P=I*V; in the power formula, the current is inversely proportional to the voltage, which means that when you increase the voltage, the current will decrease. However, this is not the case in Ohm’s law V = IR because the power is not constant in this law.

The current will only decrease if you increase the resistance; meanwhile, increasing the voltage will affect the current, but it will affect the power; decreasing the voltage will lead to lower power as the power depends on both voltage and current according to the power formula P=I*V.

What Does Affect the Current in an Electric Circuit?

The current can be affected by multiple things; the two most important factors are resistance and voltage. The resistance in an electric circuit will always affect the current; if the resistance increases, the current will decrease; meanwhile, if the resistance decreases, the current will increase. 

You can imagine the relationship between the current and the resistance as if the circuit is a river and the water running through is the current; the resistance can be the rocks in the river; when the rocks are bigger and in great number, this means that the water passing through the river to the other side will decrease. 

If you remove the rocks, this will increase the amount of water passing to the other side. This is because the voltage doesn’t have the same relationship with the current, as it only affects the current if the power is constant; if the voltage increase, the current will decrease and vice versa.

What Is the Difference Between Ohm’s Law and the Power Formula?

The difference between the power formula and Ohm’s law considering the current and its relationship with voltage is the constancy of the power. The constancy of the power is the differentiating factor between the two relations, where the power in Ohm’s law doesn’t have to be constant and can vary by increasing or decreasing either the voltage or the current.

Meanwhile, the power formula assumes that the power is constant; therefore, increasing or decreasing either the current or the voltage will affect the other inversely proportional ways. This can be seen in stepping up and stepping down transformers; when using a stepping up transformer, the current value will decrease, increasing the value of the voltage.

A stepping-down transformer does the opposite; it increases the value of the current and decreases the value of the voltage; the transformers depend on the power formula, where the power is constant.

 

Source

How to Decrease Current and Increase Voltage?

To decrease current and increase the voltage, the right solution is transformers; transformers are widely used in electronics and electrical devices. A transformer is a static electromagnetic device that uses the principles of Michael Faraday about electromagnetism. Transformers have different types, like stepping up or stepping down transformers.

Stepping up Transformer

A step-up transformer is used to increase the voltage of a circuit by decreasing the current; this happens because of the ratio between the primary and secondary windings. The number of turns in the primary winding is lower than the number of turns in the secondary windings, which results in a higher voltage than the one that entered the transformer.

Stepping down Transformer

A step-down transformer is the exact opposite of the step-up transformer, where it’s used to lower the voltage and increase the current; this happens because of the ratio between the primary and secondary windings. The number of turns in the primary winding is higher than the number of turns in the secondary windings, which results in a lower voltage than the one that entered the transformer.

Can The Inverse Relationship Between Current and Voltage Be Useful?

Decreasing the current to increase voltage or the opposite is very useful in industry and the electrical field; for example, power stations depend on this inverse relationship between the current and the voltage. The power is sent through the power stations to distribution systems through transmission lines.

A step-down transformer is used at the power station to lower the voltage so it can be distributed to smaller systems; the next usage of the transformer is to decrease the loss in distribution systems. For example, a system with two fans and two lights draws much less power than a system with two fans, a refrigerator, and two lights. 

In the first system, if the voltage of transmission is 220 volts, and the system needs to draw a current of 10 amperes; therefore, the power, according to the power formula, will be 10 * 220 = 2200 watts. To calculate the loss, we will need to assume that the resistance is 0.5 ohm, which makes the loss 0.5 x 102 = 50 watts.

In the second system, where we use a transformer of 10 kV/220, the primary current drawn by the system is 10 amperes; this will make the secondary current Vs/Vp x Is = 220/10000 x 10 = 0.22 amp. Now we calculate the loss 0.5 x (0.22)2 =0.0242 watts. As you can tell the transformer is very useful and saves a lot of power.

Conclusion

To sum up, the current will decrease when increasing the voltage only if the power is constant; according to the power formula I = P/V, the current is inversely proportional to the voltage, which means when the power is constant, and the voltage increases the current will decrease. Therefore, if you want to increase the voltage and decrease the current simultaneously, you will need to keep the same power source.

A varying power source means that the current and the voltage will not affect each other, which means you can increase and decrease the voltage while the current doesn’t get affected as the power will change according to the change in the voltage. 

Related Readings:

How To Identify a Power Transformer? 3 Min Read

How To Tell if a Resistor Is Burnt Out? And How To…

How To Reduce Voltage Using a Resistor? Step By Step

Is Voltage Inversely Proportional to Current? Answered!

How To Measure and Identify Resistor Using…

Why Do Parallel Resistors Have Less Resistance?

Don`t copy text!
Scroll to Top