Why do you need a current limiting resistor to drive an LED if the LED draws only as much current as it requires?

The current consumed by the LEDs is given by (Vin-Vf) / Rs which depends on the applied input voltage Vin, the direct voltage drop of the LEDs Vf and the series resistance Rs.

If the series resistance designed is too low, it will cause a high current through the LED compared to the operating current specified in the data sheet which will eventually destroy the LED due to excessive heat generation. If the series resistance is too high, the LED will shine dimly .

The series resistor protects the LED by limiting the current drawn from the input supply as long as the supply is a voltage source.

An LED has an internal resistance of 10 to 20 ohms. The current is equal to the volt divided by the resistance. If the resistance is 10 ohms and the voltage is 3 volts, the current is 3/10, or 0.3 amps, which corresponds to 300 milliamps (ma).

But an LED only requires about 20mA – 300mA, it is more than ten times too much and would burn the LED. So you need a current limiting resistor, Resistance = voltage divided by the current, or 3 / .02 which is 150 ohms.


Google serves cookies to analyze traffic to this site and for serving personalized ads. Learn more