THE JIST

If you want to put an LED in a DC circuit and need to find out how to keep from smoking it please read on. I will explain how to calculate the forward bias (current limiting) resistor here.
Picture

Folks... biasing a Light Emitting Diode (LED) is simple. I am not saying this simply because I understand how to do it. It truly is easy if a few steps are followed.

To begin, you will need to know the LED's forward bias voltage and its current rating. Often these are around 1.5 to 2.1 VDC and 10 to 20 milliamps (.010 to .020 Amps). To be sure, get the LED's datasheet to get the details for your device. Let's try it...

If you need to use a LED in a circuit the first thing to know is the forward bias voltage of the LED. Most LEDs are 1.5 VDC - but check the technical information for the device you have. If you don't have this information, start with 1.5 VDC. Once you have the forward bias voltage you're set. Now, subtract the bias voltage from the source voltage then divide the answer by the LED's current rating.

For example: If the LED's bias is 1.5 VDC and the current is 20 milliamps and the supply voltage is 13.8 VDC do this:

13.8 - 1.5 = 12.3 VDC.

Now divide the answer you just found (12.3) by the current needed for the device. Since our LED needs 20 milliamps we need to devide 12.3 by .02. So... 12.3 ÷ .02 = 615. This relates to the amount of resistance needed to protect the LED you're using in the 13.8 volt circuit. In this case we need a 615 ohm resistor. So a standard resistor around 620 ohms at 1/8th watt would properly protect your LED.

So... connect your LED with a series resistance of 620 ohms between the battery + and - leads and the LED should light. If it doesn't, flip the LED around... they are polarized.

BTW: This same technique applies to forward biasing transistors too. If the transistor needs .01 amps and is of the common silicon base (.7 VDC) or germanium (.3 VDC) subtract the base voltage from the source voltage and divide by the amps.... tada... bias resistance... awesome!


^