Hello,

I have found myself without a spec sheet but a supply of working 940nm IRs. I have been able to identify the forward voltage (1.25v) and for now have chosen a low value resistor (270 ohms). From this I can determine the current through the LED. I would like to know the LEDs maximum forward amperage though rather than experimenting with the LEDs and destroying a few. The second question is what % of that maximum forward current should be its everyday use e.g. 70%?

Thank you.

This might work.

1. The voltage drop across an LED, just like most diodes, is junction temperature dependent, decreasing by ~2mV per degree C.

2. Maximum rated LED junction operating temperature is generally around 100C.

3. Junction temperature rise is (approximately) proportional to current.

So...

You could apply a moderate test current (e.g., 20mA) to one of your LEDs and observe the voltage change as it warms up. After giving it a minute or so for the temperature to stabilize, divide the observed voltage change from cold to hot by 0.002V/C to get the approximate temperature rise for the test current. For example, if the voltage dropped from 1.25V (cold) to 1.2V (hot), the calculated temperature rise would be 1.25 - 1.2 = 0.05V and the corresponding temperature rise 0.05V / 0.002V/C = 25C.

Then divide the test current by the calculated temperature rise to get Amps/degree. For the example, that would be 0.02A / 25C = 0.0008A/C

Then choose a maximum junction temperature rise, e.g. 75C for 100C junction temperature in a 25C ambient, to compute max operating current. In this case 0.0008A/C x 75C = 0.06A = 60mA.