I’m a complete beginner working on a simple experiment to measure the I–V characteristics of LEDs. For most visible LEDs, my results look as expected: the current increases exponentially once the forward voltage exceeds the built-in potential, and then the curve stays smooth.
However, with an infrared LED, I noticed something strange. After the threshold voltage (around 1 V), the current rises exponentially at first, but then the slope suddenly changes and the curve becomes almost linear and much flatter than the exponential trend before it.
This is the setup:
- Infrared LED polarized directly
- Series resistor: 500 Ω
- Current measured with an ammeter, voltage measured across the LED with a voltmeter
- Current limited to about 20 mA
- Same setup used for other LEDs (red, green, blue) gives perfect curves
What could cause this sudden change of slope after the built-in voltage in an infrared LED?
P.S. The LED wasn't warm at the touch.
