I'm using the OPA551PA amplifier to charge a capacitor to various voltages. After thermal calculations, the worst case scenario is depicted here. TA = 40C, Vin = 36V, Vout = 18V, Iload = 50mA. Under these parameters, this chip should achieve a junction temperature of 125C, which is obviously too hot. There's a caveat however - this is a transient condition. We are charging a capacitor, so the charging current dramatically decreases as a function of time. My graph plots the charging current of the capacitor and the resulting thermal heating vs time. You can see that we are only in worrisome territory for some ~200ms. In the next 500ms it quickly stabilizes to the resting junction temperature that's calculated at the quiescent current. This transient also only happens once every 5 seconds. Again, this is worst case! The typical condition is the calculated junction temperature is greater that 80C for only 50ms, thereafter returning to 40C after 200ms - this process happening once every 4 seconds.
My question is, how long does it actually take a chip to heat up to the calculated junction temperature? If this was a sustained load, obviously this would be a problem. But does the chip in real life actually reach those temperatures in a <150ms timeframe?
Its like putting a pan on the stovetop. Your stove is at say 100C. Your pan temp will not immediately rise to 100C. There will a gradual rise in temperature until they match.
I'm planning on using a much more powerful chip anyways so I can keep junction temps close to ambient, but I'm still extremely curious about this.
