Interesting.
What temp would hurt a typical cpu then?
You looking for the academic response (Arrhenius and such) or the layman's (sh!t don't burn till it burns) type response?
The thermally activated stuff is all Fick's second law driven (temperature and time), reflow of the solder and so on. That is happening at all temperatures above absolute zero, but Arrehnius gives us a rule of thumb that says the rate at which it happens is going to double roughly every 10C increase in temps.
Like the destabilization of metastable diamonds, given enough time at room temperature the CPU will be "hurt". But the academic response is probably not very satisfying.
At 130C and full volts an IC might last a few months at continuous load, perhaps a year. They are generally built to survive 10yrs at the max temp range in the spec. For 2600K that is 98C.
Increase your temp to 108C and that 10 yrs becomes 5 yrs (Arrhenius equation governing the kinetics of rate-limited degradation mechanisms), increase it to 118C and it becomes 2.5yrs, 128C becomes 1.125yrs.
Go higher than that and the thermal degradation mechanisms continue to become more restrictive on lifespan but another more serious issue comes into play - solder reflow and the sheer stress involved from the mismatch in coefficients of thermal expansion.
This happen outside the CPU chip itself, but within the CPU package, and it kills the CPU for all practical purposes. Its what killed Nvidia's GPUs in the infamous "bumpgate" debacle and it is what will kill your CPU if you take it much above 130C (say the 140-150C territory).