So basically, you have no idea, hence the attempt to push the onus back on me to prove a negative.
A properly functioning processor is a deterministic machine. For any given set of inputs, it will always produce the same outputs. A malfunctioning processor (due to faulty hardware or overclocking) becomes non-deterministic. You expect a + b to equal c, but instead the processor hands you back z as a result. Pretty much by definition a processor with a "stable overclock" functions identically to a non-overclocked processor - i.e., correctly. The software doesn't change at all when the processor is overclocked.
If there's a statistically significant difference in the number of "bugs" you see when running a program on a OC'd CPU, then you can thank your unstable overclock.
Putting words in my mouth and assuming I don't know. You are the one who stated it was ludicrous. It is on you to explain your reasoning. I had already explained mine
I never said software changes because of overclocks in any way just like I never said an unstable overclock guarantees an issue. Again, people w/o reading comprehension. What does happen is software has expectations, and those expectations are met or/not met. I asked you what you use to determine a stable overclock for a reason. Software right? Are you going to determine it any other way? As long as the software gets what it determines is correct in terms of process/interactions it will function (as does the CPU and everything else). Stress tests serve as a good method for weeding out instability, however they are not absolute. What may work for one application may not work for another. There are 100's of documented cases of this happening since overclocking became a 'thing'. I still say that most of this would be covered under the case of "bugs", but again if out of 100 apps only 1 has an issue and removing/lowering the oc fixes it, the overclock is unstable. Just because you only have used 99 of those apps and haven't seen an issue, doesn't mean the possibility does not exist.
There is no way to 100% guarantee an overclock, period. Real world application is the true test of an overclock, not stress tests. As stated earlier they will provide you very good feedback and stability and in no way am I belittling the need for them, but in no way is it 100% all the time.
I do agree with your last statement "IF" removing that overclock solves your bug issue. Once again, your system could have run everything you ever threw at it flawlessly for years, but your system is only as stable as the last time it failed. I've never said an OC is a guarantee of failure, simply that the possibility will always exist, regardless of how stable you think your system is. Some applications are just touchy (or insert whatever you want here since we want to over analyze it).
All in all this discussion went off the rails because people around here have poor reading comprehension. As of yet no one has said why they think OC's are stable 100% of the time. Hardware alone and software alone do not make for stability. Out of spec is out of spec. That's why it's called out of spec. It may run awesome. Or it may not.
I do agree with cmdr that taking this to overclock.net could be interesting. I have no problems having an intelligent discussion about this. Unfortunately, that isn't what happens and you get responses like what we've seen here today. Also, it's hard to have these type of discussions w/o getting really wordy.