First of all, whether laptops are meant for gaming or not is a moot point and not the heart of the issue. Also, modifying the chipset is not something you should have to do nor is it something you should do because it will void the warranty if your chipset fails. This is exactly what the OEM's want because they are off the hook because you void the warranty when doing this.
Better quality control and better testing in real world environments is what should hvae happened by both the chipset maker(nVidia) and the OEM builder(Dell, HP, Compaq, etc.). Both segments of the market dropped the ball completely and let an inferior component, that is prone to early failure, slip into their products and cause countless hardships and cost consumers possibly millions of dollars in data loss and inconvenience.
For example, if Dell makes a desktop machine and the PSU doesn't have proper safety features and catches fire and burns your house down, no-one is going to say, 'Hey switch out the PSU with an Antec to solve that problem'. It's rediculous to suggest the average user, who knows nothing about the guts of their laptop/desktop to modify it in any way for better cooling or claim that everything is alright as long as you don't do any real gaming on your laptop/desktop using this chipset. Besides, gaming would only raise the temps of their GPU by a few degrees Celcius and shouldn't matter. That's why some poeple bought their system, using this chipset because it boasted better gaming performance than other alternatives and they wanted a cheap light gamer system, whether laptop or desktop.
I love nVidia and they have solid products that usually outperform AMD/ATI, produce less heat & use less power but they definitely dropped the ball on this one and the OEM's didn't research the problem and test the chipsets enough before making the decision to use them.