WelshBloke,
I believe the change from the old (working) generations to the new (busted) behavior with dual screens occurs because GPUs before the ATI x2xxx series used dedicated 2d hardware while everything is done in Shaders now. I may be wrong on the generation of hardware, but that'd be my conjecture for why there is such a massive change in temperature with the extra screen. [This is also partially why 2d acceleration took a step back with the advent of general purpose hardware, iirc]. Using the same monitors alleviates the issue by sharing resources (only need to generate 1 clock signal, probably share same space in memory, etc) if I'd have to guess.
mnewsham, if you upgraded your drivers to something recent you'd probably experience the same issues. I remember (poorly) the hubbub from this issue a couple months back and the fix came via drivers increasing the clock speed when the second monitor was attached. *Some* people had flickering with dual monitors (since the 5xxx series underclocked so aggressively on the desktop), causing them to whine about it, likewise causing AMD management to call in this simple fix.
This is all driver related, so find a utility and try playing with the 2d clocks until you find something suiting your needs. Hardly ideal, but the best I can do. Reminds me of the 'fix' for the bumpgate fiasco -- new drivers just ran the fan harder to minimize temperature fluctuations.
Obviously I'm going off memory here (which is deteriorating as of late), so take everything as 'generally correct, but probably wrong on the details'.