nullpointerus
Golden Member
- Apr 17, 2003
- 1,326
- 0
- 0
I thought they already covered this?Originally posted by: SickBeast
That's not even true. My 8800GTS 320mb, a midrange card, runs every game out there at high-res and high settings, aside from Crysis (which nothing can run at those settings).Originally posted by: nRollo
It takes multiple high end GPUs to run current games at high res/high detail now- what will change in this future era? Will devs be going back to Quake2 level graphics so we don't need discrete anymore? Or will GPU design be so advanced it somehow now does the same work in a package that will fit on the same die with the cpu?
The one point that both you and Ben have failed to address remains: What will happen to nVidia when their low-end GPU sales are syphoned off by these Fusion-type processors? Most people are casual gamers and are happy if their PC can run Solitare. I'm thinking this is bound to affect them financially.
In case you haven't been following the CPU scene, AMD is having enough trouble competing (profitably) with Intel on non-Fusion processors. Any "free die space" that AMD gets with a shrinking process and/or other manufacturing advances (in which AMD are perpetually behind due to having only a fraction of Intel's budget) ought to be dedicated to additional logic/cache to make the chip run faster, cooler, or whatever. Or just accept the reduced cost and sell them cheaper with the same profit margins.
Phenom is...well...looking bleak. B3 is not looking stellar... It's more like damage control. AMD lacks the momentum from the enthusiast community. They're low on cash, too. Plus they can never match Intel's budget or manufacturing capabilities, even if the K10/10.5 design is significantly better. And that's assuming AMD doesn't run into another slew of Phenom B2-like issues such as the TLB bug, slow core 3, crippled MMU/cache... Sheesh.
Intel isn't exactly sitting still.
Throwing stripped-down GPU cores into the mix...well...it doesn't make any sense to me. That just increases the complexity, which means higher cost and more chance to make mistakes. When you want to grab market share, you leverage your strengths; you don't drown out your competitors with a constant stream of hype about how one day you'll grind them under your feet with the crushing weight of your own weaknesses.
Aren't integrated GPU profits chump change compared to CPU profits?
Fusion strikes me as marketing mumbo-jumbo, smoke and mirrors to make the company execs look like they know what they're doing or to make the company look good to potential investors/buyers. It's just a first impression. The reason I say this is that the people I see talking about it on the forums don't seem to have any specific idea what it's supposed to do, or how it's supposed to do whatever it is that it's supposed to do. The only thing being mentioned is some vague notion that tying CPU and GPU into one package will result in...something financially good?...for AMD.