will never
No one said "will never" but not this year.
I think it would cost too much for a consumer card to have 16gb.
I did read that Vega will be 8gb and the Pro Vega will be 16gb, I cant find it.
Is there any proof of the above?
Either you are trolling, or you are not thinking things you post through.. In anycase please don't post anymore clueless quotes.Tech Report got their hands on one. They said performance was between a 1070 and 1080.
Why are people thinking that 8GB of VRAM is a bad thing?? We don't need 16GB on a consumer card. 4GB on Fury X already does great @ 4k.
Is $999 your speculation or are their news/rumours to such?
That AMD card will cost $600, just like Zen. For the price of a Titan, people will have a whole high end rig. They will take their savings and go have a latte.
That AMD card will cost $600, just like Zen. For the price of a Titan, people will have a whole high end rig. They will take their savings and go have a latte...
That AMD card will cost $600, just like Zen. For the price of a Titan, people will have a whole high end rig. They will take their savings and go have a latte.
Either you are trolling, or you are not thinking things you post through.. In anycase please don't post anymore clueless quotes.
What they forget to take into account, is the run of the Vega was with V-sync fixed at 60hz.
It would same thing for the TitanX/full gp102/Volta.. If you enable v-sync to 60hz on a 60hz monitor, you wouldn't see above 60fps.
This should be very clear to anyone on a technical hardware forum..
Lastly i would like to add that the demo was run on alpha drivers, as you have been told many times already.
Surprisingly, the demo attendant let me turn on Doom's revealing "nightmare" performance metrics, and I saw a maximum frame time of about 24.8 ms after large explosions.
Seems like all hands on deck lately
An 8GB framebuffer is still more than acceptable. When Fury came out, its 4GB limitation was already borderline. Sure, more would be better, but it's not even close to the same situation.
And obviously a ~525mm2-ish Vega is going to compete with GP102, not GP104. May or may not beat it (keeping in mind that Titan XP is still a cut down version), but get real people. Use your heads.
Clueless? You feel you have to insult me because you don't like what Tech Report said? You know, the site whose owner works for AMD?
Citation needed.
I'm going to assume that you didn't bother to even read the link, so let me make another clueless, trolling quote for you...
They didn't measure FPS, they measured frame times. The frame times landed between a 1070 and a 1080. I'm sure being on a hardware enthusiast site you know the importance of frame times.
FTFY.
I'm assuming in your transparent attempt to call me an Nvidia fan you forgot I use AMD video cards.
I'm sorry, i had you mixed up with ShintaiDK, and ended up quoting you instead by mistake.(both of you have very similar posts)Clueless? You feel you have to insult me because you don't like what Tech Report said? You know, the site whose owner works for AMD?
Techreport estimate performance to be 1070-1080. Vega is a GP104 competitor from the looks of it.
He were trying to guesstimate the performance to be below a GTX 1080, from the battlefront demo, which clearly is a no-go for reason stated aboveTech Report got their hands on one. They said performance was between a 1070 and 1080.
Citation needed.
And it still stands.It would same thing for the TitanX/full gp102/Volta.. If you enable v-sync to 60hz on a 60hz monitor, you wouldn't see above 60fps.
This should be very clear to anyone on a technical hardware forum..
Lastly i would like to add that the demo was run on alpha drivers
Regrading the techreport Doom "article" which is predicting a 500+mm2 die to loseout compared to a 314mm2 die. Is it a coincident this is left out from the quotes again and again ? (2 times already in last pages, not to mention the alpha drivers again)Techreport said:Though that performance might not sound so impressive, it's worth noting that all of the demo system's vents (including the graphics card's exhaust) were taped up, and it's quite likely the chip was sweating to death in its own waste heat.
Thats actually quite funny as i'm running on a Nvidia GPU as we speakI'm assuming in your transparent attempt to call me an Nvidia fan you forgot I use AMD video cards.
I don't see how Big Vega can release at $1000. Unless you're saying a GTX 1080ti will release at $1000, AMD can't price above a 1080ti. Because then I will HAVE to buy one. It won't work with my monitor, but I can't ignore the top end competitor GPU at a higher perf/dollar.... that'd be insane. And yes the 1080ti is the top end when it comes out.Unless NV release a GTX1080Ti at $600 i really dont see why a behemoth 500-600mm2 die GPU with 8GB HBM2 will be sold at $600
The Chip that Raja held in his hand is less than 500mm²
https://www.forum-3dcenter.org/vbulletin/showpost.php?p=11256866&postcount=1398
my speculation on current trends and die sizes/perf estimations
Yeah but vega is more compute GPU than gaming GPU.GP102 is 100% gaming GPU.Vega is like Kepler...(compute/gaming hybrid)An 8GB framebuffer is still more than acceptable. When Fury came out, its 4GB limitation was already borderline. Sure, more would be better, but it's not even close to the same situation.
And obviously a ~525mm2-ish Vega is going to compete with GP102, not GP104. May or may not beat it (keeping in mind that Titan XP is still a cut down version), but get real people. Use your heads.
The new programmable geometry pipeline on Vega will offer up to 2x the peak throughput per clock compared to previous generations by utilizing a new “primitive shader.” This new shader combines the functions of vertex and geometry shader and, as AMD told it to me, “with the right knowledge” you can discard game based primitives at an incredible rate. This right knowledge though is the crucial component – it is something that has to be coded for directly and isn’t something that AMD or Vega will be able to do behind the scenes.