But they are advertising Rx as such which uses the same uarch, besides titan started -and is the market leader in prosumer GPUs, and the titan is judged on a mix of gaming and semi pro, it excels in gaming- Vega Fe should be judged accordingly.
An anology would be Vauxhall bringing out a new hatchback car, and saying to everyone it is targeted for carrying the shopping/kids to school and shouldn't be judged on its driving experience-when the market leader ford fiesta is judged on such things and excels- just like every hatchback is judged on certain parameters that's the standard in its class. (Jack of all trades)
In the GPU world the prosumer cards are the hatchbacks, except they are expensive premium products unlike small cars.
Consumer = gaming.
Prosumer= uber gaming/gpgpu/semi professional mix (uber price)
Professional= uber gpgpu compute, stable certified drivers (crazy price)
What you and other people are suggesting is that AMD purposely turned off great gaming performance for that card ie artificially segmenting against the market expectations, which is daft.
You better believe if Vega Fe had uber titan like gaming capabilities they would enable it and market it as such, they are not because frankly it cant-hence this rubbish about just for game development.
The alternative to that Is what I mentioned in an earlier post, that AMD is pulling some masterful sandbagging for an early Christmas present, not impossible but extremely unlikely imo.
Im dismissing entirely the notion that AMD is doing neither of those and is working on an unready driver that is going to increase performance by 35℅ in one month of work- they have had 8 months to work on drivers- most of the gains would be in code already.
Edt; Just to restate- I would love some crazy driver action as I would probably go buy one, also I still think the bandwidth limitation rumours are fishy as hell, I expect if true just fixing that would bring 20℅ or so I would think, but would also hamper compute, so they would be shooting themselves in the foot with Vega Fe either way.
As far as I can see there is no such thing as a prosumer market - it seems to have been invented during all this Vega hype.Titan doesn't define "prosumer" because it was a fake. An elaborate ruse to raise the price of gaming cards. It was strong in DP. And people who would never in a lifetime take advantage of that used it as a reason to buy it.
As far as I can see there is no such thing as a prosumer market - it seems to have been invented during all this Vega hype.
What are the odds next incarnation of Vega runs of GDDR5/X/6 memory types?
What are the odds next incarnation of Vega runs of GDDR5/X/6 memory types?
I'm starting to feel HBM isn't the future memory for GPUs most anticipated (at least not yet.)
HBM is much more complex & expensive than GDDR5X.Not sure how anyone can question HBM. It's just expensive for now because it's new but even HBM2 can slam dunk over GDDR6's projected bandwidth with a 4-stack at full speed (1000MHz) IIRC. Something like 1024GB/s with a 4-stack @1000MHz and at lower than GDDR5X power consumption. Price is the only issue for now, but I'm sure that'll change considering Samsung is starting production too.
HBM is much more complex & expensive than GDDR5X.
So, sure, on paper, it seems like it is a winner, but, so far, the products that have been using it have been more expensive, and uses more voltage than expected.
Unlikely as AMD wants to eventually use HBM for APUs which means continuing to invest in it and working to get costs down.
HBM is fine, AMD's implementation in VEGA is questionable. 3-4 stacks would have been a better choice, at a lower voltage and clock speed they could put out 640-768GB BW.
Not sure how anyone can question HBM. It's just expensive for now because it's new but even HBM2 can slam dunk over GDDR6's projected bandwidth with a 4-stack at full speed (1000MHz) IIRC. Something like 1024GB/s with a 4-stack @1000MHz and at lower than GDDR5X power consumption. Price is the only issue for now, but I'm sure that'll change considering Samsung is starting production too.
Ok we know they showed a VEGA RX system against NVIDIA - who says this was a 1080 heavy-weight fight? Polaris 580 is positioned against 1060, they could have demoed SMALL VEGA as 1070 competitor, or a 1080 competitor with 1070 price tag. We'll see.
If true, then that whole excursion into a HBCC world is wasted and will have to be abandoned.What are the odds next incarnation of Vega runs of GDDR5/X/6 memory types?
I'm starting to feel HBM isn't the future memory for GPUs most anticipated (at least not yet.)
The saving grace for AMD is that Vega will be used in the Xbox One X.
You must not have been around for the original Titan then.As far as I can see there is no such thing as a prosumer market - it seems to have been invented during all this Vega hype.
You can't use GDDR5X/6 on APU's. It's not all about DGPU. They are working on making them extinct.HBM is much more complex & expensive than GDDR5X.
So, sure, on paper, it seems like it is a winner, but, so far, the products that have been using it have been more expensive, and uses more voltage than expected.
It's been over two years since AMD launched a high end chip, you think they're playing games like this?
It makes no sense.
If they are really selling basically broken chips at massive price premium, they are really risking it. I don't believe it.I think they ran into serious issues with Vega and needed time to fix it (re-spin or more radical change). The now-useless-for-gaming cards are sold as 'Prosumer' and potentially lower end models.
Sounds like wishful thinking, but I find it hard to believe that a company like AMD is not able to deliver an at least competive model until they had almost one year more to develop it compared to 1080(ti)...
Yeah i still dont know whos idea was OC GPU +50% and reduce memory bandwidth by 6% is good thing.That person should be fired.HBM is fine, AMD's implementation in VEGA is questionable. 3-4 stacks would have been a better choice, at a lower voltage and clock speed they could put out 640-768GB BW.
It isn't. Indications are that the XOX GPU is modified Polaris. It may have some vega features, but lacks, for example, packed math ability.
The PS4 Pro's GPU does have packed math, but also isn't Vega.
Hopefully Vega has the ID buffer from the PS4 Pro. If so, we could see some pretty sweet (and cheap) upscaling, anti-aliasing, and motion based effects.
Got to stop with rather vague and misleading statements. AMD has shown RX Vega's performance a couple of times. Is it quantifiable? Not, but it's still performance they've demonstrated multiple times. They've made the comparison themselves. When someone over at AMD said something along the lines of "it looks good against Titan Xp" people took that as an example of the performance it is capable (and actually ran with it).
We've seen RX Vega performance. We just need the hard numbers to line it up to AMD's own current live demonstrations.
The fact that what we've seen of RX Vega's performance is lackluster is on AMD themselves, since they're the ones that trotted it out and said "look."