Kitguru : Nvidia to release three GeForce GTX 800 graphics cards this October

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

FatherMurphy

Senior member
Mar 27, 2014
229
18
81
I think they calculated the cores as 20% stronger, I could be mistaken.

The 35% number is straight from Nvidia. See the diagram here: http://www.anandtech.com/show/7764/the-nvidia-geforce-gtx-750-ti-and-gtx-750-review-maxwell/3

Edit: Re-reading that slide, though, I notice that it makes a point of saying "Maxwell: 1st Generation" is 35% more performance per core. I wonder if there is any room for improvement per core in 2nd generation Maxwell, and presuming GM204 is second generation (the 2 in GM204 suggests that it is).
 
Last edited:

CrazyElf

Member
May 28, 2013
88
21
81
The real questions at this point are:

1. Will Nvidia release a big Maxwell?

2. How much faster will it perform? That will depend on its clockspeed too.

3. What will it cost?

There are other issues like if the bus is limiting the card at higher resolutions. I get the feeling that until we get more reliable information, we're really at a loss for now.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
No offense, but why would you ask that? Questions without answers, yeah? We're just playing the guessing game at this point about the GM204, GM200 is something else entirely. I'd imagine that it's an HTC first part so it would be quite high performing, and will require the 20nm node to be fully realized. Just my theory.

Anyway, this information isn't released to the public until release is basically imminent. Within 3-4 weeks of launch, and that's when you'll see NV or AMD do a press invite sorta thing. Asking these questions before that is a waste of time. Even then, these questions MAY NOT be answered until it hits the shelves.

I've not seen a GPU not adhere to that schedule, and if those facts come to light before any official announcement someone would probably involve someone losing a job due to leaking info and/or some websites would get blacklisted due to breaking NDA. Although now from what i've read elsewhere, websites stay in the dark just as long as consumers do pretty much. Due to NDA breaking. Aside from that there are guys close to the supply chain in China/TW who basically have leaked information for years with no repercussions. All GPUs roll off assembly lines over there, so, you'll see leaks come to light (as we already have for GM204). Whether they're all true or not is another story, but some of them are. I remember the leaks from chiphell (cantonese PC DIY website/magazine) for the 780 and 7970 were about 50% accurate, judging the final release silicon. But they won't have leaks for GM200, IMHO. GM200 is probably reserved for the 20nm node because you simply can't cram many more transistors above what the 780ti has on 28nm.
 
Last edited:

Ajay

Lifer
Jan 8, 2001
16,094
8,106
136
No offense, but why would you ask that? Kinda silly to ask questions without answers, yeah? We're just playing the guessing game at this point about the GM204, GM200 is something else entirely. I'd imagine that it's an HTC first part so it would be quite high performing, and will require the 20nm node to be fully realized. Just my theory.

/speculation

Yeah, that makes sense to me. NV would likely be able to put 'Big M' into production for a 1H15 delivery (with initial Apple deliveries out of the way) for the HPC market. It will need the power savings and xtor increase from 20nm to be competitive with 14 nm Xeon Phi. If that's the case, it'll be a monster - I'm sure NV will be happy to sell a consumer version for ~$1K (or more) to offset the costs of big M. Then they will probably move on to a 20nm refresh to it's mid-range cards in time for the holiday season - ideally anyway. Who knows how much faster it'll be - I'd imagine NV might like to get he die size back down around that of the GK104.
 

CrazyElf

Member
May 28, 2013
88
21
81
No offense, but why would you ask that? Questions without answers, yeah? We're just playing the guessing game at this point about the GM204, GM200 is something else entirely. I'd imagine that it's an HTC first part so it would be quite high performing, and will require the 20nm node to be fully realized. Just my theory.

Considering we are 3-4 weeks away from September (which some rumors suggest is when Maxwell comes out), and perhaps 8 weeks from October, if these rumors are true (and huge "if" here), then we're at the point where more reliable rumors start coming out.
 

tollingalong

Member
Jun 26, 2014
101
0
0
The current Maxwells aren't all that much faster but are far more power efficient. I'd be surprised if the performance jump was massive. Either way I'm getting one for Linux and probably a R295X or w/e for Mantle.
 

mv2devnull

Golden Member
Apr 13, 2010
1,503
145
106
The real questions at this point are:
This thread is full of "real" questions.

We have one fact: a rumour states that Nvidia will release something during October.

Therefore, there is only one real question: Is the rumour true?
 

FatherMurphy

Senior member
Mar 27, 2014
229
18
81
I'd open the floor to speculation on the following question: will these "2nd Generation" Maxwell chips have better performance per core than the "1st Generation" Maxwell chip -- GM107? Nvidia made such a big deal about demarcating GM107 as "1st Generation." See http://www.anandtech.com/show/7764/t...view-maxwell/3

I assume, perhaps wrongly, that the "2nd Generation" will offer tangible benefits and/or changes. This could mean better performance per core. Or better power efficiency. Or simply added features: DirectX 11.2 support, ARM Cores, etc. Given the tremendous IPC and power efficiency improvements in the 1st Generation Maxwell chip, I doubt that much more could be done on 28nm, so I would lean toward the 2nd Gen Maxwell chips having added features, though it's anyone's guess what those might be.

Maybe Nvidia is finally adding that secret dedicated Physx logic onto GM204 that Charlie said would give GK104 the edge :sneaky:
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The current Maxwells aren't all that much faster but are far more power efficient. I'd be surprised if the performance jump was massive. Either way I'm getting one for Linux and probably a R295X or w/e for Mantle.

Aren't all that much faster. This is not accurate. The GK107 configuration that matches the GM107 used in the 750ti is the GT640, neither of these cards have a power plug requirement. GT640 = 65W TDP, GTX 750ti = 60W TDP. The GM107 in the 750ti is three times faster at a lower TDP than the GT640. Furthermore, the GTX 650 has a power plug requirement for the reference baseline model (750ti doesn't for the reference baseline) and the GM107 is two times faster than the beefed up GK107.

The GT640 is a 65W TDP. This is the Kepler version of the GTX 750ti. Again...the 750ti is THREE TIMES FASTER.

The GTX 650 is a scaled up version of the GK107 with a 110W TDP. The 750ti is two times faster than the scaled up GK107. This is double the TDP of the GM107 in the 750ti. This is, again, not an apples to apples comparison because the GTX 650 is beefed up with a power plug requirement. The GM107 is neither.

If your definition of not being faster is actually three times faster, well, I guess i'll throw that out there. The performance per watt doubled from kepler to maxwell, and that will certainly manifest itself in big performance gains at the same TDP levels. 225W TDP Maxwell will outperform a 225W TDP Kepler. How much so remains to be seen. If I had to guess i'd guesstimate 15-30% faster just depending. Really, there's no point to an 800 series if it were the same speed as the prior generation. At least that's my line of thinking.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
GK104 had better perf/mm^2 than GK107, while maintaining the same efficiency as GK107. Since Maxwell's caching system adds a huge unknown factor to how much cache is needed with larger dies, it remains to be seen if the perf/mm^2 scaling characteristics are retained. I do believe that regardless, GM204 will be every bit as efficient as GM107.
 

Mand

Senior member
Jan 13, 2014
664
0
0
GK104 had better perf/mm^2 than GK107, while maintaining the same efficiency as GK107. Since Maxwell's caching system adds a huge unknown factor to how much cache is needed with larger dies, it remains to be seen if the perf/mm^2 scaling characteristics are retained. I do believe that regardless, GM204 will be every bit as efficient as GM107.

Reviews of GM107 said that Maxwell cores were both more power efficient and more space efficient than Kepler cores.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I'm ready to gobble up another 780 after the 880 drops. Already have a water block for it. Unless the 880 destroys 4K, I'll likely stick to the 780 until their next big die launches. I bought a 680 at launch and quickly replaced it.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
I bet $10 that the GTX 880 will deal with 1600p+ res just perfect despite the 256bit doom from jealous GTX 780 Ti/R9 290X owners.

Just face it, your card will meet a new king in September/October. You no longer will have the fastest card
 

shaynoa

Member
Feb 14, 2010
193
0
0
Why is it Nvidia stick with the 256 bit instead of the 384 bit or more so 512 bit,
with a video card being more powerfull than the prevous one would that not imply a bottleneck problem at a smaller bit rate. Why would any company place a bad selling point apone a new product

The 700 cards have gone over the 256 bit but not many others from Nvidia have over the years.

This alone is why i have used AMD cards. If Nvidia put 384 bit or 512 bit on the
800 series with at the least 4GB DDR onboard i will buy a few of them.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,106
136
Why is it Nvidia stick with the 256 bit instead of the 384 bit or more so 512 bit,
with a video card being more powerfull than the prevous one would that not imply a bottleneck problem at a smaller bit rate. Why would any company place a bad selling point apone a new product

The 700 cards have gone over the 256 bit but not many others from Nvidia have over the years.

This alone is why i have used AMD cards. If Nvidia put 384 bit or 512 bit on the
800 series with at the least 4GB DDR onboard i will buy a few of them.

$PROFITS$. It makes the GPU easier the design and manufacture (better initial yields) and reduces the cost of the GFX board (less traces means easier design and manufacturing).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I bet $10 that the GTX 880 will deal with 1600p+ res just perfect despite the 256bit doom from jealous GTX 780 Ti/R9 290X owners.

Just face it, your card will meet a new king in September/October. You no longer will have the fastest card

What? 780Ti/R9 290X owners are the type who drop $550-700 on a new GPU and tend to be early adopters of newest/latest tech. These are the users who could upgrade to 880 by simply reselling their last gen cards. These owners are not jealous, in fact the opposite, they are concerned/upset that the performance increase may only be 15-20%, not enough to bother upgrading since 15-20% will do little at 1600p to change playability from their existing OC cards. Some people have $700-1,400 set aside for new GPUs but it doesn't imply it's a great idea to 'waste' $ for a mere 15-25% increase in performance either. A lot of these users will be waiting for something much faster (i.e., GM200, etc.).

If you look at benches of demanding games at 1600P and beyond, you really need 40-100% increase in performance to make a dent in playability over 780Ti. That's because in the games where 780Ti struggles getting 30-40 fps, a 20% increase is only a mathematical measurement that hardly impacts smoothness these users are looking for (i.e., at least a move to 50-60 fps from 30-40 in those titles).




 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
What? 780Ti/R9 290X owners are the type who drop $550-700 on a new GPU and tend to be early adopters of newest/latest tech. These are the users who could upgrade to 880 by simply reselling their last gen cards. These owners are not jealous, in fact the opposite, they are concerned/upset that the performance increase may only be 15-20%, not enough to bother upgrading since 15-20% will do little at 1600p to change playability from their existing OC cards. Some people have $700-1,400 set aside for new GPUs but it doesn't imply it's a great idea to 'waste' $ for a mere 15-25% increase in performance either. A lot of these users will be waiting for something much faster (i.e., GM200, etc.).

If you look at benches of demanding games at 1600P and beyond, you really need 40-100% increase in performance to make a dent in playability over 780Ti. That's because in the games where 780Ti struggles getting 30-40 fps, a 20% increase is only a mathematical measurement that hardly impacts smoothness these users are looking for (i.e., at least a move to 50-60 fps from 30-40 in those titles).

I was going to respond to him, but you already spoke my point. Jealous? Nah. I have no problem dropping money on something impressive, but, as I said, I believe this will be more like the 680, which was not impressive. I'll wait for the real big-die card if this is just a 15-25% improvement.

FWIW, my purchasing decisions have nothing to do with "king of cards" or "biggest peen." I buy for the hobby and enjoyment.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,106
136
I was going to respond to him, but you already spoke my point. Jealous? Nah. I have no problem dropping money on something impressive, but, as I said, I believe this will be more like the 680, which was not impressive. I'll wait for the real big-die card if this is just a 15-25% improvement.

FWIW, my purchasing decisions have nothing to do with "king of cards" or "biggest peen." I buy for the hobby and enjoyment.

Makes sense. For me it's '20nm or bust'. If I wanted another 28nm card I'd just go SLI. Of course, this assumes that there will be some game of interest to me that will bog down at high quality 19x12. Maybe UT4 won't suck and will have massive eye candy :thumbsup:
 

jpiniero

Lifer
Oct 1, 2010
14,841
5,456
136
Thing is, there is no guarantee there will be any 20 nm products from nVidia now. You might be waiting until 16FF Pascal in 2016.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,106
136
Thing is, there is no guarantee there will be any 20 nm products from nVidia now. You might be waiting until 16FF Pascal in 2016.

I just don't think Nvidia can afford to wait till then for it's HPC and Professional line of Compute/GFX AIBs. I think the will have to go to 20nm and then 16FF+ to stay competitive. NV can also lead-out with less complex but still highly profitable mobile dGPUs as pipe cleaners for these new nodes.

The problems they, and AMD, face are having to delay something like a year for the yields to become high-enough and lower profit margins due to wafer costs and the need for increased design resources.

As a result of this, I think that only one of the major dGPU companies will survive (one company will need all the volume to be profitable). My bet is that NV will come out the winner because of AMD's poor financial condition.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Reviews of GM107 said that Maxwell cores were both more power efficient and more space efficient than Kepler cores.

What I was getting at is that GK104 scaled to be exactly 3x faster than GK107, despite only having a 2.5x larger die.

GM107 is 148mm^2. If Maxwell scales up in perf/mm^2 among it's different cores similarly to Kepler, then GM204 should only need to be 370 mm^2. This isn't an apples to apples comparison, because Maxwell is inherently different and has much more die space devoted to cache, but it's an encouraging sign to hear that GM204 is above 400mm^2.
 

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
How about this:

This 880GTX won't be the replacement for the 780ti, but more something between 780 non-ti and 770. (in terms of position, not performance !)

A future GeForce Titan (Maxwell v3?) would be the fastes card and even a possible 880ti would stay behind it. The Titan would also have special coolers to keep it's "luxury" position.

This would shift the prices to the lower end and in this scenario, a 880GTX for 450$ makes total sense!

Source? My creative mind
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |