Geforce GTX 780 Ti unveiled

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
It just that they almost exhausted their reserve and the last
remaining eventual drop is 7% more processing power on the
paper at same frequencies.

Naa if i know nv correct the driver for profit is the non reference 780. Who cares about a ti or non ti when the smoke is over? It all for branding. They will select a very few for the ti with excellent perf/watt add it to the 7%. Add reference coolers that cost more than lowend gfx cards themselves and the likes. Then make a paper like launch and sell the oc non reference 780. All praise and glory to the silent reference 780ti.

Typical good marketing strategy. Give the testdrivers the car with the big engine and loads of equipment and market the price of the lesser models. Its good marketing. Amd is very good at doing it the oposite way. Send slow binned cards with lousy reference coolers.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
The safe bet is that it's a Titan with 3GB VRAM clocked higher like a GTX 780 currently is vs a Titan. Without the useless DP performance on a gaming card.

My guess is they know Titan loses to R9 290X or this card wouldn't be needed. So this will allow them to hold fast with a small lead in enough games to still claim supremacy.

Titan doesn't need to lose in performance to justify a 780Ti, just in practicality (which we can already argue it does ). Titan is simply not a consumer oriented card with its 6GB and focus on compute. The 780 was the consumer GK110, and it seems clear that it will be slower than a 290X and thus all the justification for a 780Ti is there.

Release a 780Ti to compete with 290X @ ~$650 MSRP, bump 780 down to wherever the 290 lands, and could even keep Titan where it is.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
It just that they almost exhausted their reserve and the last
remaining eventual drop is 7% more processing power on the
paper at same frequencies.

Funny that you mention it:
K6000 is as fast as Titan with MaxClock (around 1000MHz) while using less than 225W.

On the other hand AMD's latest card - the 290X - needs 8 months to beat Titan with 4,x% in Firestrike extreme*.

Would be hilarious if nVidia is able to beat the 290x much higher in less than one month. D:

*

http://wccftech.com/amd-radeon-r9-290x-hawaii-xt-uber-mode-crossfirex-performance-leaked/
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Titan doesn't need to lose in performance to justify a 780Ti, just in practicality (which we can already argue it does ). Titan is simply not a consumer oriented card with its 6GB and focus on compute. The 780 was the consumer GK110, and it seems clear that it will be slower than a 290X and thus all the justification for a 780Ti is there.
First wrong. Titan should not be used to justify anything.
Release a 780Ti to compete with 290X @ ~$650 MSRP, bump 780 down to wherever the 290 lands, and could even keep Titan where it is.
Second wrong. $650 is whole different segment than 790x. If you bump(?) 780 down to $400-$450 (!!!) what will be in between? That is $200 gap

Funny that you mention it:
K6000 is as fast as Titan with MaxClock (around 1000MHz) while using less than 225W.
Thank you for your proof. Much appreciated.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,543
4,327
136
Naa if i know nv correct the driver for profit is the non reference 780. Who cares about a ti or non ti when the smoke is over? It all for branding. They will select a very few for the ti with excellent perf/watt add it to the 7%. Add reference coolers that cost more than lowend gfx cards themselves and the likes. Then make a paper like launch and sell the oc non reference 780. All praise and glory to the silent reference 780ti.

Typical good marketing strategy. Give the testdrivers the car with the big engine and loads of equipment and market the price of the lesser models. Its good marketing. Amd is very good at doing it the oposite way. Send slow binned cards with lousy reference coolers.

To work this strategy require a product that is close
to the competition in all benches from the start , they
ll have trouble branding their alleged 780ti as the best
performing card when we saw that the regular 780 get
stomped by 35% in 4K benches by the stock 290x and
it s not a full GK110 die that will turn the tables.
 

Abwx

Lifer
Apr 2, 2011
11,543
4,327
136
Funny that you mention it:
K6000 is as fast as Titan with MaxClock (around 1000MHz) while using less than 225W.

I dont buy such oddities and even Nvidia did put 245W
in brakets near the 225W rating...
A card that has more exe units will consume more at the same
frequencies whatever Nvidia s claims to be above physics laws.



On the other hand AMD's latest card - the 290X - needs 8 months to beat Titan with 4,x% in Firestrike extreme*.

Would be hilarious if nVidia is able to beat the 290x much higher in less than one month. D:

*
http://wccftech.com/amd-radeon-r9-290x-hawaii-xt-uber-mode-crossfirex-performance-leaked/

You ll see in 4K , wich was largely viral marketed by Nvidia
as being loosy on AMD gear , wich is the real high end card
and dont expect Nvidia to have someting other than marketing
tricks ala Origin to counter their competitor.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Jap. It's a proof, that there is so much room for a 10-20% faster card. :thumbsup:
That is not how the proving works. Or is he a stream of proof, spilling proofs from his mouth? A Geyser of facts?
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
There is no proof for the opposite. So i guess you must life with my sarcasm.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I dont buy such oddities and even Nvidia did put 245W
in brakets near the 225W rating...
A card that has more exe units will consume more at the same
frequencies whatever Nvidia s claims to be above physics laws.

I think they dont expect you to buy a $5000 workstation card. :sneaky:
And where you see here* a 245W number - not clear to me...
*http://www.nvidia.com/object/quadro-desktop-gpus-specs.html
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,543
4,327
136
Nobody here will see in 4k.

Doesnt matter if people see it or not , what matters
is that a card is to claim the performance crown it has
to be also the best at the higher resolutions because HE
cards users are not meant to be attracted by 800x600 ,
perhaps that they prefer to look at the opposite side
of the picture.


And where you see here* a 245W number - not clear to me...
*http://www.nvidia.com/object/quadro-desktop-gpus-specs.html

They say 225W but their GPU has no frequency specs ,
wich lead me to wonder how you could possibly do
assumptions on what it would consume at forcibly
random frequencies....
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I don't understand why you insist on carrying on arguing over what may or may not happen when nobody here knows (can say) for sure. We'll all have the answer in due time, why continue argue we already know your stance why do you feel you need to convince others at this point?

This is the internet, you aren't going to change minds without facts which you simply do not have.
 

Abwx

Lifer
Apr 2, 2011
11,543
4,327
136
I don't understand why you insist on carrying on arguing over what may or may not happen when nobody here knows (can say) for sure. We'll all have the answer in due time, why continue argue we already know your stance why do you feel you need to convince others at this point?

This is the internet, you aren't going to change minds without facts which you simply do not have.

Actualy i was answering to arguments about what was
going to happen or not , that is , the very subject of this
very thread about an alleged "new" Nvidia card so i foind
your post not only irrelevant but wrongly directed to me ,
go complain to the OP if discussions about what will eventualy
happen doesnt suit you , actualy i wonder what you re doing
here if this is of no interest for you....
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
I dont buy such oddities and even Nvidia did put 245W
in brakets near the 225W rating...
A card that has more exe units will consume more at the same
frequencies whatever Nvidia s claims to be above physics laws.

Hello, history calling.
GTX 480: 480 cores, GF100
GTX 580: 512 cores, GF110

Both cards share the same memory bus and the same memory.
Its something called improving silicon. You might want to research a bit before making claims that a company doesn`t follow physics.

GTX 780 GK110: 2688 cores
GTX 780 TI GK180: 2880 cores (?)
Same memory bus here as well.

Perfectly plausible to use the same or less power while add more cores if process have matured. The GTX 780 TI will still use 8+6 power like GTX 780. Remove most of the FP64 cores, and you need even less power than Titan in FP64 mode. So if its anything like GTX 480 and 580, that is more than enough power to drive a future 2880 GPU.


 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Actualy i was answering to arguments about what was
going to happen or not , that is , the very subject of this
very thread about an alleged "new" Nvidia card so i foind
your post not only irrelevant but wrongly directed to me ,
go complain to the OP if discussions about what will eventualy
happen doesnt suit you , actualy i wonder what you re doing
here if this is of no interest for you....


We know :thumbsup:
 

Abwx

Lifer
Apr 2, 2011
11,543
4,327
136
Hello, history calling-

GTX 480: 480 cores, GF100
GTX 580: 512 cores, GF110

Both cards share the same memory bus and the same memory.
Its something called improving silicon.

History doesnt systematicaly duplicate itself...

This time they did get the silicon right from the start ,
there s likely not much if anything at all to improve
with the current process , as pointed ad nauseam
all they can do is to use a full GK110 gaining 7%
theorical raw power and with a very agressive turbo
to grab a few % at the beginning of a benchmark
before the thermal constraints bring all this people
back to ground zero.
 
Last edited:

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
History doesnt systematicaly duplicate itself...

This time they did get the silicon right from the start ,
there s likely not much if anything at all to improve
with the current process , as pointed ad nauseam
all they can do is to use a full GK110 gaining 7%
theorical raw power with a very agressive turbo
to grab a few % at the beginning of a benchmark
before the thermal constraints bring all this people
ground zero.


If its a different chip, or a improved one with better silicon, and if they remove FP64 performance you find on Titan (because face it, this will most likely be a gamers card, not an jack of all trades like Titan), they can add more cores and even have the cores running faster than Titan. Meaning it will be quite a bit faster than +7% over Titan.
There is a reason why they skipped GPGPU on GK104 aka GTX 680.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
History doesnt systematicaly duplicate itself...

This time they did get the silicon right from the start ,
there s likely not much if anything at all to improve
with the current process , as pointed ad nauseam
all they can do is to use a full GK110 gaining 7%
theorical raw power and with a very agressive turbo
to grab a few % at the beginning of a benchmark
before the thermal constraints bring all this people
back to ground zero.
why are you writing like that, its annoying to
say the least, please write in a normal format,
thanks.
 
Last edited:

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
I think GTX 780 TI will be like this, but without the 12GB GDDR5 (lol talk about overkill)
http://www.hardwarepal.com/nvidia-announces-gtx780ti-12gb-gddr5-4k-4k-surround-gaming/#_

Since AMD is pushing out the 290X and marketing it for 4K while performing really well there due to all the cores and the huge memory bandwidth, I don`t think Nvidia have much choice than to go all in as well.

Its going to be how long until the next architecture? 6 months? 8 months? I don`t think Nvidia will be just sitting there all that time with an overpriced Titan and an underperforming GTX 780 watching AMD cash in with their 290X beast. Because lets face it, 290X is looking like a true champion.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |