jonpeddie: nvidia keeps losing market share

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Dark Shroud said:
That article is beyound stupid.

Nvidia has backed themselves into a corner. I don't hate Nvidia and I do not want them to go under. Yet between their corporate attitude and the way they treat consumers they could use a bit of humbling. Not to mention the hardware divides they have created in the PC gaming world.




If you read what Richard Huddy states -- that is why I offered Repi's quote, he did get the over-all context Richard was trying to offer.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Please note that the author of the above referenced article claims that gtx580 has 512 stream processors.... this kind of makes me question anything else which is states in the article...

Forget about the author, I read Richard Huddy's exact wording -- not sensationalism about how AMD desires DirectX to go away. Read Repi's quote -- Repi is a developer for Frostbite.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
The %#'s we are seeing now are repercussions to changes in the consumer market. The key is have the companies strategized 3-6 quarters ago to what the consumer wants NOW. IMO, AMD has not, they have a whole bunch of netbook candidate cpu/cpu/gpu chips and demand for netbooks and low power laptops is now the territory of smart phones and tablets. Its why AMD has no CEO right now. The late move to develop -revenue making products in this area are going to hurt and drag down AMD stock price/confidence level for at least the next 12 months.
Nvidia took their lumps, after Fermi being late, but falling gpu share in certain sectors is no surprise at all.

For those that don't remember, about two months ago, Dirk Meyer, the Chief Executive officer of Advanced Micro Devices, announced he was resigning.

If anything, this was a very sudden move, one that certain officials, like Co-Founder Jerry Sanders, didn't agree with, according to later reports that said Meyer had been removed because of a presumably unsatisfactory plan for tablet PCs.
 
Last edited:

pcm81

Senior member
Mar 11, 2011
584
9
81
Forget about the author, I read Richard Huddy's exact wording -- not sensationalism about how AMD desires DirectX to go away. Read Repi's quote -- Repi is a developer for Frostbite.
I have written code befor, includingin assembly language. The bottom line is code portability plus ease of coding vs overhead.

If you take something as general as printf function in C that will have HUGE overhead writing to file vs say fwrite function. The problem with any library is abstraction. The higher is the level of abstraction, the less efficient is the code.

This has nothing to do with AMD or Nvidia and their hardware. Yes, AMD can release optimized set of C functions for their card and it would seem like an improvement on DirectX, the problem is that this is already being done, its called the driver.

Best code written in assembly will always outperform code compiled from C. And C will always outperform Fortran or other higher abstraction languages. The goal here is not to go back to 1980s and have every one code in assembly, but to design a programming environment which has flexibility and speed of low level language, while offering portability and rapid development tools like higher abstraction languages.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The %#'s we are seeing now are repercussions to changes in the consumer market. The key is have the companies strategized 3-6 quarters ago to what the consumer wants NOW. IMO, AMD has not, they have a whole bunch of netbook candidate cpu/cpu/gpu chips and demand for netbooks and low power laptops is now the territory of smart phones and tablets. Its why AMD has no CEO right now. The late move to develop -revenue making products in this area are going to hurt and drag down AMD stock price/confidence level for at least the next 12 months.
Nvidia took their lumps, after Fermi being late, but falling gpu share in certain sectors is no surprise at all.

What does the article you've linked to have to do with nVidia's decline in GPU market penetration or the JPR article?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I have written code befor, includingin assembly language. The bottom line is code portability plus ease of coding vs overhead.

If you take something as general as printf function in C that will have HUGE overhead writing to file vs say fwrite function. The problem with any library is abstraction. The higher is the level of abstraction, the less efficient is the code.

This has nothing to do with AMD or Nvidia and their hardware. Yes, AMD can release optimized set of C functions for their card and it would seem like an improvement on DirectX, the problem is that this is already being done, its called the driver.

Best code written in assembly will always outperform code compiled from C. And C will always outperform Fortran or other higher abstraction languages. The goal here is not to go back to 1980s and have every one code in assembly, but to design a programming environment which has flexibility and speed of low level language, while offering portability and rapid development tools like higher abstraction languages.

But, to have the flexibility doesn't translate into chaos and forcing everyone to code in assembly though. Why would some prominent developer houses desire this ability and ask for it? What would be the benefits for the developer, their franchises and for the PC?
 

pw38

Senior member
Apr 21, 2010
294
0
0
The %#'s we are seeing now are repercussions to changes in the consumer market. The key is have the companies strategized 3-6 quarters ago to what the consumer wants NOW. IMO, AMD has not, they have a whole bunch of netbook candidate cpu/cpu/gpu chips and demand for netbooks and low power laptops is now the territory of smart phones and tablets. Its why AMD has no CEO right now. The late move to develop -revenue making products in this area are going to hurt and drag down AMD stock price/confidence level for at least the next 12 months.
Nvidia took their lumps, after Fermi being late, but falling gpu share in certain sectors is no surprise at all.

No offense man but I don't believe you meant that when you posted your link. It just seemed like a shot at AMD. I can understand you're an Nvidia fan and all, it's cool, but crap is crap even with some perfume. No need to defend it, just move on.

I personally don't want to see Nvidia go under because the competition between them and AMD the past 2 years has been awesome for value selection. They're a smart group they'll figure out what to do to stay alive.
 

pcm81

Senior member
Mar 11, 2011
584
9
81
But, to have the flexibility doesn't translate into chaos and forcing everyone to code in assembly though. Why would some prominent developer houses desire this ability and ask for it? What would be the benefits for the developer, their franchises and for the PC?

The ideal infrastructure (programming language) would have the following features:
1. Ability to be compiled on multiple hardware platforms (Portability)
2. Ability to rapidly produce code that is not bound by variable types. (Example is printf function in C, same function accepts multiple types)
3 Ability to access low level mechanics (pointer math or inline assembly), while allowing all features of the language to be accessible without knowledge of low level mechanics.

To fully meet the above requirements one must create a smarter compiler than any one in existence today.

Let me give you an example. Lets say I have 2 structures. Each structure has a pointer to array, variable type identifier (float, int, etc), number of dimentions, and a pointer to array listing size of dimensions. These two arrays can have mathematical operations done to them, say add them to form 3rd array. But real time checking by the ADD function needs to be done to know which chunk of code to excecute based on array variable types; this is huge overhead. Alternatively a smart compiler will do this checking during compilation and create static code removing the overhead from excecution time.

What we have with any currently existing programming environment (DX as an example) is the 1st case senario, where we do get overhead from the realtime abstraction. What we need is to increase the compiler functionality, but to remove runtime overhead. Microsoft is pretty bad at this. For example their Visual basic or any .NET code is not true standalone compiled code, but rather its a programming environment which makes function calls to the VB.dll or the .NET framework. If DX is written in a stupid way, I dont know if it is, them removing it will result in significant speed boosts at the cost of portability/rapid development If DX is written in a smart way removing it will lead to a small boost in performance at the same cost of portabiulity/rapid development.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Isnt the quadcore version of the SoC in the iPad going to come at the end of the year too? Pretty sure the Sony NGP will use that.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Isnt the quadcore version of the SoC in the iPad going to come at the end of the year too? Pretty sure the Sony NGP will use that.
NVMIND: It looks like Sony is also making the NGP, sorry.
The 2 new Sony 'tablets' are gaming oriented devices.
That will be sold under the Sony Vaio - Playstation naming.

http://recombu.com/news/sony-s1-and...-qriocity-and-reader-store-ebooks_M14189.html
Sony has just officially announced that it’s got not one but two Android-based tablets in the works.
Both the S1 and S2 will run Android 3.0 Honeycomb, feature Tegra 2 dual-core processors and will be PlayStation Certified, meaning they’ll be able to download PS1 games on the go, a la the Xperia Play.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
"If I were the CEO for Qualcomm orTexas Instruments, I’d be worried. Tegra 2 is already faster than OMAP4 (which doesn’t ship until the 2nd half of this year), and Qualcomm’s MSM 8×60 series (also 2nd half of this year). That means that Qualcomm and TI’s products will be too slow and a generation behind by the time that they ship. They’re not slated to have a product that has a chance to compete with Tegra 3 until late 2012 or early 2013…and by then they’ll be going against Tegra 4. Competition is a beautiful thing. These two companies, not to mention Samsung, should start putting their microprocessor R&D into overdrive."

http://anythingbutiphone.com/969
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Next gen of Tegra is due out in August. Quadcore with an improved GPU.

I am interested in seeing if Nvidia will be able to brute force their way into the market with an aggressive product cycle like they did in desktop graphics nearly 15 years ago.

This makes little sense to me because we're reaching the point of diminishing returns when it comes to mobile computing performance.

I just bought an HTC Thunderbolt for the 4G/LTE access, and it's a single core that does fine. Ditto with the Droid Charge.

Imho, a fast single core is enough for web-surfing and watching HD video and play Angry Birds and such. But I suppose a small pool of people will attempt to play demanding video games on tiny little screens. But dual core and lots more video performance doesn't seem necessary right now... maybe a little more performance to encode 1080p video or something, but beyond that, it's overkill. Dual-core with high-performance video seems like unnecessary expense and energy usage at this point. Quad core seems like an even bigger waste of energy.

Energy efficiency is a big reason why ARM beats INTC in the mobile computing space and why INTC is making such a fuss about its energy-efficient 22nm process... they NEED to be much more energy-efficient if they hope for x86 to trickle down to mobile computing devices.

Give me adequate performance at a lower price with better battery life any day. I don't need to or want to play BF3 on my smartphone, thank you very much, especially if it adds expense and lowers battery life.

Edit: For tablets I see perhaps a bit more need for performance, but even then, beyond stuff like encoding/decoding 1080p video, then what? Does anyone seriously think we will see hordes of hardcore games being developed for tablets (especially given Nintendo/Sony probably not wanting to open up their hand-held console businesses and Sony probably not wanting to cannibalize its PSP business)? Does anyone even WANT to play demanding games on tablets and sub-10-inch-screens? There's a big difference between casual games like Angry Birds and demanding games like Crysis 1. You'd need a keyboard and mouse to really use a tablet for hardcore gaming, at which point you might as well get a full-fledged x86 laptop instead. Or just go desktop or console (Xbox 360 w/ Kinect, PS3, Nintendo Wii). That's not to say that ARM might not dominate lower-end mobile computers (netbooks, tablets, smartphones), but I have a hard time seeing ARM compete with INTC/AMD in the full-fledged laptop space.
 
Last edited:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
NVMIND: It looks like Sony is also making the NGP, sorry.
The 2 new Sony 'tablets' are gaming oriented devices.
That will be sold under the Sony Vaio - Playstation naming.

http://recombu.com/news/sony-s1-and...-qriocity-and-reader-store-ebooks_M14189.html

I was wondering because the Tegra 2 in Xoom seems to be onpar or slower compared to PowerVR SGX 543MP2 according to Anandtech's iPad review so if the NGP containing the PowerVR SGX 543MP4 ships by the end of the year then there seems to be some competition for nVidia's Tegra 4 which I am aassuming is a quadcore SoC (for now atleast).
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So, am I to assume with the current direction of this thread, that nVidia is no longer concerned with the discrete GPU market? Or, is it still safe to assume that they aren't trying to shrink their share in that part of the market?
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
So, am I to assume with the current direction of this thread, that nVidia is no longer concerned with the discrete GPU market? Or, is it still safe to assume that they aren't trying to shrink their share in that part of the market?

That's not true. NVDA simply wants to make money, bottom line, and they know that discrete GPUs for PCs are a stagnant market. They want to be competitive there, but the growth opportunities are elsewhere right now.

NVDA is run by smart people who realized long ago that:

1. They needed a x86 license to compete with INTC/AMD which would continue to eat up the low-end graphics space by integrating graphics with CPUs, thereby cutting NVDA out of the PC market. NVDA mostly got out of the chipset business after INTC locked them out of Core i5/i7 and it's too late to go back, even after the INTC-NVDA settlement.

2. The PC market kind of sucks anyway; PC gaming cards are a low-growth sector, and pro graphics cards are cash cows but low-growth as well

3. Consoles are cash cows but weren't guaranteed to help much in the future. NVDA burned some bridges, so while they make PS3 video chips, there is a possibility that AMD powers ALL of the next-generation consoles. AMD already locked up Nintendo, and AMD has an existing relationship with MSFT for the XBOX 360 that has had some bumps but looks to continue. From what I've heard, Sony hasn't decided yet on what powers the PS4, meaning that AMD might win THAT contract, too.

4. NVDA realized that supercomputing using their GPUs could be a lucrative market to get in on. GPUs may be re-architected to become more general-purpose and thus suitable for supercomputing. This has applications for finance, certain scientific studies (e.g., modeling nuclear explosions, interpreting seismic data, visualizing things, simulating protein folding). NVDA pushed GPUs into this direction, starting with the G80, and deserves all the credit (or blame, for those who believe that NVDA has gone too far in that direction at the expense of gaming-oriented performance upgrades; NVDA has admitted that some 15-20% of GF100 goes towards GPGPU rather than gaming-grade graphics). Personally I think NVDA deserves kudos for this and am happy they are pushing supercomputing. So what if some kids lose a few frames per second in Crysis, cry me a river, there are more important things in the world.

5. And most importantly, NVDA saw money to be made in mobile computing. Smartphones and tablets/netbooks are growing much faster than laptops and desktops, especially with the mighty Apple marketing machine pushing iPhones and iPads. NVDA wanted to ride the trend by making their own systems-on-chips with ARM-licensed CPUs and NVDA-designed graphics processing, in the hopes that their SoCs would power rivals to the iPhone/iPad. So far NVDA has not made much progress, but they might be turning the corner soon.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I think people underestimate the sheer resilience of the PC parts market. While yes, mobile/cell phone/SoC tech is on the rise with increased users and revenue, I think the main PC market will always exist. Mobile tech just can't compete when battery life is brought to the equation. Technologies will mix and interface with eachother, but neither will be really killed off.

For both Nvidia and AMD, their graphics divisions can be directly tied to three main employments of their graphics technology outside of general video acceleration to off load the CPU blah blah blah:

Gaming
Practical graphics work (AutoCAD, Video Editing, Photoshop, etc)
GPGPU computing

I don't know how much money is made for other than gaming purposes, but Nvidia and AMD have both made great strides in these areas, and are completely dominant and necessary for industries that require their products. Honestly I think for personal computers, AMDs graphics division and Nvidia could both survive (depending on who Apple relied on over a long term period) just from having their graphics in Apple computers. Most Apple computer models, including all current iMacs ship with dedicated graphics. Obviously dedicated higher end GPUs are still seen as mandatory for Macs, because they are useful for so much other than gaming. I would love to see the numbers of shipped specific GPUs for the past few years of Macbook Pros.

As for moving into new areas, I would be interested in Nvidia taking it's ARM license to build chips with ARM cores + CUDA graphics cores perhaps building a completely new type of computer for a market that begs for some individuality. A Linux or Android + OpenGL based machine completely geared towards video and photo editing could make a splash, especially if it's cheap and user friendly. It could be completely ready to make use of games and apps reserved for the current mobile market, as well as being completely PC friendly. Four 2 GHz ARM cores, 128 CUDA Cores, 4 GB RAM, HDMI out, something like that. Could cause an uproar.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I think people underestimate the sheer resilience of the PC parts market. While yes, mobile/cell phone/SoC tech is on the rise with increased users and revenue, I think the main PC market will always exist. Mobile tech just can't compete when battery life is brought to the equation. Technologies will mix and interface with eachother, but neither will be really killed off.
The PC market isn't dying, it's just that nVidia is being pushed out of it. If nVidia were able to continue making chipsets, and even take advantage of the Atom going to TSMC for an x86 SoC, they might have some hope in the PC market. As it is, it's basically going to help subsidize them getting into other markets, many of which are already saturated, while some others are fledgling niches, making all of them uphill battles for nVidia.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
The PC market isn't dying, it's just that nVidia is being pushed out of it. If nVidia were able to continue making chipsets, and even take advantage of the Atom going to TSMC for an x86 SoC, they might have some hope in the PC market. As it is, it's basically going to help subsidize them getting into other markets, many of which are already saturated, while some others are fledgling niches, making all of them uphill battles for nVidia.

Now that Tegra is getting some varying success, and Nvidia has an ARM license, I think they will create some unique SoCs and solutions for new products. I hope they end up doing well, as my experience with Nvidia over the years has always been pretty good, and the same with ATi products. For laptops, I think the Asus Eee Transformer is where things need to be heading as far as mobile computing goes, and I'm happy too see Nvidia get to be the hardware provider for it. I have a friend who is getting one, so I should have some first hand experience with it soon.

All this talk has made me nostalgic for my "olden days" of PC gaming, when the this console generation was still young, XP was the norm, DX10was still in development, and I was still gaming on laptops with low end dedicated graphics lol. I'm even making a play through Half Life 2 currently.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
That's not true. NVDA simply wants to make money, bottom line, and they know that discrete GPUs for PCs are a stagnant market. They want to be competitive there, but the growth opportunities are elsewhere right now.

My point is that all of the talk of diversity and other growing markets is nothing but a smoke screen. It doesn't explain why nVidia is loosing market share in discrete graphics. It doesn't address the subject of this thread. It's off topic and has derailed the thread. While it all might very well be true and speak to the longevity of nVidia as a company, the thread is (well was) about nVidia loosing market share in discrete gpu's. They are selling less of them, percentage wise, and AMD and Intel are selling more. This was 1st Q of 2011. The excuse that Fermi was late is also a smoke screen. We aren't even in the same generation anymore. This is 6000 series vs. 500 series. Not Fermi vs. Evergreen, where Fermi was a no show for 6mos. If someone would like to take a shot at giving an explanation as to why this happened feel free. It's not because they have signed any contracts to sell Tegra to anyone for portable devices though.

Sorry if I wasn't clear in previous posts. I often tend to be to sarcastic for my own good.
 

argor

Junior Member
Jan 13, 2010
5
0
0
"If I were the CEO for Qualcomm orTexas Instruments, I’d be worried. Tegra 2 is already faster than OMAP4 (which doesn’t ship until the 2nd half of this year)
omap4 is already out it is used in the playbook for example
That means that Qualcomm and TI’s products will be too slow and a generation behind by the time that they ship.
the omap4 is faster than tegra 2 and the new Qualcomm core should be faster than tegra 2 also the Qualcomm MSM8960
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
The current situation began when AMD beat Nvidia to the DX11 punch with Evergreen, since Fermi was late thanks to development issues. AMD made the first move, took market share, and while there are faster Fermi and 500 series parts than AMDs best, they are hot and more expensive. While my product knowledge on the 500 series isn't too good, the 400 series wasn't as strong a line as the Evergreens were through the whole price and performance regime. GTX 480, 470, and 465 were expensive, hot and sucked alot of juice. The GTX 465 was the lemon of the series (for the obvious reason of being a super cut down Fermi), with the GTX 460 1 GB the crown jewel, continuing Nvidia's legacy of very awesome middle-high range products. AMD had triple screen connectivity from the get go, better high screen resolution performance, and despite weaker tessellation performance, there isn't much to really make AMD GPUs uncompetitive in this regard. Tessellation just hasn't caught on as quickly as we want it to. Northern Islands alleviated the tessellation issue, brought a real competitor to the GTX 460's area of performance and price, and maintained AMDs vast performance per watt and transistor advantage by moving to 4 way VLIW for the 68xx and above GPUs. The 500 series was late yet again, and is just really a rehash of the 400. It doesn't help that GTX 590s have been power issues. Honestly I think Nvidia should just abandon dual GPU boards for the time being, unless they go with something like a GTX 560 x2.
 

argor

Junior Member
Jan 13, 2010
5
0
0
I was wondering because the Tegra 2 in Xoom seems to be onpar or slower compared to PowerVR SGX 543MP2 according to Anandtech's iPad review so if the NGP containing the PowerVR SGX 543MP4 ships by the end of the year then there seems to be some competition for nVidia's Tegra 4 which I am aassuming is a quadcore SoC (for now atleast).
for more info tegra 4 read this http://www.beyond3d.com/content/articles/112


but the soc king of grafix in the tegra 3-4 generation will not be tegra
it will be the ST-Ericsson A9600
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Oh I almost forgot, AMD has been doing very well in mobile dedicated graphics, especially the middle end. The number of computers with 5450s and 5650s is absolutely staggering. Once again, AMDs performance per watt advantage allowed them to really bite away at Nvidia. Also there are not 4 difference series of AMD graphics in current computers, there are 2 (5 and 6 series), which keeps things a big more reigned in for consumers on understanding what is better than what instead of just confusing the hell out of them. AMD is doing a good job in this regard, getting the new series out the door and replacing the older one.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |