nVidia is hard to impossible to merge with anyone due to their company culture. Same reason why Intel would never buy nVidia. Would be like mixing oil and water.
Trying to make JHH fit into Intel's culture would be like trying to fit an ocelot into a biscuit tin.
The conspiracy theorist in me says that Intel/Nvidia is paying these guys to leave so they can replace them with people who will make sure AMD stays uncompetitive. The logical person in me says AMD is a bad company to work for.
And the GPU side is a little better
Sometimes that biscuit tin needs an ocelot.
GPU side is a lot better, not just a little. I don't keep up with the financials. I am merely speaking towards the quality of product. On the GPU side that have skus that are competitive from top to bottom.
Yes, it's just that the ratio of bad new to good new is just too high. Intel has 'bad' news, but it's not a big deal when the bad news (bug, delays) occurs 1:10 compared to good news. And Intel marketing seems to do a really good job at pushing that goods news to make sure it get's noticed.Does AMD ever have good news?
GPU side isn't that good - it's not competitive where it matters. Yes consumer cards are good, but there is little money in them. The big markups are in workstation and gpu compute, nvidia has had those pretty well locked down for years. AMD could compete there, but that would take a big investment in software and drivers, something AMD doesn't seem to want to do.
While the cpu side isn't very competitive they are most competitive as chips for servers and super computers where the markup is much bigger.
What's most worrying is I don't think AMD has a clear vision of where they are going. All I see with AMD is fusion, which is basically boils down to competing with Intel for the budget laptop market - limited profit in that. Beyond that there is no unified vision - they seem dip their feet in various markets, produce a set of marketing slides but don't really commit so don't really get anywhere.
Isn't that where the market is heading? the average person doesn't drop a grand on a high end desktop. I think a smart move would be to focus more on the low power and mobile space
GPU side isn't that good - it's not competitive where it matters. Yes consumer cards are good, but there is little money in them. The big markups are in workstation and gpu compute, nvidia has had those pretty well locked down for years. AMD could compete there, but that would take a big investment in software and drivers, something AMD doesn't seem to want to do.
While the cpu side isn't very competitive they are most competitive as chips for servers and super computers where the markup is much bigger.
What's most worrying is I don't think AMD has a clear vision of where they are going. All I see with AMD is fusion, which is basically boils down to competing with Intel for the budget laptop market - limited profit in that. Beyond that there is no unified vision - they seem dip their feet in various markets, produce a set of marketing slides but don't really commit so don't really get anywhere.
Are they really competitive in the server segment? Doesnt Intel have like 90 or 95 percent of the server market?
using hyperbole, and being misleading is the way to improve their image
Yeah. They had something like around 20% just 4-5 years ago and is now less than 5%. That is a HUGE loss in market share.
Once Intel's Core tech made it's way into the server space around 2007, it started really eating into the ground that Opterons had made from 2003-2007.
Are they really competitive in the server segment? Doesnt Intel have like 90 or 95 percent of the server market?
I suppose not now, but still more competitive then in most markets, the new BD architecture was built for servers first. While they have lost a lot they still have some % of the market, and you still see some AMD powered super computers as their interconnect is so good.
I suppose not now, but still more competitive then in most markets, the new BD architecture was built for servers first. While they have lost a lot they still have some % of the market, and you still see some AMD powered super computers as their interconnect is so good.
They might not have the ability to supply the amount and type of SKUs Apple wanted due to production issues at TSMC. (TSMC always have wafer supply issue, not yield issues. )How did AMD lose to Nvidia for the current round of Macbook Pros and iMacs with dedicated graphics?
It might be a loss in potential cashflow, but not necessarily a loss for profitability. Apple is famous for sucking every last drop of blood out of their enemies, ur, I mean "partners". Selling 5 or 50 million GPU / APU for a $0.1 profit on each unit won't make AMD much money.That's a big loss in terms of cash (5 million+ GPUs across all models per quarter term?).
Best wishes to Jim, we really need a strong AMD, if only to spur Intel.I hope the return of Jim Keller turns things around, but anything he does will take two years minimum before the changes come into effect or see the light of day in production processors.
How did AMD lose to Nvidia for the current round of Macbook Pros and iMacs with dedicated graphics? That's a big loss in terms of cash (5 million+ GPUs across all models per quarter term?). Did Nvidia's products beat the rival AMD GPUs in OpenCL performance at set price points, because I figured AMD had OpenCL locked down pretty hard or was it just Tahiti that truly outdid Nvidia?
Could it be related to how well Optimus works compared to AMDs switchable graphics? I know they have had a really rough time for over a year trying to get their switchable graphics working properly. I'd imagine that alone could have been a deal breaker, because without functional switchable graphics for GPUs, they consume too much power and significantly effect battery life.