I give up being a CPU enthusiast.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cbn

Lifer
Mar 27, 2009
12,968
221
106
Then there's the whole reason CPU enthusiasts even exist in the first place: The Open nature of the x86 Architecture. Would we really care much about the CPU, other than it just being a bullet point, if we couldn't pick/choose then Install it into a Motherboard? I highly doubt it.

Not sure if this will be the case with ARM systems.

If anything I would imagine more freedom if some type of desktop Android (for Project Denver,etc) really does Surface. (re: With an Open Source Android OS, it will be cheaper to swap hardware back and forth compared to buying a retail Windows License every time a new version is released.)
 

sandorski

No Lifer
Oct 10, 1999
70,131
5,658
126
Isn't the graphics discrepancy supposed to change with the release of Nvidia Project Denver?

With that being said I almost wonder if Nvidia is a little too far ahead of the ARM game....too soon.

I'm no expert on these things, so take everything I post with salt, but Feature wise the gap may change. Performance wise I don't think ARM is anywhere close to being competitive.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I'm no expert on these things, so take everything I post with salt, but Feature wise the gap may change. Performance wise I don't think ARM is anywhere close to being competitive.
I think you guys are neglecting the fact that these ARM chips are being used in phones.

If they were to design an ARM CPU for a desktop, it would be an absolute beast. Considering what they are doing now with a 2W TDP, if you multiply that by 60, I have a feeling that we would see very competitive performance compared to what we're using currently in terms of x86 desktop CPUs.
 

sandorski

No Lifer
Oct 10, 1999
70,131
5,658
126
I think you guys are neglecting the fact that these ARM chips are being used in phones.

If they were to design an ARM CPU for a desktop, it would be an absolute beast. Considering what they are doing now with a 2W TDP, if you multiply that by 60, I have a feeling that we would see very competitive performance compared to what we're using currently in terms of x86 desktop CPUs.

The fact they're used in Phones is why I have posted what I have posted. You can't just multiply by 60 and expect a level playing field. For all we know the ARM architecture simply couldn't scale that high without running into serious issues.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
I think you guys are neglecting the fact that these ARM chips are being used in phones.

If they were to design an ARM CPU for a desktop, it would be an absolute beast. Considering what they are doing now with a 2W TDP, if you multiply that by 60, I have a feeling that we would see very competitive performance compared to what we're using currently in terms of x86 desktop CPUs.

It doesn't work that way.

See PowerPC, Apple.
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
The 2500k is the most bang for buck CPU that we have received in a long time. How is that not exciting?

What about Ivy Bridge and trigate?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
The 2500k is the most bang for buck CPU that we have received in a long time. How is that not exciting?

What about Ivy Bridge and trigate?
The price and the lack of progress since its release are not exciting whatsoever.

Ivy Bridge looks like it will be an incremental upgrade.
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
The price and the lack of progress since its release are not exciting whatsoever.

Ivy Bridge looks like it will be an incremental upgrade.

Ok, what about trigate? Should do wonders for power consumption. AFAIK, Intel's tick tock cadence hasn't been lagging behind. I don't know about you, but I don't need to buy a new cpu every 6 months.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,453
10,120
126
I don't know, I think that there are still some exciting spots in the CPU landscape. For example, overclocking a GPU-less LLano CPU to something north of 4Ghz, on the cheap.

I have no idea of the longevity of socket FM1, but it still holds some excitement for me.

Personally, I'm waiting for the multiplier-unlocked LLano chips to trickle out. I might just get one for kicks. I already have toyed with an E350 CPU/mobo.
 

86waterpumper

Senior member
Jan 18, 2010
378
0
0
Don't you see it happening? ARM stuff will become "good enough", and given its market penetration and killer apps in the phones/tablets/whatever, it will push upwards and become the dominant ISA, killing x86-64.

I don't see this happening anytime soon but so what if it does? Maybe we will all be building custom water cooled arm systems before long

I don't understand how being a "cpu enthusiast" is a hobby in and of itself anyhow? I love to build my own computers as much as the next guy here, but even if one has the disposable income and could afford to build a new computer once every month not many would want the hassle and trouble of doing it. Tinkering around with it and swapping parts or upgrading is one thing, but most don't change cpus that often. If anything is a let down, it is the supreme ease that the modern day cpus will o/c. They have taken alot of the fun out of it. To me, being a enthusiast used to mean swapping some jumpers around and possibly not even getting a boot

I don't see for the life of me how anyone can complain. For 200 dollars you can buy a 2500k that has a incredible amount of performance even at stock settings and is economical on power usage too.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
I don't see for the life of me how anyone can complain. For 200 dollars you can buy a 2500k that has a incredible amount of performance even at stock settings and is economical on power usage too.

I think the drop in price says it all.

Fact is most people aren't bottlenecked by even the most entry level Intel mainstream CPU.

This makes me wonder what will happen to the atom line-up when the default state for the entry Ivy Bridge becomes "Quad core" (<----Someone correct me if I am wrong on this, I haven't been following tech much over the last 6 months).

Will Atom move upscale (in performance) for the masses?

EDIT: Found the Anandtech Atom update article--->http://www.anandtech.com/show/4829/there-will-be-two-32nm-atom-socs-in-2012-medfield-and-one-other (Good stuff!)
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You know what, RussianSensation, you made a very nice post.

There are tons of incredible things we can buy these days for our computers.

The OP is correct though. The CPU market is stale and has stagnated. The CPU is the most critical component in many ways as well.

I say take your gf / wife / mother / sister / brother out for a nice dinner. There will come a time when you'll have an opportunity to spend $300 on a fast CPU. Again, there are 2 ways to look at this:

1) You can be really upset that your current CPU isn't obsolete and upgrade for "fun";
2) You can be happy that you aren't "forced" to blow $$ unnecessarily every 18-24 months as was in the past. Instead, put it aside and just upgrade later! ^_^
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
I could not disagree more with the entire premise of this melodramatic post. It is fun to respond to though.

Just because AMD crapped out on their overhyped Bulldozer the first time doesn't mean it's all over with them. Lost in all the noise is the fact that AMD remained profitable in Q3 and did better than the street estimate. Sure, it wasn't a blockbuster but AMD is far from dead. A BD respin could change a lot of things for them. Cross your fingers.

And ARM is doing a great job, sure, but they are a ways from having a credible desktop and mainstream portable alternative. That might change someday. Also, they are providing increasingly credible competition for Intel. Where's the harm in that?

Also, on the desktop side, things could be better on the competition front but they've never been this good technologically. Intel's CPU strategy has been stunningly successful. The stuff you can buy for $200 now is simply a marvel. The next upgrade will be even better. I don't want to rebuild my machine every year. Intel's finally ensured that I don't have to.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Just because AMD crapped out on their overhyped Bulldozer the first time doesn't mean it's all over with them. Lost in all the noise is the fact that AMD remained profitable in Q3 and did better than the street estimate. Sure, it wasn't a blockbuster but AMD is far from dead. A BD respin could change a lot of things for them. Cross your fingers.

I figured the Bulldozer launch would remind us of Nvidia Fermi (ie, New design on a new process). Surely the Bulldozer respin will clean things up dramatically in the same way it did for Nvidia.

Moving on from the server side of things, I wonder what strategy AMD will take on the consumer devices? It seems like the smart phone docked to lapdock is becoming the new progression for Laptops. (not the usual buy a new laptop every two years with a new custom fitted laptop mainboard inside it). So what is AMD's plan for this product category?
 
Aug 11, 2008
10,451
642
126
I dont build my own computers, so I guess I would not be a real CPU enthusiast from that point of view. However, I do upgrade my computer as far a ram, graphics cards, etc, and follow computer related issues very closely.

So for my point of view, I am not that worried about ARM. I am more concerned about the end product. Considering the progress Intel is making in the x86 field, if ARM can catch up and surpass this in a few years, I dont see this as a bad thing. I would be more concerned about software compatability and gaming. Granted there are a lot of games for ARM, but not really the kind we see now for the Direct X platform. I guess what I am trying to say is I dont really care so much about whether the architecture is ARM or x86 as what the final product can do. But honestly, I can see ARM taking over the entertainment/consumer segment, but I cant really see them taking over business/productivity/scientific computing for quite some time. Too many compatability issues, unless an ARM platform can be made that seamlessly performs current productivity software.
 

toolbag

Member
Dec 25, 2010
69
0
0
Some people have an addiction to buying computer parts. It's the same as women always needing new clothes. What can ya do you know? nothing but let them bicker.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
I figured the Bulldozer launch would remind us of Nvidia Fermi (ie, New design on a new process). Surely the Bulldozer respin will clean things up dramatically in the same way it did for Nvidia.

I keep seeing this but it is nowhere near analogous.

Fermi, when released was late, hot, and power hungry. Yes.

What is wasn't was a part that could barely compete with nvidia's previous generation. Nor was it a part that was trounced by AMD's offering at the time. (They traded some benchmarks, but overall the GTX 480 was top dog with raw performance at the cost of being more expensive, hotter, and more power hungry). It had the performance to make it worth purchasing if you were going for fastest single card option though.

Bulldozer trades benchmark wins with AMD's previous gen, and loses to the intel part in all but a few cherry-picked scenarios.

Why do people keep making this comparison?
 
Last edited:

86waterpumper

Senior member
Jan 18, 2010
378
0
0
Some people have an addiction to buying computer parts. It's the same as women always needing new clothes. What can ya do you know? nothing but let them bicker.

haha right, and this is why people need to have more than one hobby If the computer world is stagnant, throw some money at cars, or motorcycles, or whatever else you are into. Before you know it, there will be fancy new computer hardware to buy...

I can see people getting bored I guess with no new cpus coming out especially since the dozer was a letdown. I really haven't heard anyone complaining about their cpu as far as it being unable to keep up with their workloads etc. I know some that run big render farms and things can always use more power, but most people could probably still be just fine with a phenom II. I don't know what the deal is maybe people want to do folding@home while they play games @ 2560 x 1600. I guess if any blame is placed it does not need to be placed at the feet of either intel or amd. AMD may be behind, but they still are able to produce a quad core cpu with decent graphics onboard that is capable of doing more than enough for most users. It's obvious to stress any decent cpu from the last several years it is going to take some different games or more demanding software than we are using right now.
 
Last edited:

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Fermi, when released was late, hot, and power hungry. Yes.


Why do people keep making this comparison?

Because when Fermi was released, the nvidia apologists downplayed the power usage as if it didn't matter at all. Now that AMD has the part with "excessive" power usage, it's suddenly a huge issue.

Not to mention the fact that a GTX 480 using over 150 watts more than it's competitor is really an order of magnitude worse than bulldozer using an extra 20-50 watts on heavy load.
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Nice that you completely deleted the performance part.

People are willing to excuse power when the part performs. When the part is sub-par performance *and* power hungry, it's just junk.

Why exactly did you totally ignore the point of my post and only chose to respond to that specific portion?
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Nice that you completely deleted the performance part.

People are willing to excuse power when the part performs. When the part is sub-par performance *and* power hungry, it's just junk.

Why exactly did you totally ignore the point of my post and only chose to respond to that specific portion?

I only chose to respond to the part of your post that was in error which I disagreed with. I'm not going to do a line by line "i agree" with your entire post, "i agree" posts are generally just worthless spam in forums.

And by most measures, Fermi's performance *was* sub par on release. The 480 barely beat a 5870, yet cost a lot more and was in turn beaten by a 5970 on most benchmarks (which still used less power, even as a dual GPU card). It's performance was only good if you ignored price and power usage and refused to use a dual GPU card.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
And by most measures, Fermi's performance *was* sub par on release. The 480 barely beat a 5870, yet cost a lot more and was in turn beaten by a 5970 on most benchmarks (which still used less power, even as a dual GPU card).

Wow, it is subpar because it "barely" beat its competitor. How much does it have to beat the competing card by to not be subpar in performance exactly? What would you call the performance of the 5870? It was slower than a "subpar" card.... I always considered the 5870 to be a solid performer, just not quite as fast as the 480, but worth it if you wanted to use less power. And you talk about others being "apologists". I think I understand why you chose to ignore the point of my post now.
 
Last edited:

mosox

Senior member
Oct 22, 2010
434
0
0
Since an old C2Q or a cheap Ph II X4 or even the i3 or Athlon II X3/X4 can do anything a regular user needs I agree that the CPU thinghy is not so exciting anymore despite what the enthusiasts post on the Internet or what the HW sites say. It's like for cars, nobody really needs a 200+ MPH car except for a handful of enthusiasts.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Wow, it is subpar because it "barely" beat its competitor.

No, you misread. It didn't beat it's competitor. It barely beat the 5870, which was priced lower and used less power, aka NOT it's direct competitor. Even then, it lost several benchmarks to a card hundreds of dollars cheaper which used less than half the power at load. Very embarrassing. It was also unable to touch the 5970, which was priced a bit higher but used less power despite being a dual GPU card.

It's only saving grace was that in it's little price/performance niche between the 5870 and 5970 it didn't have a direct competitor. But did it's performance justify the power usage? Not at all.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |