In what Intel CPU generation will 8 cores be introduced?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Fjodor2001

Diamond Member
Feb 6, 2010
3,938
408
126
This is SB-E at 32nm, an 8-core die, 2,27B transistors at 435mm2
From this die we have the Core i7 39xx series 6-core CPUs.



And this is SB-E at 22nm, an 8-core die, 2,27B transistors at 220-240mm2



Just for reference, SB die is 216mm2 and Quad core SB-E (Core i7 3820) is at 294mm2.

I dont believe anyone should have any doubt that Intel cant release a 6 core IV-E at $300 and 8-core at $500 and up and make tons of profit both in High End desktop and Server.

This is not a question IF they can do it, this is a question If they want to do it

Very good post!

So, basically already with existing 22 nm processing technology Intel could release 6C/8C mainstream CPUs. But that is of course under the assumption that it does not contain any iGPU (at least none of any reasonable size). And currently Intel does seem to want to sell mainstream CPUs without iGPU.

But looking forward just one node shrink (Broadwell), Intel could double both the iGPU and the number of CPU cores (and associated caches) compared to IB, and still stay within the same die size as current IB CPUs! Pretty amazing when you think of it.

Looking forward yet one more node shrink (Skymont), they could double that again.

I know there is a problem with just adding more cores, since it from a SW perspective can be hard to utilize then efficiently. That is likely why Intel over the last 4-5 years instead has spent the advancements in processing technology (i.e. node shrinks) on adding other stuff on the die, such as iGPU / VRM / Integrated Memory controller, and lately also to reduce the die size to produce chips with lower TDP (important for e.g. Ultrabooks).

But we are reaching a point where most "stuff" has been integrated on the die, and the iGPU is fast enough for most people. So now the question arises what to do with future advancements in processing technology? Here I have to agree with BenchPress that the fact that Intel has decided to add TSX on Haswell likely is a sign of what to come...

But of course you never know until details about the coming CPU generations start to appear. So based on previous experiences, does anyone know when we can expect Intel to present more details on Broadwell (and later CPU generations)? Can it be already on the next Intel Developer Forum in September? When did details about Haswell start to appear?
 
Last edited:

intangir

Member
Jun 13, 2005
113
0
76
That was 6 years ago. Since we are discussing future Intel CPU generations, looking forward ~4 years the statement will be 10 years old. Things change and evolve. Once upon a time someone said that nobody would ever need more than 640 kB RAM, remember.

And by the same token, the more things change, the more things stay the same. Some things remain fundamentally unchanged. RAM capacity has been following Moore's Law, doubling every couple of years, but the number of cores that software uses has not been increasing anything like exponentially.

So no, I don't think it's very valid for Intel's current CPU plans. See the comment by BenchPress which also points to Intel preparing for release of mainstream CPUs with more than 4 cores not far from now.

Are we at all sure that TSX wasn't primarily intended for server/HPC software? It could benefit desktop too, but I'm sure it was server and compute uses that actually drove Intel to action now.

So, no, I don't think TSX indicates in any way that Intel will increase the desktop core count; they could more than justify its existence by only increasing their server/workstation core counts.

AVX and similar does not require that many transistors compared to how much the "transistor per die area" count has grown. See this post, which shows that the number of transistors per core has grown by 200% in the last 6 years, at the same time as the transistor count per die area has grown by 800%!

Well, the more cores you add, the higher the proportion of die that has to be devoted to inter-core communication. Four cores has 6 possible core pairs that need to communicate; eight cores has 28! So doubling the cores more than quadruples the number of wires that must be laid out between them; interconnect scales quadratically with the number of cores. It seems to me that the better investment is in fatter cores rather than more of them. And Intel made sure AVX was expandable to 1024-bit registers, quadrupliing the current size. I'm sure four times the execution width would take a significant amount of area!

The GPU can of course be expanded though, and it can require a lot of transistors. But if all the increased transistor count over the next 4-5 years is spent on the GPU, we will be having processor chips consisting of roughly 80% GPU cores and 20% CPU cores. Is that realistic to expect?

Right now, GPU compute is probably limited more by memory bandwidth than die space; but if Charlie Demerjian is to be believed, Intel has a solution for that in the form of stacked GPU memory.

http://www.anandtech.com/show/5876/the-rest-of-the-ivy-bridge-die-sizes

Anand also says Haswell will increase the GPU 2.5x from Ivy Bridge, from 16 EUs to 40! I don't expect them to scale the cores nearly that much. And Broadwell will probably continue the same trend, unless I miss my guess.
 

intangir

Member
Jun 13, 2005
113
0
76
But we are reaching a point where most "stuff" has been integrated on the die, and the iGPU is fast enough for most people.

But... isn't the CPU itself fast enough for most people? As long as AMD is still making better GPUs, I think Intel will be devoting die space to outdoing AMD. Hooray competition!

But of course you never know until details about the coming CPU generations start to appear. So based on previous experiences, does anyone know when we can expect Intel to present more details on Broadwell (and later CPU generations)? Can it be already on the next Intel Developer Forum in September? When did details about Haswell start to appear?

I don't know if we'll hear much about Broadwell at the September IDF; it appears to be all about Haswell. It should be just as exciting for me as the September 2010 IDF, when Intel released the slides about the Sandy Bridge microarchitecture. Intel might drop a couple tidbits about Broadwell, just to remind us its on the way, but I wouldn't expect much detail until next year, if ever.


https://intel.activeevents.com/sf12/scheduler/catalog.do
SPCS001 Technology Insight: Intel® Next Generation Microarchitecture Code Name Haswell Technology Insight 09/11/12 10:30 am
Abstract: Please join us for an update on Intel’s latest microarchitecture code name Haswell. This innovative new core is built on Intel’s leading 22nm process technology, and will offer significant improvements in performance and energy consumption as well as supporting additional enhancements for graphics and support for powerful new instructions.

ARCS001 Intel® Next Generation Microarchitecture Code Name Haswell: New Processor Innovations Lecture Session 09/11/12 2:00 pm
Abstract: This session will share more information on Intel’s next generation microarchitecture code name Haswell. This new architecture will be built on Intel’s 22nm process technology and includes many architectural features and improvements to boost performance while reducing energy consumption. This session will discuss these features and improvements providing useful insight for both hardware and software developers.
Topics include:
• Haswell’s microarchitecture innovations for performance and power
• Intel® Advanced Vector Extensions 2 (Intel® AVX2) hardware
• Intel® Transactional Synchronization Extensions (Intel® TSX) hardware

GVCS003 Media Innovations in the Next Generation Intel® Microarchitecture, Code Name Haswell Lecture Session 09/12/12 12:45 pm
Abstract: In this session, you will hear about the media innovations (feature, performance and quality) in the new Intel® Microarchitecture code name Haswell. These innovations continue to improve the user experience on client PC platforms and enable exciting new applications.
Topics include:
• Fundamentals and performance of this new media microarchitecture
• Performance and quality improvements for Intel® Quick Sync Video Technology
• Low power media features for Ultrabook™
• New media capabilities through hardware and software co-design

Ah, looking through past articles, it looks like Intel went into detail about Ivy Bridge at the Sept 2011 IDF. So I'd expect similar levels of detail about Broadwell at the Sept 2013 IDF.

http://www.anandtech.com/show/4830/intels-ivy-bridge-architecture-exposed
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
That was 6 years ago. Since we are discussing future Intel CPU generations, looking forward ~4 years the statement will be 10 years old. Things change and evolve. Once upon a time someone said that nobody would ever need more than 640 kB RAM, remember.

Actually "someone" never said that...but the myth is strong.
 

dagamer34

Platinum Member
Aug 15, 2005
2,591
0
71
I think Intel has the bigger problem of the "normal" market seeing it's integrated GPUs as crap and deciding to go elsewhere.

Those that need more cores will buy them, those that don't need them will ignore having more, but everyone needs a GPU that a hardware-accelerated OS can use, and with everyone still writing off Intel GPUs, that's where die space will end up going in the future as those transistors are far more likely be used at their greatest potential than in a CPU core.

Because what isn't more massively parallel than rendering a bunch of triangles?
 

denev2004

Member
Dec 3, 2011
105
1
0
I think Intel has the bigger problem of the "normal" market seeing it's integrated GPUs as crap and deciding to go elsewhere.

Those that need more cores will buy them, those that don't need them will ignore having more, but everyone needs a GPU that a hardware-accelerated OS can use, and with everyone still writing off Intel GPUs, that's where die space will end up going in the future as those transistors are far more likely be used at their greatest potential than in a CPU core.

Because what isn't more massively parallel than rendering a bunch of triangles?
That's not a problem on desktop as long as Intel provide them at a more friendly price, which is definitely not a problem as they are quite good at manufacturing, unlike GF. Also NVIDIA's GPU was and will continue to be a good partner of Intel‘s CPU on desktop market untill Skylake release.
 

GrumpyMan

Diamond Member
May 14, 2001
5,778
262
136
So if MS is trying to kill the pc market like everyone says with Metro/W8 blah blah blah, and everyone will start using tablets in a few years, what incentive does Intel have to keep producing six and eight core cpus if they will have no operating system to drive and support them?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
So if MS is trying to kill the pc market like everyone says with Metro/W8 blah blah blah, and everyone will start using tablets in a few years, what incentive does Intel have to keep producing six and eight core cpus if they will have no operating system to drive and support them?

Intel is heavily behind Linux btw.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,938
408
126
I think Intel has the bigger problem of the "normal" market seeing it's integrated GPUs as crap and deciding to go elsewhere.

Those that need more cores will buy them, those that don't need them will ignore having more, but everyone needs a GPU that a hardware-accelerated OS can use, and with everyone still writing off Intel GPUs, that's where die space will end up going in the future as those transistors are far more likely be used at their greatest potential than in a CPU core.

Because what isn't more massively parallel than rendering a bunch of triangles?

Some questions arise:

1. If iGPU performance currently is more important than than CPU performance, how home we don't have a reverse situation where AMD (which has better iGPU, but worse CPU cores) has 80-85% of the desktop processor market?

2. Does the average user really need better iGPU performance, unless they are gaming or do heavy video editing tasks? In that case for what use? (It can't be for general compute tasks, since you say the CPU is already fast enough for most users.)

3. Assume we in about 4 years (2 node shrinks) really end up in a situation where most users are satisfied with the CPU performance, and where 80% of the transistor budget is spent on iGPU cores and 20% on CPU cores (i.e. no additional CPU cores, and the increased transistor count from 2 node shrinks is instead spent on the iGPU). Won't it then make more sense for users to not upgrade their CPU and just buy a new discrete GFX card instead? What will be the key selling point for getting a new Intel CPU for your desktop PC then?
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
And your point is exactly what? That you weren't able to answer any of the questions in my post?

My point is that it will be about 4 years. I'm not interested in igpu, integrated blah blah garbage. I can't stand the pain of reading words like that. I don't care about integrated gpu garbage so I won't comment on it.
The point of buying a new Intel cpu in 4 years is that it will be faster than whats out now. Get it?
CPU-Z-ah ....... Han bowl of cream, I do not know!
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
3,938
408
126
My point is that it will be about 4 years. I'm not interested in igpu, integrated blah blah garbage. I can't stand the pain of reading words like that. I don't care about integrated gpu garbage so I won't comment on it.
The point of buying a new Intel cpu in 4 years is that it will be faster than whats out now. Get it?

No, I don't get it at all. The whole point with my post was that I question that whether Intel will really put that must focus on the iGPU that we will end up with processors where ~80% of the transistor budget is spent on the iGPU (in 2 node shrinks / 4 years), which some seem to believe.

The reason I question this is that:

* People who really care about GPU performance (like you I assume?) will still buy a discrete GFX card anyway, so 80% of the transistors in the processor will be wasted for them.

* People that don't care about GPU performance don't need an iGPU that consumes 80% of the transistor budget, so 80% (well not completely, but still) of the transistors in the processor will be wasted for them for them too.

=> Everyone loses...

In your case (assuming you won't be using the iGPU), wouldn't you rather get a processor where 80% of the transistor budget was not spent on the iGPU? I.e. you'd either get the same processing power but at a lower price, or more processing power but at the same price?
 

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
No, I don't get it at all. The whole point with my post was that I question that whether Intel will really put that must focus on the iGPU that we will end up with processors where ~80% of the transistor budget is spent on the iGPU (in 2 node shrinks / 4 years), which some seem to believe.

The reason I question this is that:

* People who really care about GPU performance (like you I assume?) will still buy a discrete GFX card anyway, so 80% of the transistors in the processor will be wasted for them.

* People that don't care about GPU performance don't need an iGPU that consumes 80% of the transistor budget, so 80% (well not completely, but still) of the transistors in the processor will be wasted for them for them too.

=> Everyone loses...

In your case (assuming you won't be using the iGPU), wouldn't you rather get a processor where 80% of the transistor budget was not spent on the iGPU? I.e. you'd either get the same processing power but at a lower price, or more processing power but at the same price?

They might use the new die space for crappy gpus, yes it might happen. It happened with sandy bridge and it will happen again. They need to compete in the garbage department with trash king aMd and their close-to free super cheap igpu square puke cubes they call cpus.
The good news is, I bet that those cheap igpus will have other useful functions for us decent folk who use real gpus, such as quick sync type stuff. Intel isn't retarded, but they have to compete in the retard department with aMd. Too many people want $200 laptops that feature low power, low cost, low performance, low usefulness, low low low low low

CPU-Z-ah ....... Han bowl of cream, I do not know!"
 

piasabird

Lifer
Feb 6, 2002
17,168
60
91
Have you looked at the price of a 6 core processor? You get to a point where only so much data can be shoved through a 32 bit bus. The big question is about how you are going to use all these possible extra cores over 4? I think chipset and computer archetecture would have to be completely redesigned to use more cores.

It is all the extra memory registers you have to have to make all this work that increase the cost of a larger processor. On a quad they use memory registers for each individual core plus there is more memory that is shared.

I could see a motherboard that might be possible to have a front and a back with a motherboard on both sides. I think they do this for RAM and video cards now.
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
14,843
5,457
136
Oh, as to what to do with the extra space, I was going to suggest nand/dram/memmistors (?).
 

philipma1957

Golden Member
Jan 8, 2012
1,714
0
76
For quite some time the CPU performance increases between CPU generations have been modest and evolutionary. The latest significant leap was going from Netburst(P4) -> Conroe(C2D).

Focus on Ivy Bridge & Haswell seems to be to improve Ultrabook power consumption / battery life / iGPU performance. So when can we expect the next major leap? I guess it would mean Intel providing 8 core mainstream CPUs? Clearly that will not happen with Haswell. So will we have to wait until Broadwell, Skylake, Skymont, or beyond? Has anything been communicated by Intel?

with 4 core = 8 core via hyper threading.

Intel has no need to put out an 8 core = 16 core via hyper threading.

My guess is 2020 or 2021.

It is not that they can't do it and sell it. But if you want more then an I7 3700k go for a 6 core 2011 socket. not enough buy a dual socket mobo.

your question is easy to answer based on profit motive of intel.

Delay the 8 core / 16 hyper cpu as long as you can. Unless Amd comes up with some really good 8 core/16 via ht cpus intel will stall.

Once again my guess based on a whim. I could bs and pretend I have logical reasons like the ones I gave above, but frankly I like the number is 2020.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,938
408
126
Have you looked at the price of a 6 core processor? You get to a point where only so much data can be shoved through a 32 bit bus. The big question is about how you are going to use all these possible extra cores over 4? I think chipset and computer archetecture would have to be completely redesigned to use more cores.

It is all the extra memory registers you have to have to make all this work that increase the cost of a larger processor. On a quad they use memory registers for each individual core plus there is more memory that is shared.

I could see a motherboard that might be possible to have a front and a back with a motherboard on both sides. I think they do this for RAM and video cards now.

See this post by AtenRa - and that post discusses current 22 nm technology. 1 or 2 node shrinks from now it should be even less of a problem for Intel to deliver 8 core mainstream CPUs at mid-range prices, if they want to.
 

intangir

Member
Jun 13, 2005
113
0
76
No, I don't get it at all. The whole point with my post was that I question that whether Intel will really put that must focus on the iGPU that we will end up with processors where ~80% of the transistor budget is spent on the iGPU (in 2 node shrinks / 4 years), which some seem to believe.

You may question it, but Intel's declared intent is that the processor is going to become an "increasingly smaller part" of the die, to make room for graphics and other system functions, as we move towards an SoC market.

http://www.pcworld.com/article/2620..._there_is_a_tablet_or_phonecentric_world.html
PCWorld said:
PCW: Today's Ivy Bridge microprocessors have 1.4 billion transistors. What can a chip with, say, 10 billion transistors do that Ivy Bridge can't?

Otellini: For consumer electronic devices, the trend is moving towards a system on chip. So what the microprocessor becomes in than model is an increasingly smaller part of a system on chip die. Right now, the graphics are already much larger than the microprocessor. So you can think about integrating every compute function, and then you start integrating the com functions over time. So what that gives you is incredibly high-performance small devices that are lower cost because they're single chips and more pervasive.

So in near terms the computers are going to get smarter and smaller and faster, right? What's next is changing the user interface. The big change we'll see is adding voice and gesture and cognitive recognition to computing devices. Those kinds of things are going revolutionize the way we interface with machines. Changing the user interface is particularly useful with smaller devices that don't have a keyboard.

This also addresses the problem of dark silicon. We're hitting a wall in Dennard scaling in terms of voltage, so per-transistor power consumption is no longer going down with process shrinks. So, even as chip transistor counts keep going up, the number of transistors we can reasonably power is staying the same. Therefore the percentage of a chip that we can actively use within a chip's power budget is dropping. One way to solve this issue is to make portions of the chip more specialized (see this article for example). If the PCW interview is any guide, it sounds like Intel will start integrating more specialized app-specific functions (think QuickSync).

* People who really care about GPU performance (like you I assume?) will still buy a discrete GFX card anyway, so 80% of the transistors in the processor will be wasted for them.

Well, if HSA takes off, and we're allowed to use those graphics EUs for general-purpose computation, they won't be wasted!
 

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
Most of my favourite games barely utilize 1 core let alone 4 cores! 6 and 8 core Bulldozers is nothing but a gimmick and the ability to have bragging rights with your friends!
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |