News Intel GPUs - Falcon Shores cancelled

Page 245 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DrMrLordX

Lifer
Apr 27, 2000
22,369
12,175
136
Sounds like a question for @DAPUNISHER . My guess would be it is more of an IPC than a thread thing but I have no idea.
It stands to reason that the driver processes will be assigned to underutilized cores. Most of these benchmarks are showing the B580 bogging down on older 6-8 core CPUs.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,283
5,389
136
It stands to reason that the driver processes will be assigned to underutilized cores. Most of these benchmarks are showing the B580 bogging down on older 6-8 core CPUs.

Perhaps. Like I said it's just my hunch. There's not much data out there yet. Maybe search current benchmarks for a game that doesn't use many threads and see how that performs. Might give some insight.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,448
1,316
136
Do we have any data on how B580 performs with extra cores? For example, 5900X vs 5600X?
I have a 5600 and a 5700x. The 5600 should be oc'd to 4.6-4.7ghz like the 5600x. The only difference is a little smoother performance and a few fps better with 8 cores. I thought 6core 12 threads would be worse performance. On Zen 3, it does not make a significant difference. On a new build I would say go with an 8 core CPU, but 6 core CPU's are still very capable.

A lot of these game testers do things as a gimmick for views without taking a big picture look. I do not see the benefit of GPU reviews where 1080P is still the backbone of the review in 2024/2025. 1440P and 4K are more important but look less impressive or different in GPU reviews.

Gaming at 1440P even on low settings is far superior to 1080P at any level of detail. I also think that med/high settings in reviews should be the focal points because that represents real world settings by most gamers. Going with ultra settings is good if you have 12-16GB+ VRAM.
 

DrMrLordX

Lifer
Apr 27, 2000
22,369
12,175
136
stop saying that! they are not old!!

Older, not old.

I have a 5600 and a 5700x. The 5600 should be oc'd to 4.6-4.7ghz like the 5600x. The only difference is a little smoother performance and a few fps better with 8 cores. I thought 6core 12 threads would be worse performance. On Zen 3, it does not make a significant difference. On a new build I would say go with an 8 core CPU, but 6 core CPU's are still very capable.

A lot of these game testers do things as a gimmick for views without taking a big picture look. I do not see the benefit of GPU reviews where 1080P is still the backbone of the review in 2024/2025. 1440P and 4K are more important but look less impressive or different in GPU reviews.

Gaming at 1440P even on low settings is far superior to 1080P at any level of detail. I also think that med/high settings in reviews should be the focal points because that represents real world settings by most gamers. Going with ultra settings is good if you have 12-16GB+ VRAM.
That doesn't really address the problem, though. The B580 is fighting for CPU resources and can't perform well when it can't get them. Adding moar coars should solve the problem, no? For example, bench B580 on 12900k vs 5800X3D or 7800X3D (or similar). X3D CPUs can do quite well against the 12900k with a high-end dGPU, but with B580 are we going to see a different picture?
 

eek2121

Diamond Member
Aug 2, 2005
3,202
4,635
136
Apparently Intel IS dropping a faster part, if the rumor I found is to be believed. It will cost $500-$600 and performance will be where you expect for a part in that range. Unconfirmed source, but given they just announced a launch event, it makes sense.

Wish I had made more effort to follow them. Might’ve been able to dig up juicier stuff.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
30,379
26,927
146
That doesn't really address the problem, though. The B580 is fighting for CPU resources and can't perform well when it can't get them. Adding moar coars should solve the problem, no? For example, bench B580 on 12900k vs 5800X3D or 7800X3D (or similar). X3D CPUs can do quite well against the 12900k with a high-end dGPU, but with B580 are we going to see a different picture?
That's interesting and would make good content. However, It does not resolve the issues the B580 is facing as an entry level card. PCIe 3, rebar, older CPUs, all rob performance disproportionately v. the competition. Upscaling and ray tracing are advantages over AMD, that vanish without a beefy enough CPU.

The 12900K is only 3yrs old. 10th gen is listed as minimum requirement; that's what I want to see tested thoroughly. How badly does a i7 10700k PCIe 3 CPU hold back the 580? I suspect it will be worse than the Ryzen 3600 in games that suffer from PCIe 3 and sit between it and the 5600 in others.
 

mikk

Diamond Member
May 15, 2012
4,286
2,366
136
Apparently Intel IS dropping a faster part, if the rumor I found is to be believed. It will cost $500-$600 and performance will be where you expect for a part in that range. Unconfirmed source, but given they just announced a launch event, it makes sense.

Wish I had made more effort to follow them. Might’ve been able to dig up juicier stuff.

What launch event?
 

511

Golden Member
Jul 12, 2024
1,496
1,333
106
Apparently Intel IS dropping a faster part, if the rumor I found is to be believed. It will cost $500-$600 and performance will be where you expect for a part in that range. Unconfirmed source, but given they just announced a launch event, it makes sense.

Wish I had made more effort to follow them. Might’ve been able to dig up juicier stuff.
Can you share the link for the source?
 

511

Golden Member
Jul 12, 2024
1,496
1,333
106
12900K is the most risk free option or a 265K if you believe Intel's BIOS Update
 

DrMrLordX

Lifer
Apr 27, 2000
22,369
12,175
136
12900K is the most risk free option or a 265K if you believe Intel's BIOS Update
If the 265k is cheap enough that would be another option. Right now it's $360 (approximately) so the 12900k has it beat on price.

Regardless the idea would be to get some data on whether the B580 can benefit from a multicore CPU with more than 6-8 cores. In fact, just benching a 12900k with e-cores enabled and disabled with B580 and some roughly-equivalent NV/AMD video cards, would be illuminating.
 
Reactions: 511

DavidC1

Golden Member
Dec 29, 2023
1,362
2,222
96
If the 265k is cheap enough that would be another option. Right now it's $360
Regardless the idea would be to get some data on whether the B580 can benefit from a multicore CPU with more than 6-8 cores. In fact, just benching a 12900k with e-cores enabled and disabled with B580 and some roughly-equivalent NV/AMD video cards, would be illuminating.
If the driver is so reliant on the CPU, and especially the uarch performance, then Intel solutions with E cores could potentially be catastrophic, the opposite of what people are expecting. You would be way behind Zen 2 levels.

Instead I would assume disabling the E cores would make it perform quite a bit better.

There's no evidence to suggest that it would perform dramatically better on Intel platforms, and this is going to feel bad for some people - other than hopium.
 

511

Golden Member
Jul 12, 2024
1,496
1,333
106
If the driver is so reliant on the CPU, and especially the uarch performance, then Intel solutions with E cores could potentially be catastrophic, the opposite of what people are expecting. You would be way behind Zen 2 levels.

Instead I would assume disabling the E cores would make it perform quite a bit better.

There's no evidence to suggest that it would perform dramatically better on Intel platforms, and this is going to feel bad for some people - other than hopium.
Yes I just want to be sure they are not doing shenanigans
 

Thunder 57

Diamond Member
Aug 19, 2007
3,283
5,389
136
Not a big issue. He can get a boost with a cheap 5700X3D without changing whole platform.

I've already gone from a 2600X to a 5700X on the same platform. Indeed I could get a 5700X3D but I'm not exactly playing the latest games. I'll probably go to Zen 5 or 6 next unless Intel surprises us.
 
Reactions: Tlh97

DrMrLordX

Lifer
Apr 27, 2000
22,369
12,175
136
If the driver is so reliant on the CPU, and especially the uarch performance, then Intel solutions with E cores could potentially be catastrophic, the opposite of what people are expecting. You would be way behind Zen 2 levels.

Instead I would assume disabling the E cores would make it perform quite a bit better.

There's no evidence to suggest that it would perform dramatically better on Intel platforms, and this is going to feel bad for some people - other than hopium.

If that's the case then going back to one of my earlier comments: bench B580 on 5600X vs 5900X and see what changes.
 

DavidC1

Golden Member
Dec 29, 2023
1,362
2,222
96
I think the main difference is that AMD has been improving consistently over the course of past 8 years while Intel has been stuck from 2016 until 2021 in Alderlake. Also, many people can just swap out the CPU for AMD, while Intel forces you to upgrade.

So anyone with recent Intel is Alderlake and up which is Zen 3 level of performance while with AMD you have more varying levels. Zen 2 vs Zen 3 seems to be the barrier where one is usable and one isn't.

If it indeed is the case more core counts matter the 3900X should do ok against 5600X but I heard of user experiences where it's Zen 2/Zen 3 where the lines are crossed. And I heard far less problems on Intel 12th Gen(Alderlake again) even low end parts like 12400.
 
Reactions: Tlh97

DavidC1

Golden Member
Dec 29, 2023
1,362
2,222
96
HWUB retests in a variety of conditions with 5600 vs 9800X3d:

Looked at the important parts.

Cyberpunk adjusting the crowd settings affects the performance depending on the CPU. And it seems there's a certain "cap" to the peak frames. 9800X3D shows bottlenecks he says, no surprise. You could tell simply by 1440p doing a lot better.

What I'm puzzled by is how some are saying the Alchemist was better in terms of overhead? Did I understand right by Bionic_Squash saying that the driver branch of BMG is actually behind ACM by 2-3 months and after that it'll do better?

Some comments on C&C:
Very well written! Hopefully, these insights are allowed past the marketing department, managers, and other bureaucratic mechanisms that "protect" the developers. Hopefully, the driver team has enough bandwidth to address this, compared to working on new features, new hardware, or fixing other bugs.
Some of the issues you can see in the graphs indicate architectural issues, such as batching strategy and DMA chunk sizes. Other things could be hardware choices, such implementing less of the DMA mastering on the GPU side (i.e., using Windows paging to transfer data to the GPU instead of sending a physical address to the GPU DMA engine).
Some of these architectural decisions may be due to their crazy focus on trying to save on costs/die size and taking shortcuts. Ironically, because their hardware sucks, it ended up being bloated anyways.

The part about DMA points to another of their iGPU mindset.
That ISR/DPC graph is crazy. Spending 1/3 of a CPU-second in ISRs for every 1 second of wall clock time running a single threaded DX11 application is insane. Even if they moved that into a DPC, that's still going to be disruptive to other latency-sensitive applications on the system. I'd be curious to know how evenly distributed they are across CPUs and whether it's a lot of short ISRs or a few long ones that add up.

It's unlikely that it's contributing significantly to the poor 3DMark API Overhead test results at these kind of frame rates, but it's certainly interesting that Intel can't seem to do Independent Flip on Vulkan swapchains. Having to context switch to DWM to present each frame isn't free and almost certainly is costing them significant performance in very high frame rate Vulkan applications. Nvidia's driver can also directly present Vulkan swapchains like you saw on the AMD test.
The above comment implies the overall driver code is terrible.

I think Graphics Lisa is also at fault here. She's been there for many many years now. And I don't think they are mere 1-2 generations behind. I think they are 3-4 generations behind.
 
Last edited:
Reactions: 511 and Tlh97
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |