News Intel GPUs - Battlemage officially announced, evidently not cancelled

Page 104 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mopetar

Diamond Member
Jan 31, 2011
8,084
6,695
136
Ryan Shrout

Yeah, I believe.

I wonder if we're all actually living in Ryan Shrout's personal hell. Ever since he started working at Intel he's had to market products that don't really stack up against the competition in the same way they did historically.

Trying to market ARC to gamers is probably a bit like pushing a boulder up a hill for all eternity. Just when you think you've almost got it to the top a new game comes out and the drivers don't work and it all goes tumbling down.
 

Tup3x

Golden Member
Dec 31, 2016
1,072
1,064
136
I wonder if we're all actually living in Ryan Shrout's personal hell. Ever since he started working at Intel he's had to market products that don't really stack up against the competition in the same way they did historically.

Trying to market ARC to gamers is probably a bit like pushing a boulder up a hill for all eternity. Just when you think you've almost got it to the top a new game comes out and the drivers don't work and it all goes tumbling down.
New games are likely not going to be a problem. As long as Intel makes sure that developers will start optimising for their GPUs, things should not be that bad (also if they release game ready drivers on time). All those zillion old games that are not optimised for Arc that do not have any driver optimisations are the problem.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,477
24,200
146
All joking aside, Intel might be picking the right battles. Better to capture young gamers spending most of their time on current F2P games. Like McDonald's you want that name recognition with them early in life. Look how many brand loyal members the other 2 have. Also smart to follow the others in having your GPUs associated with new AAA titles. Work with the game developers and throw them money to optimize for your products.

I am looking forward to seeing how it all plays out. I have systems for older games. So I would not hesitate to buy an Intel ARC for new ones, if they are the best bang for buck.
 

gdansk

Platinum Member
Feb 8, 2011
2,839
4,221
136
People will accept subpar performance on old games if
1. the price is lower (hard to beat used RTX 3k and RX 6k series now)
2. new games work well (we'll see)

Some people, kids usually, don't have a bunch of old games they play. I'm the opposite as I mainly play old games.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
That probably means many, many years later, if their GPUs are still around by that time. I guess they are so fatigued by the whole effort that they just want to sell something now and worry about compatibility later. So anyone who buys ARC now is buying it for current and (hopefully) future games. Not past games.

Oh well, that means a lot of Steam sale deals will be worthless to them and they can have smaller game libraries. Just pointing out an advantage to going with ARC
You don't need many years for optimizations that target general improvements. You can see from some results it's relatively behind Iris Xe in performance. It's faster but not like specs suggest. Those ones will happen quickly.

Also the graphics base is still a far derivative of the GMA X3000 in 2006. So they are not starting entirely from the ground up.

You cannot put off things and be forever in theory if you want to really improve. Sometimes you have to go out there and see what you can do. That's why they need to get the dGPUs out there even if it's rough. Even Tom Petersen says they need feedback.

The issues they are having regarding performance is likely in some ways hardware as well. Not suggesting it's inherently weak, just that they only have integrated graphics related experience. So in Battlemage you see not only the driver side improve but architecture more fine tuned.

From a startup point of view what they are going through is nothing.

I wonder if we're all actually living in Ryan Shrout's personal hell. Ever since he started working at Intel he's had to market products that don't really stack up against the competition in the same way they did historically.

Trying to market ARC to gamers is probably a bit like pushing a boulder up a hill for all eternity. Just when you think you've almost got it to the top a new game comes out and the drivers don't work and it all goes tumbling down.

Ryan Shrout is kept background in that interview so they realize that it's best not to keep him as a frontman all the time, even though he's technically high up marketing.
 
Feb 4, 2009
35,200
16,657
136
All joking aside, Intel might be picking the right battles. Better to capture young gamers spending most of their time on current F2P games. Like McDonald's you want that name recognition with them early in life. Look how many brand loyal members the other 2 have. Also smart to follow the others in having your GPUs associated with new AAA titles. Work with the game developers and throw them money to optimize for your products.

I am looking forward to seeing how it all plays out. I have systems for older games. So I would not hesitate to buy an Intel ARC for new ones, if they are the best bang for buck.
While I haven’t educated myself about the sub-par older game performance I agree with what you said and want to add is it really relevant when your CPU & Ram are magnitudes faster. Does anyone really need 189fps in Team Fortress? Can any human being detect fps that high? Does it matter if you have good enough performance in older games.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
While I haven’t educated myself about the sub-par older game performance I agree with what you said and want to add is it really relevant when your CPU & Ram are magnitudes faster. Does anyone really need 189fps in Team Fortress? Can any human being detect fps that high? Does it matter if you have good enough performance in older games.

Yes you can, because our senses are relative not absolute. It's been observed for things such as screen flickering on laptops at low brightness, some individuals experience mild sickness all the way close to a KHz. Besides that you get used to it after being on it for months or years.

Also for competitive matches and players, you want to minimize any fault on the tool. Even games like LoL are quite fast paced and twitchy. You also want to get the worst case performance very high.

Those things also take competition into account. One gets 200 fps and other gets 300 for the same power use, form factor, noise and cost. You'd get the 300 fps device even if it doesn't matter.
 

Frenetic Pony

Senior member
May 1, 2012
218
179
116
While I haven’t educated myself about the sub-par older game performance I agree with what you said and want to add is it really relevant when your CPU & Ram are magnitudes faster. Does anyone really need 189fps in Team Fortress? Can any human being detect fps that high? Does it matter if you have good enough performance in older games.

Human flicker detection goes up to 500fps: https://www.nature.com/articles/srep07861
 

moinmoin

Diamond Member
Jun 1, 2017
5,063
8,025
136
Human flicker detection goes up to 500fps: https://www.nature.com/articles/srep07861
The finding's intuitive as well, the higher the contrast the higher the frequency has to be to not detect a flicker. Humans can easily see lightnings even though their duration is about 1000 - 100us, down to 5us for single strains (though it's debatable one can actively see those). That's a theoretical FPS of 10000 - 100000 and 200000. That's obviously the worst case of a highest possible contrast in nature. Screens being comparably dim compared to lightning and sun achieve much lower absolute contrast so don't need to achieve such high frequencies to avoid flicker. That may seem ironical, but the lower the screen brightness the lower the susceptibility to flickers.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,323
5,433
136
The finding's intuitive as well, the higher the contrast the higher the frequency has to be to not detect a flicker. Humans can easily see lightnings even though their duration is about 1000 - 100us, down to 5us for single strains (though it's debatable one can actively see those). That's a theoretical FPS of 10000 - 100000 and 200000. That's obviously the worst case of a highest possible contrast in nature. Screens being comparably dim compared to lightning and sun achieve much lower absolute contrast so don't need to achieve such high frequencies to avoid flicker. That may seem ironical, but the lower the screen brightness the lower the susceptibility to flickers.

One flash time is meaningless, and completely unrelated to FPS. Sure you can see one brief flash, incredibly brief if it's incredibly bright, it's simply about getting enough photons on the sensor to be detected, but that isn't remotely related to FPS.

For there to be FPS, there must be more than one frame.

FPS is more about the interval you can detect between two flashes. In the 50-100Hz range, they will look like one flash, unless you have a motion to separate the flashes spatially. This is known as flicker fusion threshold

The paper above is looking an extreme edge case artifacts, that you won't see on a normal display. Basically it's citing the Rainbow effect on DLP displays. When you combine motion on screen, with the serial color flashing and motion of your eyes, colors get misplaced and you get color fringing on the edges.

You won't get this artifact on displays that have continuous color (like LCD/OLED/MicroLED).

Artifacts on oddball displays aren't evidence that you need 1000 Hz monitors.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,323
5,433
136
Back to the card. A780 debunked:


Pretty sad for Intel that their top card will only be competing with last generation 6 series cards from AMD/NVidia.
 
Feb 4, 2009
35,200
16,657
136
Human flicker detection goes up to 500fps: https://www.nature.com/articles/srep07861
One flash time is meaningless, and completely unrelated to FPS. Sure you can see one brief flash, incredibly brief if it's incredibly bright, it's simply about getting enough photons on the sensor to be detected, but that isn't remotely related to FPS.

For there to be FPS, there must be more than one frame.

FPS is more about the interval you can detect between two flashes. In the 50-100Hz range, they will look like one flash, unless you have a motion to separate the flashes spatially. This is known as flicker fusion threshold

The paper above is looking an extreme edge case artifacts, that you won't see on a normal display. Basically it's citing the Rainbow effect on DLP displays. When you combine motion on screen, with the serial color flashing and motion of your eyes, colors get misplaced and you get color fringing on the edges.

You won't get this artifact on displays that have continuous color (like LCD/OLED/MicroLED).

Artifacts on oddball displays aren't evidence that you need 1000 Hz monitors.

And all this is happening while concentrating on a game?
I am suspicious.
I wish someone would do a large scale test. I suspect very, very few of us are capable of detecting very high frame rates.
 
Reactions: scannall

Tup3x

Golden Member
Dec 31, 2016
1,072
1,064
136
Back to the card. A780 debunked:


Pretty sad for Intel that their top card will only be competing with last generation 6 series cards from AMD/NVidia.
Well, they don't have to compete with the top and they never planned to. AMD was no where near the top for years and they didn't even try to compete there. With Battlemage Intel may or may not try.
 
Reactions: pcp7
Feb 4, 2009
35,200
16,657
136
Well, they don't have to compete with the top and they never planned to. AMD was no where near the top for years and they didn't even try to compete there. With Battlemage Intel may or may not try.

and that’s fine with me. I have never owned a top tier card (except maybe my voodoo 3 card)
I really don’t like AMD video cards however they have been the majority that I’ve owned over the years because of good enough performance and excellent price.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,323
5,433
136
Well, they don't have to compete with the top and they never planned to. AMD was no where near the top for years and they didn't even try to compete there. With Battlemage Intel may or may not try.

But they are FAR from the top, and it looks like they were originally position to compete against 7 series cards.
 

Tup3x

Golden Member
Dec 31, 2016
1,072
1,064
136
But they are FAR from the top, and it looks like they were originally position to compete against 7 series cards.
You have to start somewhere. RTX 3060 Ti is really close to RTX 3070 (so close that I think it is a really weird product) so they pretty much are there. With better driver it might start trading blows with it.

Also keep in mind that they already showed that A750 can be up to 17% faster than RTX 3060 so this probably doesn't tell the whole story. If start align perfectly it might be faster than RTX 3070 but their drivers are in pretty bad state so results will be all over the place.
 
Jul 27, 2020
19,613
13,477
146
It's statements like this that leave me confused.
I don't see any confusion. Nvidia markets their cards very well, with desirable features like better raytracing performance and DLSS. However, a lot of gamers have trouble stomaching their prices along with lesser amount of RAM so they go with AMD as a compromise. Doesn't mean they would if they had a choice.
 
Reactions: scannall
Feb 4, 2009
35,200
16,657
136
It's statements like this that leave me confused.
As I’ve documented here.
Nothing exciting about most of my previous amd cards, all were sort of loud or put a lot of heat into the room or both. Never liked any of the the various software packages that came with them. Basically nothing exciting about them at all except offering good enough performance for $50-100 less.
 

maddie

Diamond Member
Jul 18, 2010
4,878
4,951
136
I don't see any confusion. Nvidia markets their cards very well, with desirable features like better raytracing performance and DLSS. However, a lot of gamers have trouble stomaching their prices along with lesser amount of RAM so they go with AMD as a compromise. Doesn't mean they would if they had a choice.
As I’ve documented here.
Nothing exciting about most of my previous amd cards, all were sort of loud or put a lot of heat into the room or both. Never liked any of the the various software packages that came with them. Basically nothing exciting about them at all except offering good enough performance for $50-100 less.
Every decision has compromises. Choosing mainly AMD, to me, means that they offered the best compromise each time.

I'm confused by anyone not realizing, that this means it was the best for them, at that time, based on their requirements. Not that is was the "best" overall, whatever that means.

I would be driving a Bugatti if reality didn't intrude, but alas, it does.
 
Feb 4, 2009
35,200
16,657
136
Every decision has compromises. Choosing mainly AMD, to me, means that they offered the best compromise each time.

I'm confused by anyone not realizing, that this means it was the best for them, at that time, based on their requirements. Not that is was the "best" overall, whatever that means.

I would be driving a Bugatti if reality didn't intrude, but alas, it does.

I’m confused how you’re confused.
To me amd cards aren’t exciting unless it is cost/performance charts that excite you.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
Hmm. From a seller's point of view(AMD), they have succeeded, did they not?

Also you thought the pros outweigh the cons compared to Nvidia. Because that's how I would have thought it.
 

Tup3x

Golden Member
Dec 31, 2016
1,072
1,064
136
NVIDIA has offered what I need. There are few things that I really like and it would be hard to live without.

Forcing anisotropic filtering for DX11 games works while in case of AMD it doesn't (or at least officially it they say it doesn't and some say it's really buggy in DX9 games too). It's surprising how many games have broken AF and forcing it through driver makes massive difference. Also driver FXAA is much higher quality than in game versions - handy for legacy games with down sampling. Then there's NULL which makes it really simple to make sure that you never see tearing with adaptive sync display (also v-sync settings in general). Also the ability to enhance MSAA level was really useful in some games that only offer something like "off, low and high" settings (AMD could do EQAA i.e. their version of CSAA and those didn't always work properly). SWTOR was one good example: high was 4x but with driver it was possible to enhance that to 8x. Forcing AA in that games just wouldn't work without issues. With NVIDIA profile inspector it's possible to tweak things further (especially for legacy games).

NVIDIA should completely remake their control panel... It's prehistorical and browsing profiles is real pain.

It remains to be seen what Intel's driver offers... I really hope that someone does proper review of their driver and what it offers and do things actually work.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,477
24,200
146
The brand preference inspires a question. Is Intel joining AMD in the open source movement?

Nvidia R&Ds proprietary tech. They also tried to lock Nvidia users into buying more expensive G-Sync monitors for VRR. These decisions make me pull for the other teams. Before we end up with a "Now all restaurants are Taco Bell" future.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |