Question are video card prices headed down yet?

Page 109 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
After I pick up a 3080 FE, or 40x equivalent they can do what ever they want for about 10 years without my permission... LOL The way its looking I may get a 3090 easier

A 3080 would not be the best card for a ten year time line. With only 10GB of RAM, which is already not enough for some games, in ten years it will be wildly short. Even if you are being facetious, and it turns out to be 5-7 years, thats still not going to be great unless you are fine with having to drop settings because of memory usage and not because the GPU itself cannot handle it.
 
Reactions: igor_kavinski

mastertech01

Moderator Emeritus Elite Member
Nov 13, 1999
11,875
282
126
A 3080 would not be the best card for a ten year time line. With only 10GB of RAM, which is already not enough for some games, in ten years it will be wildly short. Even if you are being facetious, and it turns out to be 5-7 years, thats still not going to be great unless you are fine with having to drop settings because of memory usage and not because the GPU itself cannot handle it.
True, but in 10 years I will be 77, much slower than the games are..
 

Aapje

Golden Member
Mar 21, 2022
1,467
2,031
106
Haha, I am not saying nvidia will fail, I am thinking they will continue to pivot to highly profitable AI, HPC and other non-retail facing markets. Even professional graphics with the big $$$ per card.

They will be forced to focus on their highest margin product lines and projects because that's business. The consumer arm has to be valuable but I can't believe its the most profitable card they hold.

Yeah, in many ways you are already subsidizing cards designed for non-gaming if you buy a consumer card from them. For example, RT cores were added for business use cases and then they went looking for ways to use them for consumers, which is why they came up with DLSS. They later released DLSS 1 for CUDA cores (for Control). If they had no business use case for RT cores, they would never have added them just for DLSS, but would have released DLSS for CUDA cores in the first place.

Once Nvidia switches to chiplets, they may separate out the business stuff as much as they can. So then you only get CUDA on the Quadro cards, where that cards has the same gaming chiplet you get as a consumer, but also an extra business chiplet.

A 3080 would not be the best card for a ten year time line. With only 10GB of RAM, which is already not enough for some games, in ten years it will be wildly short.

Nvidia will probably also underspec the new generation, with a 12 GB 4070 Ti and a 10 GB 4070. In general, they like to leave previous generations behind, as we saw with DLSS and DLSS 3. So AMD may be a better choice if you want longevity. Especially if you can tier up for the same price.
 

ZGR

Platinum Member
Oct 26, 2012
2,054
661
136
I am not as worried about running low on vRAM these days despite memory usage hitting the 'limit'. 10GB vs 12GB isn't as nearly as much to worry about as 2GB vs 4GB. This does make AMD's cards seem more enticing.

Is it possible to see vRAM usage divided for used vs buffer? On Linux we can see how the memory is being used, but in task manager it will show as barely any free. Not exactly comparable to vRAM.



Launching KSP with a 16k texture pack will use all my vRAM right away. But launching Cyberpunk and having them run simultaneously doesn't show GPU performance degradation. I am not feeling confident trusting vRAM usage as a bottleneck.
 

blckgrffn

Diamond Member
May 1, 2003
9,197
3,183
136
www.teamjuchems.com
Yeah, in many ways you are already subsidizing cards designed for non-gaming if you buy a consumer card from them. For example, RT cores were added for business use cases and then they went looking for ways to use them for consumers, which is why they came up with DLSS. They later released DLSS 1 for CUDA cores (for Control). If they had no business use case for RT cores, they would never have added them just for DLSS, but would have released DLSS for CUDA cores in the first place.

Once Nvidia switches to chiplets, they may separate out the business stuff as much as they can. So then you only get CUDA on the Quadro cards, where that cards has the same gaming chiplet you get as a consumer, but also an extra business chiplet.

Exactly. I've felt they had a solution looking for a problem (adding professional/hpc silicon to all their consumer parts perhaps like CUDA in order to increase thee adoption of their professional solutions) and did a fantastic job marketing a problem they discovered. Backing into it like this also would explain why they had such a "lead" in this area for so long, as it was at tangential too the typical approach.

It is my feeling that if AMD had been the one to include this type of hardware then "pioneer" this solution, nvidia would have done their best to rip it apart and shout to the heavens how impure this approach was.

Regardless, we have upscaling as a accepted tech now and I think ironically, it only accelerates the iGPU relevancy, for its not big $1600 cards that really benefit from this tech so much as those just barely hitting FPS targets. A iGPU running 720p scaled to 1080p might look a little iffy, but if it runs well? Plenty of Fortnite/Apex/Siege machines out there in the hands of kids who just want their games to work and getting their wishes granted.

A weird thing over at Slickdeals, the bemoaning of 1660 Supers and AMD 5500 GPUs in PCs with 5700G/5600G cpus. "Why bother when the integrated GPU is so good?" smh, I definitely don't agree but that says something about how they are starting to be perceived.
 

adamge

Member
Aug 15, 2022
56
135
66
Yeah, in many ways you are already subsidizing cards designed for non-gaming if you buy a consumer card from them. For example, RT cores were added for business use cases and then they went looking for ways to use them for consumers, which is why they came up with DLSS. They later released DLSS 1 for CUDA cores (for Control). If they had no business use case for RT cores, they would never have added them just for DLSS, but would have released DLSS for CUDA cores in the first place.

Once Nvidia switches to chiplets, they may separate out the business stuff as much as they can. So then you only get CUDA on the Quadro cards, where that cards has the same gaming chiplet you get as a consumer, but also an extra business chiplet.

Intel has already shown the capability of fusing off, or subscription license key fusing, of certain CPU features. Why hasn't Nvidia done this on their consumer GPUs for the parts of it you consider professional?

I guess my point is, you say Nvidia can only do this once they go to chiplets. But it seems to the casual observer that they could do this with existing technology.
 
Jul 27, 2020
17,853
11,645
116
Launching KSP with a 16k texture pack will use all my vRAM right away. But launching Cyberpunk and having them run simultaneously doesn't show GPU performance degradation.
It's possible now to have two applications use the GPU simultaneously??? When did this happen? :-O
 

blckgrffn

Diamond Member
May 1, 2003
9,197
3,183
136
www.teamjuchems.com
Intel has already shown the capability of fusing off, or subscription license key fusing, of certain CPU features. Why hasn't Nvidia done this on their consumer GPUs for the parts of it you consider professional?

I guess my point is, you say Nvidia can only do this once they go to chiplets. But it seems to the casual observer that they could do this with existing technology.

They already driver limit things. nvenc is an easy example where it is much more "capable" on quadros that have the same silicon as non-quadros. Also, certain types of math are more accelerated on Quadro cards compared to their consumer lines, sort of like AVX on Xeon but not on consumer chips, for example.

It's already being done, but it might be done even more extensively and nvidia might be better able to differentiate their product lines better/more cost effectively as right now their quadros are in many ways the same chips but with capabilities enabled. Chiplets can help take it to the next level.
 

ZGR

Platinum Member
Oct 26, 2012
2,054
661
136
It's possible now to have two applications use the GPU simultaneously??? When did this happen? :-O

Since both those games lack DRM, you can run the game directly and bypass Steam from complaining that another game is running. But this has been possible for awhile.
 
Reactions: igor_kavinski

jpiniero

Lifer
Oct 1, 2010
14,831
5,444
136
Pretty much. In the next 5-10 years, the unwashed asses of PC gaming will likely have excellent options that require no dGPU at all. It's a slow crawl, but we'll get there. Already the sub $100 gpu market disappeared, and took so many great (lol) slow cards with it. The sub $200 market is next, imo.

The nodes getting more and more expensive is going to affect AMD's CPU products too though. Chiplets are a reasonable way to mitigate this but as long as AMD uses expensive nodes for the chiplets it will only go so far. To keep costs down I see AMD keeping the IGP size down to the point where there will continue to be a wide gap between it and the entry level dGPU, even if that dGPU is $500. And once current gen games show up, I assume that APUs slower than the Series S are not going to be playable. That might take another 2-3 years the way things are going though.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,873
3,226
126
I'm here to inject many negative sentiments.

You can't cuz right now scalpers are getting burned hardcore on trying to scalp the 4080's.
The only thing you can complain about is why nvidia still think they can get miner prices on there gpu's, when AMD was smart enough to realize they wouldn't be able to sell theres at miners prices, and will pretty much dominate the lower + Mid + Upper-Low + Upper-Mid market.

Intel has already shown the capability of fusing off, or subscription license key fusing, of certain CPU features.

So they going to make us pay a sub fee for ray tracing.. or a sub fee to use compute on them and not games?
Intel GPU's are garbage so far... no one wants them at the current time.... if they charge subs' they offically nailed the coffin shut for gamers who would actually buy them due to the low price.
But its not even that low if you look at how much AMD has slashed the prices on there gpu's.
 
Reactions: moonbogg

amenx

Diamond Member
Dec 17, 2004
4,005
2,275
136
Have strong feeling the 4080 will drop to $999 pretty soon and scalpers will be back to buying it. It will start flying off the shelves at that price. So will the 7900xtx when released and will probably sell out pretty quick. I doubt I will be able to pick up any of these cards at a reasonable price any time soon, so looking at perhaps a few months ahead to when I will be able to.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,873
3,226
126
Have strong feeling the 4080 will drop to $999 pretty soon and scalpers will be back to buying it. It will start flying off the shelves at that price.

i dont think so... not with 9.1% inflation rate and a almost 0 return of investment on GPU's because of the Eth bubble that got wrecked.
Also not to mention there is no coin gpu minable that is profitable.
You need to remember most of the gpu's went to miners, and not gamers.
Miners paid whatever they could for them because it would return interest in a few months.
Gamers were shoved the short end of the stick.
Now there is no return of interest, no miners will buy gpu's unless your speculative, but they are all mostly moving to ASIC.
And again with a 9.1% inflation index, no EDD Covid payouts, not many unless your retired and have a ample 401k fund to burn though will buy a 1000 dollar gpu without thinking if they can make meets end for the month.

Nvidia wants to sell the 4080... try 699-799...The original MSRP on the 3080, otherwise its gonna be a brick taking up storage.
Most of the 4090's were bought by scalpers, and most went to reviewers, or content streamers, or guys like AdamK.
Hell i couldn't even get one at launch as they were all sold out, and i really do not mind waiting it out for the 4090ti, after all the scalpers get burned hardcore on the 4080 and learn there lesson that GPU's are not profitable anymore, and to go back to scalping PS5's along with nike shoes.


I hope they all get burned, and move onto something else.
Anyone who buys them at MSRP is stupid, as you don't get warrenty when you buy a scalped card, as its a third party sell, and most Vendors do not honor warrenty on third party.

Woah woah woah, that's the only thing he can complain about? I think that demonstrates a severe lack of imagination on your part


Your right... moonbogg will probably complain how no AIB makes a pink or purple PCB and how its all black and white, and they are trying to get racial by doing so.
Blah... Pink PCB for the win!
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I am not as worried about running low on vRAM these days despite memory usage hitting the 'limit'. 10GB vs 12GB isn't as nearly as much to worry about as 2GB vs 4GB. This does make AMD's cards seem more enticing.

Is it possible to see vRAM usage divided for used vs buffer? On Linux we can see how the memory is being used, but in task manager it will show as barely any free. Not exactly comparable to vRAM.

View attachment 72444

Launching KSP with a 16k texture pack will use all my vRAM right away. But launching Cyberpunk and having them run simultaneously doesn't show GPU performance degradation. I am not feeling confident trusting vRAM usage as a bottleneck.

Bench marks show that in some cases the 3080 tanks where the 6800XT does not, due entirely to memory.
 

Trefugl

Member
Dec 3, 2013
32
20
81
Don't settle for less than a 12GB frame buffer if you are buying on that timeline

This is why I just finally gave up and bought a used 3090. Previously been running an AMD 290 (in crossfire before that became useless). Now that I'm back into gaming/VR/professional apps, I need CUDA and ram, and gave up on getting a good price/fps ratio. At least it was the used market and not inflating silly retail prices
 

blckgrffn

Diamond Member
May 1, 2003
9,197
3,183
136
www.teamjuchems.com
This is why I just finally gave up and bought a used 3090. Previously been running an AMD 290 (in crossfire before that became useless). Now that I'm back into gaming/VR/professional apps, I need CUDA and ram, and gave up on getting a good price/fps ratio. At least it was the used market and not inflating silly retail prices

Nice! You got that PNY from the FS/FT section! That's a huuuuuuuuuuuuge upgrade from your 290 (I was running a 290x for about 6 years, it was a workhorse for sure) and wow, welcome to now in terms of all the things! Congrats, I hope it runs for as long as the 290 did
 
Reactions: igor_kavinski

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
Have strong feeling the 4080 will drop to $999 pretty soon and scalpers will be back to buying it. It will start flying off the shelves at that price. So will the 7900xtx when released and will probably sell out pretty quick. I doubt I will be able to pick up any of these cards at a reasonable price any time soon, so looking at perhaps a few months ahead to when I will be able to.

$1000 msrp means $1200+ AIB pricing anyways. If I'm spending that much, I'm not getting a 4080. It sucks compared to the 4090 and needs to be priced at $700. However, after seeing what happened during the shortage, nothing would surprise me anymore. If they drop the price by $200 msrp and AIB cards only drop by $100, I'd expect some to lose their minds and panic buy the piss out of them.
Also, if the 7900xtx barely matches the 4080 or even loses to it, then that thing won't sell hardly at all. No one is paying 1k for an AMD card that matches or loses to an Nvidia card with the same price.
 
Jul 27, 2020
17,853
11,645
116
I threw away my 3060 Ti FE and PNY 3080 10GB in disgust, in return for the base M1 Macbook Air and $80 cash.

Damn Nvidia and their low VRAM tactics.
 
Reactions: blckgrffn

Ranulf

Platinum Member
Jul 18, 2001
2,407
1,305
136
Pretty much. In the next 5-10 years, the unwashed asses of PC gaming will likely have excellent options that require no dGPU at all. It's a slow crawl, but we'll get there. Already the sub $100 gpu market disappeared, and took so many great (lol) slow cards with it. The sub $200 market is next, imo.

I became a believer when I realized that only slightly tuned 3400G provided a very similar gaming experience to a GTX 950 when I used them both. The add in card was better, but hundreds of dollars better? And plenty of folks will play at 1080p/30fps/settings the game picked when it launched because they are there to game. Not play "software configuration simulator"

Eh, people have been saying that for 5 years now. I'm still skeptical. It reminds me of the network folks who have for even longer been saying 10g conusmer networking gear will hit the SOHO market any day now at reasonable prices. We've got $100 2.5g switches now at least.
 

Trefugl

Member
Dec 3, 2013
32
20
81
Nice! You got that PNY from the FS/FT section! That's a huuuuuuuuuuuuge upgrade from your 290 (I was running a 290x for about 6 years, it was a workhorse for sure) and wow, welcome to now in terms of all the things! Congrats, I hope it runs for as long as the 290 did
That's the one! And yeah it'll be a huge upgrade - about 4-5x the fps and 6x the VRAM from my estimates. 290 was definitely a workhorse. About the time I decided to upgrade a few years back is when the market went insane. Been biding my time since but figure I'll never really good a good deal on Nvidia and lots of VRAM anyway. 4090 is the only alternative/upgrade for what I need and waiting for that to be <$1000 will be a long long time.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |