VirtualLarry
No Lifer
- Aug 25, 2001
- 56,570
- 10,202
- 126
I hate when that happens.It was a SKU shift.
I hate when that happens.It was a SKU shift.
It was a SKU shift. The real successor to the 5850 was the 6950 (both $300 as I recall)
Imagine if the 4060ti was $230 instead of $400, absolutely no one would be complaining about it's performance.
See that's the thing. With the massive increase in wafer prices, they wouldn't do that.
HWUB seems to have a new take on the 4060 Ti. It isn't good. How do they expect people to buy 16GB versions of these things?
Yeah, I posted that one on the previous page.
I have no idea how they did something this bad, and thought it would go over well.
I wonder how many GPU buyers don't read any reviews at all, or how many unscrupulous salesman will just unload these on the unsuspecting.
And the AI crowd in particular I predict will absolutely get this card, since the 4080 is the only other option with 16+GB, and its much more expensive. NV just carved out an AI niche at a 'lower' price point that is distant from the expensive 4080 and 4090 both in price and performance, but the VRAM is badly needed for AI so these people will definitely buy it.
I bought the 580 8GB because it was a little faster than the 1060 6GB for $70 less ($229vs$299).AMD threw 8gb of RAM of the 580 because they knew that without it they'd get their asses kicked by NV even harder than they did. Some folks definitely picked it up cause it was offering more RAM with no consideration of its actual real world performance.
I might add that I don't like this either, but with regards to AI, which I read a lot about, most people are only considering NV. And the 4060Ti 16GB looks like the only budget option as older cards dry up.
Interesting take on the 4060 and how current reviews are done. A 4060 vs 7600 at 1440p running a 5600G cpu. I think his numbers disprove his belief that 1440p is the new entry standard and that 1080p is on the way out.
AMD threw 8gb of RAM of the 580 because they knew that without it they'd get their asses kicked by NV even harder than they did. Some folks definitely picked it up cause it was offering more RAM with no consideration of its actual real world performance.
Seems like he is being wrong deliberately. And knows it.Interesting take on the 4060 and how current reviews are done. A 4060 vs 7600 at 1440p running a 5600G cpu. I think his numbers disprove his belief that 1440p is the new entry standard and that 1080p is on the way out.
Actually Nvidia can do it, because thats what their customers taught them.Yeah, I posted that one on the previous page.
I have no idea how they did something this bad, and thought it would go over well.
I wonder how many GPU buyers don't read any reviews at all, or how many unscrupulous salesman will just unload these on the unsuspecting.
It's a bad argument that's made over and over. I don't want gpu benchmarks to end up being cpu benchmarks and vice versa by virtue of using low end parts. I want to know how powerful the gpu is on an absolute scale to get an idea how it's going to age when I'm in the market to buy.Interesting take on the 4060 and how current reviews are done. A 4060 vs 7600 at 1440p running a 5600G cpu. I think his numbers disprove his belief that 1440p is the new entry standard and that 1080p is on the way out.
HUB Steve responded.
I do miss that review site where they tested max playable settings in games, just can't quite remember it's name...
HUB Steve responded.
I do miss that review site where they tested max playable settings in games, just can't quite remember it's name...
Yup, that's the one. Their benchmarks were good even if their editorials were sometimes questionable.Hard OCP ??
The methodology on all these reviews sucks. I reckon Anandtech should start doing GPU reviews, and show them how it's done
If time wasn't a factor there would be a lot of settings to test for each game. And a visual comparison of these settings. Obviously too much work and often pointless to include in a round up. But each time a new game comes out it is interesting to see.What's missing from a HWUB GPU review?