What tier of GPU is necessary to power a 24" LCD, for modern games with high detail?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

videogames101

Diamond Member
Aug 24, 2005
6,777
19
81
"but atm I feel Nvidia has the edge in drivers/power."

That is the statement that was originally questioned, and it is a statement that cannot be refuted because as worded its based completely on opinion which was derived from experience. I've owned many Nvidia and ATI cards and have seen how the trends go. Obviously, if you want to prove a point one way or the other, its very easy to pick out a particular card or driver set, point at it, and say how awesome or horrible it is. Both Nvidia and ATI have had dogs in the race. However, in the single gpu world, where most of us reside, ATI hasn't been able to compete with the 200 GTX series in power, the same way AMD hasn't been able to beat Intel's i7 line. Does it mean ATI sucks? Absolutely not. They make great gear, and very recently has come out with some killer cards. In general, ATI cards are a much better value for the money.

I'm judging their performance over the past 24 months, and "in my opinion", Nvidia has maintained a stronger product line, especially since they integrated the physics driver. Benchmarks can support that. ATI has nothing to compete with that at the moment, and until they do ATI users are going to get shafted in games like Batman AA. I hope that changes soon. It's time ATI took its turn at the top of the hill.

I know they have a new line coming out that is supposed to be amazing, so we'll see how that goes. I guess we have both ATI and Nvidia fanboys out there so no matter what I type some people will disagree, and thats fine. Just remember, those are just opinions to.

How is trying to force the entire dev community to use a proprietary physics engine a plus?
 

MStele

Senior member
Sep 14, 2009
410
0
0
How is trying to force the entire dev community to use a proprietary physics engine a plus?

Its not a plus, but its not a conspiracy theory either. Nvidia has integrated physics, which is a feature that developers use to improve their products. ATI does not offer a comparable alternative at the moment. Thats the long and short of it. Basically, there are two sides to this. There are those that want developers to include physics support, and those that believe it should be left out in order for ATI to not get left in the cold. Personally, I believe that the developers should use whatever tools they want to use. They aren't being forced to use the physics engine. They use it because it saves them cpu cycles that can be used for something else. If that means that Nvidia gets supported before ATI, then thats just the way it will be until ATI steps up and develops something to compete. Nvidia isn't evil for promoting their tech, its just good business. They were smart to see its application early on and buy the company.

Developers can choose to either support or not support Nvidia's chip. I have a feeling its all going to be for naught because eventually MS will add a physics module to DirectX which will solve all the problems.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
"but atm I feel Nvidia has the edge in drivers/power."

That is the statement that was originally questioned, and it is a statement that cannot be refuted because as worded its based completely on opinion which was derived from experience. I've owned many Nvidia and ATI cards and have seen how the trends go. Obviously, if you want to prove a point one way or the other, its very easy to pick out a particular card or driver set, point at it, and say how awesome or horrible it is. Both Nvidia and ATI have had dogs in the race. However, in the single gpu world, where most of us reside, ATI hasn't been able to compete with the 200 GTX series in power, the same way AMD hasn't been able to beat Intel's i7 line. Does it mean ATI sucks? Absolutely not. They make great gear, and very recently has come out with some killer cards. In general, ATI cards are a much better value for the money.

I'm judging their performance over the past 24 months, and "in my opinion", Nvidia has maintained a stronger product line, especially since they integrated the physics driver. Benchmarks can support that. ATI has nothing to compete with that at the moment, and until they do ATI users are going to get shafted in games like Batman AA. I hope that changes soon. It's time ATI took its turn at the top of the hill.

I know they have a new line coming out that is supposed to be amazing, so we'll see how that goes. I guess we have both ATI and Nvidia fanboys out there so no matter what I type some people will disagree, and thats fine. Just remember, those are just opinions to.

ok, here's a better one: your opinion sucks. drivers are good from both companies atm.
 

terentenet

Senior member
Nov 8, 2005
387
0
0
How is trying to force the entire dev community to use a proprietary physics engine a plus?
I do not support proprietary features. I'm all for OpenCL physics, but until then, Nvidia can play however they want with Physx. Remember, they paid for Ageia, so it's Nvidia's right to ask money from who wants to use Physx; I don't see anything wrong with that.
Also, the devs are not "forced" to use Physx, but if they want a GPU accelerated physics module, Physx is the only one available ATM.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,882
3,230
126
Any Geforce 260+ should be able to do it no problem. I have a 260 216 and I pretty much max everything at 1920X1200. On occasion I might have to lower AA/AF to 4/8, but everything is fully playable..and thats with V-sync. IMO, I would go with a Geforce 285 GTX, which would cover your bases nicely. ATI makes some nice stuff to, but atm I feel Nvidia has the edge in drivers/power.

No your wrong..


^ he is correct.

MAX settings impossible... medium settings, piece of cake.

Im sorry, my single HD4870X2 couldnt pull that off in some games, which is why i stepped up to xfire with 4 GPU's.

Max Settings on 24inch monitor is expensive.

OP... the answer is really dependant on the game. I dont see you running into any problem with a valve game, like TF2, or L4D.

However go a on crytek game, or even borderlands with everything max'd out in a junk yard type map with max shaddows and everything, you will have FPS LAG.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |