Originally posted by: AmberClad
The OP has spend a lot of time working on this
- AmberClad
Originally posted by: djayjpI DO suspect, however, that the increase/decrease in performance from one setting to another is likely to be rather close to 2x/one half
Crysis Warhead
DirectX 10 ENTHUSIAST 3X @ Map: ambush @ 0 1920 x 1200 AA 0xx
==> Framerate [ Min: 13.03 Max: 24.78 Avg: 20.02 ]
DirectX 10 GAMER 3X @ Map: ambush @ 0 1920 x 1200 AA 0xx
==> Framerate [ Min: 22.65 Max: 44.25 Avg: 33.96 ]
DirectX 10 MAINSTREAM 3X @ Map: ambush @ 0 1920 x 1200 AA 0xx
==> Framerate [ Min: 33.72 Max: 70.88 Avg: 51.25 ]
DirectX 10 PERFORMANCE 3X @ Map: ambush @ 0 1920 x 1200 AA 0xx
==> Framerate [ Min: 58.18 Max: 137.67 Avg: 98.44 ]
Originally posted by: djayjp
Yes, Garrity, I took the shortcut once I realized that the 4670 offered great performance on that game. I've already done the overall performance/price comparison, so I just stuck with the 4670--- it should be about 90% accurate between most games.
Originally posted by: AmberClad
LOUISSSSS (and others) - The OP has spend a lot of time working on this, so even if you disagree with the analysis, you can stand to be a lot less abrasive/provocative in your responses.
- AmberClad[/b]
Originally posted by: AmberClad
LOUISSSSS (and others) - The OP has spend a lot of time working on this, so even if you disagree with the analysis, you can stand to be a lot less abrasive/provocative in your responses.
djayjp - Mudslinging is not ok, especially when someone is just offering constructive criticism. I've been pretty lenient so far because you're a new member, but this is going to be the last time I ask you to stop the namecalling and yelling.
Then why are you wasting time by thread crapping??Originally posted by: Kelvrick
This guide is horrible. I don't even want to spend the time saying how horrible it is.
- AmberClad[/b]
Originally posted by: edplayer
I have only read a small part of this thread
Seems like you pissed a lot of people off. Guess some people get emotional when you point out their silly obsession over a few fps.
I was waiting for the 4670 to come out a while back as it seemed like it would be the "bang for the buck" card to get but it kept getting delayed. Ended up buying a 9600GT for $54 after rebate (before taxes). Very happy with the card. The 4670 also launched at around $80 so I was happy that I went with my choice because of that also.
Originally posted by: djayjp
Very nearly 8800gt performance at 1/4 the cost from what one would have spent a year ago!
Originally posted by: LOUISSSSS
Originally posted by: djayjp
Very nearly 8800gt performance at 1/4 the cost from what one would have spent a year ago!
why do u keep saying that? nobody cares about last year's performance. we care about PRESENT DAY.
8800gts 320mb - $99
is this card a good value? it offers DOUBLE x1900xt performance.
nobody will be using ur guide or even understands it because you're not giving any REAL LIFE examples. you keep saying $1 = 5fps, we're like HUH? use real life cards and real life people that play a few different games.
if you're not going to use the perfect real-life example i gave a few posts above, please make up a good REAL LIFE example of how to use ur guide.
Originally posted by: LOUISSSSS
isn't the 9600gt the same gpu as the 8800gt? a year later, shouldn't it be cheaper AND faster?
8800gts 320mb - $99
is this card a good value? it offers DOUBLE x1900xt performance. .
Seems like you pissed a lot of people off.
I believe I am due some modest appreciation regarding my rather insightful and innovative observation that a 1:1 performance/price ratio is poor value from a strictly performance/price standpoint; this means that even if higher end cards could increase their performance by the same ratio that their price increases (which they fall FAR short of), a 100% increase in performance is NOT (from a strictly P/P perspective) worth a 100% increase in price.
So, if card A is 50fps and card B is 75fps (a 50% increase in performance), but costs 50% more money, then it's NOT worth the added cost; in fact, I argue that you are actually paying MORE for LESS (even though they both yield the same fps/$ figure and card B performs 50% better). Thanks to progress/Moore's Law, we should get MORE performance for the SAME amount of money (well, including inflation). So, in terms of Moore's law, the performance/cost ratio between generations should always be about (going by theoretical performance) 2:1. So, in other words, if card A is 50fps and card B is 70-90fps of real-world performance or more and costs not significantly more... it's worth it! Anything less performance-wise/anything more cost-wise is probably not worth it (unless of course you need it for a specific application). So, a 1% increase in price should improve --theoretical-- performance by 2% to be worth it; 'theoretical' as in meaning an increase in say stream processors or clock speed, rather than real-world performance measured using framerate, which will always be lower... 70-90% real-world performance compared to theoretical specs is a pretty good range to expect. If assuming an 85% efficiency (which might be a little optimistic) over a 2x theoretical performance upgrade, one can expect a performance boost of 1.7x. This is the figure I chose to go with because I put a focus on value, but a more conservative estimate might be in the 1.4x-1.6x range. In fact, based on a fair bit of statistical analysis between generations, I conclude that a 1.5x average increase in real-world performance seems to occur (this is based on the averages of several benchmarks looking at 7800gtx- 8800gtx- gtx 280). To go into more detail, one way to determine the price/performance ratio is to divide the framerate by the price-- a higher relative number is better. To compare the price/performance ratio between two cards always only compare within the same exact benchmark! So, say if card A is rated 0.5 (fps divided by price; e.g., 50fps and costs $100) and card B is rated the same (0.5, e.g., 100fps, $200), even though card B has a higher actual framerate, you'd think it would be better because they have the same performance/price ratio, but it's not because it costs more. This is because although its performance is 100% higher, its price is also 100% higher! Card B has a 1:1 ratio of increase vs Card A. Remember, we should be getting MORE (performance) for the SAME money OR, SAME performance for LESS money (not same for the same, which is a poor 1:1 ratio)-- imagine every time you walked into a computer store and said you wanted a higher performing CPU they said it would cost more (not counting inflation), like if at one point you wanted a 2GHz CPU and it cost $200, and then the next generation CPU came out (with twice the performance, say achieved with twice the clock speed) and you came in and the new 4GHz CPU cost $400...! This is a 1:1 ratio of increase. Soon, no one could (or would want to) buy anything! So, a truly good buy/deal/upgrade (of course depending on your applications/needs) compared to card A in this example would have a (real world) fps/price rating of about 0.85 (vs 0.5, or 1.7x higher) or higher for the same price, or a theoretical spec or performance/price rating of 1.0 or better (if you're too lazy to look at benchmarks ). So, card A with a lower actual framerate has a better price/performance ratio than Card B even though they both might be rated 0.5 (price divided by framerate). This kind of thinking, that double the performance is worth double the money, has led to the huge and hugely expensive monster cards of today and accordingly crappy price/performance ratios of these ultra high-end cards (*cough GTX 280 *cough-- especially at its launch price, yikes!). Interestingly, you can also relate this thinking to power consumption trends as well (like performance/watt). Although, certainly you can find poor examples of price/performance at the low-end too (or nearly anywhere in the spectrum of cards/prices). Take, for example, the current 9500 GT; with at most 1/3 the performance level of the 4670 for about the same price... that's terrible value! To make a very vague, general statement, typically the ideal price/performance ratio is in the $75-200 range.
Re. the 9600gt quote, you still don't get it.. it's RELATIVE P/P value. It's timeless.
Any performance/price comparison that I have seen online always merely takes the fps divided by the price; my guide/tool goes further, giving you a more accurate assessment of a card's relative performance/price value using the absolutely factual Moore's law/generational progress as a factor (the variable of which is estimated using statistical analysis based on a variety of benchmarks looking at 7800gtx-- 8800gtx-- gtx 280)
Originally posted by: garritynet
Seems like you pissed a lot of people off.
Its not that he pissed us off, its that he is so completely enamored in his supposed 'tool'. There is no tool just bad math, guesses and assumptions. When critiqued he just lashes out in nerd rage to defend his pet 'tool'.
I'm curious, how would YOU accurately define the Moore's law/generational progress rate, then?
I believe I am due some modest appreciation regarding my rather insightful and innovative observation that a 1:1 performance/price ratio is poor value from a strictly performance/price standpoint; this means that even if higher end cards could increase their performance by the same ratio that their price increases (which they fall FAR short of), a 100% increase in performance is NOT (from a strictly P/P perspective) worth a 100% increase in price.
Modest, are we?
How about the least bit of appreciation, then?
To say this(which might be his tool, I dunno):
IF Current fps * 1.4-1.7 < Expected FPS THEN upgrade.
Compare Fps/$$$ for all cards that exceed 30fps AND meet the upgrade requirement. The card with the highest Frames per $ is the ideal Pr/Per upgrade.
However even this is unnecessary. All you need is
IF Current Card dose not do what you want THEN buy cheapest card that dose.
No. It's not frames per $. Try again. Ok, one more time, you take the difference in performance of two cards expressed as a percentage, you take the difference in price expressed as a percentage, if the difference of each percent increase in performance vs each percent increase in price isn't about 1.5x or greater for the more expensive/higher-performing card, then it's not worth it. E.G., very simply, if card B costs 100% more, it should perform at least about 150% better; not too complicated, is it?
UPDATE-- sorry, i meant that if one card costs 100% MORE it should perform at least 50% BETTER / have 150% the performance.
IF you enjoy limiting the information at hand to you (and therefore most likely making a less optimal decision), then sure, that's all you need.
Timeless to you. Real world Pr/Per is decided by the other cards that occupy the same market segment, not some ratio that exist in a vacuum.
I'm curious, how would YOU accurately define the Moore's law/generational progress rate, then?
No. It's not frames per $. Try again. Ok, one more time, you take the difference in performance of two cards expressed as a percentage, you take the difference in price expressed as a percentage, if the difference of each percent increase in performance vs each percent increase in price isn't about 1.5x or greater for the more expensive/higher-performing card, then it's not worth it. E.G., very simply, if card B costs 100% more, it should perform at least about 150% better; not too complicated, is it?
How about the least bit of appreciation, then?
What I was obviously trying to say was that while the 8800gt was a pretty good value THEN at its original price relative to its competition, the 9600gt at that price relative to its competition is an even better value NOW, relatively-speaking, i.e., time accounted for. Are you trying to argue that a 9600gt for $54 ISN'T a proportionately/relatively better deal (P/P wise)?
What I was obviously trying to say was that while the 8800gt was a pretty good value THEN at its original price relative to its competition, the 9600gt at that price relative to its competition is an even better value NOW, relatively-speaking, i.e., time accounted for.
Originally posted by: nitromullet
Prices are from newegg, searching for card and sorting by lowest price. (Rebates not factored in)
4870 X2 557.60fps/$499.99 = 1.12 fps/$
4870 595.80fps/$224.99 = 2.65 fps/$
4850 536.50fps/$159.99 = 3.35 fps/$
4830 (Tom doesn't have this card yet, apparently)
4670 388.60fps/$75.99 = 5.11 fps/$
Looks like I was dead on (well, I might be wrong about the 4830), all without a formula. How did I know that? It boils down to the simple fact that newer tech generally costs more, and has a worse P/P ratio. The highest end cards tend to have the newest tech, should perform the best, but offer a worse return on investment dollar for dollar...
Well, actually, there is a bit of a formula there. Ah ha... but if you're a less-knowledgeable consumer, you might think that the 9500gt has a great P/P--- which it doesn't, it's a turd; in fact, it would seem to have worse fps/$ than the 4870 at 2.044 fps/$ (it's about 2.5x worse than the 4670). The 4830 out-performs the 4850 in terms of P/P. Oh yeah, and why the heck does the 4870x2 have a lower absolute framerate than the 4870? Maybe the fps is too high? Seems like a terrible benchmark (for the 4870x2 at least). The problem with your method here is that it is completely ignorant of moore's law/generational increases. One of my main points is that just taking the fps divided by $ gives you a crappy picture. In fact, the disparity is 1.5x worse than what it appears above with my method (i.e., going by the increases in performance for free between generations; in other words, you're losing at least 50 cents on the dollar of every dollar more you spend than is optimal).
Originally posted by: garritynet
Better deal P/P wise? Sure, whatever. Dose anyone need to do math to figure that out? No. Dose P/P have anything to do with whether or not a 9600gt will run COD4 on my monitor? No.
My point though this whole debate is that your tool is a whole lot of nothing. Graphic cards will obey moores law. Insightful. Don't buy cards that aren't a good deal. Innovative.
I'm not going to respond to such comments anymore... you have my answer and my reasoning.
Thanks so much for representing my entire argument as being merely: graphic cards will obey moore's law; don't buy cards that aren't a good deal.
pathetic.
Thanks so much for representing my entire argument as being merely: graphic cards will obey moore's law; don't buy cards that aren't a good deal.
pathetic.
Well, actually, there is a bit of a formula there. Ah ha... but if you're a less-knowledgeable consumer, you might think that the 9500gt has a great P/P--- which it doesn't, it's a turd; in fact, it would seem to have worse fps/$ than the 4870 at 2.044 fps/$ (it's about 2.5x worse than the 4670). The 4830 out-performs the 4850 in terms of P/P. Oh yeah, and why the heck does the 4870x2 have a lower absolute framerate than the 4870? Maybe the fps is too high? Seems like a terrible benchmark (for the 4870x2 at least). The problem with your method here is that it is completely ignorant of moore's law/generational increases. One of my main points is that just taking the fps divided by $ gives you a crappy picture. In fact, the disparity is 1.5x worse than what it appears above with my method (i.e., going by the increases in performance for free between generations; in other words, you're losing at least 50 cents on the dollar of every dollar more you spend than is optimal).
In fact, the disparity is 1.5x worse than what it appears above with my method
Originally posted by: garritynet
The 4870 is good deal priced at $200AR. That your method fails to recognize that shows that it is faulty. This "free increase in performance" is actually not free at all. Remember: Moore's Law is more of an observation than an actual law. ATI worked their butts of to deliver the 4870 and they charge for all that R&D too. That cards on the lower end of the spectrum offer a better FP$ has nothing to do with Moore's Law.
So he still got it right.........just the disparity is different? So his method still works when it comes to finding the best 'value' for your money.
That my method disagrees with your subjective opinion (which is presented with nothing else to back it up) demonstrates no fault. All my method says is that there is a better deal than such cards as the 4870 (even with that insane rebate).
I have no idea what you're talking about regarding that moore's law not applying to lower end cards. If moore's law is not your thing, then how about simply a statistical guarantee that performance tends to improve at a certain predictable rate. The rate I have observed is 1.5x over generations in gpu's
The answer is that this simpler method sort of does, but doesn't really work. It sort of works for finding it out when the differences are large and at low price points, basically at finding a starting point (or bottom) to start your comparisons with more expensive cards. Using his method, you will inevitably mistakenly conclude that cards that offer 1:1 P/P's are worthwhile in terms of comparing percentage increase in price vs percentage increase in performance, I disagree, and so does Gordon Moore .
Really though, use your brain for a moment and you'll realize that this is not at all a controversial point... everyone agrees with it, otherwise the next time you went to buy a graphics card next generation, instead of paying $200, you'll have to pay $300 (200x1.5) to get the usual performance increase, assuming a usual performance increase of 1.5x (this is how a 1:1 view sees the world; idiotic, isn't it? But in what publications have you seen this point ever having been raised when they do P/P comparisons? -- i.e., they simply take fps/$ or $/fps).
Originally posted by: djayjp
Originally posted by: nitromullet
Prices are from newegg, searching for card and sorting by lowest price. (Rebates not factored in)
4870 X2 557.60fps/$499.99 = 1.12 fps/$
4870 595.80fps/$224.99 = 2.65 fps/$
4850 536.50fps/$159.99 = 3.35 fps/$
4830 (Tom doesn't have this card yet, apparently)
4670 388.60fps/$75.99 = 5.11 fps/$
Looks like I was dead on (well, I might be wrong about the 4830), all without a formula. How did I know that? It boils down to the simple fact that newer tech generally costs more, and has a worse P/P ratio. The highest end cards tend to have the newest tech, should perform the best, but offer a worse return on investment dollar for dollar...
Well, actually, there is a bit of a formula there. Ah ha... but if you're a less-knowledgeable consumer, you might think that the 9500gt has a great P/P--- which it doesn't, it's a turd; in fact, it would seem to have worse fps/$ than the 4870 at 2.044 fps/$ (it's about 2.5x worse than the 4670). The 4830 out-performs the 4850 in terms of P/P. Oh yeah, and why the heck does the 4870x2 have a lower absolute framerate than the 4870? Maybe the fps is too high? Seems like a terrible benchmark (for the 4870x2 at least). The problem with your method here is that it is completely ignorant of moore's law/generational increases. One of my main points is that just taking the fps divided by $ gives you a crappy picture. In fact, the disparity is 1.5x worse than what it appears above with my method (i.e., going by the increases in performance for free between generations; in other words, you're losing at least 50 cents on the dollar of every dollar more you spend than is optimal).
Originally posted by: djayjp
***UPDATE: changed my recommended values from 1.7x to 1.4x-1.6x...
Originally posted by: nitromullet
The point that you keep missing is that your recommended values are in fact your opinion based on your interpretation of Moore's Law.
1) Moore's Law is concerned with the number of transistors, and it is well known then video card performance is not always in direct correlation with the number of transistors in the gpu.
2) Your recommend figure changes. Absolute truths generally don't. Pi, for example, has not changed my entire life, while your truth seems to have undergone a metamorphosis in just one weekend:
Originally posted by: djayjp
***UPDATE: changed my recommended values from 1.7x to 1.4x-1.6x...
...it's your opinion. Nothing wrong with posting your opinion in a forum, but accept the fact that some will disagree with you or find less value in it than you do.
Like I said, what makes this guide/tool so useful is the fact that this figure should be timeless (barring some unforeseen massive performance changes unlike any we have yet seen).