Next leap in graphics?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
No leaps until consoles get a drastic power UP!

Designing games to run on the lowest common (and profitable) denominator doesn't inspire graphics limits to be pushed.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
No leaps until consoles get a drastic power UP!

Designing games to run on the lowest common (and profitable) denominator doesn't inspire graphics limits to be pushed.

This.

And even then, gamedevs would need to invest more in graphics which I don't think they have the budget for. Hence my suggestion of their pooling resources to build a master texture library. Yeah, it'll never happen, blah blah but it'd be more efficient and free up more budget for other things like art direction.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
This.

And even then, gamedevs would need to invest more in graphics which I don't think they have the budget for. Hence my suggestion of their pooling resources to build a master texture library. Yeah, it'll never happen, blah blah but it'd be more efficient and free up more budget for other things like art direction.
Thats a neat idea but u will still need considerable raw horsepower to decompress it.Remember mega-textures from rage
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Thats a neat idea but u will still need considerable raw horsepower to decompress it.Remember mega-textures from rage

You can have ultrahigh resolution and then downsample from there. Should not be a problem.

Take granite. You can have many types of granite, but for any given type, you could have an extremely high-rez texture of it and downsample to whatever level of resolution you wanted and use that for your game.

As it stands, game companies have limited budgets and don't even really try to have high-rez textures, with few exceptions. If gamedevs pooled together their resources on an open-source master texture library that would be so much more efficient.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
You can have ultrahigh resolution and then downsample from there. Should not be a problem.

Take granite. You can have many types of granite, but for any given type, you could have an extremely high-rez texture of it and downsample to whatever level of resolution you wanted and use that for your game.

As it stands, game companies have limited budgets and don't even really try to have high-rez textures, with few exceptions. If gamedevs pooled together their resources on an open-source master texture library that would be so much more efficient.

They will need open-source tools too,the good ole 3d max may not work with them
We don't get high res textures because games are generally developed for consoles first.
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
This.

And even then, gamedevs would need to invest more in graphics which I don't think they have the budget for. Hence my suggestion of their pooling resources to build a master texture library. Yeah, it'll never happen, blah blah but it'd be more efficient and free up more budget for other things like art direction.

This
 

WaitingForNehalem

Platinum Member
Aug 24, 2008
2,497
0
71
Battlefield 3 was definitely a huge step forward in graphics. I went back and played the original Crysis maxed out and was surprised at how inferior it looked compared to BF3, especially the lighting.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I thought the consensus was that witcher 2 has the best graphics w/o modding, right now?
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I thought the consensus was that witcher 2 has the best graphics w/o modding, right now?

It is subjective. I've beaten the Witcher 2 on PC and it looks great, but it is more saturated and whimsical than something like BF3 which is geared to be realistic. IMO Crysis still looks the most realistic out of anything I have played.
 

sandorski

No Lifer
Oct 10, 1999
70,231
5,807
126
I really don't think there will ever be the kind of dramatic leaps like we had when 3D became a reality. Things will definitely improve over time, but I think the next big leap will happen on the Dev side, making it far easier to implement features into a Game. That will be more Software related than Hardware related.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
Crysis for sure, particularly modded. You can't mod Witcher 2 or BF3. Crysis/Warhead shipped in a state that was super impressive and still had more potential available with mods and extending the hidden options in the engine. Cryengine 3 was a step backwards to my eyes visually.

When you mod Crysis with improved textures and enable 16AF and greater shadow depth etc., it just looks incredible. The only improvement I have seen over it in a few games, is better lighting.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Seems to me that the graphics in games has only had marginal increases in quality for the past year or so. There are some good looking games out there, but when can we expect the next big jump in graphics technology, similar to the time when Far Cry, Half Life 2 and Doom 3 came out?
After the next console gen comes, probably. Also, after AVX2 becomes mainstream. Only in the last year or two have we finally gotten games with lighting of Id Tech 4 or better quality, FI, that don't use the Doom 3 engine. Not only that, but there have been steps backward, too, from even Id themselves . While others have met their quality, added geometry, and improved performance scaling, Doom 3 (Id tech 4) and Crysis are still about as good as anything out there.

Some games are now making good use extra CPU cores, and physics is really improving, after stagnating for awhile. We could get some real boosts with DX11 as a baseline, but major improvements will rely on artists who understand the technology, after more engine features are well-implemented. Big game companies that treat their content guys like their code monkeys are probably going to keep doing the same old thing.

We are still very much limited by what our CPUs can do, and what content makers are willing to put effort into (edit: also content creation application creators). Clipping, flickering, bad decal implementations, bugs with baked-in lighting, etc., still permeate games. IMO, the next big jump will include a reduced reliance on 3rd-party content creation applications, especially as it concerns models and animation. Make initial high-LOD models outside of it, but then everything else should be done with an engine-specific content creation kit, that renders with the game engine itself, and has many specific tweaks for building a playable world. Turning something that looks awesome in 3DS Max into something that looks OK for common GPUs is the wrong way to think about it, regardless of whether you use such a tool or not. I think Crytek, FI, is on the right track, especially offering an SDK as freeware.

In terms of graphics detail, we are already well into diminishing returns. In terms of immersive game feel, we've only just improved over the early Havok games (IE, going from added-on gimmick to an integral feature of the game world).

all the "graphic improvements" you have been seeing are just shortcuts to make games look better.
That's been the case for a long time. Quality lighting and shadow are the only real exceptions, and they are themselves full of little shortcuts. That's why raster can keep on chugging, despite all the ray-tracing fans .

If Intel wanted too they could probably get Havoc to do that. But their IGPs are not the best for OpenCL.
But, their CPUs are, and Haswell will finally bring some major vector and multithreading improvements, shortly followed by AMD, no doubt.

The next breakthrough in graphics will not happen due to more beefy hardware but through the industry finding and embracing ways to make content generation easier reducing the load on artists. It takes around two weeks to make a single high quality 3d model for a single person working on a AAA game iirc. I bet we'll see more emphasis on procedural generation.
:thumbsup: We need that to be quicker/easier, and we need to be able to build creatures. Define skeleton, muscle, fat, skin, random filler tissue, etc. (including being able to define custom material properties, for robots, anatomically incorrect critters, physics-defying being, and whatnot), as part of a model, have the engine's SDK process it, and then work from there.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The next breakthrough in graphics will not happen due to more beefy hardware but through the industry finding and embracing ways to make content generation easier reducing the load on artists. It takes around two weeks to make a single high quality 3d model for a single person working on a AAA game iirc. I bet we'll see more emphasis on procedural generation.

Nah, these professional modelers can knock off models really fast. If it took them 2wks to make a model, they'd starve. Besides, it's not the models that are the issue.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
This.

And even then, gamedevs would need to invest more in graphics which I don't think they have the budget for. Hence my suggestion of their pooling resources to build a master texture library. Yeah, it'll never happen, blah blah but it'd be more efficient and free up more budget for other things like art direction.

The modeler has his/her own texture library. So does the modeling software, but the modeler builds a library up them self.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Its easy to blame consoles. But they are really not to blame.

Coders are lazy today and expect people just to buy more and bigger hardware for the bruteforce effect.

Look again at consoles you blame. Tho we can agree they are wastly inferiour, keeps a somewhat restraint on what can be done and so on. But look again, look at how far and how much can be done with 256MB memory and a x1900/g7900 card. Plus 256MB memory for OS+game on extremely weak CPUs.

GFX today is absicly limited by pixels. And why they do hope to promote it. I dont see any real overall visual effect from retina type screens on the desktop. Unless you move even closer to the screen. And besides that we got useless features like 8, 16, 32 even 64x AA. Utter joke. Or as some say, first you pay for a GFX card to play then game. Then you pay again to turn AA on.
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
Its easy to blame consoles. But they are really not to blame.

Coders are lazy today and expect people just to buy more and bigger hardware for the bruteforce effect.

Look again at consoles you blame. Tho we can agree they are wastly inferiour, keeps a somewhat restraint on what can be done and so on. But look again, look at how far and how much can be done with 256MB memory and a x1900/g7900 card. Plus 256MB memory for OS+game on extremely weak CPUs.

GFX today is absicly limited by pixels. And why they do hope to promote it. I dont see any real overall visual effect from retina type screens on the desktop. Unless you move even closer to the screen. And besides that we got useless features like 8, 16, 32 even 64x AA. Utter joke. Or as some say, first you pay for a GFX card to play then game. Then you pay again to turn AA on.
this
 

KompuKare

Golden Member
Jul 28, 2009
1,183
1,464
136
I would like it if an APU accelerated physics API were released.

I think this is an very important idea. ATM, the onboard iGPU is pretty much just sitting there doing nothing for anyone who has a dGPU. In theory, AMD have the hybridCF option but even when running with two identical cards normal CF and SLI suffer from micro-stuttering (and it seems CF worse than SLI).

Therefore what chance is there of when running on two GPUs of totally different speeds, that anyone (AMD, Intel, Lucid, Nvida) being able to those inbalanced GPUs to do anything useful together?

Unless, the weaker iGPU gets to something else: either physics or post-processing (AA etc.) only where the latency and sync issues do not ariser As an added bonus (ATM only for AMD 7000s), the dGPU should be in ZeroCore (I'm sure Nvidia will come up with something similar soon too: they finally seem to be taking power usage seriously) when not gaming with the fan off.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |