The only true path of future gaming; Raytracing

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
With all this talk off apis and gpus lately, you almost forget that in there current implementation they are a dead end.

The only gpu company with some foresight seems to be Imagination tech. They might have been defeated once but why compete on your competitions terms?

A press release and a post on engadget gas dropped with IMG tech talking up raytracing on their latest hardware.

[ http://www.engadget.com/2014/03/18/imagination-powervr-raytracing/#comments ][ http://imgtec.com/news/Release/index.asp?NewsID=853 ]

we know ray tracing is the future and we also know it is very, very computationally intensive. It can literally take server farms hours to produce a high quality production quality render. So why is this little mobile gpu IP vendor boasting the capability? It is simple, its called innovation.

Enough ranting. Right now most of you guys with beefy gpus should be able to do some simple renders using a path tracing technique. The technique differs slightly from ray tracing as you and render in real time at the cost of visual noise. There are a few demos out there of path tracers implemented in Opencl:

  1. https://code.google.com/p/sfera/ | https://www.youtube.com/watch?v=Dh9uWYaiP3s
  2. http://www.geeks3d.com/20120502/laguna-real-time-opencl-path-tracer/
  3. http://davibu.interfree.it/opencl/smallptgpu/smallptGPU.html | http://www.youtube.com/watch?v=TAZsC3buDug

also there is octane renderer that seems to be a beastly path tracer implemented in CUDA
  1. http://render.otoy.com/ | https://www.youtube.com/user/SuperGastrocnemius/videos

when will AMD or NVIDIA make a proper raytracing or path tracing card?

[for the technically minded] is it possible to implement an efficient enough path tracer in opencl that can achieve playable frame rates?
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I still think ray tracing for games is WAY far out. Like a decade or so. The problem is that games keep getting more complex. A current GPU could probably run a game from the year 2000 with ray tracing. But so long as games keep getting more and more complex, I just don't see it happening any time soon.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Is there something in particular that prevents raytracing being done by a dedicated raytracing-specific GPU?

I mean, like is it something about the underlying mathematics, such as not being helped by massively parallel simple operators like GPUs have? Or is it just a simple matter of throwing together the right silicon to handle the straightforward mathematics?

My poor understanding makes me think it's hard math and so you can't make hardware to do it easily, and instead you need massive computational power that is orders of magnitude out of our reach with today's processors?
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
I still think ray tracing for games is WAY far out. Like a decade or so. The problem is that games keep getting more complex. A current GPU could probably run a game from the year 2000 with ray tracing. But so long as games keep getting more and more complex, I just don't see it happening any time soon.

that's kinda the subtext of the post, is this a far out technical issue or just one that hasn't been focused on yet? Unfortunately I am not technically minded enough to come to any sort of conclusion.
It is just interesting the imagination -prevously caustics graphics [ https://www.youtube.com/user/CausticGraphics/videos ]- are pushing this near future(?) rendering technique and no one else is saying much.

edit oh and there is siliconarts [ https://www.youtube.com/channel/UC9nLfJZRLg8cnZeekh3K_gQ ]
 

Jodell88

Diamond Member
Jan 29, 2007
9,491
42
91
I suppose this is news for you guys but I've seen this stuff for years already. Blender 3D does it with its Cycles rendering engine, Luxrender has it with SLG.
 

Jodell88

Diamond Member
Jan 29, 2007
9,491
42
91
I already posted that, beats the hell outa my then 6850 and now kaveri a10...wonder how it would fair on a titan or 290x?
Didn't see the link in the OP, sorry.

AMD rules at OpenCL so the 290X would be a monster. A TITAN may offer decent performance as well.
 
Last edited:

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
I suppose this is news for you guys but I've seen this stuff for years already. Blender 3D does it with its Cycles rendering engine, Luxrender has it with SLG.

it not about whether the tech existed or not [nearly every movie with CG uses ray tracing since the 90s] but this discussion is about if it is now feasible, why no focus from amd/nv, how far out is this tech and why imagination tech is the only one to focus on it.

also I dont think blender is realtime even for simple scenes
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0

Jodell88

Diamond Member
Jan 29, 2007
9,491
42
91
wow, looks good, I wont dispute your claim but the devil is in the details
That's his mistake on recording it.

However the beauty about Blender is you can download it and use right now free of charge. If you have a Nvidia GPU Cycles can utilize it. The OpenCL backend is not up to snuff as yet.

For OpenCL there's Luxrender and SLG.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
That's his mistake on recording it.

However the beauty about Blender is you can download it and use right now free of charge. If you have a Nvidia GPU Cycles can utilize it. The OpenCL backend is not up to snuff as yet.

For OpenCL there's Luxrender and SLG.

do you know if the blender game engine can use cycles?
 

antef

Senior member
Dec 29, 2010
337
0
71
Reading AT's news today on Imagination's new tech I also have the same question as the OP. Even if AMD and NVIDIA aren't as optimistic about real-time ray tracing in games as Imagination might be, there's no doubt ray-tracing algos have already been in use for a long while, in rendering programs, CAD, movie CGI, etc. So why haven't they been working on dedicated hardware to accelerate this for a long time? Seems it would have a ton of immediate applications. So all this time the two leaders in graphics have been leaving all their customers use ray-tracing routines to run them on traditional CPUs? Doesn't seem to make sense.
 

Jodell88

Diamond Member
Jan 29, 2007
9,491
42
91
Reading AT's news today on Imagination's new tech I also have the same question as the OP. Even if AMD and NVIDIA aren't as optimistic about real-time ray tracing in games as Imagination might be, there's no doubt ray-tracing algos have already been in use for a long while, in rendering programs, CAD, movie CGI, etc. So why haven't they been working on dedicated hardware to accelerate this for a long time? Seems it would have a ton of immediate applications. So all this time the two leaders in graphics have been leaving all their customers use ray-tracing routines to run them on traditional CPUs? Doesn't seem to make sense.
Ray tracing is computationally expensive. That's why it hasn't been used in games. Each ray of light has to be computationally traced, when it hits an object and bounces off in different directions has to be traced, and the list goes on.

While ray tracing may be the next huge graphical leap in graphics technology, today's technology isn't fast enough.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Does the computational load, for tracing a ray, what is the nature of that type of computation? Is it easily handled my massively parallel GPUs, or does it need something more, like a bunch of fancy CPUs, or even a dedicated custom chip, to accelerate the computation?

Or is there something about tracing a ray that makes it uniquely unable to be handled by simply throwing more silicon at it?
 

Jodell88

Diamond Member
Jan 29, 2007
9,491
42
91
Does the computational load, for tracing a ray, what is the nature of that type of computation? Is it easily handled my massively parallel GPUs, or does it need something more, like a bunch of fancy CPUs, or even a dedicated custom chip, to accelerate the computation?

Or is there something about tracing a ray that makes it uniquely unable to be handled by simply throwing more silicon at it?
GPUs are actually great at ray tracing. However, the amount of math that has to be calculated is insane. This is why ray tracing hasn't taken off in games as it is impossible to calculate all the math and produce a relatively noise free image to the screen at a reasonable FPS.

It is a problem that can be solved by throwing more silicon at it, but it'll need a lot of silicon to have something playable.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Raytracing on games is overrated. There are unbiased rendering methods newer than raytracing like path tracing which need less tricks to emulate photorealistic effects, ligthing and soft shadows.

You can achieve 95% of the quality of a PPT (progressive path traced) render with unbiased methods, with 1/10 of the rendering time or less. I would rather bet implementing GI instead of AO is the next big thing for making games more realistic-looking.
 

antef

Senior member
Dec 29, 2010
337
0
71
Ray tracing is computationally expensive. That's why it hasn't been used in games. Each ray of light has to be computationally traced, when it hits an object and bounces off in different directions has to be traced, and the list goes on.

While ray tracing may be the next huge graphical leap in graphics technology, today's technology isn't fast enough.

I'm sorry but your post doesn't seem to be a response to what I said. What I said was, regardless of the feasibility of ray tracing in games, ray tracing routines have already been in use for years in a number of non-real-time applications such as CAD and CGI. Given this and the fact that dedicated hardware to accelerate ray tracing such as what Imagination is working on or what the CausticOne does seems to speed it up a great deal, why don't the two leaders in graphics solutions in the industry have their own ray-tracing acceleration hardware already? Instead people who use these applications run the routines on high-end Xeons or something like that (or maybe on normal GPUs via CUDA or OpenCL). But if dedicated silicon helps, you'd think the big players would already have it out there.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
I applaud the idea of doing something 'different' but if this were really plausible right now wouldn't the big players already be looking into it?

I guess with that logic no one new would ever show up... but its not like we haven't know about RT forever. I assume when hardware gets fast enough, and the limitations of the current methods are being reached, NV and AMD (and Intel) will be more then ready to make RT happen.

Its not like I can suceed by telling the world hey, rice is really good, you should all eat it now.
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
I'm sorry but your post doesn't seem to be a response to what I said. What I said was, regardless of the feasibility of ray tracing in games, ray tracing routines have already been in use for years in a number of non-real-time applications such as CAD and CGI. Given this and the fact that dedicated hardware to accelerate ray tracing such as what Imagination is working on or what the CausticOne does seems to speed it up a great deal, why don't the two leaders in graphics solutions in the industry have their own ray-tracing acceleration hardware already? Instead people who use these applications run the routines on high-end Xeons or something like that (or maybe on normal GPUs via CUDA or OpenCL). But if dedicated silicon helps, you'd think the big players would already have it out there.

It was a valid response. What he is saying is that time is better spend on other activities currently. Graphic cards are not calculating magic, they are calculating numbers. Even with hardware acceleration specific to this task, does not mean that the total amount of numbers to be crunched changes. Currently, GPUs are limited to a finite set of limiting resources. Size, heat, material ect.

Great lighting would be nice, but other effects are more important for most people. Once the cost of ray tracing is reduced in relation to other factors, you will see it in games. Right now, it would be too much at the current limits of GPUs. Maybe the new node will change that, but I doubt it.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |