Advantages of the pathtracing approach could include native warp aware rendering and free supersampling eliminating some of the performance penalties inherent in the current over sized raster+post warp approach required for high quality anti-aliased images. Add eye tracking and adaptive quality rendering and I reckon it could be done with todays hardware. No doubt by the time the consumer rift rolls around, this stuff will be looking a lot more practical.
the problem for approaches like this is that the traditional method is getting to a point of per pixel geometry any way. so that plus baked lighting or what ever they cook up for real time lighting will probably run better on cards designed to accelerate that method.
if i get per pixel geometry and textures up to 5 inches away from the model then there's no real need for improvement beyond that on those criteria.
i thought i had to share this, although its only a commentary, but tells sth interesting about the future of gaming. the respond to 'Mark' was one of the most intersting answers. quote from http://raytracey.blogspot.de/ commentaries Well, obviously not for the PS4, because Sam Lapere just told that, but probably for the cloud, SLI Titan Rigs or next next gen consoles, i think.
Sam Lapere said... Hi, sorry for the delay in answering. Great to see so much enthusiasm
Anonymous, Teemu, colocolo: the session was recorded and shoulds appear on Nvida's website soon. I will post a link once it's there.
Mark: yes, we've been approached by some of the largest game developers in the industry to use Brigade. Unfortunately I can't tell which ones.
MrPapillon: exactly, we still need some work on the tools side of things, but we're getting there
Reaven: Brigade is actually quite easy to develop game content for. There are almost no corner cases compared to a rasterizer. You don't need to worry about transparent surfaces, hundreds of lights, shadow map resolution, depth of field artifacts, ... The tech is not fully there yet, but we have made enormous progress over the course of one year (a year ago, I never thought we could do the massively dynamic scenes that we can do today) and I'm convinced that we will have a very compelling product very soon.
Sean: thanks for your comment, couldn't have said it better
Anonymous: yes, yebis is used. More on that on the GDC next week.
Anonymous: yes, we can now edit all the materials via a simple GUI at runtime.
During the next days, we are going to record and upload some videos of the Brigade demo shown at GTC. The video files are huge and Youtube sometimes tends to cancel the upload for no reason, so please be patient.
This should be perfect for the rift, the low resolution of the rift should help boost the the fps!
The video runs in a very low fov, low resolution, in 2d and still has way too much noise even with the Titans in SLI they are using. Running it in stereoscopic 3D, higher resolution, >110 FOV as well as interface, AI, textures, physics and so on is still far, far away.
Especially considering they are also using the ideal settings for a raytracer, instanced geometrics. Things like trees on the other hand are killers when it comes to raytracing performance. As the light keeps reflecting on and through every leaf multiple times, reducing the performance to a crawl (or a noise-fest, rather).
So while it will happen sometimes in the future and what they are doing is mightly impressive, don't expect to be running a game with it on the Rift in the near future.
Mon Mar 25, 2013 5:50 am
Hermit
Two Eyed Hopeful
Joined: Thu Mar 07, 2013 3:36 am Posts: 58 Location: Hermit' Cabin
Path tracing is a very promising technique with lots of future potential, but realistically I don't see it being used in actual games (apart from simple proof of concept demos) anytime soon. SVO cone tracing, on the other hand, is a technique that is capable of producing realistic, non-grainy graphics at medium to high resolution and interactive frame rates right now. I wouldn't be surprised to see something similar being used for games in the near future. Here's a nice video demonstrating the possibilities:
but the engine has a major benefit against rasterization engines. it can run hundreds of Billions of polys and they theoretically dont have to be instanced . as long as the models fit into the VRAM it would be no problem. thats insane. this engine has the potential to become the matrix.
Mon Mar 25, 2013 7:24 am
STRZ
Certif-Eyed!
Joined: Mon Dec 05, 2011 3:02 am Posts: 559 Location: Geekenhausen
This should be perfect for the rift, the low resolution of the rift should help boost the the fps!
The video runs in a very low fov, low resolution, in 2d and still has way too much noise even with the Titans in SLI they are using. Running it in stereoscopic 3D, higher resolution, >110 FOV as well as interface, AI, textures, physics and so on is still far, far away.
Especially considering they are also using the ideal settings for a raytracer, instanced geometrics. Things like trees on the other hand are killers when it comes to raytracing performance. As the light keeps reflecting on and through every leaf multiple times, reducing the performance to a crawl (or a noise-fest, rather).
So while it will happen sometimes in the future and what they are doing is mightly impressive, don't expect to be running a game with it on the Rift in the near future.
How do you know the video is running in low res? If it's 720p it's still in higher res than the rift. And why does the FOV matter? And normal game physics are good enough. Also, sure it's under ideal settings, but why not make a rift demo under ideal settings? I'm fine with a city demo with no people as opposed to a forest with AI ppl running around.
I'd want a simple scene, that looks super realistic, shown on the oculus. This seems to get close to that.
Mon Mar 25, 2013 9:53 am
virror
Sharp Eyed Eagle!
Joined: Fri Jan 18, 2013 7:13 am Posts: 427 Location: Gothenburg, Sweden
interesting stuff but as far as one of the videos where everything is grainy and then it renders thats not going to be useful for the Rift right now but some good demos
i don't think were quite there yet and this may still be a year off but once hardware gets better and software gets better too im sure it will happen sooner or later .
You can run the demo in a browser with WebGL support on a machine with a fairly recent GPU. In the upper right corner you can set the pixel size to 1:1.
The trick behind it is pre-computing a bunch of stuff with 'light probes' which are just white spheres that get illuminated with the ambience. The data is saved and for each render pass the GPU looks up values which actually never change for the scene. It's called a time-space optimization.
Don't know how the scene would deal with animation... yet. I'm stealing everything! ;D
SVO cone tracing, on the other hand, is a technique that is capable of producing realistic, non-grainy graphics at medium to high resolution and interactive frame rates right now. I wouldn't be surprised to see something similar being used for games in the near future.
Unreal Engine 4 is already using it for lighting. The guy whose PhD thesis it's based on has a ton of interesting papers on his site.
nanicoar wrote:
The trick behind it is pre-computing a bunch of stuff with 'light probes' which are just white spheres that get illuminated with the ambience. The data is saved and for each render pass the GPU looks up values which actually never change for the scene. It's called a time-space optimization.
Don't know how the scene would deal with animation... yet. I'm stealing everything! ;D
Animation works fine, search for spherical harmonic lighting and you should find lots of information.
This should be perfect for the rift, the low resolution of the rift should help boost the the fps!
The video runs in a very low fov, low resolution, in 2d and still has way too much noise even with the Titans in SLI they are using. Running it in stereoscopic 3D, higher resolution, >110 FOV as well as interface, AI, textures, physics and so on is still far, far away.
Especially considering they are also using the ideal settings for a raytracer, instanced geometrics. Things like trees on the other hand are killers when it comes to raytracing performance. As the light keeps reflecting on and through every leaf multiple times, reducing the performance to a crawl (or a noise-fest, rather).
So while it will happen sometimes in the future and what they are doing is mightly impressive, don't expect to be running a game with it on the Rift in the near future.
How do you know the video is running in low res? If it's 720p it's still in higher res than the rift. And why does the FOV matter? And normal game physics are good enough. Also, sure it's under ideal settings, but why not make a rift demo under ideal settings? I'm fine with a city demo with no people as opposed to a forest with AI ppl running around.
Because a game is not just a static flat box with just one light source. What they are doing is impressive but it is not close to being ready for a game yet. Tech demos tends to be impressive. Such as this video where a mobile cpu runs raytracing, a smoke particle effect trail and an ocean wave simulator, at the same time. It doesn't mean that you'll be seeing raytracing together with other advanced effects is on a mobile phone anytime soon, however.
If you just want a tech demo. Nothing is stopping you, the engine is open source so go and download it and try it out yourself.
The video runs in a very low fov, low resolution, in 2d and still has way too much noise even with the Titans in SLI they are using. Running it in stereoscopic 3D, higher resolution, >110 FOV as well as interface, AI, textures, physics and so on is still far, far away.
Especially considering they are also using the ideal settings for a raytracer, instanced geometrics. Things like trees on the other hand are killers when it comes to raytracing performance. As the light keeps reflecting on and through every leaf multiple times, reducing the performance to a crawl (or a noise-fest, rather).
So while it will happen sometimes in the future and what they are doing is mightly impressive, don't expect to be running a game with it on the Rift in the near future.
How do you know the video is running in low res? If it's 720p it's still in higher res than the rift. And why does the FOV matter? And normal game physics are good enough. Also, sure it's under ideal settings, but why not make a rift demo under ideal settings? I'm fine with a city demo with no people as opposed to a forest with AI ppl running around.
Because a game is not just a static flat box with just one light source. What they are doing is impressive but it is not close to being ready for a game yet. Tech demos tends to be impressive. Such as this video where a mobile cpu runs raytracing, a smoke particle effect trail and an ocean wave simulator, at the same time. It doesn't mean that you'll be seeing raytracing together with other advanced effects is on a mobile phone anytime soon, however.
If you just want a tech demo. Nothing is stopping you, the engine is open source so go and download it and try it out yourself.
I'm fully aware of the difference between a tech demo and a game, thanks. Didn't know it was open source, might just have to try it out then, because all I want is a tech demo.
Brigade 3 is getting faster and faster. http://raytracey.blogspot.co.nz/ there is an uncompressed video file at the end of the blog entry. 2.4gb.
Sam Lapere said... colocolo: yes, we're currently only a factor of 10x away from game quality noisefree images in real-time. That means that if we don't do any further algorithmic optimizations, GPU's will have the power to run this at high image quality in 720p in 5 years. But if you take into account that there will be substantial algorithmic and hardware improvements, I think it will be closer to 1.5-2 years from now (for 1080p/30fps).
but anyway pixels need to get shaded physically correct.... and ray tracing is the best way for that.... a combination of both technolgies though would be nice...although brigade can handle billions of polygons, at some point RAM get filled up.
Yea. Don't put much faith in Euclideon. Since they've started getting attention a few years ago, they've proven themselves to be a very sketchy company. Say one thing, do another. Show very little of what their engine can do. Make absurd claims (that very well may be true) but backed up with little to no shown fact, often with tech lingo that's used wrong conveying that they're idiots.
The debate for awhile was whether it was a hoax or not. I always thought it was real, and now I think there little doubt that it's real. Although I can't say them ever planning it as a game engine was real. I've found it really shady that they garner so much attention and press from gamers and gaming media, then turn around and do the geoverse thing and not mention a thing about gaming since. (I found 1 reference to the word "game" on their site, but its more of a advertizement). They claim that their gaming engine is on hold cause of the timing with the changing of the console cycle. I guess PC doesn't exist in their world?
But even if they were taking a gaming engine serious, they still have a lot of work to do and prove. Like with lighting and animation. Going with what info we have now, Brigade 3 will look better and WILL happen. Euclideons UD engine just "might happen". About all EUD has going for it is potentially better frame rates on a sooner time frame, since its still not using the GPU.
The EUD tech is interesting, but there's still to little known about it.
I think people are too harsh. Both Euclideon and Brigade 3 are ahead of their time. It's tough to fight against mainstream by creating something that will take quite some time to become advanced enough, to be really useful. It's true, Brigade has more to show right now and is closer to how we imagine future of graphics could look like. But judging by how Euclideon's working principal is much closer to real-life (small particles/builders that build the world), it has potential.
I think both Euclideons and Brigade 3 are probably at least 10 years away from being excepted as a valid alternative to polygons though.
I think people are too harsh. Both Euclideon and Brigade 3 are ahead of their time. It's tough to fight against mainstream by creating something that will take quite some time to become advanced enough, to be really useful. It's true, Brigade has more to show right now and is closer to how we imagine future of graphics could look like. But judging by how Euclideon's working principal is much closer to real-life (small particles/builders that build the world), it has potential.
I think both Euclideons and Brigade 3 are probably at least 10 years away from being excepted as a valid alternative to polygons though.
Isn't path-tracing only for lighting, while the geometry is still polygon-based? If so, could both approaches (voxels+path-tracing) be merged?
brigade uses polygons and unlimited detail polygon converter. apparently next gen graphics if used sth like a gtx780 will always render higher resolutions as brigade, like 4 times higher. though image quality you can render with a path tracer like brigade 3 is IMO almost lifelike and not reproducable with next gen graphics.
Sun Nov 03, 2013 1:07 pm
zerax
Two Eyed Hopeful
Joined: Thu Jan 31, 2013 9:43 am Posts: 54 Location: Norway
What's the point of ray tracing now? Your average GPUs given an interesting GPU code can do all this. Ray tracing was for shiny balls before at least three generations of GPU past.
Sun Mar 23, 2014 3:12 am
GeraldT
Certif-Eyable!
Joined: Fri Jan 18, 2013 9:10 am Posts: 1057 Location: Germany
nope - if you want great shadows and reflections you still need raytracing. and raytracing looks a lot more realistic. they have shown hardware solutions that might get us where we need to soon enough - they are based on hybrid rendering techniques.
_________________ want to demo the Rift or check it out? click here
I rarely see games that get that close to real world lighting, those gradients look perfect. GTA 5 comes close and some others. I haven't played Crysis 3, but C2 had many problems with colors and shading.
My assumption of ray tracing has been that it simply does everything you need, automatically, and does so as if it were real light. But that is a very laymen opinion
Full realistic shadowing is not a priority for VR. Approximation is just fine. For example, look at that mountain scene in the valley benchmarch of unigine.
What we need as we go for 2k to 4k pixels for VR are lots of polygons beautifully textured and dynamic movement of them.
Full reflection is another thing that is approximated (be it crappy) by in-scene rendered surfaces.
Any hybrid engine has to deliver lots of surfaces be they polygons or straight pixels. I don't think ray tracing will scale to 2k @ 75Hz.
Sun Mar 23, 2014 2:04 pm
GeraldT
Certif-Eyable!
Joined: Fri Jan 18, 2013 9:10 am Posts: 1057 Location: Germany
Sorry geraldT, VR is going pixels big time, which just demands we get more GPU grunt. So the best means to do this is polygons with the best texturing possible. Is brigade3 even real time? Nothing flashy like skin sub illuminance, just your basic pipeline. There I a lot of fancy GPU code that makes fantastic pictures, but not at 75hz.
ray tracing is so amazing... Now, the Honda in the footage looks just gorgeous. I think we will very likely reach a Matrix like photoreal world in a decade.
Users browsing this forum: No registered users and 16 guests
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot post attachments in this forum