Now you know why I'm so pessimistic over games.. because this kind of stuff can't even be done in a game yet. I'm used to seeing correct rendering results - not gobos.I saw that video.
I had no idea it's this expensive to render but it makes total sense.
It's kinda funny how the fake mas method looks 99% correct because you cannot see the surface and you have no idea how caustics should look.
Play with this a bit! It's awesome. Look how this water correctly distorts perspective and how good caustics look. And it's an old demo
edit: Enable gravity with G, raise the ball and drop it. You will see caustics properly start form the point of drop!
yeah this demo is at least 10 years old.... so what givesNow you know why I'm so pessimistic over games.. because this kind of stuff can't even be done in a game yet. I'm used to seeing correct rendering results - not gobos.
Literally recreating the physics of photons contacting with object surfaces is one of the final pieces of the lighting puzzle in rendering. Up next, recreating plasma.yeah this demo is at least 10 years old.... so what gives
Actually, comparing real-time to offline doesn't make sense, the assets don't even serve the same purpose.Now you know why I'm so pessimistic over games.. because this kind of stuff can't even be done in a game yet. I'm used to seeing correct rendering results - not gobos.
LOL, that kind of thing isn't used in production renders as well, we fake it as much as possible. Same with gobo shadows for instance, lot of times it's pushed towards compositing. Nuke is very powerful for this, using point position passes and normal passes to project light/shadows even caustics unto production renders.Now you know why I'm so pessimistic over games.. because this kind of stuff can't even be done in a game yet. I'm used to seeing correct rendering results - not gobos.
They don't. However, game world's final goal is to have CG-like games. They are moving there just not fast enough. I'm sure that when GPUs get powerful enough, the film companies will have first dibs on realtime movies.Actually, comparing real-time to offline doesn't make sense, the assets don't even serve the same purpose.
It's NEVER going to be fast enough, we can only have real-time recreations of what were milestone of CG development DECADES prior, it will always be that linear regardless.They don't. However, game world's final goal is to have CG-like games. They are moving there just not fast enough. I'm sure that when GPUs get powerful enough, the film companies will have first dibs on realtime movies.
But they already do that, Mandalorian and it's Virtual Production workflows is one example.They don't. However, game world's final goal is to have CG-like games. They are moving there just not fast enough. I'm sure that when GPUs get powerful enough, the film companies will have first dibs on realtime movies.
Nah.. more of production would be so fast that you could get amazing results without having to wait for render times. Final goal would be to have hardware to render fast enough that you could get to the comp stage extremely quickly. Lighting wouldn't take nearly as long to get approved by the directors.@VFX_Veteran @Sharinghack what would be the benefit of realtime movies? Are we talking about having the actors on screen in a soundstage and then having the whole scenario rendered without the need for additional VFX, as they'd be added as you capture your footage?
As long as ambitions get higher, so will complexities.Nah.. more of production would be so fast that you could get amazing results without having to wait for render times. Final goal would be to have hardware to render fast enough that you could get to the comp stage extremely quickly. Lighting wouldn't take nearly as long to get approved by the directors.
Imagine computing the caustics based on different wavelengths that the photons carry after the initial bounce (as shown with the rainbow effect on the surface in the demo). That's actually NOT free. It will require the shader to no longer assume an equal white of all colors but you have to compute with 3 channels now (red, green, blue).
Thanks for you input sir!!Yes, this is done with Spectral Rendering and is pretty much only used in scientific fields. It's very expensive and would be wasteful to implement in video games (even if it were possible). Even with spectral rendering, there are different methods with varying degrees of accuracy.
Upcoming RTX 4090 is 450W TDP and I am not sure it will run native 4K with ray tracing at 60 fpsThey don't. However, game world's final goal is to have CG-like games. They are moving there just not fast enough. I'm sure that when GPUs get powerful enough, the film companies will have first dibs on realtime movies.
Upcoming RTX 4090 is 450W TDP and I am not sure it will run native 4K with ray tracing at 60 fps
Keep your hopes down