The Difficulty of Caustics Rendering, via Corridor Crew.

D

Deleted member 13

Guest
Caustics has always been a huge expense in rendering. That's sampling rays through media and accumulating the gathered photon energy onto the next surface. Our rendering has always been expensive with caustics no matter what rendering mode you are doing (obviously have to be ray-traced). Imagine computing the caustics based on different wavelengths that the photons carry after the initial bounce (as shown with the rainbow effect on the surface in the demo). That's actually NOT free. It will require the shader to no longer assume an equal white of all colors but you have to compute with 3 channels now (red, green, blue).
 
Last edited by a moderator:
  • Like
Reactions: brainchild

rofif

...owns a 3080...why?
24 Jun 2022
1,318
1,752
I saw that video.
I had no idea it's this expensive to render but it makes total sense.
It's kinda funny how the fake mas method looks 99% correct because you cannot see the surface and you have no idea how caustics should look.

Play with this a bit! It's awesome. Look how this water correctly distorts perspective and how good caustics look. And it's an old demo

edit: Enable gravity with G, raise the ball and drop it. You will see caustics properly start form the point of drop!
 
  • Like
Reactions: Deleted member 13
D

Deleted member 13

Guest
I saw that video.
I had no idea it's this expensive to render but it makes total sense.
It's kinda funny how the fake mas method looks 99% correct because you cannot see the surface and you have no idea how caustics should look.

Play with this a bit! It's awesome. Look how this water correctly distorts perspective and how good caustics look. And it's an old demo

edit: Enable gravity with G, raise the ball and drop it. You will see caustics properly start form the point of drop!
Now you know why I'm so pessimistic over games.. because this kind of stuff can't even be done in a game yet. I'm used to seeing correct rendering results - not gobos. :p
 
OP
OP
yewles1

yewles1

Active member
21 Jun 2022
158
235
41
Indianapolis, IN, USA
PSN ID
yewyew1
Now you know why I'm so pessimistic over games.. because this kind of stuff can't even be done in a game yet. I'm used to seeing correct rendering results - not gobos. :p
Actually, comparing real-time to offline doesn't make sense, the assets don't even serve the same purpose.
 

Sharinghack

Active member
21 Jun 2022
220
406
Lisbon, Portugal
www.pushvfx.com
Now you know why I'm so pessimistic over games.. because this kind of stuff can't even be done in a game yet. I'm used to seeing correct rendering results - not gobos. :p
LOL, that kind of thing isn't used in production renders as well, we fake it as much as possible. Same with gobo shadows for instance, lot of times it's pushed towards compositing. Nuke is very powerful for this, using point position passes and normal passes to project light/shadows even caustics unto production renders.
 
D

Deleted member 13

Guest
Actually, comparing real-time to offline doesn't make sense, the assets don't even serve the same purpose.
They don't. However, game world's final goal is to have CG-like games. They are moving there just not fast enough. I'm sure that when GPUs get powerful enough, the film companies will have first dibs on realtime movies.
 
OP
OP
yewles1

yewles1

Active member
21 Jun 2022
158
235
41
Indianapolis, IN, USA
PSN ID
yewyew1
They don't. However, game world's final goal is to have CG-like games. They are moving there just not fast enough. I'm sure that when GPUs get powerful enough, the film companies will have first dibs on realtime movies.
It's NEVER going to be fast enough, we can only have real-time recreations of what were milestone of CG development DECADES prior, it will always be that linear regardless.
 

Sharinghack

Active member
21 Jun 2022
220
406
Lisbon, Portugal
www.pushvfx.com
They don't. However, game world's final goal is to have CG-like games. They are moving there just not fast enough. I'm sure that when GPUs get powerful enough, the film companies will have first dibs on realtime movies.
But they already do that, Mandalorian and it's Virtual Production workflows is one example.

Recently on of the episodes of Love Death and Robots was entirely rendered in UE5, I tried really hard to notice something but it was quite uncanny and not that far away from an offline render.

It's very impressive the improvements we are having in realtime rendering, it's moving so fast that in a few years offline render will become obsolete.

EDIT: Here it is, kind of making of from the episode.

 
  • Informative
Reactions: Deleted member 13
D

Deleted member 13

Guest
I'll have to check it out and see what I can find.
 

Satoru

Limitless
Founder
20 Jun 2022
6,846
10,311
@VFX_Veteran @Sharinghack what would be the benefit of realtime movies? Are we talking about having the actors on screen in a soundstage and then having the whole scenario rendered without the need for additional VFX, as they'd be added as you capture your footage?
 
D

Deleted member 13

Guest
@VFX_Veteran @Sharinghack what would be the benefit of realtime movies? Are we talking about having the actors on screen in a soundstage and then having the whole scenario rendered without the need for additional VFX, as they'd be added as you capture your footage?
Nah.. more of production would be so fast that you could get amazing results without having to wait for render times. Final goal would be to have hardware to render fast enough that you could get to the comp stage extremely quickly. Lighting wouldn't take nearly as long to get approved by the directors.
 
OP
OP
yewles1

yewles1

Active member
21 Jun 2022
158
235
41
Indianapolis, IN, USA
PSN ID
yewyew1
Nah.. more of production would be so fast that you could get amazing results without having to wait for render times. Final goal would be to have hardware to render fast enough that you could get to the comp stage extremely quickly. Lighting wouldn't take nearly as long to get approved by the directors.
As long as ambitions get higher, so will complexities.
 
  • Like
Reactions: Deleted member 13

brainchild

Industry Professional (Vetted)
VIP
30 Jun 2022
5
20
Imagine computing the caustics based on different wavelengths that the photons carry after the initial bounce (as shown with the rainbow effect on the surface in the demo). That's actually NOT free. It will require the shader to no longer assume an equal white of all colors but you have to compute with 3 channels now (red, green, blue).

Yes, this is done with Spectral Rendering and is pretty much only used in scientific fields. It's very expensive and would be wasteful to implement in video games (even if it were possible). Even with spectral rendering, there are different methods with varying degrees of accuracy.
 
D

Deleted member 13

Guest
Yes, this is done with Spectral Rendering and is pretty much only used in scientific fields. It's very expensive and would be wasteful to implement in video games (even if it were possible). Even with spectral rendering, there are different methods with varying degrees of accuracy.
Thanks for you input sir!!
 

Killer_Sakoman

Veteran
21 Jun 2022
1,469
1,432
They don't. However, game world's final goal is to have CG-like games. They are moving there just not fast enough. I'm sure that when GPUs get powerful enough, the film companies will have first dibs on realtime movies.
Upcoming RTX 4090 is 450W TDP and I am not sure it will run native 4K with ray tracing at 60 fps 😁
Keep your hopes down
 
  • Like
Reactions: Deleted member 13

Satoru

Limitless
Founder
20 Jun 2022
6,846
10,311
Upcoming RTX 4090 is 450W TDP and I am not sure it will run native 4K with ray tracing at 60 fps 😁
Keep your hopes down

fetchimage