I am very new to game development and I have been trying to understand the difference between Screen Space Reflection and Physically Based Rendering.
I have read about PBR, and from what I understand, it tries to mimic how light reflects in real life, which is it usually gets split to two components, specular and diffuse depending on the type of material.
As for SSR, pls correct me if i am wrong, it is how reflections look on a surface.
If my understanding of SSR is correct, then aren't they in a way the same? I mean, isn't the way reflections look on a surface dependent on the surface roughness and etc. This will then affect how much light is specularly reflected and how much is diffusely reflected. Again, please correct me anywhere I am wrong.
Answer
Physically-Based Rendering
You're on the right track when you say "it tries to mimic how light reflects in real life, which is it usually gets split to two components, specular and diffuse depending on the type of material."
But we've been modelling materials with specular and diffuse in games & computer graphics for a long time. The trick is that we used to handle these things as completely independent - changing the specularity didn't change the diffuse:
This is an example of Phong shading from the Blender wiki. You can see that it offers two parameters of specular intensity and specular hardness, and these parameters only change the whitish part of the reflection. The blue diffuse reflection doesn't change at all.
The way games would use this is an artist would be tasked with hand-tuning these values for each material until it "looked right." Because "specular hardness" isn't a real physical property of materials that we can measure precisely, it had to be done by eye.
This method is a bit brittle. As you change the illumination (say, a dynamic object moving through different areas, or in an environment with time of day and weather) it can look subtly wrong - too bright or too dark - as the viewing conditions aren't the same as the ones its specular parameters were tuned for.
Enter Physically-Based Rendering, which is an attempt to ground our material descriptions in more objective, measurable properties of real surfaces. One of the most apparent properties is conservation of energy - a rougher surface will scatter light diffusely, and a smoother/more metallic surface will reflect light more directly, but it's the same pool of light they're both drawing from. So other things being equal, as we make a material shinier, the diffuse component should get darker:
This example is from a Marmoset article explaining PBR originally shared by Syntac_
There's more to physically-based rendering than energy conservation, but this is probably the most telltale sign that you're working with a physically-based system.
By keeping the reflection models similar to how materials work in real life, we reduce the need for fudge factors and artist subjectivity to get a real-world material like wood or concrete or leather to look real under a wide variety of lighting conditions.
Note that another answer described this in terms of indirect illumination from light bouncing off other objects in the scene. While many lighting systems that use physically-based models will also include tools to model this, it's usually known under a separate name of Global Illumination. This is the effect that makes one side of the diffuse head in this image appear green, illuminated by light bouncing off the green wall:
Image from this article on global illumination
Screenspace Reflection
While PBR tries to model how the material reflects light, Screenspace Reflection tries to capture what is being reflected - specifically, for a shiny, mirror-like surface, what should I see in the reflection?
Again this is a relatively recent rendering technique that's probably clearest to understand by contrast to how games did it before:
Flipped Rendering - common for water planes or flat mirrors, we literally render all the reflected geometry a second time mirrored across the plane of the reflective surface. This gives high-quality reflections (full detail, objects in contact with the surface line up with their reflections) but only works correctly for flat surfaces. The more wavy or bumpy a surface is, the less this behaves like real reflections, which should distort or blur in complex ways.
Cube Maps - let us store the colour that would be seen by any view ray radiating out from their center point. By dynamically rendering cube maps from selected points in the scene, we can estimate what colour should be reflected off any arbitrarily curved surface. The trouble here is that the cube map is only completely correct at its center point - as the point where we're simulating reflection moves around the scene, it should see some parallax, which isn't present in the cube map. This means objects don't tend to line up with their reflections.
Screen Space reflection tries to address these limitations by using the rendered scene itself as the source for reflection information. It ray-marches a reflected view ray, using the scene's depth, until it would intersect something in the rendered scene.
Here's a slide from an EA DICE presentation about their approach to reflections in the Frostbite engine.
This means (with some smart algorithmic work) we can get reflections with reasonably raytracing-like accuracy off of arbitrary surfaces in games, having correct alignment of surfaces in contact, distortion and blurring, as long as the reflected part of the surface is visible on-screen (ie. not offscreen or occluded by something else). Where the reflection can't be accurately determined by the raymarching, it's usually approximated using nearby samples or a fallback cubemap representing the scene beside/behind the camera's view.
You can see in this example of screenspace reflection, the impression can be very convincing, although small errors are noticeable (see the reflection of the undersides of the cubes, which are not visible in the rendered frame and so simply smear & repeat adjacent pixels, or the holes in the right green curtain's reflection beside the flower pot and at the bottom of the screen, where the raymarching failed to find the right reflected pixels). It's common to use this technique for moderately shiny/slightly rough surfaces to help make the occasional error less visible.
No comments:
Post a Comment