I am doing a GL multi-pass rendering app for iOS. The first pass renders to a depth buffer texture. The second pass uses the values in the depth buffer to control the application of a fragment shader. I want to rescale the values in the depth buffer to something useful but before I can do that I need to know the depth value range of the depth buffer values. How do I do this?
Answer
The range of the values written to the depth buffer are whatever you want them to be. Typically they fall in the 0-to-1 range. The actual value that is written in to the depth buffer is computed during the viewport transformation, based on the Z value of vertex in NDC space (after the perspective divide by w in clip space).
The NDC depth value (Z, after the perspective divide by W) is scaled by depth portion of the viewport transformation (which brings your X and Y coordinates into a coordinate space you'd associate with pixels in the window), and then scaled by (2^n-1)
-- that's meant to be read as "two to the power of n
" -- n
is the bit precision of the depth buffer. The resulting value is written to the depth buffer.
OpenGL splits definition of the viewport transformation matrix into glViewport and glDepthRange calls. glDepthRange is what controls the scale factor responsible for determining the depth range you are asking about. You can call glGetFloatv with the GL_DEPTH_RANGE
selector to recover the current range. This will allow you to make use of the range without assuming it's 0 to 1 (although 99.9% of the time, in practice, nobody ever changes it).
Further reading, if you want some insight on how to reconstruct the math to follow the Z value all the way from eye space to the depth buffer.
No comments:
Post a Comment