I want to compose conventional triangle-based models and particles with a ray-traced scene at a reasonable frame-rate.
webGL does not let you write the gl_FragDepth
in the fragment shader. You cannot have multiple render targets, but you can render to an RGBA texture and then use that texture as input to another draw-op. So I can render my ray-traced scene and store the depth as input to the next stage. This is obviously far from ideal, but workable.
I'd love to be wrong about this, and this is gathered from trial and error rather than any definitive source; please correct any flawed assumptions
How can you pack/unpack a float and, ideally, some flag bits and such, into an RGBA texture efficiently in a GLSL fragment shader?
No comments:
Post a Comment