Multi-Texture Blending Renderer
Create a WebGL renderer that blends multiple textures together using different texture units.
Requirements
Your implementation should:
- Create a headless WebGL rendering context with dimensions 512x512
- Set up a simple quad (two triangles forming a rectangle) that covers the entire viewport
- Create and configure three different textures:
- Texture 1: A 256x256 red gradient (top-to-bottom)
- Texture 2: A 256x256 green gradient (left-to-right)
- Texture 3: A 256x256 blue gradient (diagonal)
- Bind each texture to a different texture unit
- Write a fragment shader that samples all three textures and blends them together by averaging their RGB values
- Render the scene and extract the pixel data from the framebuffer
- Save the output as a PNG image file named
output.png
Technical Details
- Use appropriate texture parameters (filtering and wrapping modes)
- The fragment shader should receive texture coordinates and sample from all three textures
- Pass the texture unit indices to the shader via uniform variables
- The final color should be the average of the three sampled colors
Test Cases
- The output image dimensions should be 512x512 pixels @test
- The center pixel (256, 256) should contain contributions from all three textures @test
- All three textures should be properly bound to separate texture units @test
Implementation
@generates
Dependencies { .dependencies }
gl { .dependency }
Provides headless WebGL rendering context.
pngjs { .dependency }
Provides PNG encoding support for saving the output image.