-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Description
Description
I have a program that creates and fills Depth16Unorm
depth textures on the CPU, then reads that data (textureLoad
) in a shader to render point clouds.
@group(1) @binding(1)
var depth_texture: texture_2d<f32>;
let norm_linear_depth = textureLoad(depth_texture, texcoords, 0).x;
On native, this works without issue, as the spec suggests.
When using the WebGL backend, on the other hand, my shader just... doesn't work. I cannot see any relevant logs in the browser console, and trying to enforce a fragment color output does nothing, so I assume the shader crashes right at the textureLoad
.
Converting my u16
data to f32
on the CPU and then uploading that into a R32Float
texture fixes the issue, no other changes required.
Repro steps
Using the WebGL backend:
- Create a
Depth16Unorm
texture - Fill it with arbitrary data
- Read from it in a shader
Expected vs observed behavior
Writing to and reading from a Depth16Unorm
should work on both native and web, as the spec suggests.
Platform
- wgpu 0.15
- Linux (Arch), also reproduced on macOS
- Latest Firefox & Chromium