Skip to content

Cannot read to / write from Depth16Unorm texture on WebGL backend #3537

@teh-cmc

Description

@teh-cmc

Description

I have a program that creates and fills Depth16Unorm depth textures on the CPU, then reads that data (textureLoad) in a shader to render point clouds.

@group(1) @binding(1)
var depth_texture: texture_2d<f32>;

let norm_linear_depth = textureLoad(depth_texture, texcoords, 0).x;

On native, this works without issue, as the spec suggests.

When using the WebGL backend, on the other hand, my shader just... doesn't work. I cannot see any relevant logs in the browser console, and trying to enforce a fragment color output does nothing, so I assume the shader crashes right at the textureLoad.

Converting my u16 data to f32 on the CPU and then uploading that into a R32Float texture fixes the issue, no other changes required.

Repro steps

Using the WebGL backend:

  1. Create a Depth16Unorm texture
  2. Fill it with arbitrary data
  3. Read from it in a shader

Expected vs observed behavior

Writing to and reading from a Depth16Unorm should work on both native and web, as the spec suggests.

Platform

  • wgpu 0.15
  • Linux (Arch), also reproduced on macOS
  • Latest Firefox & Chromium

Metadata

Metadata

Assignees

No one assigned

    Labels

    backend: glesIssues with GLES or WebGLtype: bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions