getCurrentTexture::createView() fails on dedicated GPU

Hi everyone,

Not sure if the right place to ask for dev support. I did not find anything
to help me when searching for the problem below, so I thought to ask here.
My apologies if this is the wrong place!

I'm on Ubuntu 22.04 with the latest Chrome (stable and unstable). My
notebook has a dedicated GPU which is a GeForce RTX 3050 6GB (Laptop) and
an integrated Intel Iris Xe. I tried different driver versions etc. in
troubleshooting the problem, but am currently on nvidia-driver-550.

My WebGPU program runs fine on the integrated GPU (power-preference:
"low-power"), but it does not on the dedicated RTX ("high-performance").
Already a minimal WebGPU init will trigger the problem. It happens when the
context is asked to provide a view of the current texture
(context.getCurrentTexture().createView()). The error printed in the
console of Chrome is the following:

-----------

Requested allocation size (1228800) is smaller than the image requires
(1310720).
    at ImportMemory
(../../third_party/dawn/src/dawn/native/vulkan/external_memory/MemoryServiceImplementationOpaqueFD.cpp:131)

localhost/:1 [Invalid Texture] is invalid.
 - While calling [Invalid Texture].CreateView([TextureViewDescriptor]).

-----------

Canvas resolution is (in this example) 640x480. Preferred canvas format is
"bgra8unorm". The same is true for the GPUTexture that comes back from
context.getCurrentTexture(). I fail to understand where the extra size
comes from that is required for the image. What am I not providing? Or is
it something with my drivers?

Interestingly, if I remove the presentation logic (i.e. no blit pass) from
my program, the nvidia GPU will happily execute the compute shaders etc.

I'm somewhat at loss. And since the program is running fine on the
integrated GPU, I feel something this basic should not go wrong. But since
I've been using WebGPU only for about a year so far, my experience is
pretty limited. So anything that points me in the right direction is very
much appreciated!

Thank you!

Regards,
Markus


Here comes a minimal code example that already triggers above error for me:

  1 async function main()
  2 {
  3   if(!navigator.gpu)
  4     throw new Error("No WebGPU");
  5
  6   // "low-power" does work
  7   let adapter = await navigator.gpu.requestAdapter({
  8     powerPreference: "high-performance" } );
  9
 10   if(!adapter)
 11     throw new Error("Failed to request adapter");
 12
 13   let device = await adapter.requestDevice();
 14   if(!device)
 15     throw new Error("Failed to request device");
 16
 17   let canvas = document.querySelector("canvas");
 18   canvas.width = 640;
 19   canvas.height = 480;
 20
 21   context = canvas.getContext("webgpu");
 22   context.configure({ device, format:
navigator.gpu.getPreferredCanvasFormat(),
 23     alphaMode: "opaque" });
 24
 25   let renderPassDescriptor = {
 26     colorAttachments: [
 27       { view: context.getCurrentTexture().createView(), // Fails with
error
 28         clearValue: { r: 1, g: 0, b: 0, a: 1.0 },
 29         loadOp: "clear",
 30         storeOp: "store" } ]
 31   };
 32 }
 33
 34 main();

Received on Tuesday, 13 August 2024 15:13:49 UTC