0

I am currently working on a c++ program using OpenXR with OpenGL. I managed to render everything that I wanted into VR.

I have one framebuffer per eye, and then 3 more for layers (crosshair/menu) that are displayed with different button presses (which is done in VR with xrEndFrame).

The problem is that I want to mirror everything also onto the computer screen.

Most of it was simple because I just used glBlitNamedFramebuffer to mirror one of the eye framebuffers onto the 0/default framebuffer so it's displayed.

But I can't use this method for the layers, as it is just in 2D on the screen, and I need to position the layer for example to where I am looking in VR for the crosshair.

I can draw a quad to the position i want it to be, but i don't know how i can copy the content of the framebuffer onto the quad. I thought maybe glReadPixels might help, but that seemed rather inefficient.

7
  • I don't understand the statement "But i can't use this method for the layers, as it is just in 2d" Framebuffers are also just 2d.. Commented Jun 21, 2024 at 8:43
  • Sorry, I meant that i have to position the content of the framebuffer into the 3D world. so that i have a 2d layer visible, but for example it has to be where i am looking in VR, not just in the middle of the screen. Commented Jun 21, 2024 at 8:50
  • Can you make your "layer" framebuffers be backed by a texture instead? Then you can just render that to a quad. Commented Jun 21, 2024 at 8:58
  • I might have a texture. I have a XrSwapchainImageOpenGLKHR, where "image is the OpenGL texture handle associated with this swapchain image" Commented Jun 21, 2024 at 9:44
  • 1
    Thanks, I managed to get the texure (it looked wrong at first), and rendered them onto quads. It looks a bit different then before, but i can work with that. Commented Jun 28, 2024 at 9:23

1 Answer 1

0

So I wasn't able to render a framebuffer into another framebuffer, but at least for the OpenXR case, there is a solution.

I created the layers as usual, but instead of giving the layers to OpenXr via xrEndFrame, I used the color texture from the swapchain

colorTexture = reinterpret_cast<const XrSwapchainImageOpenGLKHR*>(hudLayer.m_swapchainImages[hudLayer.swapchain_][swapchainImageIndex])->image;

and rendered them onto quads within the first framebuffer. (I could use the positions/orientations of the layers for the quads as well)

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.