Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DRAFT] OpenXR: Use the XR_FB_foveation_vulkan extension to generate the density map for VRS #99768

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

dsnopek
Copy link
Contributor

@dsnopek dsnopek commented Nov 27, 2024

This builds on top of PR #99551

But instead of manually building the density map, it uses the XR_FB_foveation_vulkan extension to get the density map from the OpenXR runtime.

So, you can use the same foveation settings that are currently used for OpenGL (setting the "Foveation Level" or "Foveation Dynamic"), and that should get used by the OpenXR runtime in order to build the density map that we're using.

I'm not crazy about the interdependencies this PR makes between OpenXRFBFoveationExtension, OpenXRVulkanExtension and OpenXRSwapChainInfo, but I'm not sure there's a way around it? I'll spend some more time thinking about if there's a more decoupled way to do this.

This marked as a DRAFT both because it depends on PR #99551, which isn't merged, but also it's just kind of hacking itself into the renderer. We need some way to use the manually built density map when the XR_FB_foveation_vulkan extension isn't available.

@BastiaanOlij
Copy link
Contributor

I'm really on the fence about this one. Was just talking to @DarioSamo about density maps in general, what my ideas are for improving what we have right now, and if we do this right, our build in solution will be as good as, if not better than what is on offer here. And it works on any platform (well currently any Vulkan based platform).

So far the feedback I've gotten talking to people in the know, most game engines have already solved this the same way we have, with their own logic for generating density maps, because they ran into the same problems we did with what OpenXR offers. Most vendors this don't have a need to supporting something in the runtime. Especially when looking at PCVR vendors who are dealing with a wide hardware spectrum. That is not to say that in some point in the future we will see a vendor neutral extension for this but thats a big unknown.

So doing structural changes to our rendering engine to support an edge case currently only supported by Meta, and potentially not something that will be introduced by others, while we already have a viable solution that we can improve open is.. uhm.. Meh..

The other thing that is a potential problem here, is that the density map has to be adjusted to the resolution at which we're rendering, and since we're allowing that to be overridden, we might not be getting a density map at the right size.

@dsnopek
Copy link
Contributor Author

dsnopek commented Nov 29, 2024

I think there's a few advantages to using the density map from OpenXR:

  • The "dynamic" feature of foveated rendering works! I tested it by doing two RenderDoc captures: one where I was looking at a part of the scene where it tanked the FPS (down to like 36fps), one where I was looking at a single color quad (and fps went back up to 72fps). I could see that density map changed, so that it was really scaling down resolution in the first capture, but was a pure (1.0, 1.0, 0.0) texture in the second (so not scaling resolution down at all)
  • The density map can be geared to the hardware. When looking at RenderDoc with this PR, Dario said that the binning stage was much shorter with this PR than with the custom density map from his PR. He said something like he thought it had an easier time with the density map from OpenXR because it was "more binary" (ie less gradient, more large patches of a single value). But this is something that could be hardware specific - maybe different hardware would be able to work better with differently generated maps?
  • Because "dynamic" works, I suspect that XR_META_foveation_eye_tracked would work as well, allowing the OpenXR runtime to update the density map based on eye position, and it'd be able to use the right "filters" and latency for foveated rendering. Our manual density map generation can use XR_EXT_eye_gaze_interaction, but that really isn't the right extension for foveated rendering: it uses "filters" and latency that are designed for interacting with UIs, which means it will prioritize things like stability (ie not bouncing around too quickly), rather than prioritizing the things that matter for foveated rendering (like latency)

So, I'd really like to be able to have the option to let the OpenXR runtime generate the density map!

I'm not crazy about my implementation so far, because it couples a whole bunch of things that I don't want coupled, but that's hopefully something I can work out before taking this out of draft.

@dsnopek
Copy link
Contributor Author

dsnopek commented Nov 29, 2024

A point I forgot to respond to in my last comment:

So doing structural changes to our rendering engine to support an edge case currently only supported by Meta, and potentially not something that will be introduced by others, while we already have a viable solution that we can improve open is.. uhm.. Meh..

We shouldn't have to make any structural changes to the renderer to do this! All the rendering changes in my commit on this PR are just temporary hacks in order to test this - it's only a draft :-)

In the end, I expect all changes for this to be in the "openxr" module. The renderer is already calling XRInterface::get_vrs_texture() to get the texture - it's just a question of where we get that texture from? If this extension is available, and the developer enabled foveated rendering in project settings, we can return the one from the extension; if not, we generate one manually.

@dsnopek dsnopek marked this pull request as draft November 29, 2024 13:22
@BastiaanOlij
Copy link
Contributor

@dsnopek the problem is, the solution Dario and I were discussing tonight kind of changes things a lot. What we're thinking of is to leave providing a texture for non XR usage only, and instead provide eye center and radius data if XR_VRS is specified, and generating the density map in the shader that is currently in the rendering engine (and eventually move that into the rendering driver so that we're independent of hardware/GPU architecture). This removes a bunch of overhead and solves a bunch of problems with an externally supplied density map not matching our further rendering settings. It's not a big step to make the density map adjust based on frame timing.

I do hear what you're saying, but many of those issues are because we need to improve our implementation, and so far its just not gotten the attention it deserves. Solving that properly means that we have a good solution for all platforms, and seeing this extension does not seem to be getting wide adoption, we're going to have to do that anyway.
There is a difficulty that right now those with the knowhow, aren't sharing, so its going to take more experimentation to get it right. But its worth doing.

My only concern is that right now, there is no proper extension in OpenXR that allows us to get usable eye tracking data, so supporting eye tracked foveated rendering alongside fixed foveated rendering is a problem. We're using the eye gaze extension which is not really suitable for this.
That is definitely a place where using this extension is a plus.

DarioSamo and others added 2 commits December 5, 2024 12:00
@dsnopek dsnopek force-pushed the openxr-vulkan-foveated-rendering branch from c646106 to 177d119 Compare December 5, 2024 15:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants