Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposals for custom shaders/rendering #903

Open
mvilim opened this issue Jun 2, 2021 · 6 comments
Open

Proposals for custom shaders/rendering #903

mvilim opened this issue Jun 2, 2021 · 6 comments
Labels
3d feature New feature or request rendering

Comments

@mvilim
Copy link

mvilim commented Jun 2, 2021

Hi! Thanks for the wonderful UI library.

I would like to add support for custom rendering of a widget (i.e. not just a custom widget made of existing primitives, but a widget that uses a custom shader to draw itself on the screen). I have experimented with a couple ways to achieve this, but would appreciate some feedback on the potential approaches before I commit to fully implementing any of them, as you may have an opinion on which is the best fit for this project.

The only discussion of custom rendering that I have been able to find is here: #32 (comment) but perhaps I've missed some discussion elsewhere.

I plan to do this custom rendering using wgpu (and therefore iced_wgpu), but as you will see, some of the changes would reach into the lower level crates.

The general approaches I have been able to come up with, in order of most complete/invasive to least are:

1. Genericize iced_graphics::Primitive

Genericize iced_graphics::Primitive with a type (used for a new Custom primitive variant) which can be specified by the downstream application in its creation of a custom iced_graphics::Renderer. When no custom rendering is required, this type would be set to the unit type. This would have a very far reaching impact on iced, as the generic parameter would cascade into pretty much everything in iced_graphics and downstream (as well as probably breaking (in a relatively shallow way) existing custom widgets). It's possible some of the breakage could be minimized by extracting a lower-level type e.g. GenericPrimitive<CustomType> and leaving Primitive as is. The internal APIs could would be then be written in terms of GenericPrimitive<CustomType> but the external API would use the non-generic Primitive (which would map to GenericPrimitive<()>).

The wgpu::Backend would then have a generic CustomPipeline (with would have draw function which takes a command encoder, output target, staging belt, etc.) which it would invoke with the Custom primitives. Alternatively, a completely separate backend could be implemented that delegates to wgpu::Backend (though likely that would require making more of the wgpu::Backend public to allow the custom primitives to be drawn in the correct layers.

2. Genericize via Renderer::Output

Leave Primitive as it is and do all the genericization via the Renderer::Output. Implementing a custom renderer requires implementing the Renderers for all the different widgets, which could be eased by genericizing most of the default iced_graphics widget Renderers to output Renderer::Output instead of (Primitive, mouse::Interaction) via a Into which can be a no-op in the case of a default renderer and must have a conversion implemented for a custom renderer.

This gets tricky (or impossible) to genercize for some widget types (e.g. container types like Row, or anything which needs to have widget logic that interacts with Renderer::Output concretely), but again some Into magic may let the custom renderer reuse as much of the existing iced_graphics renderers as possible.

3. Introduce a non-generic, non-type-safe custom primitive handle

Drop the type-safety for the custom renderer by introducing a non-generic variant Primitive::Custom which would hold a struct CustomPrimitiveHandle(u64) and an inner Primitive. The meaning of this handle would up to the custom widget and its custom renderer. These primitives would be passed to a CustomPipeline (as described in 1) which could then render them as it sees fit. This approach avoids the generics from the custom renderer leaking into the rest of the code, but has the disadvantage that the custom renderer must express all of its primitive behavior via this integer handle (likely by looking up the primitive via its handle somewhere in the custom renderer).

4. Add an introspection API to assist in locating external rendering passes

Don't try to integrate the custom rendering directly in the iced rendering calls, but instead provide an "introspection" API which can allow an external application to look up the position of a given widget or its primitives. The application can then use this information to know exactly where to render its custom graphics given the rest of the UI layout. Of course, this doesn't play well with heavily layered user interfaces (as we can't interleave the custom draw calls with a single iced draw call), but one could still draw some widgets below and some widgets above the custom widgets by issuing multiple iced draws for different sets of widgets.

Solutions 1-3 would also prompt the question of whether the custom renderer should be included in iced::Application and iced::Sandbox (which would complicate their interfaces). It might be better to avoid exposing the custom renderer there and instead require that anyone who wants to use a custom renderer must also implement a custom application (like the integration example).

Are any of these solutions something that you think is worth considering for iced? If so, I can draft a PR to make the proposal more concrete than this wall of text.

@13r0ck
Copy link
Member

13r0ck commented Jun 2, 2021

Just out of curiosity what would this allow a developer to do? Or maybe a better way to ask that is if you were to create an example for this what would that example show? Why would someone want this?

It sounds cool, I think I am just missing the point

@mvilim
Copy link
Author

mvilim commented Jun 3, 2021

It would allow a developer to directly draw GPU rendered content for things that would be inefficient or awkward to express using iced primitives. For complex content, this can be a big performance improvement. In my case, I have custom charting shaders that I would like to render as part of a UI. In the comment by hecrj linked above, he mentions custom 3d graphics as a potential use case.

The example would likely just have a custom shader that draws some simple pattern or figure as the contents of a widget. It would be similar to the integration example, though in that case the shaders for the scene are not part of the UI directly, but rather something that is drawn before the UI without any knowledge of the UI that will be drawn atop it.

@hecrj hecrj added the feature New feature or request label Jun 7, 2021
@akhilman
Copy link
Contributor

I prefer a special texture widget that can display the GPU buffer so that we can draw whatever we want to the offscreen buffer and then display it as a simple image.

@mvilim
Copy link
Author

mvilim commented Sep 16, 2021

@akhilman

I think that is exactly the question this issue is trying to determine how to solve. How do you expose a custom image to the library user in such a way that they can efficiently draw on it? There are a couple ways "custom" widgets are currently exposed in iced, including the Canvas widget for drawing geometric shapes, the Image widget for uploading specific images. The issue with these is that they do not directly expose the GPU, and they do this by mediating the drawing with the CPU, via the construction of geometric objects in Canvas and the copying of an image from system memory in Image.

Abstracting over GPU operations is very tricky, as efficiently exposing the details of the GPU operations without depending on the specifics of driver, hardware, or library is difficult. This issue proposes several ways of wiring through the backend-specific instructions for custom rendering without CPU overhead. It is almost certain that this function must be exposed in a backend-specific way (or the feature is equivalent to abstracting over all backends, which iced already does not do; see, for example, the backend-specific code in wgpu/src/image.rs).

In such a scenario the application is likely already wgpu specific (or any other backend specific), which simplifies the problem, since we can rely on them to call iced (as in the wgpu_integration example) rather than requiring iced to expose the wgpu types it its API without breaking the iced encapsulation. In #1003 I use this combined with the behavior of the layout function to draw a custom widget directly on the frame buffer (without an intermediate texture).

I think the fundamental problem is how to associate each custom texture/buffer with a particular widget without leaking the texture/buffer type (which is backend specific) into the non-backend portions of the API. If you see a better way to do this than the approaches I suggest above, please let me know. It's possibe I've overlooked an approach.

It's possible that we could simply take the approach in #1003 and formalize it a bit, providing at a minimum some helper functions or hooks in the wgpu renderer that allows the custom rendered widget implementer to easily access an appropriately sized texture (along with the necessary state, such as a wgpu::Device reference) which will be rendered into the correct place. There then remains still the choice of where to put this connection, and how to interleave the calls (a closure that performs the rendering?).

@akhilman
Copy link
Contributor

akhilman commented Oct 16, 2021

Iced may not know anything about additional rendering passes.
We could have backend specific function that produces some struct with the encapsulated pointer to GPU buffer.
All iced should provide is the interface to show this struct as the image.

After all rendering loop may look like:

  • Render 3D objects to offscreen buffer
  • Draw GUI with that buffer as Image

Update: It even could be unasfe interface.

@PolyMeilex
Copy link
Contributor

PolyMeilex commented Jul 2, 2022

Just here to express my interest in this. I have a lot of custom rendering in my app because I need instanced rendering, and some animations. My solution to this is to simply drive the whole app myself, and use iced as a side thing that renders UI on top of my wgpu rendering. I would prefer iced to drive my app, handle pages/scenes, layouts and all of that, and my custom rendering can be easily encapsulated in widgets.

I don't want to reinvent the wheel, and write my own scene handling, layouts, async task handling, event handling, etc, if iced can do all of that with a nice architecture already.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
3d feature New feature or request rendering
Projects
None yet
Development

No branches or pull requests

5 participants