Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Editor-Ready UI #254

Open
cart opened this issue Aug 20, 2020 · 133 comments
Open

Editor-Ready UI #254

cart opened this issue Aug 20, 2020 · 133 comments
Labels
A-UI Graphical user interfaces, styles, layouts, and widgets C-Feature A new feature, making something new possible C-Tracking-Issue An issue that collects information about a broad development initiative

Comments

@cart
Copy link
Member

cart commented Aug 20, 2020

This is a Focus Area tracking issue

Before we can start work on the Bevy Editor, we need a solid UI implementation. Bevy UI already has nice "flexbox" layout, and we already have a first stab at buttons and interaction events. But Bevy UI still needs a lot more experimentation if we're going to find the "right" patterns and paradigms. Editor-Ready UI has the following requirements:

  • Embraces the Bevy architecture: Bevy ECS, Bevy Scenes, Bevy Assets, Bevy Events
  • A Canvas-style API for drawing widgets with shapes and anti-aliased curves
  • Define a consistent way to implement widgets
  • A core set of widgets: buttons, inputs, resizable panels, etc
  • Theme-ability
  • "Interaction" and "focus" events
  • Translation-friendly. We can't be anglo-centric here

Active Crates / Repos

No active crates or repos. Feel free to make one. Link to it in this issue and I'll add it here!

Sub Issues

No active issues discussing subtopics for this focus area. If you would like to discuss a particular topic, look for a pre-existing issue in this repo. If you can't find one, feel free to make one! Link to it in this issue and I'll add it to the index.

Documents

@Kleptine
Copy link

Kleptine commented Aug 20, 2020

I wanted to share a few quick thoughts on UI systems in games. I'm a gamedev professionally, and I read this list of requirements, and had a bit of a knee-jerk reaction. These are good features from a basic point of view, but sort of miss the boat on what I would consider are the harder questions around UI:

  • Structure: Immediate Mode vs. Retained Mode
  • Performance: How fast is a Noop render (where nothing in the UI changes)?
  • Styling: CSS-like, or in-code?
  • Flexibility: How easy is it to build any UI, vs. just 'widgets'

I'm happy to elaborate on any of these, but primarily I'd recommend trying to learn from where Unity is at the moment. They started with an immediate mode UI engine for both the Editor and the Runtime. Later, because of performance and flexibility, they adopted a retained mode UI for the Runtime. Now, as of two years ago, they've finally been building a (really fantastic) unified UI system for both the Editor and Runtime, open source.

This is a great talk on the new design (UIElements/UIToolkit): https://www.youtube.com/watch?v=zeCdVmfGUN0

I'd highly recommend taking as much learning as you can from their path. Frankly, both previous systems were a total mess. They seemed great in the moment, but just collapsed under any sort of complexity.

What makes UI challenging is that it's very hard to build incrementally, you have to plan a lot of these things in from the start. I don't think Bevy UI should be nearly as complicated as the latest Unity UI, but it's really worth learning, and not repeating their mistakes.

@Kleptine
Copy link

And also, if the intention for Bevy is to start off as a hobbyist engine, incremental is perfectly fine. I would just expect to have to rewrite the UI a few times as you want to attract larger projects.

@thefuntastic
Copy link

thefuntastic commented Aug 21, 2020

Some early distillation of thoughts is happening here:
https://hackmd.io/lclwOxi3TmCTi8n5WRJWvw?view

At the moment anyone is free to add their particular domain experience. This is stopgap - suggestions are that Cart takes ownership of the hackmd / it's used as ingest for a github wiki/markdown page

@tektrip-biggles
Copy link

Would it be worth also looking as SwiftUI as an example of prior art?

@erlend-sh
Copy link

Would it be worth also looking as SwiftUI as an example of prior art?

Add your write-up of it to the doc if you’d like 👍

@stefee
Copy link
Contributor

stefee commented Aug 22, 2020

Can we add "screen-reader friendly" and "multiple input types" (keyboard, mouse) as hard requirements for any UI framework / UI primitives?

@cart
Copy link
Member Author

cart commented Aug 22, 2020

At the risk of being very ignorant, is screen-reader friendliness a high priority for a visual editor? I have a feeling visually impaired folks will have difficulty getting use out of the editor. Making a visual scene editor friendly to visually impaired users sounds like an entire research project. Mouse + keyboard navigation is a clear win for accessibility so I think I'd want to focus efforts there, given our limited resources.

I'm certainly not saying we shouldn't make ourselves screen reader friendly, as that will absolutely be useful in other apps, I'm just questioning if it should be an Editor-UI priority. I'm curious what the use cases would be for screen readers + visual editors.

Please tell me if I'm wrong here.

@stefee
Copy link
Contributor

stefee commented Aug 22, 2020

Yeah no worries, it was a genuine question. I'm not sure how well screen readers are supported in other editors, I am coming from a context of developing on the open web and it is really important in that context. I imagine native software is a lot more complicated.

The extent to which it is important comes down to how visual the editor is. If there are lots of text inputs (e.g. a way to change transform values of an entity numerically rather than just using click + drag) then I think screen reader support would be something to consider. If at least in the short-term we are looking at a more purely graphical interface then it is less important.

@stefee
Copy link
Contributor

stefee commented Aug 22, 2020

Also for the record, I don't use screen readers so I don't actually know what users would prefer in this case. Maybe folks would rather just edit the RON files directly. Let's not worry about it too much. :-)

@tektrip-biggles
Copy link

Have been looking at SwiftUI and one of the nice things about the declarative approach is that it does mean your data is well structured for adding things like that in the future.

The general concepts they use are really elegant and I suspect would translate well to Rust, although it may be an "all or nothing" commitment to go that route.

Main principle is defining a tree of very lightweight units composed together to create consistent and predictable results with surprisingly little boilerplate and small number of primitive/leaf components. Data dependencies are all declared up front so the framework knows what to update as & when the underlying data changes and there's an "environment" concept so that data can be passed down the hierarchy without needing to be passed through every hop.

I quite like the bounds calculation process too, where parent gives its children size "suggestions", they respond with their desired size and parent determines ultimate positioning (but has to respect child's desired size)

Worth watching the WWDC '19 & '20 talks on it all for ideas.

@regexident
Copy link

@tektrip-biggles: FYI, @raphlinus has done a bunch of great research in this whole topic, as well as in comparison to existing systems like SwiftUI:

https://raphlinus.github.io/rust/druid/2019/10/31/rust-2020.html

@Kleptine
Copy link

A few thoughts:

Screen Reading: If the goal is for Bevy UI to support in-game UI as well as the editor, that means the UI will need to support gamepad-only and keyboard-only navigation of UI from the beginning. So in a lot of ways, you're already halfway there to basic screen-reading support. The bigger question is one of UX design (ie. does the Editor require the mouse?), rather than implementation.

FRP: Modern FRP-based approaches work fantastically for the web. I think they're really strong when you have a flat UI that is structured as a hierarchy. Game UI can be much more complex. In-Game UI often isn't a flat plane, nor a tree. There might be a myriad of text cards, health-bars, holographic displays, etc. Depending on the game, it may be hard to treat this as a single tree.

Additionally, there are entire projects currently figuring out FRP in Rust (Yew, etc). It's been a massive undertaking, and most require lots of macros, templates, generics, etc. And that's without having to build a renderer. So I worry about complexity, scope here.

What I'd favor is a general, high performance UI rendering engine, integrated with the ECS. An FRP crate could be built on top of it, but wouldn't be explicitly baked into the solution. That would allow UI-heavy and 2D games to use FRP as needed, but not require jamming all other use cases inside of it.

@cart cart pinned this issue Aug 23, 2020
@ShalokShalom
Copy link

I think Godot can serve as a nice project to look at, while developing the BevyUI

@ncallaway
Copy link
Contributor

@Kleptine

It's been a massive undertaking, and most require lots of macros, templates, generics, etc. And that's without having to build a renderer. So I worry about complexity, scope here.

What I'd favor is a general, high performance UI rendering engine, integrated with the ECS. An FRP crate could be built on top of it.

I think that's right. Especially the part about complexity and scope. I really really like developing UI in the FRP style and would love if Bevy supported it, but it's a mountain of work that we don't need to take on right now.

I think really nailing a UI rendering engine and ECS would us a good foundation to build different higher level UI experiments on. I could see a React-like UI system someday that treats the ECS UI the way React treats the DOM.

@stefee
Copy link
Contributor

stefee commented Aug 24, 2020

I 100% agree with this @ncallaway and I've had similar thoughts over the last few days. The high-level API is something that can - perhaps should - be developed only once the underlying system has been established and is found to be performant and fit for purpose.

If it means writing very verbose or repetitive code for the time-being, that is a worthwhile trade-off IMO.

@stefee
Copy link
Contributor

stefee commented Aug 24, 2020

Also agree with @Kleptine that the high-level stuff can be user-land crates for the time being.

@stefee
Copy link
Contributor

stefee commented Aug 24, 2020

One point @Kleptine

  • Styling: CSS-like, or in-code?

Why not start in-code and then we can add some optional CSS-like solution later (I'm thinking something like how CSS-in-JS works, i.e. it mostly compiles to target language at build time so everything is static at runtime).

@Kleptine
Copy link

Kleptine commented Aug 24, 2020

I could see a React-like UI system someday that treats the ECS UI the way React treats the DOM.

Precisely my thoughts as well. The ECS would be a great way to store this information in a flat structure. You should take a look at the way Unity stores their UI components in a raw buffer (in the UIElements design talk). It's fairly similar.

Why not start in-code and then we can add some optional CSS-like solution later.

I think that could work out. Unity used in-code styling for their IMGUI implementation. One of the challenges was just that it was cumbersome to feed through all of the different styles through all parts of your program. It might be better, though, if styles can be stored as constants, somehow and exported statically.

So I think some more succinct form of styling would be nice. A CSS-like alternative could be added as a crate, although it might make the API a little more challenging to design. But I agree it's a fairly big task, probably too big for now. Personally, I would be averse to anything other than a CSS subset. There's enough UI styling languages out there that there's no need to reinvent the wheel.

Edit: Another downside of code-only styling is that you need to recompile to see changes. Bevy is already fast to recompile, but you might still have to spend a few minutes replaying the game to get back to the UI you were looking at. It'd be ideal if styling were an asset that could be hot-reloaded in-place, just like any other asset.

@stefee
Copy link
Contributor

stefee commented Aug 24, 2020

It'd be ideal if styling were an asset that could be hot-reloaded in-place, just like any other asset.

Anyone please correct me if I'm wrong, but I think styling as it currently works in Bevy UI can be saved/loaded from scene assets at runtime.

It might be better, though, if styles can be stored as constants, somehow and exported statically.

I was assuming that this would be the case, or at least that UI styling would be built up out of composable functions (i.e. "mixins"). I think this would already be quite easy to do with Bevy UI, but I haven't actually tried it.

@stefee
Copy link
Contributor

stefee commented Aug 24, 2020

I wrote in Discord about the possibility of adopting the Every Layout (https://every-layout.dev/layouts/) primitives as our "building blocks" for layout. I think we could potentially just copy the CSS from the Every Layout components into Bevy UI as some sort of "style mixins".

I'm a big fan of their composable approach to flexbox layout primitives, and since Bevy UI currently uses the flexbox model anyway this would be a good fit: https://every-layout.dev/rudiments/composition/

P.S you have to pay for access to the book, but the CSS itself is not patented, so we can use it.

@ndarilek
Copy link
Contributor

As to screen reader support in the editor/UI, I can speak a bit to that.

I'm a blind game developer working on adding accessibility to Godot. It may not be as relevant here as it is in Godot since I gather that more code-based workflows are first-class here in a way they aren't with Godot, but a few use cases I have in mind:

  • Working alongside sighted developers who would prefer a visual editor. Right now we've got a handful of sighted developers building audio-only experiences for us, but I think the feedback loop could be drastically increased if they could work alongside blind developers vs. filtering feedback through slow playtest cycles. Right now hiring me as a Unity developer is probably impossible since Unity itself is inaccessible, but my work makes Godot accessible enough that I'm able to build a game with a hybrid editor/CLI workflow. It'd be less hybrid if I started my accessibility work earlier.
  • Some tasks are hard to get right in text. I'm specifically thinking of tilemap editors, and have some PoC code that replaces Godot's tilemap editor with an alternate accessible interface. Screen reader access to the editor would let me do something similar with Bevy.
  • Accessible modding tools. Godot's system for DLCs/mods offers using the editor as one option. If I build a moddable game, being able to ship a cross-platform accessible editor would be nice.

I'm running up against some Godot limits that make accessibility challenging to implement, so tentatively pencil me in as willing to help with Bevy UI accessibility. My big condition for helping out is that it be baked directly into the UI (I.e. separate components are fine, but I'd want to ship it as a UI requirement that someone might choose to disable for whatever reason, than as a third-party crate.) I'd also like for it to be as integrated with the repo as is the UI crate, such that CI failures breaking accessibility are blockers. IOW, I'm fine with people not launching the screen reader system if they'd rather not, but I'd want UI consumers to automatically have it and be assured that it works with the most recent UI. Hope that's acceptable.

In terms of making my job easier, here are two bits of advice:

  1. Make keyboard navigation first-class. Godot has this problem right now. The tree widget is absolutely broken for keyboard navigation, and fixing it isn't a priority. I'd do it myself, but since I can't see what I'm doing, it's like coding with a strip of cloth between me and the keyboard. Not saying it needs to be exhaustive and platform-specific, but keyboard/gamepad support should be consistent, and fixing breakage should be prioritized.
  2. Send events for just about everything. Focus enters/leaves a widget. Text is added to, or removed from, a TextEdit. Focus/selection moves around an editor. I need to intercept just about everything and provide a speech representation. Godot has some lacks here too, and there's been a bit of resistence to adding signals I need, meaning accessibility becomes more and more hacked-on.

Anyhow, hope that helps. Sorry for going long. I'm about to launch an accessible Godot game, but am happy to try Bevy for my next title and work on accessibility. One good aspect of audio-only games is that they don't exactly require lots of resources--a blank screen and good audio are usually enough. :)

@ncallaway
Copy link
Contributor

ncallaway commented Aug 24, 2020

I'd vote for accessibility / screen-reader friendly support as part of Editor Ready UI too. I think @ndarilek lays out great reasons why accessibility is important in the Editor itself.

The other reasons why I'd vote to tackle it as early as possible (even if it does expand scope somewhat) are:

  • It's much easier to build-in early when you don't need it than bolt-on late when you do (similar to i18n).
  • If accessibility support is baked into the core of the UI, then it's much more likely that games using Bevy will be more accessible by default.

I don't necessarily think we'll be able to get the first version of the editor to be in a place where it integrates with popular screen-readers on all platforms, but I would really like the core of the UI system to at least have all the pieces in place so that if someone wanted to add screen-reader support the UI system doesn't get in the way.

@ndarilek
Copy link
Contributor

ndarilek commented Aug 25, 2020 via email

@stefee
Copy link
Contributor

stefee commented Aug 25, 2020

Thank you so much @ndarilek!! This is really insightful.

Ideally, and especially because we have prior art in the form of Godot, we shouldn’t need to rely on blind/partially sighted contributors to implement this. So sign me up as well.

@stefee
Copy link
Contributor

stefee commented Aug 31, 2020

Copying @ncallaway's comment from Discord.

@ncallaway:
From the todomvc I've been working on the missing pieces that'd be useful from the UI system:

@stefee
Copy link
Contributor

stefee commented Sep 1, 2020

Based on discussion on Discord, I think #195 should be a priority right now. I'm keen to get going on some of the styling stuff but I don't want to have to re-do too many bits for DPI scaling.

@cart
Copy link
Member Author

cart commented Dec 8, 2021

Yeah I'm pretty hesitant to complicate our builds with more c or c++ deps after we've done so much work to remove things like shaderc, bevy_glsl_to_spirv, and spirv_reflect from our tree (in the upcoming Bevy 0.6). Non rust dependencies have a habit of making some of our users' lives very difficult.

@SamPruden
Copy link

Perhaps Pathfinder is worth looking at. It's a Rust library that implements a subset of the HTML canvas API on GPU, aiming to be very fast for use in Firefox.

Pathfinder 3 is a fast, practical, GPU-based rasterizer for fonts and vector graphics using OpenGL 3.0+, OpenGL ES 3.0+, WebGL 2, and Metal.
Please note that Pathfinder is under heavy development and is incomplete in various areas.

A lot of hard work seems to have been done on getting very nice high performance GPU vector graphics working, but I don't know how well it would play with ECS or how easy it would be to drop into the editor.

I've found @cart commenting on a Pathfinder github issue so I'm sure this is already known about, but I thought it was worth mentioning in this thread.

@cart, are you currently expecting Bevy to implement its own canvas, or to use a library for this?

@cart
Copy link
Member Author

cart commented Dec 20, 2021

My take is that Bevy should implement its own canvas api and render things using bevy_render. Bevy needs full control over the render stack so we can properly target different platforms, render arbitrary things to textures, and generally have a "bevy renderer ecosystem where things are compatible with each other". We can learn from projects like pathfinder (and potentially reuse things that work with wgpu directly, provided they can be plugged into a bevy_render context), but most pre built rendering solutions require at least partial ownership of the render stack.

@SamPruden
Copy link

That makes a lot of sense, but it sounds like a pretty significant undertaking if the amount of work and expertise that's gone into Pathfinder is any indication. What's the plan for getting this done - are you going to lead that effort yourself?

@Kleptine
Copy link

It's definitely significant, but not nearly as much work as something like Pathfinder. They have pretty significantly different use cases, and Bevy can take advantage of simplifying assumptions that a more general rasterizer like Pathfinder can't.

That said, it doesn't make a lot of sense to re-implement something like font rendering1, so there's a line to be drawn as to what should used as a dependency. If pathfinder is sufficiently flexible, I think it could make sense to be used as a tool, just not as the basis for the entire stack.

Footnotes

  1. At least if you want correct Unicode rendering in multiple languages.

@heavyrain266
Copy link

heavyrain266 commented Dec 21, 2021

For font rendering/rasterization, you could use fontdue and bevy's renderer for everything else, instead of huge library for rendering in the browser. 🤔

@Nilirad
Copy link
Contributor

Nilirad commented Dec 21, 2021

I don't have experience with Pathfinder, but I have some with lyon. lyon_tessellation is very useful for producing geometry. It is just a simple renderer-agnostic library.

The only problem that I found using lyon was that the tessellator creates polygons in clockwise order, and I think it's not configurable, so the rendering pipeline must be configured properly to accomodate for that.

@SamPruden
Copy link

SamPruden commented Dec 21, 2021

I'm not experienced in this area, but I believe that tessellation based solutions don't play particularly nicely with AA and other subtle subpixel effects, or animated transitions. Looking at projects like Pathfinder, a lot of work seems to go into custom rasterisation for high quality results at higher performance. I would assume that Bevy wants its core canvas - used in both the editor and games - to be capable of those types of ultra crisp looks.

@heavyrain266
Copy link

We are currently shipping prebuilt versions of shaderc. The problem is that this requires a version for every supported target. Currently we for example don't ship anything for arm32/aarch64 linux, despite bevy fully working on these platforms. There is a limit on the amount of targets for which we can ship prebuilt versions due to the crate size limit on crates.io. Cart already had to ask for it to be increased.

Actually discovered subset of skia called tiny-skia, it's implemented in pure Rust, I'm using it as UI backend for my aseprite-style tilemap editor and works perfectly fine, also saw that someone uses it for desktop widgets backend on linux.

@alice-i-cecile alice-i-cecile added C-Feature A new feature, making something new possible C-Tracking-Issue An issue that collects information about a broad development initiative and removed focus-area labels Jan 1, 2022
@heavyrain266
Copy link

heavyrain266 commented Jan 29, 2022

As a side note this project seems to be abandoned -> https://webglstudio.org/ | https://github.com/jagenjo/webglstudio.js

Perhaps it would be a good idea to pick it up, rewrite the filesystem in rust and migrate the ui to tailwind with actix networking and not even include a webview interface( just navigate to localhost and port ).

*** It would be fun and interesting to see if instead of a seperate bevy editor we could make a 2d/3d UI internally within a bevy project. A bevy engine game with editing and publishing capabilities elegantly coded with modular entities components and systems for building developer tools and/or game features. We would also not be limited to making 2d gui components and and tools and we can step into a VR situations for example spray painting textures ect. Other game engines have markets, plugins, addons ect. we would essentially have an exponentially growing platform instead where our community can work on their own games and commit functionality towards the editor/open world base project. For one example, I build a network solution built modular within the editor could then be used to backup my bevy workspaces across multiple devices, it could be reused to make a collaborative in game editing with developers, it could be used for adding multiplayer to the same game, it could be used as a client patch updater, or even a community based asset manager.

Instead of playing around with dead project, better idea could've be using Eframe by Egui author, you can use it natively and through WASM as editor for web games.

@Barugon
Copy link

Barugon commented Jan 29, 2022

Instead of playing around with dead project, better idea could've be using Eframe by Egui author, you can use it natively and through WASM as editor for web games.

You don't need eframe; there's a bevy_egui plugin.

@Barugon
Copy link

Barugon commented Mar 9, 2022

There is a bevy blender bridge someone started -> https://github.com/piedoom/blender_bevy_toolkit

Interesting. Thanks for the link.

@sharky-david
Copy link

I was getting frustrated with trying to build UIs with all the boilerplate it involves, so made myself a rudimentary way to style Bevy UI elements with CSS (the current implementation is basically a subset of web styling anyway). It's been sitting on my PC for 6 months, and I've found it useful, so maybe someone else will too.

https://github.com/sharky-david/bevy_prototype_css

I'd be very interested in what others think.

@ShalokShalom
Copy link

I am with the others, who think CSS has no place in a modern world.

@breebee
Copy link

breebee commented Jul 23, 2022

I have been using blender+armory for the past couple of months, absolutely amazing experience using blender as an editor. Unbeatable, hands down in every aspect from management and organization to work flow. I strongly recommend Cart to reconsider blender as the editor and improve upon Armory's approach with bevys ECS

@heavyrain266
Copy link

heavyrain266 commented Jul 23, 2022

I have been using blender+armory for the past couple of months, absolutely amazing experience using blender as an editor. Unbeatable, hands down in every aspect from management and organization to work flow. I strongly recommend Cart to reconsider blender as the editor and improve upon Armory's approach with bevys ECS

I don't think that you fully understand scope of the editor for game engine. It's not only 3D editing but also 2D, visual UI design, story telling, quests, asset management, potential visual scripting with node graphs and many other tools, plugins etc.

For example Guirrela Games uses Autodesk Maya with custom renderer for game's world editing while editor is used for visual scripting, UI design, quests and many more technical stuff.

Problem with blender is that, you cannot just plug bevy's renderer to it... so you will have backed lights and a lot of other hardcoded data while you atill need an editor for stuff unrelated to rendering.

@Pauan
Copy link

Pauan commented Jul 23, 2022

I don't think that you fully understand scope of the editor for game engine. It's not only 3D editing but also 2D, visual UI design, story telling, quests, asset management, potential visual scripting with node graphs and many other tools, plugins etc.

Yes, Armory has all of that. I suggest you give it a look. It heavily extends Blender by adding a lot of extra features. It is a full-featured editor and game engine.

Problem with blender is that, you cannot just plug bevy's renderer to it... so you will have backed lights and a lot of other hardcoded data while you atill need an editor for stuff unrelated to rendering.

I think it is possible to integrate Bevy with Blender. Blender's add-on API does support custom renderers (even Cycles is written as just an add-on). Blender's add-on system is incredibly powerful. And since it's possible to call C / Rust code from Python, I don't see any technical reason why it can't be done.

This sort of integration is actually quite common, it's fairly standard nowadays.

For example, BabylonJS is a JS 3D game engine which has integration with 3DS MAX, Blender, Cheetah, and Maya.

And Unreal Engine also has Blender add-ons which provide integration with Unreal.

It would be fairly easy to create a Blender add-on which exports a Blender scene into the Bevy editor.

@Weibye Weibye moved this to High Level Goals in UI Jul 28, 2022
@Weibye Weibye added this to UI Jul 28, 2022
@mrchantey
Copy link
Contributor

I am with the others, who think CSS has no place in a modern world.

I agree that the older parts of the spec have their issues, but the modern features, ie grid & flex, are super powerful.

Unity has resolved to use a CSS subset in their "Singular UI Solution" moving forward.

@ShalokShalom
Copy link

I don't think that you fully understand scope of the editor for game engine. It's not only 3D editing but also 2D, visual UI design, story telling, quests, asset management, potential visual scripting with node graphs and many other tools, plugins etc.

Yes, Armory has all of that. I suggest you give it a look. It heavily extends Blender by adding a lot of extra features. It is a full-featured editor and game engine.

Problem with blender is that, you cannot just plug bevy's renderer to it... so you will have backed lights and a lot of other hardcoded data while you atill need an editor for stuff unrelated to rendering.

I think it is possible to integrate Bevy with Blender. Blender's add-on API does support custom renderers (even Cycles is written as just an add-on). Blender's add-on system is incredibly powerful. And since it's possible to call C / Rust code from Python, I don't see any technical reason why it can't be done.

This sort of integration is actually quite common, it's fairly standard nowadays.

For example, BabylonJS is a JS 3D game engine which has integration with 3DS MAX, Blender, Cheetah, and Maya.

And Unreal Engine also has Blender add-ons which provide integration with Unreal.

It would be fairly easy to create a Blender add-on which exports a Blender scene into the Bevy editor.

I think one of the huge benefits, that we gain by going with Blender, is obviously that we develop less code.

The maintainance is done for us, especially with Armory as well doing stuff for us.

Another huge benefit that I see, is familarity: Blender is used and loved worldwide, there are literally tens of thousands people, who can already use 'our editor'

I think this deserves consideration.

@cart
Copy link
Member Author

cart commented Oct 28, 2022

I understand the "Blender integrated engine" value prop. I have used Armory a reasonable amount and have read a good portion of their code. However my stance on this hasn't changed. Bevy's Editor should be built using Bevy. I won't restate every reason I've stated above and elsewhere, but in short:

Blender has its own way to represent data, extend the editor (Python), represent scenes, handle UX, etc. This means that developing the editor experience would require Bevy developers to learn a brand new skill set. Bevy Users that want to tinker on the Bevy Editor can't rely on their existing Bevy experience to build the tooling. They need to learn a brand new set of skills. This raises the barrier to entry. The Bevy<->Blender translation layer would also introduce significant complexity (and almost certainly some runtime overhead). Building the Bevy Editor in Bevy means that we can use the same toolsets, the same skills, and the same paradigms everywhere. And it forces us to dogfood Bevy Engine features, ensuring Bevy is a good option for building "engine tooling".

Additionally, while Blender is a top-of-class 3D-editor, they aren't optimizing their experiences (or data layers) for real-time game development (anymore). If Bevy's needs are ever mismatched with Blenders, or our users need something that Blender's developers aren't willing to accommodate, then we would just be stuck. If the Bevy Org owns the editor in its entirety, we can provide a holistic experience that optimizes for the exact needs of our community.

@mrDIMAS
Copy link

mrDIMAS commented Dec 1, 2022

Just wanna highlight again that editor-ready ui crate is already exists - it is fyrox-ui (link). Not everything should be ecs based, a solution is good if does its job in reasonable time. Building your own ui library is cool in terms of satisfying programmers curiosity, but counterproductive in terms of development time. Objections?

@nicoburns
Copy link
Contributor

nicoburns commented Jun 2, 2023

This is a dump of my personal "UI is usable" checklist:

Essential

Nice to have

@rlidwka
Copy link
Contributor

rlidwka commented Jun 5, 2023

Just wanna highlight again that editor-ready ui crate is already exists - it is fyrox-ui (link). Not everything should be ecs based, a solution is good if does its job in reasonable time.

@mrDIMAS, do you have any examples of using it with Bevy engine? I'm using bevy-egui currently for my projects, but don't mind exploring other options.

@Barugon
Copy link

Barugon commented Jun 5, 2023

FYI, I don't think that firox-ui has any keyboard navigation.

@SorteKanin
Copy link

Related to this UI project thingy and regarding these points from the top post:

  • Define a consistent way to implement widgets
  • A core set of widgets: buttons, inputs, resizable panels, etc

I'd like to just drop some of my experience using Bevy UI so far and how I've built a little widget system on top of it. I'm curious whether anyone else has done something similar (surely someone has?) and whether this is an approach that could be built into Bevy (or maybe used in the editor or something). Or maybe this is not desirable for some reason? I haven't explored this approach a lot so perhaps there's some disadvantages I'm not seeing yet.

I think that a big issue with Bevy UI at the moment is that it's very verbose to define UI and it's cumbersome to compartmentalize. It's also not intuitive how a "widget" (or element or whatever you want to call it) can have control over where to place its internals (i.e. its children UI elements).

After experimenting a bit, I ended up writing these couple of traits:

/// Types that can spawn new entities.
pub(crate) trait Spawner<'w, 's> {
    /// Spawns a new entity with the given bundle of components.
    ///
    /// Returns an [`EntityCommands`] to enable further commands on the entity.
    fn spawn<B: Bundle>(&mut self, bundle: B) -> EntityCommands<'w, 's, '_>;
}

impl<'w, 's> Spawner<'w, 's> for Commands<'w, 's> {
    fn spawn<B: Bundle>(&mut self, bundle: B) -> EntityCommands<'w, 's, '_> {
        self.spawn(bundle)
    }
}

impl<'w, 's> Spawner<'w, 's> for ChildBuilder<'w, 's, '_> {
    fn spawn<B: Bundle>(&mut self, bundle: B) -> EntityCommands<'w, 's, '_> {
        self.spawn(bundle)
    }
}
/// A trait for widgets that lets them be spawned neatly.
pub(crate) trait Widget<'w, 's> {
    /// Spawns the widget with the given spawner.
    fn spawn<S: Spawner<'w, 's>>(self, spawner: &mut S) -> EntityCommands<'w, 's, '_>;
}

The Spawner trait is just unifying the spawn methods of Commands and ChildBuilder. I'm honestly confused why this isn't built into Bevy already but I can define it myself pretty easily anyway.

The Widget trait then allows any implementor to be spawned under a given Spawner. For example, you could define a simple menu like this (some used functions returning NodeBundle's left out):

/// The menu widget consists of a container with a title.
pub(crate) struct MenuWidget<F: FnOnce(&mut ChildBuilder)> {
    /// The title is displayed at the top of the menu.
    title: &'static str,
    /// The content of the menu is filled by this closure.
    content: F,
}

impl<F: FnOnce(&mut ChildBuilder)> MenuWidget<F> {
    /// Creates a new menu with the given title and content.
    pub(crate) fn new(title: &'static str, content: F) -> Self {
        MenuWidget { title, content }
    }
}

impl<'w, 's, F> Widget<'w, 's> for MenuWidget<F>
where
    F: FnOnce(&mut ChildBuilder),
{
    fn spawn<S: Spawner<'w, 's>>(self, spawner: &mut S) -> EntityCommands<'w, 's, '_> {
        let mut root = spawner.spawn(menu_body());
        root.with_children(|root| {
            root.spawn(menu_background()).with_children(|background| {
                background.spawn(
                    TextBundle::from_section(
                        self.title,
                        TextStyle {
                            font_size: 72.0,
                            color: Color::WHITE,
                            ..default()
                        },
                    )
                    .with_style(Style {
                        margin: UiRect::all(Val::Px(48.0)),
                        ..default()
                    }),
                );

                (self.content)(background)
            });
        });
        root
    }
}

Note how this allows the MenuWidget to decide the location of its inner content (it's nested into the menu's background, not just as a direct child of the menu). The definition of the MenuWidget is still a little verbose, but now it's packed away in a neat self-contained package. By then also defining a ButtonWidget and a CheckboxWidget, I can define UI that uses all the widgets quite neatly. For instance, a very simple settings menu could look like this:

fn setup_settings_menu(mut cmds: Commands, settings: Res<Settings>) {
    info!("Setting up settings menu...");

    MenuWidget::new("Settings", |menu| {
        CheckboxWidget::new("Enable vsync:", settings.vsync, Action::ToggleVsync).spawn(menu);

        CheckboxWidget::new(
            "Enable development mode:",
            settings.dev_mode == DevMode::On,
            Action::ToggleDevMode,
        )
        .spawn(menu);

        ButtonWidget::new("Back", |world| {
            world
                .resource_mut::<NextState<AppState>>()
                .set(AppState::Menu(MenuState::Main));
        })
        .spawn(menu);
    })
    .spawn(&mut cmds)
    // Insert a marker component to easily despawn the menu later.
    .insert(SettingsMenu);
}

I find the above quite neat as each widget is self-contained. I only have to give the widget it's required inputs and it'll take care of the rest.

In the above code, buttons take a label and an action (a fn(&mut World)) that they can use to do something once pressed. Checkboxes take a label, an initial state (bool) and an Action enum that is a list of user input actions. But the specific inputs aren't important and are mostly implementation details of how I've built these widgets for myself right now.

The point is that each widget is in control of what inputs it gets and what it uses those inputs for in order to construct the UI. It definitely still requires some additional systems to make these widgets do things, but I thought I'd still share this approach in case it is interesting to anyone.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-UI Graphical user interfaces, styles, layouts, and widgets C-Feature A new feature, making something new possible C-Tracking-Issue An issue that collects information about a broad development initiative
Projects
Status: High Level Goals
Development

No branches or pull requests