Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use-after-free if Styles are dropped after being assigned #110

Closed
nia-e opened this issue May 6, 2023 · 11 comments · Fixed by #141
Closed

Use-after-free if Styles are dropped after being assigned #110

nia-e opened this issue May 6, 2023 · 11 comments · Fixed by #141
Labels
bug Something isn't working help wanted Extra attention is needed question Further information is requested

Comments

@nia-e
Copy link
Collaborator

nia-e commented May 6, 2023

Turns out that when adding a style to a widget, LVGL doesn't actually store that style internally anywhere; instead, it just stores a pointer to it. Thus, dropping a Style after it's been added to an object can cause undefined behaviour and we have no way of detecting it.

My first idea was to internally hold a static mut ALL_STYLES: Vec<Style> and rather than having Style creation return a Style, rather have it return a &mut Style (giving the option of removing Styles from the global vector unsafely). This is... bad, though, since it would make anyone using us end up having a lot of unsafe code and would also break if alloc is not enabled. Similar issues re: unsafe proliferation would occur if we make add_style() unsafe directly.

My next suggestion might be having every widget struct internally hold a Style, and on assigning a style, simply copy the new Style in-place. However this carries the issue of making assigning multiple Styles impossible. We could maybe find a way to "merge" styles together but I fear this may require carrying a patched version LVGL itself and I don't think we want that headache. Carrying a Vec<Style> internally is better than the idea above but still breaks if we don't have alloc (though I guess we can write our own Vec with the LVGL allocator? That might be our best bet).

Lastly we can try to make this Someone Else's Problem by somehow making the compiler complain when Styles are dropped. The neatest way would be if it were possible to somehow carry multiple PhantomDatas and append to it the lifetimes we care about, thus causing an error if a Style we care about is dropped - thus forcing the consumer of our API to use some kind of collection on their end - but I don't think it's actually possible which is entirely possible since Rust allows lifetime summing. Neat.

I've also considered doing what we do with Objects i.e. storing a raw pointer, but that leads right back to memory leak issues since Styles don't have parents to drop them (though we don't yet implement deletion on objects, but it can be done relatively easily and it might be worth doing on Screen types at least).

@nia-e nia-e added bug Something isn't working help wanted Extra attention is needed question Further information is requested labels May 6, 2023
@nia-e
Copy link
Collaborator Author

nia-e commented May 6, 2023

@kisvegabor is there any way to merge together two styles to support the idea in paragraph 3 above?

@kisvegabor
Copy link
Member

There is no built in way at this moment to merge two styles . But even if it were possible it'd be a little bit more complicated. Imagine a button with 3 styles:

  1. normal style to make the button blue
  2. a style to make the button darker blue in pressed state
  3. add a slight shadow in pressed state

In this case we need 2 styles in the button: one for 1) and an other for 2) and 3). That is we can't merge together the pressed, checked, etc styles of various parts (main, scrollbar, indicator, etc). For each combination we would need a separate style.

Thus, dropping a Style after it's been added to an object can cause undefined behaviour and we have no way of detecting it.

How can "dropping a Style" happens in Rust? Is it a explicit act of the user or is it dropped by Rust automatically?

@nia-e
Copy link
Collaborator Author

nia-e commented May 8, 2023

Normally dropping happens automatically when a value goes out of scope; we could stop this quite easily, but I was hoping there's some way to tell when it's okay to drop a style and reclaim the memory.

@kisvegabor
Copy link
Member

Normally, (I mean in a common LVGL UI secario) the styles should be created and initialized once at the beginning of the UI code and should never be dropped. At least this is how we do it in C.

There can be cases when the a give group of styles are related e.g. to a screen and you want to free the styles too when your delete the screen. But it might be required only if the RAM is very limited.

@nia-e
Copy link
Collaborator Author

nia-e commented May 9, 2023

Is it safe to drop a style if we can assert that no object is using it?

@kisvegabor
Copy link
Member

It's safe, but the user also needs to be aware of that the unused style will be dropped. For example this can happen:

  1. During start up the user initializes style for a popup
  2. The styles are not used yet so they will be freed
  3. When a button is pressed a popup needs to be created with the styles, but the styles are already freed

Basically the same issue can happen if some time elapses between initializing the styles and actually using them.

@nia-e
Copy link
Collaborator Author

nia-e commented May 9, 2023

I was going to keep track of which styles to drop statically by taking advantage of lifetimes, so using a style after it went out of scope would be a compile error and force the user to deal with it (e.g. move the style to a greater scope). This way, there's no runtime overhead of checking which styles are in use and a style will never be dropped while an object exists which references it.

@kisvegabor
Copy link
Member

So the styles in the largest (global?) scope won't be never dropped, right?

Let's say I create a style in a function and assign it to an object. When the function exits the style won't dropped as the style is being used, but if it wasn't assigned in the function for any reason, it will be dropped. And if all the objects to which the style was assigned is deleted, the style will be freed as well. Is it correct?

@nia-e
Copy link
Collaborator Author

nia-e commented May 10, 2023

Sort of. Under my proposal, if I were to do this::

let mut obj = Obj::new();
{
    let style = Style::default();
    obj.add_style(&style);
    // ~~~~~~~~~~~^~~~~~~~
    // ERROR: `style` does not live long enough
}

It would be a compile-time error as we're defining style in a smaller scope than the object it's assigned to. Thus, we need to either move it into the same scope as the object (or greater), e.g. by creating some kind of growable collection or declaring it but not initialising it:

let style_uninit; // `style_uninit` lives in this scope, but is uninitialised
let style_vec: Vec::new(); // Assuming we have an allocator and heap access

let mut obj_1 = Obj::new();
let mut obj_2 = Obj::new();
{
    style_uninit = Style::default(); // Initialise `style_uninit`
    obj_1.add_style(&style_uninit);
    // No error since we're just initialising the style here and moving the value into the parent scope

    let style_new = Style::default();
    style_vec.add(style_new);
    obj_2.add_style(style_vec.last().unwrap()) // `.last()` only fails if the `Vec` is empty
    // No error; we create `new_style` here, but move it into the parent scope which is also where the object lives
}

Thus the compiler can assert that by the time it's ready to drop the Style aka lv_style_t, it can also drop the Obj aka lv_obj_t. If we were to move the object into some still greater scope (i.e. if the code above was inside of a non-main function body and we moved the object into main()), the compiler would complain and ask us to drag the style along with it. We could also provide methods to intentionally leak a style and/or object (since a style doesn't contain any other references or pointers that Rust tracks, it would have 'static lifetime i.e. lives until the end of the program. Leaking an object would only give it the same lifetime as the shortest-lived Style we've assigned to it, so leaking all the Styles associated to an object and also the object itself means they live forever)

@kisvegabor
Copy link
Member

kisvegabor commented May 11, 2023

I see now and it sounds good! 👍 Thank you for the explanation.

@cydergoth
Copy link
Contributor

I just ran into this, and ended up storing a .clone() (implicit by Rust) of the Style object in a Rust struct which contains both the LVGL widget and the associated style. It requires using a #[allow(dead_code)] which isn't ideal, and it requires manually adding all the correct styles to the encapsulating Rust struct.

  #[allow(dead_code)] // Used to protect the styles
  #[derive(Debug)]
  pub struct RustyMegaWidget<'a> {
      title: String,
      view: Obj,
      view_title: Label,
      // Styles are here to ensure they have the same lifecycle as RustyMegaWidget
      style_view: Style,
      style_title: Style,
      ....
  }

This is obviously not an ideal situation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed question Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants