finger tracking using Godot4.2 functions #546
Closed
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There's no teleport to this scene example. You can get to it by setting it as the main scene before you upload the demos.
The crucial lines using the new OpenXR functions in Godot4.2 are:
I'm lacking access to the active/confidence values in the hand tracking, and would have to use the OpenXRHand nodes to access this boolean of data, but I can't make it work in this examples, so the hands are on by default.
godotengine/godot#78032 (comment)
(I am not hiding the controller hands, which move only when your hands are facing palm down.)
I am plotting series of spikes at the transform positions so you can see how the joint positions and the joint_rotations relate. Then I use the joint_positions to curl the skeleton of some old OVR hand models bone by bone to conform to these joint_positions in the function
setshapetobonesOVR()
.It is my opinion that this code will have to be written for each different avatar form because very few avatars or hand models are being made that are compatible with the OpenXR standard hand skeleton. With a bit more work the code could be made to guess how the bones might correspond (as is probably now done in VRChat).
Then we need to talk about how the hand tracking could be made compatible with the rest of the XR-tools library, since we only have the select_button action (no other buttons or float or joystick-vector values are available). I have implemented an extremely simple locomotion method in the code as follows:
There are movement methods, such as virtual joystick in Eolia, or highlight and teleport. We need to keep this select_button available for selecting buttons (or do locomotion in left hand and selecting with right hand, as I do in TunnelVR.)
I don't know how best to embed this as a movement type, but it might be able to animate separately from the hands. Similarly, there should be an aim pose that also comes from the system (as it used to on the Oculus API) which was based on your arm orientation, not your finger orientation.