Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

finger tracking using Godot4.2 functions #546

Closed

Conversation

goatchurchprime
Copy link
Contributor

There's no teleport to this scene example. You can get to it by setting it as the main scene before you upload the demos.

The crucial lines using the new OpenXR functions in Godot4.2 are:

var xr_interface = XRServer.primary_interface
for i in range(OpenXRInterface.HAND_JOINT_MAX):
	joint_transformsLR[i] = Transform3D(Basis(xr_interface.get_hand_joint_rotation(hand, i)), xr_interface.get_hand_joint_position(hand, i))

I'm lacking access to the active/confidence values in the hand tracking, and would have to use the OpenXRHand nodes to access this boolean of data, but I can't make it work in this examples, so the hands are on by default.
godotengine/godot#78032 (comment)

image
(I am not hiding the controller hands, which move only when your hands are facing palm down.)

I am plotting series of spikes at the transform positions so you can see how the joint positions and the joint_rotations relate. Then I use the joint_positions to curl the skeleton of some old OVR hand models bone by bone to conform to these joint_positions in the function setshapetobonesOVR().

It is my opinion that this code will have to be written for each different avatar form because very few avatars or hand models are being made that are compatible with the OpenXR standard hand skeleton. With a bit more work the code could be made to guess how the bones might correspond (as is probably now done in VRChat).

Then we need to talk about how the hand tracking could be made compatible with the rest of the XR-tools library, since we only have the select_button action (no other buttons or float or joystick-vector values are available). I have implemented an extremely simple locomotion method in the code as follows:

func _on_left_hand_button_pressed(name):
	if name == "select_button":
		var vel = 2.5*Vector3(-XRCamera.global_transform.basis.z.x, 0, -XRCamera.global_transform.basis.z.z)
		PlayerBody.velocity = vel

There are movement methods, such as virtual joystick in Eolia, or highlight and teleport. We need to keep this select_button available for selecting buttons (or do locomotion in left hand and selecting with right hand, as I do in TunnelVR.)

I don't know how best to embed this as a movement type, but it might be able to animate separately from the hands. Similarly, there should be an aim pose that also comes from the system (as it used to on the Oculus API) which was based on your arm orientation, not your finger orientation.

@goatchurchprime
Copy link
Contributor Author

This was implemented in the AutoHands addon https://github.com/Godot-Dojo/Godot-XR-AH

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant