Releases: UCL-VR/ubiq-avatars-readyplayerme
Releases · UCL-VR/ubiq-avatars-readyplayerme
v1.0.2
v1.0.1
v1.0.0
Initial release. Currently support is only good for HalfBody
avatars.
Features
- Integration as a Ubiq avatar
- Runtime import of avatar model by URL
- Speech indicator and mouth animation from microphone (network synced using Ubiq's VOIP)
- Grip animation on hands activated by VR controller (network synced)
- Eye animation (random: not linked to user eye motion, not network synced)
- Half-Body avatars have head and hand tracking in VR with appropriate offsets, and can be controlled with keyboard and mouse as a fallback
What's missing
- Full-Body avatars currently look very strange when moved. Support for them is more complicated as we need an open-source way to do full-body VR rigging. Consider them unusable in the current state.
- Real eye/face animation, from eye/face-tracking capable hardware
- Real hand-tracking, from hand-tracking capable hardware