-
Notifications
You must be signed in to change notification settings - Fork 80
VTube Studio Settings
After clicking the settings icon (gear icon) in the main menu, the settings screen will be shown. It has several tabs, a help icon (links to this manual) and a language selector button that lets you change the interface language (app restart needed).
Global (not related to a specific model) setting are saved in this file:
<VTS-App-Path>/VTube Studio_Data/StreamingAssets/Config/vts_config.json
From here, you can purchase VTube Studio PRO (iOS/Android only). On iOS, there is also a button to restore purchases on this screen.
Furthermore, there are some general settings for hiding the VTS logo, remembering the scene (save open background and model when restarting), etc.
Next, there’s the "Stream to PC" setup. This is for connecting your phone to your PC to stream the face tracking data to the PC app. This will be covered in the chapter "Connecting your Smartphone to your PC/Mac". On iOS devices, you can also start an USB connection as alternative to WiFi here. This is recommended as it seems to be more reliable/stable than WiFi. On Android, this option is unfortunately not available.
Finally, there's the "Mouse Input Config". This allows you to use the mouse/finger position as input parameter to animate your model in addition to face tracking. You can select an X-Range and Y-Range in pixels. This range will be mapped to [-1, 1] for X and Y. With this setting, you could for example make your model follow your finger or mouse pointer with its eyes. This will also show mouse-clicks (right, left, middle) that can be used to trigger hotkeys. In the smartphone app, these mouse-clicks can be triggered by touching the screen with one finger (left-click), two fingers (right-click) and three fingers (middle-click).
VTube Studio supports voice-based lipsync. This can replace or improve the camera mouth tracking using your voice recorded from your microphone.
This is now available on Windows, macOS, Android and iOS. For more info, see "Lipsync".
This config tab will let you change settings related to rendering and the camera:
-
AR Camera Config
- Change face tracking camera FPS. This does not affect the app framerate.
- Change whether the camera preview is shown. If you want to make sure your face is never shown on your smartphone screen, you can configure that here (see third picture).
- Change whether face mask is shown over your face.
-
Tracking Config
- Change behavior of model when face tracking is lost. You can either have the model freeze in place when your face is lost or have it return to its default pose.
- Change eye-blink linking behavior
- If "Always" is rotated, the EyeOpen values of your left and right eye will be always set to the average of both.
- If "When face is rotated" is selected, the EyeOpen value of your eyes will be linked when your head is rotated too much to the side. This will use the value of the eye that is still visible when your head is rotated to one side.
- Various sensitivities for the iOS/Android face tracker. If (for example) your mouth doesn’t open enough, a quick fix might be to increase the sensitivity here.
-
iOS Blink Fix
- Due to a recent bug in iOS face tracking, your HeadY parameter will twitch upward very slightly when blinking. This can be counterbalanced using those sliders until it is fixed officially by Apple (hopefully soon). A value for Y of around 30-45 is recommended, but check what works well for you. X can be left at 0 usually. Depending on your Live2D model, you may not need this setting at all because the effect is very subtle.
-
Quality and Effects
- Turn on Anti-Aliasing (smoother model edges but costs performance, not recommended on mobile)
- Change app FPS. VSync means the FPS will be synced to your display update rate. This is recommended on PC to prevent screen-tearing.
-
Tracking V2: Adds various experimental iOS tracking improvements. Use with caution!
- Better winking: When one eye is fully closed and the other one isn’t, this will force the partially open eye a bit more open. This makes winking easier/better.
- Better frowning: Automatically frown more by lowering your brows. This is done by detecting when the brows are down and subtracting a certain value of MouthSmile in that case.
-
Record Screen
- Smartphone-only. Lets you directly record your screen including microphone sound. A small red dot will be shown at the bottom-right while recording. Tap it to stop the recording at any time
Starting with the Steam release of VTube Studio, you can use your webcam for face tracking. Please note that just like with iPhone/Android version, your webcam picture will never be shown within VTube Studio.
There is no way to reveal the webcam video in VTube Studio and VTube Studio has no code that would even enable it to do so.
Most settings here should be self-explanatory. Generally, a resolution of 1280x720 and FPS of at least 10 are recommended for decent tracking. This is especially true for eye- and blink-tracking.
When checking "Autostart cam with VTS", the camera will be started a few seconds after the scene has finished loading when starting VTube Studio.
After starting the webcam for the first time, press "Calibrate" while looking at the camera with a neutral face. Your calibration data is saved so you don’t need to calibrate again, even after restarting VTube Studio, but if you feel like the angles of your model look wrong compared to your face, you can always recalibrate. Webcam calibration data is saved in the following file:
<VTS-App-Path>/VTube Studio_Data/StreamingAssets/Config/webcam_calibration.json
For details on the different tracking quality levels, please refer to the information shown when selecting them in VTube Studio. Important: Only Level 5 allows winking.
Right after turning the camera on, the camera calibrates. You can also manually calibrate afterwards by using the button or a hotkey. This resets the face position to looking forward. Calibration is important, so make sure to look straight at the camera with a neutral face when running it.
The webcam tracking settings can be used to customize the behavior of how face tracking values are calculated. Again, most are self-explanatory. Play around with different values until you find something that works for you. "Link Eye Blinking" and "Webcam Parameter Calc. Mode" will be explained in a bit more detail.
Using this, you can link your eyes in case you have problems with weird blink behavior. It's recommended to set this to "When face is rotated". What this does is, when the face is rotated left/right to a point where one eye becomes invisible to the tracker, the other visible eye will be used to control blinking.
"Automatic" is the old way. If this works for you, feel free to keep using it. This requires an initial ~10 second warmup phase (see instructions in VTube Studio). The new "Manual" (now default) mode does not require this and may be more accurate for most people in most situations.
For the "Manual" mode, it’s important to set the sensitivity sliders to something that works for you and always calibrate the webcam once while looking straight at it with a neutral face. These sliders only have an effect on the webcam tracking. If you want to set sensitivities for smartphone tracking, use the same sliders in the Android/iOS version.
If you have any questions that this manual doesn't answer, please ask in the VTube Studio Discord !!
- Android vs. iPhone vs. Webcam
- Getting Started
- Introduction & Requirements
- Preparing your model for VTube Studio
- Where to get models?
- Restore old VTS Versions
- Controlling multiple models with one device
- Copy config between models
- Loading your own Backgrounds
- Recoloring Models and Items
- Record Animations
- Recording/Streaming with OBS
- Sending data to VSeeFace
- Starting as Admin
- Starting without Steam
- Streaming to Mac/PC
- VNet Multiplayer Overview
- Steam Workshop
- Taking/Sharing Screenshots
- Live2D Cubism Editor Communication
- VTube Studio Settings
- VTS Model Settings
- VTube Studio Model File
- Visual Effects
- Twitch Interaction
- Twitch Hotkey Triggers
- Spout2 Background
- Expressions ("Stickers"/"Emotes")
- Animations
- Interaction between Animations, Tracking, Physics, etc.
- Google Mediapipe Face Tracker
- NVIDIA Broadcast Face Tracker
- Tobii Eye-Tracker
- Hand-Tracking
- Lipsync
- Item System
- Live2D-Items
- Item Scenes & Item Hotkeys
- Add Special ArtMesh Functionality
- Display Light Overlay
- VNet Security
- Plugins (YouTube, Twitch, etc.)
- Web-Items
- Web-Item Plugins