This project shows examples of ShaderGraph in visionOS 2.
- GitHub : https://github.com/ynagatomo/ShaderGraphByExamples
- Xcode 16.1+ , visionOS 2.1+
- Some examples only work with a real Apple Vision Pro device.
An example of how to create a 3d texture image file, '.ktx', with Blender and Apple TextureConverter.
- Render image slices of a volumetric 3d scene.
- Convert them to a ktx file with Apple TextureConverter using '--build_volume' option.
- Use the ktx texture file with the Image3D shader-graph node.
There are many ways to create depth-map textures, for example, using the Apple Depth Pro.
apple/ml-depth-pro: Sharp Monocular Metric Depth in Less Than a Second, a foundation model for zero-shot metric monocular depth estimation.
GitHub: https://github.com/apple/ml-depth-pro
Depth Pro CLI: % depth-pro-run -i mydata/girl.png -o results --skip-display
Modify Apple Depth-Pro's python code to save grayscale depth-map;
ml-depth-pro/src/depth_pro/cli/run.py:
90 cmap = plt.get_cmap("binary") # for grayscale depth map
To create side-by-side images for stereoscopic display on Apple Vision Pro, you can use Blender like below.
- Sample Code: ShaderGraph Examples in visionOS 1.2 GitHub: ynagatomo/SGMExamples
- Documentation: ShaderGraph Nodes in RealityKit GitHub: ynagatomo
- Article: ShaderGraph in visionOS (Jan 6, 2024) Medium: ynagatomo
click to open
- [Nov 3, 2024] Added the Ex01, "Interior Mapping Shader"
- [Nov 17, 2024] Added the Ex04, Ex05, Ex06, "Nebura with a 3D texture"
MIT License
since Nov, 2024