Synth VR
Mixed Reality interaction with synth humanoids on Meta Quest. Hand physics, room integration, passthrough.
com.genesis.synth.vr 
Install via UPM
Add to Unity Package Manager using this URL
https://www.pkglnk.dev/synth-vr.git README Markdown
Copy this to your project's README.md
## Installation
Add **Synth VR** to your Unity project via Package Manager:
1. Open **Window > Package Manager**
2. Click **+** > **Add package from git URL**
3. Enter:
```
https://www.pkglnk.dev/synth-vr.git
```
[](https://www.pkglnk.dev/pkg/synth-vr)Dependencies (5)
README
Synth VR
Mixed Reality interaction with Synth humanoids on Meta Quest. Physics-based hand tracking, room-scale environment setup, and passthrough rendering.
Features
- Physics Hand Tracking ā MuJoCo-bound hand bodies driven by OVR hand tracking. Push, grab, pull, and interact with the Synth using your real hands.
- Room Integration ā MRUK-powered room setup with physics colliders for walls, floor, ceiling, and furniture anchors. The Synth interacts with your physical room layout.
- Smooth Locomotion ā Controller-based walk and rotate with configurable speed.
- Passthrough Rendering ā Occluder materials for room surfaces, PTRL (Passthrough Receive Light) shader for realistic lighting on the Synth.
- Ambient Light Estimation ā Passthrough camera-based lighting estimation for harmonized rendering in mixed reality.
Ecosystem
synth-vr is part of a three-package architecture for creating, training, and interacting with physics-simulated humanoids:
| Package | Role | |
|---|---|---|
| synth-core | Humanoid creation, MuJoCo physics, skill architecture | Required |
| synth-training | On-device reinforcement learning via TorchSharp SAC | Optional |
| synth-vr (this repo) | Mixed reality interaction on Meta Quest | ā |
synth-core provides the physics body and motor system. synth-vr adds Meta Quest hand tracking, MRUK room integration, and passthrough rendering so the Synth lives in your physical space. Optionally add synth-training to enable on-device reinforcement learning ā the Synth trains live on Quest while you interact with it.
Requirements
- Unity 6000.x or later
- synth-core package
- MuJoCo Unity plugin (
org.mujoco) ā via arghyasur1991/mujoco fork (synth-patchesbranch) - Meta XR SDK packages (v85+):
com.meta.xr.sdk.interaction.ovrcom.meta.xr.mrutilitykit
Optional
- synth-training ā Add on-device reinforcement learning so the Synth trains directly on Quest while you interact with it in your room. Without this package, synth-vr provides physics interaction only (no learning).
Installation
Add to Packages/manifest.json:
{
"dependencies": {
"com.genesis.synth.vr": "https://github.com/arghyasur1991/synth-vr.git",
"com.genesis.synth": "https://github.com/arghyasur1991/synth-core.git",
"org.mujoco": "https://github.com/arghyasur1991/mujoco.git?path=unity#synth-patches"
}
}
Meta XR SDK packages are installed separately via the Unity Package Manager or Meta's setup tools.
Scene Setup
Use the built-in wizard to configure a VR scene: Synth > Setup > VR Scene Wizard
Step 1 ā Add Meta Building Blocks
Open Meta > Tools > Building Blocks and add these to your scene:
| Block | Purpose |
|---|---|
| Camera Rig | OVRManager + OVRCameraRig with tracking space and hand/eye anchors |
| Hand Tracking (x2) | OVRHand + OVRSkeleton on left and right hand anchors |
| Passthrough | OVRPassthroughLayer for mixed reality passthrough |
| OVRInteractionComprehensive | Ray/grab/poke interactions (optional) |
Step 2 ā Run the VR Scene Wizard
The wizard validates Building Blocks, then configures everything else automatically:
- Conflicting objects ā disables stray MainCamera, Ground/Plane, Haptics, and controller visuals
- MuJoCo scene ā creates
MjScene,Global Settings(withMjGlobalSettings), andQuestPerformanceManager - Synth-VR components ā adds
SceneMeshManager(room integration) andPlayerHandBodies(physics hands, auto-wired to OVR skeletons) - Lighting ā creates a directional light if missing
- URP settings ā applies Quest-optimized render pipeline settings (Render Scale 1.0, MSAA 4x, HDR off, Shadow Distance 50, SRP Batcher on)
- Permissions ā fixes OculusProjectConfig for hand tracking, passthrough, scene support, and anchor support
Click Setup Everything to run all steps at once, or use per-section Fix buttons for granular control. All changes support Undo.
Step 3 ā Add Your Synth
Add a single Synth to the scene via synth-core. Note: only one active Synth per scene is supported.
Required Permissions
| Feature | OculusProjectConfig | OpenXR Feature | Android Permission |
|---|---|---|---|
| Hand Tracking | handTrackingSupport: 1 |
Hand Tracking Subsystem | Auto-injected |
| Passthrough | insightPassthroughEnabled: 1 |
Meta Quest: Camera | android.permission.CAMERA (auto) |
| Room Mesh (MRUK) | sceneSupport: 1 |
Meta Quest: Meshing / Planes | com.oculus.permission.USE_SCENE (auto) |
| Spatial Anchors | anchorSupport: 1 |
Meta Quest: Anchors | com.oculus.permission.USE_SCENE (auto) |
The wizard auto-fixes OculusProjectConfig settings. OpenXR features must be enabled manually in Project Settings > XR Plug-in Management > OpenXR.
Package Structure
synth-vr/
āāā Runtime/
ā āāā Hands/ PlayerHandBodies (MuJoCo hand tracking)
ā āāā Locomotion/ PlayerLocomotion (smooth walk/rotate)
ā āāā SceneSetup/ SceneMeshManager (MRUK room integration)
ā āāā Performance/ QuestPerformanceManager (CPU/GPU levels)
ā āāā Lighting/ AmbientLightEstimator
ā āāā Shaders/ PTRLWithDepth
ā āāā Resources/ InvisibleOccluder, PTRLHighlightsAndShadows materials
āāā Editor/
āāā VRSceneSetupWizard.cs (interactive scene setup)
āāā XRSetupEditor.cs (XR plugin configuration)
Components
| Component | Purpose | Where |
|---|---|---|
PlayerHandBodies |
Creates MuJoCo bodies from OVR hand tracking for physics interaction | OVRCameraRig |
SceneMeshManager |
Sets up room geometry, furniture anchors, and physics colliders from MRUK | SceneMesh object |
QuestPerformanceManager |
Requests CPU/GPU performance levels from Quest OS | Global Settings |
PlayerLocomotion |
Smooth walk and rotate via controller thumbstick | OVRCameraRig |
AmbientLightEstimator |
Estimates ambient lighting from passthrough camera for realistic rendering | Runtime (auto) |
Roadmap
- Voice interaction ā Speech-to-intent pipeline for verbal commands and conversational interaction with the Synth
- Haptic feedback ā Controller and hand haptics driven by MuJoCo contact forces
- Spatial audio ā 3D audio anchored to the Synth for immersive presence
- Better light harmonization ā Improved passthrough lighting estimation for more realistic Synth rendering in your room
- Better occlusion ā Depth-accurate occlusion between virtual Synth and real-world objects
- Synth sees your world ā Feed pre-scanned Gaussian splat of the room into Synth eye cameras so the agent perceives the real environment
License
Apache-2.0 ā see LICENSE for details.
No comments yet. Be the first!