Welcome! These docs will get you up and running with the VR Interaction Framework. If you come across any issues, please don't hesitate to contact me for support, or come join the Discord community!
The VR Interaction Framework is a collection of scripts and prefabs to help you develop interactions in VR. It is intended to make it easier for developers to create their own interactable objects and be productive quickly.
There are multiple prefabs available to provide you with examples of common VR interactions. These range from simple blocks and balls to switches, levers, weapons and rocket arms. You are encouraged to experiment with these, inspect the scripts to see how they were made, and create your own.
As of version 1.5, VRIF no longer requires Oculus Integration, which makes installation relatively straight-forward. Just start a new project with Unity 2019.4LTS, import the VRIF asset from the Asset store, and you should be good to go. Some devices may need an extra step or two.
If you're importing VRIF into an existing project, or just want to know how to set VRIF up from scratch, below are some notes for specific platforms :
I recommend using 2019.4LTS as it is stable and has the widest support. However, I also understand the desire to want to use the latest and greatest :) Maybe there is a new editor feature you can't live without, a performance update you'd like to to test, or you just want to use that lovely new dark theme that's available!
There are a couple of considerations to keep in mind with 2020 :
This largely depends on which devices you need to support or are developing with, as well as which editor version you are using. If you are using 2019.4LTS you can pretty easily switch between the two. However, in Unity 2020 you are locked into using XRManagement. While XRManagement is the path forward for Unity, the lack of built-in support for OpenVR could be a deal-breaker depending on your target devices.
There are a few recommended settings to use when targeting the Oculus Quest :
Change your Build Settings (File -> Build Settings) Target to "Android". Make sure Texture Compression stays at "ASTC".
For Legacy input : go to Edit -> Project Settings -> Player. Under "XR Settings" make sure "Virtual Reality Supported" is checked, and that the Oculus SDK has been added.
Also for Legacy input : Make sure you have the Oculus Desktop Package and the Oculus Android Package installed. This will allow you to build to the Quest and use Oculus Link.
For XR Management, you only need to have the Oculus plugin installed.
Oculus Quest should enable V2 Signing; Low Overhead Mode is optional.
Under Project Settings -> Player, expand the "Other Settings". Make sure "Vulcan" is not enabled under Graphics API's if you are using a Unity version < 2019.3, otherwise an error may be thrown.
Set the "Minimum API Level" to Android 4.4 'Kitkat' (API Level 19)
Make sure API compatibility level is .NET 4x.
I recommend changing your Fixed Timestep (Edit -> Project Settings... -> Time) to 0.01388889 if you are targeting the Oculus Quest. This matches the HMD's framerate (1/72). For Oculus Rift you can try 0.0111111 (1/90)
I also recommend disabling shadows (Project Settings -> Quality) and setting Pixel Light count to 1 if you are targeting the Oculus Quest.
If you want access to Quest handtracking make sure to download the Oculus Integration from the asset store, and import the package into your project. Once that is installed you can extract the integration package and test out the demo scene.
Unity may prompt you to update new plugins restart the editor - say yes to these prompts.
Unity 2019.3 and above now offer XRInput which provides standardized input across a variety of devices. However, OpenVR is not currently supported in Unity XR Management. You will need to use the Legacy system to access OpenVR, or the SteamVR asset.
In order to support OpenVR devices such as the HTC Vive, you will need to install the OpenVR Desktop package (Window -> Package Manager) and add the SDK in your project settings (Edit -> Project Settings -> Player -> XR Settings) :
Unity can support the HTC Vive through either the OpenVR package or SteamVR.
WindowsWMR devices can be supported through either the OpenVR package or SteamVR.
The Valve Index is not currently officially supported. However, you should be able to get this device working by installing the SteamVR asset and using the provided bindings, as described below :
While it is not required to have any specific Tags or Layers for VRIF to work, it is recommended to separate the Grabbable and Player components into their own layers so you can have their collisions ignored. This helps certain interactions such as trying to pull a drawer open that then collides with the player's capsule.
VRIF's layers are setup as "Grabbable" and "Player" by default like so :
You may then set the layer's to have their physics ignored by modifying the Physics Collision Matrix (Edit -> Project Settings -> Physics) :
I also recommend having a Tag on the Player, and another Tag on the CharacterController if you need to quickly find the main player.
The demo scene is meant to provide a unified place to test out how different object interact with each other, while keeping an eye on general performance. You can grab objects, interact with switches, levers, and buttons, and even do some climbing and combat.
Check out the script /Scripts/Scenes/Demo/DemoScript.cs for some additional code that is used in the demo scene.
The demo scene will be updated regularly with new features, so I wouldn't recommend saving your modifications to that scene. Instead, copy the prefabs over to your own scene, or just rename the scene.
The following components make up the core of this framework. The two main components are Grabber and Grabbable. The Grabber is in charge of picking up Grabbable objects that reside within it's Trigger. Grabbables designate items as grabbable by these Grabbers and allow you to tweak parameters such as grab offsets, how to handle physics, and things of that nature.
The Grabber is a Object the contains a Trigger Collider that is in charge of picking things up.
The Grabbable Component let's Grabbers know they can be picked up. There are many settings to help you tweak it's functionality to your liking.
Grab Physics Allows you to specify how this object will be held in the Grabbers
Physics Joint A ConfigurableJoint will be connect from the Grabber to the Grabbable. This allows held objects to still collide with the environment and not move through walls / other objects. The joints rigidity will be tweaked depending on what it is colliding with, in order to make sure it aligns properly with the hands during interaction and movement.
Kinematic The Grabbable will be moved to the Grabber and it's RigidBody will be set to Kinematic. The Grabbable will not allow collision from other objects and can go through walls. The object will remain firmly in place and is a reliable way of picking up objects if you don't need physical support.
None No grab mechanism will be applied. Climbable objects are not grabbed to the user, for example. They remain in place when grabbed. No Rigidbody is necessary in this case.
Grab Mechanic Specify how the object is held in the hand / Grabber
Precise The Grabbable can be picked up anywhere
Snap The Grabbable will snap to the position of the Grabber, offset by by any Grab Points that have been specified..
You can extend GrabbableEvents class in order to respond to all sorts of events that happen to a Grabbable. This is how many of the included prefabs are built, by either responding to Grabbable Events of by extending the Grabbable class to customize behaviour.
Check out /Scripts/Components/GrabbableHaptics.cs to see how easy it is to haptics to an object when it becomes a valid pickup.
Check out /Scripts/Extras/Flashlight.cs to see for a simple example on how to turn a light on and off. Hello World!
Another way to respond to these events is to add the GrabbableUnityEvents component to a Grabbable object. Then you can drag in your function to any event you wish to respond to :
Climbables are modified Grabbable objects that keep track of a position for the Character Controller to offset from. See the custom included CharacterController.cs to see how climbing works.
Climbing is accomplished by checking where the controller is this frame, and then offsetting the character position by that amount.
Multiple Climbing objects can be held at once (one in each hand). You can set a "BreakDistance" if you want to prevent the players hands from getting too far away from a hold.
The Input Bridge serves as the primary class to go to for checking controller input such as position, velocity, button state, etc.
It is recommended to use this instead of something like OVRInput because this class can be more easily updated and account for other Input SDK's in the future.
Additional information on the included prefabs and scripts.
Buttons, switches, and levers are generally controlled by using physics joints, such as a Fixed Joint and Configurable Joint.
For example, a lever consists of a Grabbable part that is attached to a base via a ConfigurableJoint. That base could also be a Grabbable object. Whenever the player grabs the lever, a joint is attached, but is still constrained to the base.
Sometimes the Physics Engine can become unstable if certain conditions are met, so a helper script '/Scripts/Helpers/JointHelper.cs' is available that will help constraint objects to where they should be.
Hand models are independent from the Grabbers and can be easily swapped out in the editor or at runtime.
The demo scene includes an example of how to change out hands by clicking in the left stick.
See /Scripts/Helpers/HandControllers.cs for an example script you can use to animate a hand model based on input and Grabbable / Grabber properties.
Whenever a Grabbable is held, it's HandPose ID will be sent to the Hand Model's Animator. You can use this to animate the state of the hands while an object is grabbed.
Custom Hand Poses can be assigned to each Grabbable. Hand poses work by setting a Pose ID parameter on the hand Animator whenever a Grabbable is held. These Hand Pose ID's are defined in HandPoseDefinitions.cs
To learn how to create custom hand poses, check out the second half of this excellent video by Valem :
Custom Hands start right around around 7m:35s
The demo scene has a couple of examples of using arms, body, and head IK. These examples use the standard Unity IK system, but with a bit of trickery to get the hands and elbows to position correctly.
If you just use Unitys IK system then the characters hands won't always be able to reach where the controllers are, and finger IK isn't always rigged. To get around this, you can use a hand model as your controller. Then have your wrist model point at your characters elbow joint, and then an attached upper arm look at the shoulder joint. This way the hands always match with the controller, and the arms and elbows have targets to mimic.
Grabbable objects have a BreakDistance property you can set that can force a grabber to drop and object if it goes too far away from the object. This can be useful with arm IK as you can have the player just drop whatever it is holding if the Arm length would be too far. For example, if a player was holding onto an axe stuck in a tree and walked back, you could force them to drop the axe if they go too far, preventing the arms from being crazy long.
Take a look at /Scripts/Components/CharacterIK to see how hands and head IK are positioned / rotated. You can hide different parts of the body (such as arms or legs) by scaling their joints down to 0.
Body IK can be as simple as rotating (or Lerping) the body to match the HMD's rotation, offset with the characters rotation.
There is not yet a an example for feet IK, but this would involve setting the feet height to the player's lower capsule position.
Check out Final IKas an option for Full Body IK. Keep in mind IK can be a computationally expensive feature.
Grab Points allow you to specify multiple Transforms to be used as a grip when holding an object. For example, you may want to grip a knife by different positions or angles on the handle, depending on where the users hand is when it is grabbed. You can even specify a different hand pose depending on where the object is gripped.
Grab Points are Transforms that can be assigned to a Grabbable's "Grab Points" property. You can as many Grab Points here as you need, and the closest one to the grabber will be used when the object is grabbed.
For finer control, you can add the "GrabPoint" script to the Transform. Here you can specify a different HandPose to use when this Grab Point is being gripped.
You can also specify a rotation constraint, specified as "Max Degrees Difference Allowed". If the angle between the hand and the grabpoint is greater than this value, then this grab point will not be considered valid.
The knife prefab includes multiple grab points and is a good example to inspect. You can grab the knife with the blade facing up or down, depending on which way the hand grabbed it, by using rotation constraints.
Snap Zones are triggers that can "grab" objects if they were recently released from a Grabber. You can use these to create inventory systems, attachment systems, or just snap objects together.
When an object is inside a SnapZone it's colliders are disabled. This is to prevent the physics of this object from interacting with the world and causing things to go crazy. In order to grab the item back out of the SnapZone, the snap zone's trigger responds to the grab event.
Objects are positioned at 0,0,0 local position and rotation by default. You can add the "SnapZoneOffset" component to a Grabbable if you wish to specify a custom offset.
In addition to the "SnapZoneOffset" component, you can also add a "SnapZoneScale" component, to modify the scale of the Grabbable when inside the snap zone.
If you are using a player inventory / toolbelt type setup, you can attach the "ReturnToSnapZone" component to a Grabbable, and specify which SnapZone to return to when not being held. You can also specify a return speed and delay.
For example, you could have a toolbelt with knives that will automatically return back to the toolbelt after thrown - just set the ReturnDelay to something like 1-2 seconds.
Slow motion is a fun effect in VR, and can also be a helpful way to troubleshoot physics and gameplay bugs.
In the demo scene you can slow time by pressing the "Y" button on the Left Oculus Touch Controller. Try shooting weapons, throwing objects, and observing sounds while time is slowed.
See /Scripts/Extras/TimeController.cs for an example on how to slow down time and apply a sound effect.
Whenever you are playing a sound, be sure to multiply your sound pitch by Time.TimeScale. This way your sounds will be "slowed" down by decreasing pitch, relative to how you've scaled Time.TimeScale.
When adding forces to rigidbodies use ForceMode.Velocity. This will properly scale based on Time.fixedDeltaTime. Otherwise physics may not work as expected.
There is a very simple damage system included that can be easily extended to work with your own custom setup.
A "Damageable" component has a health value. Damage can be dealt to it by calling it's DealDamage(float damageAmount) method. Once this value is <= 0 then the "DestroyThis" method will be called.
The "DamageCollider" component can be added to colliders. Whenever this collider collides with an object that has a Damageable component, the specified amount of damage will be dealt.
If you want to integrate with another damage system, just override the "DealDamage" method and pass along the damage value. Here is an example of how to integrate with EmeraldAI's damage system :
VRIF contains multiple integrations with other Unity assets :
SteamVR SDK Support is currently experimental. If you require Steam device support (such as Valve Index or HTC Cosmos) you may use the OpenVR plugin along with XR Plugin Management, or the SteamVR SDK.
If you want to give SteamVR a go, this integration package can get you started. It includes SteamVR bindings you can use to map SteamVR actions to raw inputs that the InputBridge can read.
Installation instructions found here.
Note : This essentially maps SteamVR Actions such as "Grip", "Trigger", etc. so that the InputBridge can convert it to be used as raw input. It's not how SteamVR's input system is intended to be used, but is currently the only way to get input from certain devices until Unity gets full OpenVR or OpenXR support.
Check out InputBridge.cs to see how Steam Actions are bound to inputs
The Oculus Integration shows how to integrate Quest Hand Tracking. There is a demo scene included that shows how to draw using hand tracking, as well as a sample scene with a player that uses Oculus OVR components, such as OVRManager.
First install the Oculus Integration asset. Then extract the integration package found at "/BNG Framework/Integrations/Oculus Integration". Finally, enable the integration by navigating to Window -> VRIF Settings and enabling the Oculus Integration checkbox.
After extracting the integration package you can check out the included demo scene for the Oculus player or Quest Hand Tracking.
Final IK allows you to setup a character with full body IK. Full body character in VR can be tricky, but this integration should get you up and running with a pre-build FinalIK character quickly.
Installation Instructions :
After installation, you can run the demo scene that was just extracted into this directory (VRIK.unity). This scene contains a mirror and a few simple items so you can test out how IK works.
Two-handed weapons are not currently supported due to how VRIK positions it's hands. Please let me know if you would like to see that feature in the future, and what other features you would like to see in this integration : beardedninjagames@gmail.com
VRIK Documentation from RootMotion can be found here : http://www.root-motion.com/finalikdox/html/page16.html">Root Motion Docs
Please follow the Youtube video below for step-by-step instructions on how to set this up :
In order to damage Emerald AI objects, you need to pass along the damage amount from the VR Framework to the Emerald AI system.
To do this :
That's it! If you are having issues with Emerald AI detecting the player or receiving damage, make sure you double check the tags and layers are correct and that the EmeraldAIDamageable component has been added to the enemy.
EmeraldAI documentation can be found here : https://docs.google.com/document/d/1_zXR1gg61soAX_bZscs6HC-7as2njM7Jx9pYlqgbtM8/
Please follow the Youtube video below for step-by-step instructions on how to set this up :
VRIF contains an integration package for PUN that will demonstrate how to connect and a server, sync the head and hand positions, as well as sync hand animations state.
Installation instructions :
A collection of useful information :
Upgrading to new versions of any Unity Asset can feel like a daunting task. Below are a few tips to ensure you don't lose work and make upgrading as painless as possible.
First and foremost : Back up your work. The best way to do this is to use source control, such as the free and excellent Github.
If things don't work quite right, you can always revert back to your previous commit. This is also a good way to see what has changed in the script files since the last version.
Keep your own Player Prefab. The Player prefab that comes with the framework may change from time to time as features are introduced. If you make modifications to the included player prefabs, make sure you save your own prefab copy. This way your settings won't get overridden, and you can inspect the included player prefabs for any changes.
You don't need to import the project settings every time. These are included mostly for first time installs. If your project is already setup properly, you can skip this step by deselecting the ProjectSettings files at the bottom of the asset import.
Getting poor performance or strange visuals when running in the editor.
Can't shoot the bow properly and two-handed weapons behave poorly when close to the face.
Can't grab an item or climb an object
Can't interact with a UI Canvas