VR Interaction Framework

Welcome! These docs will get you up and running with the VR Interaction Framework. If you come across any issues, please don't hesitate to contact me for support, or come join the Discord community!


The VR Interaction Framework is a collection of scripts and prefabs to help you develop interactions in VR. It makes it easy to create your own interactable objects and be productive quickly.

There are multiple prefabs available to provide you with examples of common VR interactions. These range from simple blocks and balls to switches, levers, weapons and rocket arms. You are encouraged to experiment with these, inspect the scripts to see how they were made, and create your own.


  1. Start by downloading the Oculus SDK from the asset store, and import the package into your project. Grab a beverage, as this step can take a few minutes.

    VR Interaction Framework Oculus Integration Kit

    Unity may prompt you to update new plugins restart the editor - say yes to these prompts.

    If you are building for the Rift, you can skip all the way down to Step 5! Quest users continue below to get setup for Android Development.

  2. Change your Build Settings (File -> Build Settings) Target to "Android". Make sure Texture Compression stays at "ASTC".

    VR Interaction Unity3d Build Settings
  3. Go to Edit -> Project Settings -> Player. Under "XR Settings" make sure "Virtual Reality Supported" is checked, and that the Oculus SDK has been added.

    *note : You may use Unity's new XR Plugin Management system instead of the legacy system if you prefer. I currently recommend the legacy system for compatibility / stability reasons.

    Oculus Quest should enable V2 Signing; Low Overhead Mode is optional.

    VR Interaction Unity3d Player Settings
  4. Still under Project Settings -> Player, expand the "Other Settings". Make sure "Vulcan" is not enabled under Graphics API's if you are using a Unity version < 2019.3, otherwise an error may be thrown.

    Set the "Minimum API Level" to Android 4.4 'Kitkat' (API Level 19)

    Make sure API compatibility level is .NET 4x.

    VR Interaction Unity3d Player Other Settings
  5. I recommend changing your Fixed Timestep (Edit -> Project Settings... -> Time) to 0.01388889 if you are targeting the Oculus Quest. This matches the HMD's framerate (1/72). For Oculus Rift you can try 0.0111111 (1/90)

    VR Interaction Framework Physics.TimeStep Unity3d
  6. I also recommend disabling shadows (Project Settings -> Quality) and setting Pixel Light count to 1 if you are targeting the Oculus Quest.

  7. Go ahead and import the Interaction Framework package if you haven't already.

  8. That's it! You're ready to run the demo in /Scenes/Demo. Make sure that scene is added to the build if you are running it on a device.

Building for OpenVR and using XRInput

Unity 2019.3 and above now offer XRInput which provides standardized input across a variety of devices. However, OpenVR is not currently supported in Unity XR.

In order to support OpenVR devices such as the HTC Vive, you will need to install the OpenVR Desktop package (Window -> Package Manager) and add the SDK in your project settings (Edit -> Project Settings -> Player -> XR Settings) :

VR Interaction Framework Unity3d XR Settings

Tags and Layers

While it is not required to have any specific Tags or Layers for VRIF to work, it is recommended to separate the Grabbable and Player components into their own layers so you can have their collisions ignored. This helps certain interactions such as trying to pull a drawer open that then collides with the player's capsule.

VRIF's layers are setup as "Grabbable" and "Player" by default like so :

VR Interaction Framework Unity3d Layer Settings

You may then set the layer's to have their physics ignored by modifying the Physics Collision Matrix (Edit -> Project Settings -> Physics) :

VR Interaction Framework Unity3d Tags and Layers Physics Matrix

I also recommend having a Tag on the Player, and another Tag on the CharacterController if you need to quickly find the main player.

Demo Scene /Scenes/Demo

The demo scene is meant to provide a unified place to test out how different object interact with each other, while keeping an eye on general performance. You can grab objects, interact with switches, levers, and buttons, and even do some climbing and combat.

VR Interaction Framework Unity3d Demo Scene

Check out the script /Scripts/Scenes/Demo/DemoScript.cs for some additional code that is used in the demo scene.

The demo scene will be updated regularly with new features, so I wouldn't recommend saving your modifications to that scene. Instead, copy the prefabs over to your own scene, or just rename the scene.

Core /Scripts/Core/

The following components make up the core of this framework. The two main components are Grabber and Grabbable. The Grabber is in charge of picking up Grabbable objects that reside within it's Trigger. Grabbables designate items as grabbable by these Grabbers and allow you to tweak parameters such as grab offsets, how to handle physics, and things of that nature.

Grabber /Scripts/Core/Grabber.cs

The Grabber is a Object the contains a Trigger Collider that is in charge of picking things up.

VR Interaction Framework Grabber Unity3d
  • Hand Side (Left, Right, None) Set to Left or Right if you are parenting this to a Controller. Set to None if this is not used on a Controller.
  • Grip Amount (0-1) How much Grip is required to be considered a grab. Ex : 0.9 = Grip is held down at least 90% of the way
  • Release Grip Amount (0-1) How much Grip is required to be considered letting go of a grab. Ex : 0.1 = Grip must be greater than or equal to 10% of the way down. This value should be lower than Grip Amount. Provides a way of having a zone for grip / releasing.
  • Held Grabbable (Grabbable) The Grabbable that is currently being held. Null if nothing is being held.
  • Hands Graphics (Transform)The parent object that holds any number of Graphics used to represent hands. This transform should be for the graphics / animations only. If a Grabbable's property 'ParentHandModel' is true, this transform will be parented to the Grabbable. On release the transform will return back to the Grabber's center.
  • Force Grab Force the Grabbing of this Grabber. Useful for debugging within the Editor to simulate holding down a grab button.
  • Force Release Force the Release of this Grabber. Useful for debugging within the Editor to simulate the release a grab button.

Grabbable /Scripts/Core/Grabbable.cs

The Grabbable Component let's Grabbers know they can be picked up. There are many settings to help you tweak it's functionality to your liking.

VR Interaction Framework Grabbable Unity3d
  • Remote Grabbable If true the object will be eligible to be picked up from far away. Remote Grabbables are found by being within a RemoteGrabber Trigger.
  • Remote Grab Distance If "Remote Grabbable" is true, then the object can be remote grabbed at a maximum of this distance.
  • Remote Grabbing True if the object is currently being moved towards a Grabber
  • Grab Button This property allows you to specify which button needs to be pressed to pick up the object. Typically this would be Grip, but sometimes you may want to use Trigger (like an arrow, for example).
  • Grab Physics Allows you to specify how this object will be held in the Grabbers

    1. Physics Joint A ConfigurableJoint will be connect from the Grabber to the Grabbable. This allows held objects to still collide with the environment and not move through walls / other objects. The joints rigidity will be tweaked depending on what it is colliding with, in order to make sure it aligns properly with the hands during interaction and movement.

    2. Kinematic The Grabbable will be moved to the Grabber and it's RigidBody will be set to Kinematic. The Grabbable will not allow collision from other objects and can go through walls. The object will remain firmly in place and is a reliable way of picking up objects if you don't need physical support.

    3. None No grab mechanism will be applied. Climbable objects are not grabbed to the user, for example. They remain in place when grabbed. No Rigidbody is necessary in this case.

  • Grab Mechanic Specify how the object is held in the hand / Grabber

    1. Precise The Grabbable can be picked up anywhere

    2. Snap The Grabbable will snap to the position of the Grabber, offset by "Grab Position Offset" and "Grab Rotation Offset".

  • Grab Speed How fast the Grabbable will Lerp to the Grabber when it is being grabbed.
  • Throw Force Multiplier Angular The Grabbable's Velocity will be multiplied times this when dropped / thrown.
  • Throw Force Multiplier The Grabbable's Angular Velocity will be multiplied times this when dropped / thrown.
  • Hide Hand Graphics If true, the Grabber's hand graphics will be hidden while holding this object.
  • Parent to Hands If true, the object will be parented to the hand / Grabber object. If false, the parent will remain null / untouched. You typically want to parent the object to the hand / Grabber if you want it to move smoothly with the character.
  • Parent Hand Model If true, the Grabber's Hand Model will be parented to the Grabbable. This means the Grabber and it's Hand Graphics will be independent. Enable this option if you always want the hands to match with the grabbable, even if the hands don't align with the controller. See the demo scene weapon for examples.
  • Other Grabbable Must be Grabbed If this is not null, then the specified object must be held in order for this Grabbable to be valid. A weapon clip / magazine inside of a gun is a good example. You may only want the magazine to be grabbable if the pistol is being held.
  • Collision Spring The amount of Spring force to apply to the Configurable Joint during collisions.
  • Collision Slerp The amount of Slerp to apply to the Configurable Joint during collisions.
  • Collisions A list of objects that are currently colliding with this Grabbable. Useful for debugging.
  • Grab Position Offset A local offset to apply to the Grabbable if "Snap" Grab Mechanic is selected.
  • Grab Rotation Offset A local euler angles to apply to the Grabbable if "Snap" Grab Mechanic is selected.
  • Mirror Offset for Other Hand If true, the "other" hand (typically Left Controller) will have it X position mirrored. Example : 1, 1, 0 offset would become -1, 1, 0

Grabbable Events /Scripts/Core/GrabbableEvents.cs

You can extend GrabbableEvents class in order to respond to all sorts of events that happen to a Grabbable. This is how many of the included prefabs are built, by either responding to Grabbable Events of by extending the Grabbable class to customize behaviour.

Check out /Scripts/Components/GrabbableHaptics.cs to see how easy it is to haptics to an object when it becomes a valid pickup.

Check out /Scripts/Extras/Flashlight.cs to see for a simple example on how to turn a light on and off. Hello World!

VR Interaction Framework Flashlight Script Unity3d

Another way to respond to these events is to add the GrabbableUnityEvents component to a Grabbable object. Then you can drag in your function to any event you wish to respond to :

VR Interaction Framework Unity3d Grabbable Events

Climbable /Scripts/Core/Climbable.cs

Climbables are modified Grabbable objects that keep track of a position for the Character Controller to offset from. See the custom included CharacterController.cs to see how climbing works.

Climbing is accomplished by checking where the controller is this frame, and then offsetting the character position by that amount.

Multiple Climbing objects can be held at once (one in each hand). You can set a "BreakDistance" if you want to prevent the players hands from getting too far away from a hold.

Input Bridge /Scripts/Core/InputBridge.cs

The Input Bridge serves as the primary class to go to for checking controller input such as position, velocity, button state, etc.

VR Interaction Framework Unity3d Input Bridge

It is recommended to use this instead of something like OVRInput because this class can be more easily updated and account for other Input SDK's in the future.


Additional information on the included prefabs and scripts.

Buttons, Switches, and Levers

Buttons, switches, and levers are generally controlled by using physics joints, such as a Fixed Joint and Configurable Joint.

For example, a lever consists of a Grabbable part that is attached to a base via a ConfigurableJoint. That base could also be a Grabbable object. Whenever the player grabs the lever, a joint is attached, but is still constrained to the base.

Sometimes the Physics Engine can become unstable if certain conditions are met, so a helper script '/Scripts/Helpers/JointHelper.cs' is available that will help constraint objects to where they should be.

VR Hands

Hand models are independent from the Grabbers and can be easily swapped out in the editor or at runtime.

The demo scene includes an example of how to change out hands by clicking in the left stick.

See /Scripts/Helpers/HandControllers.cs for an example script you can use to animate a hand model based on input and Grabbable / Grabber properties.

Whenever a Grabbable is held, it's HandPose ID will be sent to the Hand Model's Animator. You can use this to animate the state of the hands while an object is grabbed.

VR Interaction Framework Unity3d Hand Bones

Arms, Head, and Body IK

The demo scene has a couple of examples of using arms, body, and head IK. These examples use the standard Unity IK system, but with a bit of trickery to get the hands and elbows to position correctly.

If you just use Unitys IK system then the characters hands won't always be able to reach where the controllers are, and finger IK isn't always rigged. To get around this, you can use a hand model as your controller. Then have your wrist model point at your characters elbow joint, and then an attached upper arm look at the shoulder joint. This way the hands always match with the controller, and the arms and elbows have targets to mimic.

Grabbable objects have a BreakDistance property you can set that can force a grabber to drop and object if it goes too far away from the object. This can be useful with arm IK as you can have the player just drop whatever it is holding if the Arm length would be too far. For example, if a player was holding onto an axe stuck in a tree and walked back, you could force them to drop the axe if they go too far, preventing the arms from being crazy long.

Take a look at /Scripts/Components/CharacterIK to see how hands and head IK are positioned / rotated. You can hide different parts of the body (such as arms or legs) by scaling their joints down to 0.

VR Interaction Framework Unity3d CharacterIK

Body IK can be as simple as rotating (or Lerping) the body to match the HMD's rotation, offset with the characters rotation.

There is not yet a an example for feet IK, but this would involve setting the feet height to the player's lower capsule position.

Check out Final IKas an option for Full Body IK. Keep in mind IK can be a computationally expensive feature.

Grab Points

Grab Points allow you to specify multiple Transforms to be used as a grip when holding an object. For example, you may want to grip a knife by different positions or angles on the handle, depending on where the users hand is when it is grabbed. You can even specify a different hand pose depending on where the object is gripped.

Grab Points are Transforms that can be assigned to a Grabbable's "Grab Points" property. You can as many Grab Points here as you need, and the closest one to the grabber will be used when the object is grabbed.

For finer control, you can add the "GrabPoint" script to the Transform. Here you can specify a different HandPose to use when this Grab Point is being gripped.

You can also specify a rotation constraint, specified as "Max Degrees Difference Allowed". If the angle between the hand and the grabpoint is greater than this value, then this grab point will not be considered valid.

The knife prefab includes multiple grab points and is a good example to inspect. You can grab the knife with the blade facing up or down, depending on which way the hand grabbed it, by using rotation constraints.

VR Interaction Framework Unity3d Grab Points

Snap Zones

Snap Zones are triggers that can "grab" objects if they were recently released from a Grabber. You can use these to create inventory systems, attachment systems, or just snap objects together.

When an object is inside a SnapZone it's colliders are disabled. This is to prevent the physics of this object from interacting with the world and causing things to go crazy. In order to grab the item back out of the SnapZone, the snap zone's trigger responds to the grab event.

Objects are positioned at 0,0,0 local position and rotation by default. You can add the "SnapZoneOffset" component to a Grabbable if you wish to specify a custom offset.

In addition to the "SnapZoneOffset" component, you can also add a "SnapZoneScale" component, to modify the scale of the Grabbable when inside the snap zone.

If you are using a player inventory / toolbelt type setup, you can attach the "ReturnToSnapZone" component to a Grabbable, and specify which SnapZone to return to when not being held. You can also specify a return speed and delay.

For example, you could have a toolbelt with knives that will automatically return back to the toolbelt after thrown - just set the ReturnDelay to something like 1-2 seconds.

VR Interaction Framework Unity3d Snap Zones

Slow Motion

Slow motion is a fun effect in VR, and can also be a helpful way to troubleshoot physics and gameplay bugs.

In the demo scene you can slow time by pressing the "Y" button on the Left Oculus Touch Controller. Try shooting weapons, throwing objects, and observing sounds while time is slowed.

See /Scripts/Extras/TimeController.cs for an example on how to slow down time and apply a sound effect.

VR Interaction Framework Unity3d Slow Motion

Whenever you are playing a sound, be sure to multiply your sound pitch by Time.TimeScale. This way your sounds will be "slowed" down by decreasing pitch, relative to how you've scaled Time.TimeScale.

When adding forces to rigidbodies use ForceMode.Velocity. This will properly scale based on Time.fixedDeltaTime. Otherwise physics may not work as expected.

Dealing and Taking Damage

There is a very simple damage system included that can be easily extended to work with your own custom setup.

VR Interaction Framework Unity3d Damage System

A "Damageable" component has a health value. Damage can be dealt to it by calling it's DealDamage(float damageAmount) method. Once this value is <= 0 then the "DestroyThis" method will be called.

The "DamageCollider" component can be added to colliders. Whenever this collider collides with an object that has a Damageable component, the specified amount of damage will be dealt.

If you want to integrate with another damage system, just override the "DealDamage" method and pass along the damage value. Here is an example of how to integrate with EmeraldAI's damage system :

VR Interaction Framework Unity3d Damage System

Upgrading to New Versions of VRIF

Upgrading to new versions of any Unity Asset can feel like a daunting task. Below are a few tips to ensure you don't lose work and make upgrading as painless as possible.

  1. First and foremost : Back up your work. The best way to do this is to use source control, such as the free and excellent Github.

    If things don't work quite right, you can always revert back to your previous commit. This is also a good way to see what has changed in the script files since the last version.

  2. Keep your own Player Prefab. The Player prefab that comes with the framework may change from time to time as features are introduced. If you make modifications to the included player prefabs, make sure you save your own prefab copy. This way your settings won't get overridden, and you can inspect the included player prefabs for any changes.

  3. You don't need to import the project settings every time. These are included mostly for first time installs. If your project is already setup properly, you can skip this step by deselecting the ProjectSettings files at the bottom of the asset import.

Common Issues and Fixes

  1. Getting poor performance or strange visuals when running in the editor.

    • Fix : Make sure nothing is selected in the Inspector during runtime as this can kill your framerate. Better yet, set the Game window to "Maximize on Play".
  2. Can't shoot the bow properly and two-handed weapons behave poorly when close to the face.

    • Fix : Make sure the "Player" and "Grabbable" layers are ignored in the Physics Collision Matrix (Edit -> Project Settings -> Physics)
  3. Can't grab an item or climb an object

    • Fix : Make sure the "BreakDistance" property of the Grabbable / Climbable is larger than the collider. Alternatively, set "BreakDistance" to 0 to disable this feature.
  4. Can't interact with a UI Canvas

    • Fix : Make sure the Canvas has an EventCamera assigned, the OVRRaycaster component is assigned to the Canvase, and that the OVRInputModule is on the EventSystem.

Tips & Guidelines

  1. Keep all of your objects at a 1:1 scale. If you were to scale a cube to 1, 2, 1, then remove the box collider and recreate it in a parent object that has a uniform scale. Otherwise weird things can happen with physics objects, such as floaty gravity, objects that fly out sporadically, and other glitchy behaviour.
  2. Play with adjust your Physics Timestep (Project Settings -> Time -> Fixed Timestep) to something such as .01388889 or 0.01111111. Physics may not run as fast as your framerate, so you may want to adjust this to the headset you are targeting.
  3. There is an object highlight script included, but it is not enabled by default as it has poor performance on the Quest. For Rift, you can enable this by enabling the "OutlineCamera" object in the CenterEyeAnchor
  4. Move linearly to increase player comfort. Acceleration can cause motion sickness.
  5. Having a button to slow down time can give you some extra time to see how physics objects are interacting and debug any issues with joints or other behaviours.
  6. Use VRUtils.Instance.Log(“Message Here”) to log information to the menu attached to your characters hand. This can be a helpful way to see Debug information without having to take off the headset.
  7. If you are getting weird eye level positions on the Quest, make sure that "Floor Level" is checked under the OVRCameraRig.
  8. You can use the Oculus Link to Debug the Quest from within Unity. Just make sure the "Oculus Desktop Package" is added and you have the Oculus app installed. You can also use an app such as Virtual Desktop, which will let you connect wirelessly through SteamVR.
  9. If all of your materials are pink, then you are probably using a newer Render Pipeline and need to upgrade your project's materials : Edit -> Render Pipeline -> Universal Render Pipeline -> Upgrade Project Materialst
  10. If you are finding your held objects collide with your player capsule, set your "Player" physics layer to ignore the "Grabbable" layer (Project Settings -> Physics -> Layer Collision Matric). This way objects will pass right through the player capsule. If you still need some sort ofplayer collision, add hit boxes on a separate layer - i.e. for the head.