final year university project


The major features of this project are:

The Holodeck system supports a large amount of hardware:

Gesture Panels

Gesture panels are invisible volumes that enable gesture based control of the world. There are 3 types of gesture panels supported by Holodeck:

  1. Swipe - the user must swipe their hand in a predefined direction (or against the direction).
  2. Circle - the user must draw a circle in the air with their finger for a length of time.
  3. Tap - the user must "press a button" in a predefined direction.

These gesture panels are demonstrated in the "prison break" video below.

Gesture Detection

The Holodeck system allows developers to create custom gestures by specifying the relative angles of the users fingers (with a degree of forgiveness).

The Holodeck system provides the ability to "listen" for each gesture type (swipe, circle, tap) regardless of the users current location. This allows level designers to perform player specific actions anywhere in the world (eg: swipe left to display the HUD, swipe right to hide). Programmers are able to extend this system by implementing dedicated gesture listeners which can detect custom gestures and use gesture information to modify the world in a more meaningful way. Level designers are free to activate and deactivate gesture listeners at runtime which allows for area specific gestures to be recognized and processed.

Some gestures require pure code detection and handling. These gestures should revolve around core gameplay (such as shooting a gun and picking up objects).

Electrotactile Interface

The Electrotactile Interface (referred to now as ETI) is a device that uses a range of electric pulses to provide the user with feedback from the VR environment. Much like gesture detection, the ETI can be powered through code and level scripts.

With the addition of the ETI, the Holodeck system could cause large and prolonged amounts of physical pain to the user. Because of this, the code was required to be rock solid and very forgiving (such as disabling the device during loading screens, disabling the device on crash, minimizing latency between user action and haptic response, etc).


Progress video demonstrating what had been done (originally submitted mid to late 2014).

This is the earliest test video demonstrating the initial hand wireframe and button panel.

Another early video demonstrating hand interaction with NPCs (by proxy of physics props).

Another early video demonstrating the initial weapon implementation.

Demonstration of the final hand interaction + weapon system (and Giant Leap).

Misc videos.