Project Ideas

This page is intended to help you with generating appropriate project ideas. It has some suggestions, but mostly points to places to look for inspiration, and how to think about the project. 

Some of these ideas were inspired by previous classes and others were developed by the instructor or research collaborators.  Please don’t consider yourself limited to just these ideas.  The descriptions here are just a starting point, and you are encouraged to come up with cool 3D interaction techniques on your own. 

Note that the term "3D interaction" refers specifically to techniques are are spatial. For example, grabbing objects, pointing to targets, and making gestures are all examples of 3D interaction.  Pressing buttons or pushing the controller thumbstick is not sufficient and is also trivial to implement.  Although you are allowed to use controller input to support your interaction techniques, they must be primarily spatial in nature to count as a contribution for your project.

 

Project Scope

In general, good project ideas will represent a solid chunk of implementation work.  For projects with two team members, a good rule of thumb would be complexity equivalent to at least three homework assignments.  These expectations will be scaled down or up by 50% for individual projects or teams of three, respectively.

Note that short descriptions like those listed below merely convey the topic area of possible projects.  They do not describe all of the details about what would be sufficient to get a good grade on the project. In other words, for any of these it’s possible to do a very bad version or a very good version. Simply choosing a topic on this list is insufficient to guarantee any particular grade.

 

Project Types

Potential projects for this class be roughly categorized into three general classes.

Interaction projects are focused on the design and implementation of novel 3D interaction techniques. For example, merely re-implementing a 3D menu system from some 3D application you have seen is not very novel, since this has likely been done many times. Beyond that, a straightforward VR menu system is probably of insufficient complexity to make a good project.  (They are probably no more than one homework worth of effort, and you will be already doing this is one of the programming assignments). However, in general, 3D menuing is not a solved problem for the virtual reality field, which will be discussed in both the lectures and textbook chapter on system control. Building on previous work and a reasoned analysis of the problems and opportunities with current devices, and developing a system that is potentially better that existing techniques (in a way that you can explain and justify) would be a good project.  Interaction projects should include a virtual reality testbed environment that provides a way for users to test the proposed interaction techniques and gain an understanding of their advantages and disadvantages.

Technical projects are focused on providing advanced toolkit capabilities that are non-trivial to implement using the out-of-the-box functionality of Babylon.js.  For example, a multi-user shared virtual reality experience would require network synchronization of user's actions and the environment state, and would require the development of a suitable software infrastructure.  Note that the goal of a technical project is on expanding possible 3D interactions that are currently difficult to achieve with existing tools.  In other words, there still must be an interaction goal that justifies the need for technical development.  Technical projects should aim to provide standalone toolkit or library code that could theoretically be integrated with new applications, and should also include a virtual reality testbed environment that demonstrates the new capabilities.

Application projects are focused on applying virtual reality in a new context or to solve an important or interesting problem in a specific application domain.  The motivating application is usually (but not always) interdisciplinary; for example, using virtual reality to analyze or understand complex structures in CT scan data.  Similar to the other types, these projects should involve implementation of novel 3D interaction techniques, or at least the effective application of existing techniques in a new context.  (In both cases, the interaction techniques should be non-trivial to implement and should not be something you already implemented in a previous homework assignment.)    These projects should provide a proof-of-concept virtual reality prototype that demonstrates the developed interaction techniques in the context of the selected application domain.  Note that a video game is a perfectly  acceptable application as long as the mechanics involve some interesting 3D interaction techniques.

 

Researching Topics

To get started, you might considering looking at the VR and 3D interaction technique papers at the last few years of the following conferences:

Focus on papers that present interesting and novel 3D interaction techniques, not just applications of immersive technologies.  That said, in addition to looking at the conferences, some example project ideas are provided below

Note that ACM SUI 2020 is being held virtually from Oct 30 - Nov 1 and is free to attend!  However, you must register by October 26 Links to an external site..

Project Ideas

Interaction Projects

This is a non-exhaustive list of advanced interaction techniques and topics that we will discuss in lecture.  Identifying some interesting existing techniques for inspiration would be a reasonable starting point for an interaction project.  More information about these many of these techniques can be found in the appropriate section of your textbook or by searching for the original papers on Google Scholar Links to an external site..

Selection and Manipulation

  • 3D bubble cursor
  • Precise and rapid interaction through scaled manipulation
  • Intent-driven selection
  • Image plane pointing
  • Aperture selection
  • Bendcast
  • Depth ray
  • Absolute and relative mapping
  • Voodoo dolls
  • iSith
  • Spindle
  • Bimanual volumetric selection
  • HOMER
  • Scaled world grab
  • Redirected touching
  • Haptic retargeting
  • Sparse haptic proxy
  • Turkdeck
  • Mutual human actuation

Navigation

  • Walking-in-place
  • Redirected walking (e.g., rotation gains, translation gains, or curvature gains)
  • Reorientation with distractors
  • Change blindness redirection
  • Impossible spaces
  • Flexible spaces
  • Virtual portals
  • Route planning (e.g., path drawing or marking points)
  • Dual-point world manipulation for vertical travel (e.g., climbing)

System Control

  • Diegetic graphical menus
  • Context sensitive widgets
  • Non-context sensitive widgets (e.g., command and control cube)
  • 1-DOF menus with virtual tools
  • Physical tools
  • Multimodal input (e.g., voice and gesture)
  • Alphanumeric input in VR

 

Technical Projects

Multi-User Virtual Environments in WebXR.  Although there are multiple frameworks for implementing networked virtual environments in traditional game engines such as Unity, we do not currently have out-of-the-box tools for multi-users experiences using Babylon and WebXR.  Matrix Links to an external site. is an open-source project for open, decentralized communication that includes an SDK for JavaScript.  This project would involve using Matrix to synchronize the actions of two or more users and the virtual environment state to enable a shared experience running on multiple Quest devices.  If successful, this project could result in an open-source library that has the potential to generate much interest in the virtual reality community, along with a possible publication.

Full-Body Avatars with Inverse Kinematics.  Currently, it is difficult to provide the illusion of inhabiting a full-body humanoid avatar in WebXR.  This project would integrating a full virtual body for the user using the Oculus Quest.  The user should be able to directly puppet the avatar by moving their head and controllers, which will require a technique commonly used in robotics and animation known as inverse kinematics.  When the user moves in the environment, the avatar's legs should move using a walking animation.  There should be a way to easily demonstrate these capabilities to the user, such as by placing a mirror in the virtual environment.

 

Application Projects

The following project ideas were submitted by collaborators in the University's Medical Devices Center.  Students will be allowed to consult with the researchers in order to develop goals for the proposal, and in many cases the researchers will provide data or models necessary for the project.  If successful, all of the projects below have the potential to result in a future publication.  If you are interested in pursuing one of these projects, please contact me for an introduction.

Deep Brain Stimulation Probe Trajectory. This project is to develop a pre-surgical planning and VR clinical training tool. The tool will consist of an anatomical model, and a CAD model of the probes. The system can be operated in two modes. The first mode, the clinician will be allowed to set the orientation of the probes in the X, Y and Z axes. The system will then calculate two indexes. First is the proximity to the predetermined (desired) target. The other is an index that identifies how many veins and arteries were damaged by that placement choice.  The second mode will calculate based on the location of veins and arteries in the anatomical file, along with a given target for where the probe will stimulate the brain, and calculates the X, Y and Z axes that will cause the lowest damage-index and the highest degree of accuracy for the probe target. There may be more than one solution set.

Dental Shape Optimization. Dental fillings and bridges fail because they are not designed to have the lowest possible stress in the filling/bridge, or on the tooth. The tools used by Dentists are also not optimally designed to ensure minimal interference.This project will create VR models and FEA to identify the stresses of various designs with the intent of designing them with the lowest stress profile and highest material strength. Chosen designs will be built in the lab to validate the findings. Segmented files of the mouth and tooth will be provided. FEA data will also be provided. The software needs to have a “Design by Dragging” feature, where the variables can be moved to an infinite degree and the corresponding design changes, displaying the predicted changes in stress.

Prostate Abslation Procedure. Prostates have a tendency to grow with age. Cancer can also form in and on the prostate. In both cases, the cancer and excess tissue need to be removed. This program will allow the clinician to evaluate different ablation probe placements to determine their effectiveness. An anatomical file of the prostate and CAD file of the ablation probe will be provided.

3D Printed Prosthetics. Patient specific prosthetics improve the quality of life for the amputee. This project will allow for the optimal design of a prosthetic based on that patient anatomy before being 3D printed. The VR system will also be used to allow the amputee to have input on the look of their prosthetic. Anatomical files will be provided.