Perspective 3D Projections for Augmented Reality Interfaces
I designed and implemented an interactive visualization system for 3d objects on flat interfaces, viable for virtual and augmented reality platforms. The system utilizes a Kinect motion sensor to track the user’s position and manipulate the displayed viewport accordingly, so that the viewer sees different parts of the 3d scene depending on perspective. The addition of an IR-based gesture sensor affords another mode of interaction and haptic feedback for users’ object-manipulating hand gestures.
The objective of this project, completed for MIT's 6.835 (Prof. Randall Davis' Intelligent Multimodal User Interfaces course) VR environment using a few commodity hardware components, without the need for multiple angle projectors, sensors, remote controls, or wearable displays. Its result is an accessible platform that, to the user, mimics a realistic 3D scene, with the objects inside it ”popping out”, and appearing to have a convincing physical presence. The system can be extended for many different purposes; as a proof of concept, I showcased a simple interactive game that makes use of gestural input to manipulate objects while providing a realistic 3D experience.
The project has received positive attention from the MIT CSAIL Communications Department and will be showcased to the public in the Stata Building starting in 2017.