Planetary Rift
Project Team
|
Supervision
|
Goal
The goal of our project was to develop an application that solves an existing problem at the Planetarium Hamburg. Every time the Planetarium team makes a video file for a new show, they have to compile it, then move from their workstation to the "Sternensaal" dome room and project it onto the screen inside the dome. Only then they are able to see if the show looks as intended. Every time this is not the case, they have to return and repeat the whole process. Because this is very time-consuming and the Sternensaal is not available at all times, it was our task to give an opportunity to simulate the movie presentation in a virtual environment.
Motivation
Our main goal was to develop a simulation of the planetarium and its presentations for the Oculus Rift. We decided to go a few steps further and create an immersive, user friendly experience. Because the application is designed for daily use, we also had to find a way to reduce simulator sickness.
Approach
In the beginning of the project we familiarized ourselves with the technology, while analysing the requirements our application had to meet using different methods. We built a simplified model of the Planetarium and started implementing the various required features in Unity. We used the floor plan the Planetarium provided and combined it with data we collected ourselves to create the final model of the Sternensaal. Not only was our goal to show pictures and videos but also content made with Unity, so we created a solar system in the same scene as the planetarium model. We used several cameras to display the solar system as well as other content.
We split the interaction into two basic cases:
- In the first case the user navigates from one seat to another by looking at it and clicking. We determine which seat the user is looking at by using Unity’s built-in raycast feature and we give the user visual feedback by highlighting that seat. The user can toggle between flying and teleporting from seat to seat. For a better overview, we implemented a birdeye perspective.
- In the second case, the user looks directly at the dome. He can open a context menu projected onto the dome and positioned according to the users field of view to select the content shown on the dome screen. The user can arrange the projected content by rotating it and load videos and pictures using a file browser.
Because Unity only accepts ogv video files, we integrated a console application that converts the loaded video files. Unfortunately Unity does not support reading from the console, so there is currently no way to inform the user about the progress of the conversion.
We did a usability study to evaluate our prototype using different questionnaires. The results suggested that the navigation in general was satisfying. Especially the birdeye feature was highly appreciated. Despite the positive feedback, simulator sickness remains a problem in virtual environment. We used the feedback given by participants and our supervisors to improve our application and added a debug feature which allows the user to freely move through the planetarium using the WASD keys.
Outlook
There is of course still room for improvement and additional features. Alternative means of input may include the use of voice recognition or the Microsoft Kinect. A multi user mode could be useful for showing progress to multiple team members simultaneously. To update the planet’s in real time based on astrological data might prove equally interesting for both astrologists and the layperson. Being able to edit video files directly would make for a great addition to the functionality of our application.
Downloads
Video: "Planetary Rift" (.mp4, 237 MB)