Mindful Melody VR
for Hack the Valley 7
with Emily Goose, Reese D., and Siddhant (Sid) Das
in October 2022
for Hack the Valley 7
with Emily Goose, Reese D., and Siddhant (Sid) Das
in October 2022
We leveraged a Meta Quest 2 virtual reality headset and a piano keyboard for MIDI input to build an immersive, interactive experience using Unity.
Given the limited time of one weekend at a hackathon, we quickly got to know each other's interests, skills, and work style, helping us complete our project as a team while also satisfying each of our individual goals. We collaborated primarily in-person, but used Discord to keep track of our discussions and GitHub to organize our code. Together, we learned the different software and hardware involved, and applied it to build this VR experience.
We recognized that day to day life can be stressful, and although stepping outside to be surrounded by nature can be a great escape, in reality it can be impractical whether it's too far, a bad season, or any other reason. Meditation apps and relaxing music are popular options that people turn to for a quick and convenient break to reset or get set for the day. As such, we decided our project would be designed to:
Portray nature (e.g. plants, animals, weather) and music
Interpret user input (e.g. movements, button presses)
React to user input (e.g. plant health, animal movement, weather change, music)
We were interested in developing for a VR headset and a piano keyboard and learning how to integrate it all, so this was an extra factor we wanted for our project.
As soon as we tested an early prototype, which was essentially an empty world with just a floor, we realized the difficulty of interacting with an object in real life while not seeing that object and our hands in the virtual world, especially without continuous contact with the object. Unlike hand-held controllers, it is easy to lose track of a piano somewhere in front of the user. We added virtual representations of the user's hands and the piano to help with locating and operating it.
We considered training an AI model on existing songs to determine what melodies, harmonies, or rhythms are associated with what lyrics and album art, but due to time and resource constraints for this hackathon, we created a set of musical motifs which were each associated with a certain response in the environment. As we tested our project on different users with varying familiarity with VR and music, we adjusted the conditions and variables to ensure the environment provides satisfactory responses to numerous possible input combinations.
An example of what a user sees while in the Mindful Melody VR is shown in the image below. A virtual representation of the keyboard and the player's hands are seen near the bottom, corresponding to where the piano keyboard and hands are in real life from the user's perspective. Looking around, the user sees a virtual park with various plants and animals, and the weather is fair. By pressing keys on the piano keyboard, the user has played music which attracted the squirrel to appear.
Since a prototype had to be demo'd at the end of 36 hours, the VR experience had limited features and lots of room for improvement, such as:
Add a welcome screen
Offer a introductory tour around the VR environment for new users
Add environment selections (e.g. beach, forest, mountain, coral reef) and navigation between them
Make virtual representation of hands and keyboard more realistic (e.g. pressed keys move down, not turn red)Â
Add more musical motifs and corresponding responses
Make help/guidance available to users
Offer customization options (e.g. key press speed)
Expand compatibility to more devices (e.g. other VR headsets or even regular screens, different pianos/keyboards and potentially acoustic pianos too)