Sound Spectrum
An interactive music controller project that combines music, visuals, and motion control. It provides a unique and immersive experience where users can manipulate the music and visuals by moving an accelerometer-based controller.
Music Interface with Gestures
This project aims to create a dynamic and visually captivating audiovisual experience.
The Controlls
Changing Tracks
By flicking the phone to the right or left, the music track and background image change while the color of the sound waves remain the same until changed using a different gesture
Changing Volume
Rotating the phone to the vertical upright position raises the volume to the highest setting, while rotating it to the opposite position, mutes the music. The number of sound waves on the screen also increase/reduce based on the volume. This gesture works like a volume knob.
Changing sound waves colors
Flicking the phone up changes the shade of the sound waves based on the image behind to add a certain amount of customizability but also keeping a certain aesthetic.
The Process
Conceptualization and Planning
After understanding the basic concept of the project, I chose to have my music player have sound wave lines fill the screen, and increase/decrease based on the volume.
I also wanted for the color of those lines to match the background and feel of the song. I began by deciding on the images and songs that I wanted to play, I went with songs from different genres that I enjoyed such as Redbone, Feel it Still, & Cleopetra, Favela, & Gorillaz. I then searched for images that I felt match the respective song. I chose the colors.
The Tech
To bring my vision to life, I utilized several technologies and tools. The project is built using the p5.js library, which provides a powerful set of functions for creating interactive graphics and audiovisual experiences. I incorporated the p5.sound library for audio processing, enabling me to analyze the music and extract relevant data for visualization.
Additionally, I integrated WebSocket communication to receive data from an accelerometer sensor, allowing users to control the music and visuals by moving the controller. This real-time interaction adds an extra layer of immersion and interactivity to the project.
Challenges
During the implementation phase, I encountered several challenges that tested my problem-solving skills and pushed me to learn new concepts. One of the main challenges was mapping the accelerometer data to control different aspects of the music and visuals. I had to experiment with different techniques to achieve smooth and responsive control while ensuring the movements were accurately interpreted.
Another huge challenge was knowing what to code at times, but with the help of Chat GPT, we found new solutions to parts of my project.
Iteration & Development
Throughout the development process, I embraced an iterative approach. I continuously tested and refined the project, incorporating user feedback and making adjustments to improve the overall experience. This iterative cycle allowed me to fine-tune the responsiveness, visual aesthetics, and user interaction, ensuring a polished and captivating final product.
With the help of Chat GPT, I was able to polish my project to a result that I was happy with.
Chat GPT
Visual Studio Code
The Result
Key Learnings & Insight
Creating 'Sound Spectrum' taught me how to combine different technologies to control music tracks interactively. I learned to make the controls more responsive by using special techniques, and I gained experience in analyzing and visualizing audio using p5.js.
It was a creative project that showed me how code, music, and visuals can work together.