H(ear)s to Us
An interactive sound experience facilitating moments of human connection.
Design Process
Interviews
I conducted 10 interviews with Emily Carr students and staff, exploring potential causes of loneliness and strategies for overcoming it.
Interview Findings
1.
70% of people interviewed experience loneliness regularly, particularly in Vancouver.
2.
Nighttime, weather, phone use, and social media are common triggers for loneliness.
Introvert or extrovert tendencies revealed varying triggers.
3.
Moving outside, engaging with nature, music and sentimental media bring comfort for loneliness.
Introvert or extrovert tendencies revealed varying comfort tools.
Co-Design
I facilitated two co-design sessions: one with Emily Carr students from various years and majors, and another with Interaction Design peers, to explore ideas for the form and sound of my concept.
Mindmap of ideas for physical forms accumulated from both sessions.
Mindmap of sonic possibilities accumulated from both sessions.
Co-Design Findings
1.
People's proximity to each other is important in shaping an effective moment of connection.
2.
Sounds of collective actions remind participants of human connection.
3.
Organic and nature-inspired shapes named as forms comfortable and approachable.
Prototyping
I started by refining my inspiration for the project by drafting a moodboard before moving into testing with Arduino and touch sensors.
I gathered inspiration from a sonic swing set, light boxes that react in proximity to each other and other installations that change visually or audibly with user input.
To conduct user testing, I made a prototype with DIY touch sensors to play sound once activated. I created a condition where sound would only play when both sensors were activated at the same time to test how people felt when needing collaboration to activate sound.
User Testing Findings
1.
“How do you express to people that it’s a collaborative thing?”
2.
“How close will the touch points be together?”
3.
“Is this more like a playground of sound?”
The prototype advanced to an Arduino promicro for ease during grad show exhibition. I soldered light sensors long enough so that once installed in a table, users could not reach all the sensors on their own, therefore necessitating them to ask others to collaborate with them.
My process moved into MaxMSP where I programmed sound to play all the time and have the sound increase to an audible level when light sensors were triggered.
Below is my prototype prepared for exhibition at Emily Carr's graduation show.
My setup included four light sensors connected to an Arduino Pro Micro, with Max/MSP running on a Mac Mini and outputting to speakers. I designed the table as a circle to foster a sense of togetherness, avoiding the hierarchy a rectangular shape might create. The 4x4 foot scale ensured that no single person could activate all sensors alone, encouraging collaboration. Scale and material were chosen to fit within exhibition guidelines.
I split a song into four parts—vocals, bass, drums, and instrumentals—so each sensor would trigger one element. I chose a song that feels lively yet incomplete until all sensors are activated, showing the impact of full participation. Users can experiment solo or invite others to join, discovering how the experience changes together.
Reflection
Throughout this process I carried many questions, hopes, and uncertainties. While I felt ok with the final result—especially given the balance of other coursework and unexpected life events—I see a brighter future for this concept. Watching people interact with it was really exciting and informative, more so than with any digital design I’ve created.
Future improvements could include a more visually engaging interaction area (rather than bare light sensors through the table), a slightly larger scale to prevent one person from triggering two sensors, and interaction points that allow users to experiment with sounds with more variety, similar to a DJ drum pad.
Note to self: document things even when you don't feel proud of them!
Musings of Future Possibilities
It would be super neat if sensor areas could be similar to mountains in the table, touch responsive all the way around. For example, the base of the mountain could emit a deep sound when triggered with a higher sound right at the peak.