Sensehouse

 
mockgood.jpg
 

Sensehouse is an installation that functions as both a bus stop shelter and a touch based instrument; users would touch the surface and receive immediate auditory feedback corresponding to where they touch. The more users that actively participate, the richer the music becomes. This project aims to bring more delight, opportunity to play, and place making into the public sphere.

 

Context

This project lasted the duration of fall 2018 to spring 2019 and was selected to present to Translink BC’s open call commission in 2019.

Responsibilities

Primary and secondary research, concept lead and ideation, sketching, storyboarding, inclusive persona creation, user testing and feedback, UX and UI design, prototyping, coding, set and sound design.

 

 
 

Problem space

I began by looking for a way to bridge the divide between low fidelity users and technology but soon began to examine how we interact with each other and our surroundings in public spaces; specifically transit stops in Vancouver and how they act as places of gathering.

As technology evolves, we begin to see a shift in our emotional and social behaviours, and subsequently observe a heavier investment of self online on various platforms. With this shift, we witness an increase in depression, anxiety, sleep issues, and social isolation in certain users (Barr. independent.co.uk ).

Design goals

Sensehouse aims to use music as a means to help facilitate connection and conversation; fostering new ways in which we inhabit public space and interact with each other. Using music and design as an intervention and encouraging creativity and spontaneity while breaking the monotony of repetitive infrastructure.

 

 
 

 
 

Design research

I began by researching bus shelters, public installations, and certain instruments, what their functions are and what advantages certain designs had over others; as well as researching the benefits of expression through music both on an individual, and collective level. I then looked at precedents and tools that may be able to help this design come to life. 

This design solution offers individuals common ground to connect and converse through more than just conventional forms of expression, using music as an accelerant for social cohesion, altruism, and place making.

 
 

 

 Design Process

I conducted nine rounds of user testing, each with a different prototype, steadily narrowing down my design and trimming what was deemed unnecessary as per user feedback.

By presenting users with an instrument that requires no prerequisite musical knowledge to play and is activated by touch, it levels the playing field and facilitates an easy, and enticing on-boarding for those who aren’t as creatively inclined. As an interaction designer and a maker, I chose to opt for a physical interaction rather than something web or screen based. This makes our connections real and draws us to the present.

mat.jpg
 
 

I used paper, clay, and cardboard to make possible structural models; while they gave me a few ideas as to how the
structure could affect the way the instrument is played, they also made me question whether a change of form was
absolutely necessary for the design implementation.

I was particularly drawn to the geodesic dome structure built from cardboard, it made me think about how the shape of the structure could influence the way it was played. This insight led me to draft a schematic where a wall or panel would have a pickup built into it.

 
 

The original concept, building with a guitar pickup, would be high in cost and
ineffective to this design, I opted to build a piezo pickup, or contact microphone using wire, a piezo speaker, a quarter inch input, and a soldering kit.

While the Piezo speaker usually emits sound, if the polarity of the solder is switched, it becomes a microphone, specifically a contact microphone. this speaker need only make contact with any object and that object immediately becomes amplified, because the signal follows a direct path, the sound can be intercepted and modualted.

In this iteration, I taped the contact mic to a piece of acrylic that is similar to local bus stop structures in use now; I then ran the microphones signal through a delay modulation pedal and then to the amplifier. The delay was used to create a more interesting effect for the user to play with rather than just having an amplified plastic cut out. I taped the scenario above the installation and let users have a go.

 
 

 
pd.jpg

 Learning new languages

Through this design process I learned programs such as Touchdesigner, Puredata, and MaxMSP to be able to illicit an audio reactive visual response, I paired these programs with Ableton so that I can have more control over the sounds produced.

This design had originally intended to have an audio reactive visual response that involved a type of screen and motion graphic, but this was ultimately deemed unnecessary. If this interaction was to be about connecting people, it had to be able to do just that without the common distraction of a screen; the design opted to be static instead of reactive, leaving space for users to breath and contemplate.

 
 

Makey Makey is a controller that allows the user to connect it to any object that has conductivity and turn it into a functional trigger. In this case I drew nodes on paper using graphite and connected the alligator clips to the ends, I then connected the Makey Makey to Ableton. This turned the paper into a operational MIDI keyboard.

I gave users my standard scenario and encouraged them to play with my model, I ran into a few problems in the beginning including how to operate this controller and the grounding at the same time, but that was easily solved and actually added to my insights greatly, one of which was that participants are more than willing to make contact with others for the sake of playing with this design,

 
 

MaxMSP is a program that comes with Ableton 10, it can be used to create new sounds based off of schematic-like grids as well as programming audio reactive visuals; it is extremely similar to Pure Data in that sense. In this particular iteration I came across a pre coded schematic
that allows the user to manipulate the parameters of a chosen shape. This was a good beginning to reactive visuals but I couldn’t get this program to connect to any sort of audio input.

I began setup and communication between Ableton and TouchDesigner; Ableton produces sound while sending a signal to TouchDesigner; TouchDesigner then processes the signal and has the potential to produce live audio reactive visuals,

During my trials and tribulations learning these various programs, I found a bit of a shortcut in what I needed. I found and implemented a MAX plugin created by Synnack called “Ganz graf mod X”, which essentially allowed me to run it in a midi chain so it recieves any input from a user and translates the sound it produces into a visual in real time. Since this is what I was looking for, I began to user test.

 
 

For this iteration I used the Makey Makey and connected it to Ableton through my laptop, I then hid my laptop behind a larger screen and arranged the space to be as reminiscent of a bus stop as possible. The users were then given the scenario, told to hold the ground, and left to their own devices.

 

 
 
novaliafocus.jpg

 Less is more

After considering the user feedback I received, I moved forward with a stronger graphic approach rather than a screen based one. At first I illustrated instructions showing how the user can interact with a Novalia ink board and observed how my participants interpreted what they saw. Upon receiving mixed feedback, specifically about the clarity of instructions from that round, I chose to make the graphic element both the instructions and the sound triggers on the next iteration.

 
 

For this iteration, I used a Novalia conductive ink board paired via bluetooth to my Ipad. When the surface of the board is touched and there is an ink sensor beneath the finger, a signal is sent to the Ipad and sound from the app is triggered; no ground wire is needed to be held to complete the circuit. This iteration is the closest to the ideal interaction, including street ambience from a speaker. The user was given the scenario, observed and then questioned afterwards. more often then not, without my prompting, groups would interact with the prototype altogether.

Through the recommendation of Peter Bussigel and some research, I purchased a Teensy board 2.0++. The small golden holes on the board are potential input signals. I soldered some headpins to this board and attached it to a breadboard for more use out of the unit. I then found code on Github that allows the board to take input from each pin and convert it into a MIDI signal; this would prove to work perfectly with Ableton.

 
 

Since I knew graphite worked as a conductive material, I found a tutorial on how to make conductive paint, mixing acrylic and graphite powder together. This mixture of graphite and acrylic had to be diluted with water in order to spread well, the graphite powder soaked up all the moisture that was in the acrylic turning it into more of a paste.

I soldered each nail point to a wire that ran longer than the length of the 2x4 ft. MDF panel so that the Teensy and breadboard could have enough space to be hidden from the user.

This iteration was ran and briefly tested during my final presentation at the end of the term. Despite some issues I had ran into previously , It worked perfectly and I was able to gather a little user feedback in between and after presentations.

 
 

Ideally, this design would occupy a corner spot inside of one of many gallery spaces inside of
ECUAD, for exhibition; and then finally implemented outside in a public setting.

This is the ninth prototype installed in the Libby Leshgold gallery at Emily Carr University. this design was in exhibition for the two week duration of the 2019 grad show, of which I not only participated but also curated.

 
 

I would like to acknowledge and thank Haig Armen, Ryan Betts, Tim Rolls, Amanda Huynh, Peter Bussigel, Eugenia Bertulis, Benjamin Unterman, Bob Werner, Bobbi Kozinuk, and the entire ECUAD INTD 2019 cohort for helping me in every aspect of my project, from user testing to mentorship, your input was invaluable to this project and only proved to broaden my insights and views tremendously.

 
 

 

Get in touch.

For any inquiries, send me a message and I will get back to you as soon as possible.

bottom contact.jpg