Monday, January 20, 2014

User Tests

Last week we ran two user tests to check on potential issues. After a short introduction of our re:aktion application the user had to "discover" the functions on his own. We really want to make sure that the application is as intuitive as possible. A quick learning process is key to grab the users attention. In this post we're going to present our findings and the changes we applied. 

First a short summary the movement-trigger implemented at the beginning of the user-tests:
height of the right hand (continuous) - cutoff frequency of filter
height of the right hand (discrete) - switching filter lowpass / highpass  / off
height of the left hand - delay amount
proximity to the display - volume
horizontal position - panning

First test

The user was impressed by the tracking of his bodyshape. The recognition of overlapping bodyparts worked well and was appreciated. The user realised in no time that his right hand applies a filter. Panning and volume changes has been identified after a few tries. The user tried clapping, which of course had no effect. The left hand was not used to change the delay settings. It appears that the changes weren't audible enough and too subtle.

Fixes: The delay effect has been amplified and a flanger effect is added

Second test

The second user was more active, didn't identify the nature of the effects though. He realised which movement seemed to trigger an effect and was able to trigger them on purpose. Some uncertainty remained about a few movements. The user thought they had an impact they actually had not. He was not sure if moving both arms at the same time would amplify the effect of the filter. The user tried jumping and moving his legs. In a multi-user constellation they had to deal with major problems concerning the identification of the directing user.

Fixes: The user covering the largest area is "in charge". This avoids conflicts with unintentional switching of user focus when passing in the background. Furthermore, the brightness of the point cloud displaying the bodyshape has been adjusted to give better feedback.

Saturday, January 11, 2014

Software Implementation

Initial Implementation

The initial approach was the following:
The data stream of the kinect was interpreted by a C# program, implementing the offical Kinect SDK. The gestures should be directly mapped to their corresponding feature, which is then sent out via OSC (implemented using the Bespoke OSC Library). The data of the OSC is then sent through an Ethernet connection to a second computer running Ableton Live, where all sound generation and manipulation should happen. The OSC stream was received and mapped through the Max4Live plugin Livegrabber.

This rather complicated setup was used because for both overall performance and the prior experience of using both Ableton Live and the .net C# environment. After many unsuccessful attempts on synchronising the many components, this setup configuration was deemed too complex and was left in favor of the setup described in the following paragraph.


The final setup was surprisingly simple: Processing was used to both read the Kinect data and interpret it as well as play music and modify it accordingly. For the first part, the Simple OpenNI Library was used and Minim for the second.
Performance is good enough to guarantee smooth image drawing and sound manipulation.


The mapping is still a work-in-progress:
  • Overall volume is controlled by the user's distance.
  • Cutoff frequency of a low-pass filter is mapped to the relative height of the right hand.
  • Delay volume is mapped to the relative height of the left hand.

For complexity reasons, it was decided to first implement full functionality for a single user before introducing multi-user interaction. If multiple users are detected, only the last one has an active role.

Visual Interface

As planned the visual interface is implemented through a grey scale point-cloud that draws the depth map of the camera and  differentiates multiple users by colour. If no player is present, a MOVE! message is displayed. Below, a screenshot of the visual interface during testing phase can be seen. Please note that the representation is much better in full screen.


  • Smoothing for big parameter jumps (probably simple interpolation)
  • Adding more than one musical pattern
  • Proper multi-user support
  • Implement special gestures (clapping etc)

Sunday, December 8, 2013

Testing of paper prototype

Initial User Tests

This week we tested our newly built paper prototype on three users. We gave each of them a short introduction and asked them to each complete the three following tasks:

Task 1: Changing Volume

First of all we wanted to know how they would change the volume of the music or their part. All of them told us unanimously they would like to control it by changing their distance to the screen. Moving away from the screen should lower the volume.

Task 2: Trigger special effects

The second task was to think of gestures or poses to trigger some special effect. In our team we initially thought to trigger some effects by crouching down, this was not a gesture any of the users would have tried by themselves because it is a quite unnatural movement while dancing. They instead suggested gestures like touching their own hands, moving the head and turning their back to the screen. To initiate a crouching gesture, a visual hint was proposed.

Task 3: Pass instruments between users

Last but not least we requested our users to pass the different instruments to each other. Every user intuitively tried this by touching the other players’ hand. This confirms our initial design decision to be a good practise.

Additional feedback

Some users pointed out that we have to be careful with gestures that need to put an arm above the head for a extended amount of time because this gets exhausting pretty fast.
Furthermore, two users prefer first person to have full control over all instrument simultaneously until a second person joins the game.


Mostly all of the suggestions concern the gestures and movements that control the music. Like we already learned in our interviews, they should easily incorporated into dancing so we shouldn’t assume that positions such as crouching down might be tried without additional hints. We therefore introduced a new visual element that invites the user to try new and unusual gestures, like in the following picture.

Furthermore, additional gestures like moving the head or turning the body away from the camera should be implemented to trigger special effects. These are being ideated now.

To address the additional feedback, we will design the controls in such a way that a single user has the possibility to control every aspect yourself as long as no other player entered the game. Additionally, fatigue of arms are a concern when implementing the final gestures.

Task Breakdown

Ideation: Full team
User tests and taking pictures: Chris
Blogpost: Alex, Chris

The following is a picture of our testing process. Please be reminded that it also incorporates a simple audio feedback to emulate the systems response to the user.

Monday, December 2, 2013

Paper Prototype

We prepared a Paper prototype  to check the basic interaction with re:aktion. It's considered to be helpful to observe difficulties in the way of interacting with the system in an early state, with low cost and effort. We looked into three scenarios, representing the very basic feature our system should provide.

List of tasks

Catching the user's attention

Abstract visuals and the invitation to "move" should attract people and get their attention. A simple music pattern is already playing, reaching people who can't actually see the display.

As the user comes closer, he sees himself as some kind of silhouette, mirroring his movements

When the user begins to move he is is probably triggering some implemented change of the music pattern. He becomes aware of the influence he has on the music... and it is on!

Multi user interaction

With re:aktion, you are in control of the music you're dancing to. The height of your right hand will set the complexity level of the instrument assigned to you.
A higher hand position will trigger a more complex pattern, while a lower hand position will cause to reduce the complexity.
As visual feedback abstract forms will be displayed around the hand of the silhouette. The variation of the forms are correlated with the complexity of the pattern.

Left and body position are responsible for filtering of the signal and other effects.

Changing instruments

To change instruments just touch the hands of another user.

The visuals have different colours for each instrument. After the exchange the colour of the visuals will of course change respectively.

Task breakdown

Brainstorm & UX design: Full team
Cutouts: Eduard
Photo edit: Benoit, Chris
Blog update: Xavier, Chris

Sunday, November 24, 2013

Storyboarding and Interview


In order to gather the expectations, the preferences and the opinion of our target audience, we conducted interviews with different people corresponding to our target group. The overall interview plan may be downloaded here. Even if the plan was of great help, the interviewers were encouraged to adapt their questions according to the responses and reaction of the interviewed.

3 people were interviewed:
- A 20 years old female trainee who often listens to music and goes to festivals.
- A 24 years old male tech house DJ and producer who likes going to clubs and to festivals.
- A 24 male student who listens to techno, produces music and plays violin and guitar.


During the interviews, the 2 following storyboard were presented. They deliberately don't show the precise interaction with the application that was envisioned by our team in order not to influence the interviewed person, so that the answers gathered may be as diverse and creative as possible. 

Interview findings

The interviewed persons were really enthusiast about the application and shared their expectations as well as what would refrain them from using it.

People are really enthusiastic about the application idea and are definitively willing to try it. They also say they would not only use such an app in festival but also in a different context such as an art gallery or even a private party. However the main focus is to have fun, such an app would have too many limitations to allow creating music in a professional way.

In this context, the app needs to grab the potential user's attention in a few seconds as they walk by. The average using time would fall around a short to medium amount of time (5 to 20 minutes). Thus, the set-up needs to be straightforward and allow having fun immediately.
To reduce the complexity, the tempo and the overall scale should be set. The application should not allow for single notes to be entered but rather allow the selection of predefined melodically phrases.

Furthermore, the subject expects to use the application as a medium to either share the interactive act of creating with friends or even meet new people. Hence, the app should focus on cooperation and make people dance together in a common pace. In addition, as dancing in front of friends or strangers may be embarrassing; the app should not include awkward gestures and maybe even reward the users.

The most important for all of the interviewees was that the gestures to change the music don’t interfere with dancing. They have to be both intuitive and fluid.
For the targeted music, the exact instrument selection matter less to the interviewees than the overall sound and rhythm. Thus, player shouldn’t be attributed an instrument but rather a general part of the song. The interviewed also gave us many gestures idea, such as:

  • Distance from screen changes volume of assigned sound
  • Position of left hand changes melodic elements
  • Position of right hand changes filter cutoff
  • Special poses  invoke special effects (such as crouching for reverb)

In order to allow a more intuitive interaction, the app show abstract visualizations instead of just displaying the players’ image or a fully developed graphical interface.

Empathy map

These interviews allowed us to determine the following empathy map:

Design principles

Thanks to their responses, we could determine multiple design principles which should be followed in order to create an application which corresponds to the user expectations and which provides the best user experience possible:
  • Quickly capturing the user's attention is crucial as walking by might only give a time window of a few seconds
  • Keep it simple because complex mechanics will discourage inexperienced user.
  • Changing the sound must not interrupt the dance flow.

Challenges ahead

These are the challenges we will have to face to create an app that correspond to the users' needs.
  • Designing and implementing intuitive gestures.
  • Creating musically suitable melodic elements.
  • Implementing fun musical modifications.
  • Creating a visually exciting experience.

Presentation of our results

Tasks repartition

The tasks were distributed as following:
- Storyboard brainstorming: Xavier, Eduard, Alexander, Benoît
- Storyboard drawing: Benoît
- Interview questions: Chris, Eduard, Xavier
- Interviews: Chris, Alex
- Empathy Map: Xavier
- Design principles: Chris, Eduard, Xavier, Alexander, Benoit
- Presentation: Eduard, Chris
- Blog post: Benoît, Chris

Wednesday, November 20, 2013

We have a new name!

After carefully focusing our expectations and goals, we decided it was time for a name change. The main reason to let go of the playful reference 'kollektiv hero' was a need for distinction and maturity. We therefore proudly present you the new name of our project:


Our new name is created by dividing the common word 'reaction' into its integral parts and fusing the two concepts back together:
The global prefix 're' indicates the nature of responding to a stimulus, as in re-sponse, re-ply or re-verb.
The germanized 'aktion' both indicate the physical involvment of the participants and at the same time helps to differentiate the project.

Monday, November 11, 2013

Project idea

We are proud to present the idea of

!Kollektiv Hero!

You know Guitar Hero? Wait for  Kollektiv Hero!

People step into the area in front of the screen and are assigned a certain role in a musical piece. Moving your hands will change the particular sound and sequence of "your" instrument, so dancing in front a the screen will allow the user to create original music. Everything is synchronized to the same tempo so people will start moving around and "dancing" in the same rhythm.
Specific roles will be assigned to each player according to their instrument. The drums/percussion guy might enter patterns in 3D space by walking around and jumping at certain positions. the other roles include rhythmic and melodic elements.

Interaction mind map:

Targeted users and location

The location would be either a festival or an electronic arts exhibition; the targeted users being young, dynamic, interested in arts, music and having a minimal knowledge in this domain.


environment and users mind map:

Technical requirements:

Project name:

The following names were proposed:

Kollektiv Hero
Music Playground
Instrument Playground
Jamm up!
Jamm out

Since the application idea remembered some of us of the game "Guitar Hero", we thought it would be  nice if people could allready get a kind of idea of the game by hearing the name. So that's why it is called Hero. 

The word Kollektiv is german for collective, which can also refer to a group of people. Maybe you also know the music group Kollektiv Turmstrasse? Also a hidden hint that our Application deals with music