Shape Shifting
This project was created for the course Physical Interaction Design at KTH Royal Institute of Technology in Stockholm, Sweden. The assignment was to create an interactive shapeshifter. This of course could be anything. Together with Christiaan Boersma, Max Meijer and Daiman Khan we created an interactive mesh that gives an experience as if you would encounter some being or thing you have never seen before. It reacts to certain input and seems to be communicating with movement and light animations, however, you don’t know what it means. While interacting you discover that it seems to feel your touch and reacts on it. The more input you give, the more excited the being becomes. Is it just a thing or is it really alive?
Modern Arts
After some brainstorming we had a general idea of a structure of spatial forms that is flexible or movable. The shape would not only shift by literally rearranging the structure, also the mental image of the thing would be able to shift by a sound or a movement. By changing a movement for example, another mental image could be projected on the same shape. For inspiration we went to the Museum of Modern Arts in Stockholm. We looked for spatial forms that had the potential to move around, and thus to be both a thing and a being.
Triangle Mesh
We created a grid of triangles with flexible connections between them. The flixibility and limits of this combination of shapes brings a specific kind of movement and it can only be rearranged in certain ways. If you would place it on another object it would cover it like a blanket but arrange itself in a certain way. The surface of the tiles can be used to implement all kinds of technology.
The mesh covers the corner of the table like a sheet and at the same time takes on its own shape because of the limited flexibility.
When the mesh was picked up and let go, it would end up in a random shape formed by the limited flexibility.
Interactive_Mesh
Sensing, Moving and Communicating
The triangle mesh was able to ‘feel’ the user’s touch and respond to it with movement and light animations. We decided to only let the mesh feel and not see (with for example a camera) and it could not make a sound (with a speaker), only move and show patterns of light. It would hang vertically instead of laying on the ground. This would make it easier to let it move and give a better experience since the user would be ‘face to face’ with the mesh.
We created capacitive sensors out of aluminum foil. The Arduino charged the foil creating an electric field. The conductivity of the hand changes the field and with it the returning signal. This way it is possible to sense the hand in a certain range, even the position of the hand in this range.
The foil triangles were placed behind the front panel and LED strip.
The light was created with programmable WS2812B RGB LED strips. Each LED is controllable individually. We used an Arduino to control them with the Adafruit NeoPixel library. Here the light becomes red when the hand approaches.
Each triangle consisted of 9 LEDs, which were placed vertically between the aluminum sensor and a front panel made out of milky acrylic to create a diffused effect.
We created movement with servos, controlled by an Arduino. Here, the servo rotates as the hand approaches the yellow wire.
The servos were attached to the back of a panel and connected with wires to the surrounding panels. Again because of the limited flexibility lifelike movements were created.
Scaling Up
When we had working prototypes we started to work on a full grid of 20 panels. This brought a lot of challenges with it. One of them was the memory and processing power of the Arduino, which was not enough to control all 20 panels. We used dedicated control modules for the sensors and servos that communicated with the Arduino via I2C protocol. Even with the modules it was a challenge to make the code as light as possible. Another challenge was the sensitivity of the sensors. Placed close together and close to the wiring the electric fields were influenced by all the conductive components.
The triangles are made of plywood and acrylic. We used a laser cutter to create all the pieces. With all this material the mesh became really heavy for the torque of the small servos. This caused for the final mesh to be less movable than the prototype made of cardboard.
We sanded the edges in an angle for the flexibility of the mesh. Although the back plates were a lot thinner, we sanded these edges as well.
We glued the back plates on a piece of cloth. The cloth keeps the plates in place while staying flexible. The wires of the light and sensor go through the hole. The front side of the panels are attached on the other side of the cloth with screws.
The servos were attached to the back plate with screws. So many panels required a lot of wires and there was a chance one could brake. That is why we made the grid modular. The wires could be de- and reattached on the panel itself so every panel could be interchanged easily.
To increase the pulling range of the servos we attached circles on top. Bigger circles also meant less strength of the servos, so we experimented with different sizes.
Every other panel had a servo with circle. Different from the prototype, we randomized the way the servos were attached to other panels. This caused for more variety in the movements.
All the LEDs and servos required a lot of power, around 2,5 – 3 Watt per triangle. Since everything runs on 5 Volt we used a 12 Amp power source to power the whole mesh. We placed the power source together with the Arduino, modules and wiring on a base of wood. The cloth was attached to this base so it also functioned as a hanging mechanism.
With all those components cable management was not to take lightly.
Designing an Experience
When the mesh was finished we were not done yet. To create the experience we had in mind we needed to program the right behaviour. We experimented with light animations, sensitivity, moving speed, reaction time and movements. Although we did not meant for it to make sound, the rotating servos created a mechanical sound that complemented the liveliness of the mesh. In idle state, the mesh shows a colorful gradient that slowly shifts across the panels. When the user’s touch is sensed, the feedback consists of local movement and intensifying illumination around that specific area. The more the user gives input, the more excited the movements become and the faster the light animation shifts. For a better understanding of the experience you should watch the video above if you haven’t already.