An interactive exhibit exploring the sensory development of prenatal life
Exhibition Design
Carla Schwarze & me
Research, Ideation, Concept, Prototyping
study project, second semester
four weeks
2022
Never again are we as safe and protected as in the womb. Everyone has experienced the prenatal feeling of closeness, warmth and containment, but no one is able to recall it. In Utero is an interactive installation that aims to mimic this feeling and convey information about the sensory development of the unborn child during pregnancy through audio and light installations.
The exhibit is intended to be used in the context of exhibitions, museums, or fairs. Our goal is to enhance the informational layer of the exhibit by adding an emotional dimension and thus creating a unique experience. The objective was to avoid touchscreens and make the central senses of vision, audition, and touch experienceable.
To better convey the topic of pregnancy, we decided to immerse visitors in the first-person perspective of an unborn child. The primary focus is on conveying the feeling of being secure in the mother's womb. Therefore, we took the opportunity to provide educational content on prenatal sensory development alongside the emotional experience, which should be embedded within the womb experience.
We decided against a purely technical VR application and sought a solution that could be experienced sensually and physically. Initial ideas of constructing a capsule similar to a sensory deprivation tank as an exhibit were discarded. Instead, we chose a suspended yoga hammock that envelops the body, creating a sense of concealment without feeling constricting. At the same time, gentle swaying is made possible. Incorporating technical components into the yoga hammock enabled us to convey information as a direct and sensory-rich experience through a light and sound installation.
Through the use of headphones, the visitor is introduced to a narrating voice, embodying the mother, providing a linear guidance through the different stages of pregnancy. Addressing the visitor directly creates a special connection. We made the decision to focus on the auditory and visual sensory development. The story is divided into nine sections corresponding to the months of pregnancy. In terms of storytelling style, we opted for a loving and personal approach. The level of information density had to be kept moderate to avoid overwhelming the listener.
The recorded voices were embedded into a soundscape that dynamically evolves according to the developmental stage of the embryo. Although the embryo cannot hear anything in the early weeks, the soundscape should begin with an inviting, warm sound that also resembles "nothingness.” As the developmental stage progresses, new sounds are added gradually. It starts with fluid sounds, followed by the mother's heartbeat, muffled voices through the abdominal wall, laughter, and other external noises, such as music.
Without a visual user interface, we faced a significant challenge regarding interaction possibilities. Users interact with touch zones embedded in the fabric, which become visible only when they are usable. These circular zones are then indicated and delimited by light rings. When the zone is touched, the story continues, the soundscape adjusts, and the respective light effects are triggered using LED strips and room lighting. This touch serves as an analogy to the movement of the baby within the womb, occasionally pressing and kicking against the abdominal wall. When the user interacts with the system, one of the ring-shaped LED arrays begins to gently light up and down. The illumination draws the users' attention to the correct spot on the fabric, and the diameter of the ring indicates the boundary of the touch zone. In the center of the ring is a stretchable, electrically conductive fabric that measures capacitance. When users touch the inside of the light ring, the ring initially lights up at maximum brightness to provide feedback to the users that their touch has been registered.
The technical components are controlled by an Arduino microcontroller, which is connected to a computer. To control the sound, the Arduino communicates via a MIDI interface with a sequencer on the computer. All sound files are stored in the sequencer and can be played independently. This allows for dynamic changes in the soundscape and adaptation to the desired situation.
In Utero is an experimental project that primarily focuses on the emotional experience. It has been an exciting challenge to develop an interface without conventional inputs and outputs such as buttons and screens. The experimentation with fabric and sensor technology in our project has led to a quite unique solution.
A randomized sequencer that converts an audio source into a melody.
Interface Design
Jannik Aßfalg & me
Research, Ideation, Concept, Prototyping, User-Testing, Product Design, Rendering
study project, second semester
four weeks
2022
Many music producers struggle with writer's block and are constantly looking for new ways to stay inspired. Ran.Seq helps producers to get new ideas by quickly providing them with a variety of random melodies, generated from environmental sounds.
Ran.Seq has an integrated microphone that continuously captures sounds from the environment and displays their waveforms on a screen. Should a melody be generated, the recording can be stopped, and new sounds are generated based on the amplitudes of the waveform. The results are organic and almost random. The melodies can be generated rapidly and iteratively until a suitable sequence of tones is found, providing a starting point for further refinement.