An interface for natural interactions
Interface Design
Malte Fial & me
Research, Ideation, Concept, UI-Design, Sound-Design
study project, third semester
3 weeks
2023
Aware UI is an innovative interface concept based on Machine Learning that enables natural interaction. The system combines touchscreen and voice control to create a way of interaction that takes human gestures, speech and non-verbal communication into account to provide a familiar and user-friendly experience.
The goal of the project was to provide a new type of communication with an interface that comes closer to natural communication between people and is directly derived from it. Natural actions and reactions like focusing on the other person, eye contact, listening should be transferred.
Aware UI uses voice control because conversation is a very human form of communication. Furthermore, the user can move freely around the room during the interaction and use his hands for other activities.
In addition to the VUI, a touchscreen is used. The screen is used for information that is difficult to convey via voice output, such as more complex lists and tables, maps and videos. Depending on the user's position, the contents of the display are dynamically adjusted. The depiction varies in the amount of information and size, so that the screen can always be optimally utilized while remaining legible.
To avoid a robotic feeling interaction, Aware UI needed a character. The smart system is visualized by an animated circle on a dot grid. The animations are supported by sounds that mimic human non-verbal communication, bringing the circle to life. When Aware UI speaks, it has a gender-neutral voice. The goal is to challenge the notion that a female voice is generally preferred for supportive tasks and a male voice for commanding tasks.
Aware UI was coded using the ML5 library, among others, and runs browser-based. It uses technologies such as Facemesh, Object Detection and Teachable Machine to process information from webcam and microphone and track the user. Once a person is detected in the room, the UI responds with visual feedback. The distance to the user is calculated using facial landmarks, the focal length of the webcam and the pixel density. Speech input is processed by a chat GPT connection and enables context-sensitive responses.
In the smart home scenario, Aware UI can be placed in the kitchen and allows the user to start music, find shopping and control other smart home devices.
Aware UI is an innovative concept that enables natural and human-like interaction with digital systems. Using machine learning and a combination of voice and touchscreen interactions, it provides a user-friendly experience that adapts to the user's needs and context.
Redesign of a finance app for sharing expenses
Application Design
André Jacoby, Sarah Fütterling & me
Research, Concept, UX & UI Design
study project, third semester
four weeks
2022
With over 10 million downloads, Splitwise is a popular app for shared flats or travel groups to keep track of money spent together and calculate the corresponding shares of the group members. However, users often get lost in the abundance of features and have difficulties navigating through the core functions.
The app was used by all team members and considered very useful, but sometimes the user experience was frustrating, and the UI design, from our perspective, was not meeting professional standards. We decided to undertake a fundamental redesign of the app.