
SOUL AI
Overview
Soul AI is a 24-hour designathon project where our team explored how AI could respond with emotional context rather than purely functional output. Built under tight time constraints, the concept focused on designing an AI experience that adapts to user tone, intent, and mental state.
The result was a multi-screen interactive prototype demonstrating emotionally aware responses and adaptive UI behavior. The project was presented to an industry jury and secured Runner-Up for its human-centric approach and clarity of execution.
Categories
AI
Website / Desktop Application
Date
Oct 29, 2025
Problem Statement

In a world full of apps, technology still struggles to feel truly personal. People use AI assistants and recommendation systems every day, but most of them treat users as data points not as individuals with emotional context, intentions, and evolving needs.
The question we asked ourselves was simple:
How might we make digital experiences feel emotionally intelligent without being intrusive?
How can AI feel like it understands you, not just predicts you?
That became the core problem we set out to address in the 24-hour designathon.
Solution at a Glance

Soul AI is a digital experience that tailors AI responses to your emotional context and intention, not just task triggers.
Instead of generic replies or static suggestions, Soul AI gauges tone, mood, and goals to deliver responses that feel human, supportive, relevant, and contextually aware.
We built a concept prototype in 24 hours and presented it to the judges, ultimately finishing as Runner-Ups , not because it was perfect, but because it felt alive.
How It Started

We entered the 24-hour designathon with a few common frustrations:
Voice assistants that feel robotic.
Recommendations that ignore why you’re doing something.
Apps that treat all users the same.
My teammates and I spread across UX, research, and content knew we had to lean into something that felt alive, not mechanical.
After an initial brainstorm, someone said:
“What if AI could understand not just what I want, but how I feel when I want it?”
That question became our anchor.
User Research

Since we had only 24 hours, traditional research wasn’t possible but we didn’t skip insight. Instead, we used micro-research methods:
Live mini-interviews with 8 peers
Observation of how users react to existing AI assistants
Quick sentiment testing by reading emotional tone of sample messages
What we learned:
People want empathy before solution.
Users abandon assistant tools when responses feel robotic.
Emotional cues (tone, words, hesitation) matter more than raw task accuracy.
We distilled those insights into clear design priorities:
Emotional context, supportive language, adaptive UI tone.
More research :
Approach

Rather than jumping straight into UI screens, we first asked:
What does emotional intelligence in AI even look like?
What signals can we surface to the user without being creepy?
We mapped user states calm, stressed, curious, overwhelmed and matched them with response strategies:
User State | AI Response Style |
|---|---|
Stressed | Reassuring / simplified suggestions |
Curious | Contextual explanations |
Overwhelmed | Step-by-step guidance |
From that point, our goal was not to make every response smarter but to make every response feel like it matters.
Solution

We landed on a prototype that:
Reads the emotional tone of a user’s input
Adjusts AI response style accordingly
Displays UI elements that reflect mood subtly
(e.g., softer colors for stress, bright accents for curiosity)
The result was not a single “right answer” but behavior that felt attentive:
A UI that shifts language tone
Prompts that feel supportive
A feedback loop where users can correct the AI’s emotional reading
This created an experience where interaction wasn’t just reactive it felt responsive.
Design Decisions

Because we had limited time, we made intentional trade-offs:
Language over fancy visuals
We chose to invest in writing emotionally supportive copy rather than flashy graphics. The core idea had to feel real.
Tone matching, not mood guessing
Instead of making AI guess user emotions incorrectly, we limited it to four states that we could design for confidently (stressed, curious, calm, overwhelmed). This kept the system believable.
Simple UI that adapts
Rather than designing many complex screens, we built a system of adaptive micro-responses to ensure the UI supported the message, not distract from it.
1.Language
Language over fancy visuals
2.Tone
stressed, curious, calm, overwhelmed
3.Simple UI
To ensure the UI supported the message
Key Insights

Some of the most surprising takeaways:
Users notice tone before they notice features.
AI that sounds empathetic can build trust even if the functionality is limited.
Personalization needs to feel human, not predictive.
One team member said it best:
“I didn’t want Soul AI to be smart I wanted it to feel seen.”
That shifted our focus from problem solving to emotional resonance.



Flows

We mapped simple, realistic paths that reflected emotional context:
User types:
→ “I’m stressed and need help organizing my tasks”
→ “I’m overwhelmed by suggestions simplify, please”
→ “I’m curious about learning show me paths, not noise”
Each flow had:
Emotional context at input
Adaptive UI tone on output
Minimal cognitive load on the next step
Rather than force all tasks into a rigid flow chart, we built branches that feel natural.
UI

SOUL AI
Overview
Soul AI is a 24-hour designathon project where our team explored how AI could respond with emotional context rather than purely functional output. Built under tight time constraints, the concept focused on designing an AI experience that adapts to user tone, intent, and mental state.
The result was a multi-screen interactive prototype demonstrating emotionally aware responses and adaptive UI behavior. The project was presented to an industry jury and secured Runner-Up for its human-centric approach and clarity of execution.
Categories
AI
Website / Desktop Application
Date
Oct 29, 2025
Problem Statement

In a world full of apps, technology still struggles to feel truly personal. People use AI assistants and recommendation systems every day, but most of them treat users as data points not as individuals with emotional context, intentions, and evolving needs.
The question we asked ourselves was simple:
How might we make digital experiences feel emotionally intelligent without being intrusive?
How can AI feel like it understands you, not just predicts you?
That became the core problem we set out to address in the 24-hour designathon.
Solution at a Glance

Soul AI is a digital experience that tailors AI responses to your emotional context and intention, not just task triggers.
Instead of generic replies or static suggestions, Soul AI gauges tone, mood, and goals to deliver responses that feel human, supportive, relevant, and contextually aware.
We built a concept prototype in 24 hours and presented it to the judges, ultimately finishing as Runner-Ups , not because it was perfect, but because it felt alive.
How It Started

We entered the 24-hour designathon with a few common frustrations:
Voice assistants that feel robotic.
Recommendations that ignore why you’re doing something.
Apps that treat all users the same.
My teammates and I spread across UX, research, and content knew we had to lean into something that felt alive, not mechanical.
After an initial brainstorm, someone said:
“What if AI could understand not just what I want, but how I feel when I want it?”
That question became our anchor.
User Research

Since we had only 24 hours, traditional research wasn’t possible but we didn’t skip insight. Instead, we used micro-research methods:
Live mini-interviews with 8 peers
Observation of how users react to existing AI assistants
Quick sentiment testing by reading emotional tone of sample messages
What we learned:
People want empathy before solution.
Users abandon assistant tools when responses feel robotic.
Emotional cues (tone, words, hesitation) matter more than raw task accuracy.
We distilled those insights into clear design priorities:
Emotional context, supportive language, adaptive UI tone.
More research :
Approach

Rather than jumping straight into UI screens, we first asked:
What does emotional intelligence in AI even look like?
What signals can we surface to the user without being creepy?
We mapped user states calm, stressed, curious, overwhelmed and matched them with response strategies:
User State | AI Response Style |
|---|---|
Stressed | Reassuring / simplified suggestions |
Curious | Contextual explanations |
Overwhelmed | Step-by-step guidance |
From that point, our goal was not to make every response smarter but to make every response feel like it matters.
Solution

We landed on a prototype that:
Reads the emotional tone of a user’s input
Adjusts AI response style accordingly
Displays UI elements that reflect mood subtly
(e.g., softer colors for stress, bright accents for curiosity)
The result was not a single “right answer” but behavior that felt attentive:
A UI that shifts language tone
Prompts that feel supportive
A feedback loop where users can correct the AI’s emotional reading
This created an experience where interaction wasn’t just reactive it felt responsive.
Design Decisions

Because we had limited time, we made intentional trade-offs:
Language over fancy visuals
We chose to invest in writing emotionally supportive copy rather than flashy graphics. The core idea had to feel real.
Tone matching, not mood guessing
Instead of making AI guess user emotions incorrectly, we limited it to four states that we could design for confidently (stressed, curious, calm, overwhelmed). This kept the system believable.
Simple UI that adapts
Rather than designing many complex screens, we built a system of adaptive micro-responses to ensure the UI supported the message, not distract from it.
1.Language
Language over fancy visuals
2.Tone
stressed, curious, calm, overwhelmed
3.Simple UI
To ensure the UI supported the message
Key Insights

Some of the most surprising takeaways:
Users notice tone before they notice features.
AI that sounds empathetic can build trust even if the functionality is limited.
Personalization needs to feel human, not predictive.
One team member said it best:
“I didn’t want Soul AI to be smart I wanted it to feel seen.”
That shifted our focus from problem solving to emotional resonance.



Flows

We mapped simple, realistic paths that reflected emotional context:
User types:
→ “I’m stressed and need help organizing my tasks”
→ “I’m overwhelmed by suggestions simplify, please”
→ “I’m curious about learning show me paths, not noise”
Each flow had:
Emotional context at input
Adaptive UI tone on output
Minimal cognitive load on the next step
Rather than force all tasks into a rigid flow chart, we built branches that feel natural.
UI

SOUL AI
Overview
Soul AI is a 24-hour designathon project where our team explored how AI could respond with emotional context rather than purely functional output. Built under tight time constraints, the concept focused on designing an AI experience that adapts to user tone, intent, and mental state.
The result was a multi-screen interactive prototype demonstrating emotionally aware responses and adaptive UI behavior. The project was presented to an industry jury and secured Runner-Up for its human-centric approach and clarity of execution.
Categories
AI
Website / Desktop Application
Date
Oct 29, 2025
Problem Statement

In a world full of apps, technology still struggles to feel truly personal. People use AI assistants and recommendation systems every day, but most of them treat users as data points not as individuals with emotional context, intentions, and evolving needs.
The question we asked ourselves was simple:
How might we make digital experiences feel emotionally intelligent without being intrusive?
How can AI feel like it understands you, not just predicts you?
That became the core problem we set out to address in the 24-hour designathon.
Solution at a Glance

Soul AI is a digital experience that tailors AI responses to your emotional context and intention, not just task triggers.
Instead of generic replies or static suggestions, Soul AI gauges tone, mood, and goals to deliver responses that feel human, supportive, relevant, and contextually aware.
We built a concept prototype in 24 hours and presented it to the judges, ultimately finishing as Runner-Ups , not because it was perfect, but because it felt alive.
How It Started

We entered the 24-hour designathon with a few common frustrations:
Voice assistants that feel robotic.
Recommendations that ignore why you’re doing something.
Apps that treat all users the same.
My teammates and I spread across UX, research, and content knew we had to lean into something that felt alive, not mechanical.
After an initial brainstorm, someone said:
“What if AI could understand not just what I want, but how I feel when I want it?”
That question became our anchor.
User Research

Since we had only 24 hours, traditional research wasn’t possible but we didn’t skip insight. Instead, we used micro-research methods:
Live mini-interviews with 8 peers
Observation of how users react to existing AI assistants
Quick sentiment testing by reading emotional tone of sample messages
What we learned:
People want empathy before solution.
Users abandon assistant tools when responses feel robotic.
Emotional cues (tone, words, hesitation) matter more than raw task accuracy.
We distilled those insights into clear design priorities:
Emotional context, supportive language, adaptive UI tone.
More research :
Approach

Rather than jumping straight into UI screens, we first asked:
What does emotional intelligence in AI even look like?
What signals can we surface to the user without being creepy?
We mapped user states calm, stressed, curious, overwhelmed and matched them with response strategies:
User State | AI Response Style |
|---|---|
Stressed | Reassuring / simplified suggestions |
Curious | Contextual explanations |
Overwhelmed | Step-by-step guidance |
From that point, our goal was not to make every response smarter but to make every response feel like it matters.
Solution

We landed on a prototype that:
Reads the emotional tone of a user’s input
Adjusts AI response style accordingly
Displays UI elements that reflect mood subtly
(e.g., softer colors for stress, bright accents for curiosity)
The result was not a single “right answer” but behavior that felt attentive:
A UI that shifts language tone
Prompts that feel supportive
A feedback loop where users can correct the AI’s emotional reading
This created an experience where interaction wasn’t just reactive it felt responsive.
Design Decisions

Because we had limited time, we made intentional trade-offs:
Language over fancy visuals
We chose to invest in writing emotionally supportive copy rather than flashy graphics. The core idea had to feel real.
Tone matching, not mood guessing
Instead of making AI guess user emotions incorrectly, we limited it to four states that we could design for confidently (stressed, curious, calm, overwhelmed). This kept the system believable.
Simple UI that adapts
Rather than designing many complex screens, we built a system of adaptive micro-responses to ensure the UI supported the message, not distract from it.
1.Language
Language over fancy visuals
2.Tone
stressed, curious, calm, overwhelmed
3.Simple UI
To ensure the UI supported the message
Key Insights

Some of the most surprising takeaways:
Users notice tone before they notice features.
AI that sounds empathetic can build trust even if the functionality is limited.
Personalization needs to feel human, not predictive.
One team member said it best:
“I didn’t want Soul AI to be smart I wanted it to feel seen.”
That shifted our focus from problem solving to emotional resonance.



Flows

We mapped simple, realistic paths that reflected emotional context:
User types:
→ “I’m stressed and need help organizing my tasks”
→ “I’m overwhelmed by suggestions simplify, please”
→ “I’m curious about learning show me paths, not noise”
Each flow had:
Emotional context at input
Adaptive UI tone on output
Minimal cognitive load on the next step
Rather than force all tasks into a rigid flow chart, we built branches that feel natural.
UI









