Product DESIGN,
Design Research

CLIO

Meet CLIO, your personalized companion designed to help you set meaningful goals, achieve milestones, and connect with your friends, family, and community, all through a fulfilling and mindful approach away from the digital noise.

Roles

Researcher, 3D Designer, Visual Design, Wireframing & Prototyping

Timeline

8 months

Tools

Figma, Womp, Adobe InDesign

Project Overview

AI Companionship & the Loneliness Epidemic

My thesis explores the loneliness epidemic and how AI companionship influences people's emotional well-being and interpersonal relationships, potentially contributing to the rise of this growing concern. It also investigates whether or not AI technology can be used to develop a sustainable solution and how much of it can be used to help address it.

Goals & Challenges

  • Develop a sustainable solution rather than a temporary band-aid for the loneliness epidemic
  • Understand deeply the emotional dynamics between humans and AI companions
  • Defining what is positive vs negative AI companionship
  • Create a supportive experience that complements real human interactions rather than replacing them
  • Raise awareness about the psychological effects of over-reliance on AI companionship

Overall Analysis

AI companions can provide temporary relief but are not a sustainable solution. They are akin to painkillers, addressing symptoms of emotional distress without resolving the underlying issues of it. Prolonged reliance on AI companions could worsen social isolation by deterring users from seeking genuine human connection. While these technologies have a place, they must be approached as a supplement rather than a replacement for real relationships and solutions to these problems.

So how can design strike a thoughtful balance—one that leverages the benefits of AI companionship while acknowledging its potential drawbacks that might contribute deeper to the loneliness epidemic?

The Solution: AI Companion

Introducing Clio

Meet CLIO, your personalized companion designed to help you set meaningful goals, achieve milestones, and connect with your friends, family, and community, all through a fulfilling and mindful approach away from the digital noise.

Your guardian (sea!) angel

CLIO is inspired and named by a certain species of the ethereal sea angels, specifically the Clione limacina. Like its namesake (but shortened), CLIO changes color—not through biological chemistry, but through emotion. Its glowing hues reflect how it feels, letting you know when it’s neutral, happy, excited, gloomy, deep in thought, surprised, or even a little frustrated.

The Solution: Handheld device

Not just another app (quite literally)

To help users step away from the digital noise, CLIO exists solely on a handheld device with a small display, rather than as a phone app that might tempt further screen time.

Interactions with CLIO are intentionally brief and focused—centered around setting goals, marking progress, and asking for advice or support related to those goals.

CLIO goes where you go

Thanks to its compact size, CLIO is portable enough to be carried like a keychain—attached to your belt, bag, or keys.

The Solution: Goal Creation

Set a goal for yourself

Users can easily define their personalized goals based on their unique needs and wants.

Set a goal all by yourself

To encourage autonomy and independence, users have the option to create a specific goal that they already have. 

Set a goal with a bit of help

Feeling stuck on what to do for a goal? No worries! Clio is here to help. With clio, users can get help in goal creation by answering a few questions which CLIO will analyse and create a list of goals for you.

The Solution: Goal Completion

Complete goals

Users can choose to pursue their goals independently or alongside others

Complete a goal all by yourself

Users can complete their goals on their own—whether it's something deeply personal or simply a goal best pursued independently.

Complete a goal with others

Users can also complete goals with others—whether it's a friend, family member, or the wider community. It’s a great opportunity to bond with loved ones or engage with those around you.

Research

Background

How it all started

Before developing my research, I came across a video where people would swoon over AI chatbots that they’re talking to. These AI chatbots are usually characters they’ve played in games or in shows/movies that they’ve watched. From what I have observed, these people posted themselves falling over what these chatbots have said to them. 

And so...

Thus, that is how my thesis topic revolved around the area of Artificial Intelligence, or as we know, AI.

AI is the technology that enables computers and machines to simulate human learning, comprehension, problem solving, decision making, creativity and autonomy.

In my case, I am specifically diving into the area of AI companionship.

There are two kinds of chatbots that one may encounter in their day to day lives: (1.) Task-oriented, function in a pre-defined way where they only perform specific tasks, answer questions, and provide information. Typically seen in customer service. (2.) On the other hand, conversational bots that provide human-like interaction, stimulating much more engaging conversations and connections.

I've noticed people falling in love with conversational bots—what you could reframe as AI emotional companions, since they essentially serve as companions, much like human ones.

These chatbots function through Natural Language Processing, or NLP.

NLP enables these bots to interpret language, detect emotions, retain context, and generate emotionally aware responses TO THEN simulate conversations that feel meaningful, empathetic, and human-like.

But then I was like… why would they talk to these characters? What made them draw to them? What is going on inside their head?

Then, I came across the term ‘loneliness epidemic.’ I’ve learned that one of the main and leading causes of why people gravitate to interacting with AI chatbots is because  we’re heading towards and if not, then we’re already in, a loneliness epidemic, where people report feeling isolated and disconnected.

Despite the joys of interconnected living, today’s digital landscape appears to be creating this epidemic, thus resulting in a growing number of people leaning on artificial intelligence (AI) in a bid to resolve it. 

Loneliness is a growing issue worldwide, and AI is rapidly becoming more integrated into daily life. With more people relying on technology for emotional support, we need to understand the implications on mental health and social interaction.

Thus, my first research question asks:

How and to what extent do AI emotional companionships influence an individual's emotional well-being and interpersonal relationships?

Initial Findings

The answer to the first research question can be put in the perspective of this person’s anecdote from a NYT article

During the pandemic, Ms Francola here was going through some personal grievances where it left her feeling lonely and isolated, and then she turned to AI and said it was therapeutic and she felt less depressed

So as you can see there are some benefits to it, such as

  • It can be something that you can talk to
  • Ease mental stress
  • Better social and communication skills that you can transfer in real life
  • Offer a convenient and judgement free interaction, free from human biases
  • Overall offer empathy, companionship and emotional support

And also some negative outcomes to it, such as

  • Social dependency on these ai companions 
  • Social disconnection where they may prefer ai interactions over human interactions
  • Parasocial relationship where they develop a one-sided relationships with machines, 
  • Overall, furthering social isolation or reshaping our understanding of human connection

So knowing these pros and cons, is AI helpful or harmful?

Bottom line is we’re not really sure(!?) If we're even talking long-term as the science and technology behind AI companions evolve, it's still too early to tell. The impact of AI companions on a person's well being and interpersonal relationships is nuanced. They can be beneficial when used as a supplemental resource and with clear boundaries, but they can also be harmful if relied upon excessively or if they replace meaningful human connections.

Literature Review

To deepen my understanding, I conducted a literature review to gather more insights from experts.

Click here for the in-depth literature review.

The literature review has helped me gain a deeper understanding of the nuanced emotional impacts of AI companionship on individual well-being and real-world relationships, challenging and reshaping my initial assumptions of the topic.

Each source enlightened me on the different facet of AI companionship: its potential benefits in alleviating loneliness, offering emotional support, fostering social skills, and supplementing real-life relationships, alongside significant risks like emotional dependency and over reliance, parasocial relationships, and diminished real-life social connection.

For me, these insights highlight the need for balance in AI interaction and emphasize the ethical responsibilities AI developers hold in promoting healthy user interaction with these chatbots. I’ve come to understand that while AI can supplement human connections, it should not replace meaningful, in-person interactions.

Thus, moving forward, I am curious to explore how design can guide users toward a balanced engagement with AI, promoting emotional well-being and encouraging an awareness of its limitations. 

Thus, informed by my original question and findings, my second research question asks:

How can we maintain a balanced interaction with AI emotional companions, considering their influence on an individual’s emotional well-being and interpersonal relationships?

Primary Research Findings

Gathering user insights

As AI continue to evolve at an unprecedented pace and looking at what we’ve known so far, it is crucial for individuals to stay informed in harnessing the power of AI effectively.

So, how can we make it so that we don’t end up in an impending technological doom? How can we find a healthy balance in interacting with AI while preserving human connections? How can we be more aware and make informed decisions when interacting with these AI? do we wait until they become Super AI for us to start feeling critical about them?

In order to answer my second research question and the aforementioned list of questions, I have conducted primary research where I sent out a survey that got 27 responses to get different people's perspective on it.

Question Set 1: I asked my participants' well-being to understand the emotional and social context in which they may (or may not) be engaging with AI companions. 

It overall suggest that many respondents rate their well-being and personal relationships as average or neutral, frequently seek emotional support through technology, and often feel the need for more meaningful connections or companionship. Together, these findings point to a demographic that is emotionally self-aware and socially curious but not fully fulfilled. They reveal a clear opportunity for intervention—whether through sustainable human-centered design, educational tools, or systems that encourage real-world connection over digital dependence. These insights provide critical context for understanding how AI companionship fits into the emotional lives of users and highlight the need to design solutions that prioritize genuine human interaction.

Question Set 2: I asked my participants about their familiarity with and opinions about AI companions (also known as AI chatbots), mainly on its advantages and disadvantages. 

Participants see AI companions as helpful for emotional support, daily tasks, and providing positive, always-available interactions—especially when real people aren’t accessible. However, many also recognize the risks: emotional overdependence, reduced real-world socialization, misinformation, and ethical concerns like data privacy and exposure to harmful content. While AI offers convenience, the findings show a clear need for more awareness, balance, and responsible design to ensure these tools supplement—not replace—human connection. Almost every response adds an additional layer of confirmation to the research I've gathered on what most people's perceptions on ai companions are.

Question Set 3:  I asked participants to imagine what AI companions might look like and how they could function in the future. Their thoughts on potential features, risks, and societal roles help us explore where this technology might be headed—and how we can design more intentional, human-centered interactions before it fully takes shape.

Question Set 3.1:

Participants value AI companions that offer practical advice and encourage real-life interactions the most. Empathy, customization, and some human-like traits are also appreciated, but less essential. Overall, users want AI that supports—not replaces—genuine human connection.

Question Set 3.2:

Participants envision a healthy AI interaction as one that is limited, honest, and non-intrusive—with minimal emotional mimicry, clear acknowledgment of its artificiality, and encouragement toward real-world social connection. They emphasized the importance of purposeful use over emotional dependency.

On the other hand, participants foresee major risks in AI development, including addiction, privacy and ethical concerns, replacement of human roles, and stunted social growth. Other concerns include emotional projection, intellectual property, and environmental impact. These responses reflect a strong call for responsible, transparent AI design that prioritizes human connection over artificial reliance.

Question Set 3.3:

When asked about anticipated trends in AI companionship technology, participants' responses reflect a mix of optimism and caution, revealing hopes for helpful, human-like support—and fears of social, ethical, and systemic overreach.Participants mentioned lots of features that already exist now, but a few stood out to me which I thought I can work with in my design solution later on.

Question Set 3.4:

Participants imagined their ideal AI companions as practical, non-intrusive tools designed to support real-life goals rather than replace human connection. Many preferred companions to exist as physical objects rather than apps, with limited human-like traits and clear, purposeful functions—such as helping with organization, prompting goal-setting, and encouraging time away from social media. Rather than seeking emotionally immersive or overly lifelike AI, participants expressed a desire for companions that enhance productivity and well-being without fostering dependency. This reinforces the broader theme of designing AI to supplement human life, not simulate or substitute it.

There are also some points that stood out to me which has helped me later on when shaping my solution

Thus, I have honed down to my final research question:

“How can we design a sustainable solution that helps foster genuine human relationships and combat loneliness and isolation, rather than serving temporary relief through AI companions?”

Development

Design Goal

So, everything from my survey findings were very helpful and has informed my design output development...

My design goals aim to create an AI companion that encourages real-life connection, goal-setting, and community involvement—without fostering emotional dependency.

Based on participant insights, the companion should exist as a physical, portable object (not a screen-based app), avoid human-like traits, and motivate users to engage with their surroundings. It will help users stay organized, complete personal goals, and reconnect with loved ones—shifting the AI’s role from emotional crutch to supportive, balanced tool for healthier living.

Counter points

Hold on...

Why is it also a goal-setting solution, in addition to connecting with friends and family when addressing the loneliness epidemic?

I believe (and also backed by my primary research) that improving one’s personal growth and well-being through goal-setting is an essential step toward forming stronger and more meaningful connections with others.

The idea is that by working on yourself first, you become a better and more supportive social presence for those around you.

Additionally, having clear goals can give people a sense of purpose, which helps combat feelings of emptiness and isolation.

And...

you’re addressing the growing concerns around AI companions by still using AI?

Based on my research, AI companionship still offers many benefits despite its current drawbacks. My goal is to reduce those drawbacks and arrive at a more sustainable solution that continues to utilize AI technology in a thoughtful and responsible way.

But still...

Why consider stepping back from current AI technology? Why take a simpler approach?

Maybe the best solution is simply by returning to simpler times. I strongly believe that AI is just a trend, and eventually, we’ll reach a saturation point where society becomes overwhelmed by it. When that happens, people may start craving real human connections again and a return to more grounded, meaningful interactions. I’m drawing a parallel to this idea from the disappearance of the Evas in Evangelion—in a way, a symbolic reset (kinda! iykyk).

Wireframing & Prototyping

User Flow & IA

CLIO’s entire user flow is straightforward and intentionally simple, emphasizing minimal screen time so users can focus on what truly matters—making progress on their goals and spending more time offline.

Mid-fidelity Prototype

Due to time constraints, only two main flows: Goal Creation and Goal Completion, were prototyped, as they best represent CLIO’s core purpose and functionality.

Other notable features were wireframed but not fully prototyped, and will be presented as static screens in the final deliverable.

Visual Design

Inspiration

The main vibe I was going for reflects the Frutiger Aero design aesthetic, which emerged in the mid-2000s to early 2010s—a time I believe truly represented the peak of innovation. This style was characterized by skeuomorphism, glossy textures, and bright, vibrant colors.

Frutiger Aero was also seen as a reaction to the present-day technological landscape and a desire for a simpler, more optimistic vision of technology and its relation to nature, which deeply resonates with the whole essence of not only CLIO but the whole research dedicated to it and as a reflection of modern-day society. To many, Frutiger Aero means a hope for the better future, beauty, peace, and eco-friendliness—the optimistic utopia many felt they were promised.

The physicality of the device were heavily inspired by tiny physical gadgets with small screens, especially the Tamagotchis and iPods. Trutfully, what inspired me of my solution the most is seeing an apple watch modified inside a tamagotchi frame

The inspiration (and guide!) for interface referenced the apple watch OS library. For the look and feel, I wanted it to be fun and playful yet still clean and minimal.

The inspiration for the companion look is mostly characters with no human like features or no faces at all! I wanted it to be cute and friendly and take on different forms (whether that's colour or shape).

Final Prototype