Sarah Chen

Cosmos Institute Grant Recipient 2025

Could you be friends with a robot?

Our relationship with technology is complicated, even more so when the technology starts talking back. While artificial intelligence agents offer us emotional support, friendship, and entertainment - they come with their own downsides: reliance, emotional manipulation, and a lack of regulation.

However, as artificial intelligence agents become more intelligent and complex, our perception of our relationship with them may change from one of service to one, at least on our end, of friendship.

“Et tu, Brute?” aims to investigate if we are starting to view artificial intelligence friendship as an emotional possibility by using human friendship and human betrayal mechanisms, as comparative starting points through an interactive text narrative game.

CSAF Artifact from Design Tomorrow, an in-person technology simulation by Sarah Chen

CSAF Artifact from Design Tomorrow, an in-person technology simulation by Sarah Chen

A Brief Overview of the State of AI Companionship

The market for AI companionship has exploded since the advent of LLMs.

In September 2024, SensorTower reported that average users of some of the top chatbot apps like Character.AI, Chai, or Poli were spending over an hour a day talking in the app.$^1$

Based on the U.S. Google Play Store, as of February 2025, these are some of the top AI chatbot apps.

image.png

These are the ones designed to be your companion, so general LLMs like ChatGPT and Claude are absent.

These AIs are designed to form emotional attachment. In an analysis of shared language and features in the description of the apps, most apps promised personalization and that their app was almost/as good as/or like a human.

image.png

This focus on emotional attachment is a common feature in AI chatbot development. For instance, in a 2020 publication on “The Design and Implementation of XiaoIce, an Empathetic Social Chatbot,” the authors note that XiaoIce is “uniquely designed as an artifical intelligence companion with an emotional connection to satisfy the human need for communication, affection, and social belonging.”$^2$

However, these AI chatbots currently exist with little emotional safeguards or regulation, and there have been several serious incidents of chatbots encouraging self-harm or harm to others.$^3$

With AI chatbots becoming more popular, more compelling, and more ‘human-like’, understanding our perception of AI friendship and its parallels to human friendship is critical to defining our relationship going forward.

Why Design an Interactive Text Narrative?

Do you know what you would save in a burning fire?

You might be able to say “Oh, I’d save my family/pet/essential papers/valuables,” but the reality is, you don’t know what you would do until you’re in a burning fire.

Now, ethical guidelines for research do not allow us to put participants in a burning fire (this is a joke). However, asking participants what they would save in a burning fire still gets to a bit of the issue of: if you don’t know or you’re not sure because you’ve never been put in that situation, how do we know if that information you give us is the truest to reality?

Games, through their ability to evoke emotion and implicitly test moral compasses, offer a path forward.

Do you know what you would save in a burning fire if everything you don’t save burns?