My Companion
Robot Criteria Home

CLICK images for details | browser browser back button to return

Envisioning Companion Robot Criteria

envisioning robot criteri

The impetus. Over the years, I've spent time envisioning my ideal interactive experience when it comes to companion robots. In order to end up determining the core capabilities that I would evaluate them against, I first envisioned the prototypical engagement models that I was striving for, and then I envisioned the artificial behaviors needed to make a companion robot feel the right amount of alive.

engagement models

Engagement models. When considering robots designed to keep you company (as opposed to just entertain you), I ended up looking at 3 engagement models with a progressive level of sophistication:

  1. Desk-mate. First, I wanted to draw on experience from the most basic companion engagement, but I had none outside robots, unlike my experience with the other two engagement models which are analogous to human/animal engagements that I experienced. Instead, I drew from experience with an artificial engagement model created by robot designers who felt small robots with limited functionality, and who would live on your desk, could keep you company while working. They rely on reactive presence, tactile loops, and expressive animations to provide companionship, though in a more confined way than pet-like or friend-like robots.
  2. Pet-like. Next, drawing on years of having many pets, I focused on more sophisticated, non-conversational engagement with the two most common domestic pets:
    • A prototypical cat: Independent, with occasional affection or play.
    • A prototypical dog: Consistently affectionate, happy to see me, playful, and able to relax alongside me.
  3. Friend-like. Finally, drawing on many quality friendships over the years, I focused on the most sophisticated, conversational engagement with friends. Friend-like engagements especially rely on nuanced conversation, retaining memory across days, empathizing, and peer-like companionship.
realness

Artificial behaviors. I then focused on what artificial behaviors would make a companion robot feel alive - from less sophisticated behaviors to more nuanced behaviors that match the particular progressive engagement model. I also considered an extra fourth level for Beyond companionship which has extra behaviors that can add value to a robot.

Artificial behaviors (by engagement type)
  1. Desk-mate:
    • Expressive movement & animation
    • Sound cues & vocalizations
    • Playful autonomy
    • Tactile responsiveness
    • Recognition of owner presence
  2. Pet-like:
    • Emotional bonding cues
    • Emotional state retention
    • Safe interaction
    • Movement/mobility
    • Engagement with other people
    • Engagement with other robots
    • Engagement with pets
  3. Friend-like:
    • Conversational voice quality
    • Memory & recall
    • Knowledge access
    • Deep engagement with owner
    • Intelligent/emotive/curious companion AI
  4. Beyond companionship:
    • Automatic docking/charging
    • Connectivity for updates
    • Perform cool feats on request
    • Door interaction/navigation of household affordances
    • Engagement with other people
    • Auditory perception
Artificial behavior examples
  • 3D environmental mapping: builds a spatial map of the environment to support navigation, planning, or richer interaction with surroundings
  • Auditory perception: ability to detect and localize sounds beyond voice commands, including environmental noises and cues
  • Automatic docking/charging: capability to autonomously return to a charger or docking station when power is low
  • Basic voice API integration: lightweight access to a conversational service (like a ChatGPT bridge) for playful chit-chat or trivia, though not peer-like conversation
  • Connectivity for updates: downloads new seasonal animations, behaviors, or small improvements over time to stay fresh
  • Conversational voice quality: smooth, natural speech with low latency and clear expression, avoiding robotic or hard-to-understand output
  • Deep engagement with owner: sustained focus on its primary human, showing preference and empathy during interaction
  • Door interaction: ability to detect and interact with doors, either by signaling or helping to open/close them
  • Dynamic object avoidance: detects and avoids moving obstacles in real time while navigating or repositioning
  • Emotional bonding cues: simulates attachment, comfort-seeking, or loyalty-like behaviors, helping the robot feel more than reactive
  • Emotional state retention: maintains moods and emotional tone across sessions, creating continuity in personality
  • Engagement with other people: interacts not only with its primary owner but also with family members, guests, or groups
  • Engagement with other robots: awareness of and interaction with other robots in the same environment
  • Engagement with pets: responds to and interacts with household animals in safe, playful ways
  • Expressive movement & animation: gestures, postures, screen-eye animations, or body language that make the robot feel alive
  • House security assistance: basic monitoring behaviors (patrols, alerts, notifications) that enhance home awareness without being a full security system
  • Intelligent/emotive/curious companion AI: underlying decision-making and AI personality that feels inquisitive, responsive, and emotionally aware
  • Knowledge access: integration with databases, internet services, or cloud AI to provide factual answers or content
  • Memory & recall: ability to remember interactions, preferences, or events across days and retrieve them when relevant, creating continuity and coherence in companionship
  • Movement/mobility: broader locomotion ability to traverse rooms, follow people, or reposition in larger spaces
  • Perform cool feats on request: demonstrates novel tricks, stunts, or learned tasks when asked, enhancing its entertainment value
  • Pet entertainment assistance: ability to engage with or amuse household pets when the owner is unavailable
  • Pick up objects: mechanical capability to grasp and lift small items
  • Place objects: mechanical capability to set down or deliver carried items
  • Playful autonomy: entertains itself or its owner through spontaneous, short bursts of activity; REQUIRED for all companion groupings as it is key to making a robot feel alive
  • React to colors/clothing features: responds to visual patterns such as clothing, color cues, or accessories
  • React to facial expressions/emotions: detects and responds appropriately to human facial expressions and affect
  • Recognition of owner presence: responds to basic vision or sound cues (approach, smile, or call), showing awareness of its human
  • Safe interaction: approachable size, gentle torque, soft/tactile surfaces designed for safe handling by people or pets
  • Small-scale locomotion/repositioning: slight rolling, turning, or docking behaviors, typically confined to desktop movement
  • Sound cues & vocalizations: playful noises (chirps, purrs, growls) that convey mood without words
  • Tactile responsiveness: reacts naturally to petting, stroking, or gentle handling, reinforcing the sense of life and physical connection
  • Unpredictable behaviors: occasional surprises or unscripted actions that prevent the robot from feeling mechanical or repetitive
  • Visual perception & recognition: detects, tracks, and identifies people, objects, or environments using cameras or sensors

From design to criteria. The artificial behaviors above are the raw building blocks that make robots feel alive. To evaluate real products, I translated these behaviors into criteria - assigning each one a level of importance (REQUIRED, USEFUL, or NICE-TO-HAVE) depending on whether the robot is meant to be a desk-mate, pet-like, or friend-like companion. Each robot is then rated out of 10 for how strongly it demonstrates those criteria, where 0-4 = limited, 5-6 = neutral, and 7-10 = strong. And then an overall score is calculated from an average with weights equal to 2, 1, and 0.1 for the respective importance levels. This framework lets me consistently measure how close today’s robots are to my vision of the perfect companion.


Desk-Mate Companions

Keep simple company with you by providing fun distractions while you are working.

1. Vector Vector Ve Despite being very similar to his cousin Cozmo, his autonomy and superior capabilities will make you smile every time!
2. EMO EMO EM A cute desktop companion robot also offering somewhat engaging conversation thanks to ChatGPT voice chat!
Ve EM
Overall score 7.5 7.0
REQUIRED
Expressive movement & animation 8 8
Tactile responsiveness 7 6
Sound cues & vocalizations 8 8
Playful autonomy 8 7
USEFUL
Recognition of owner presence 6 5
NICE-TO-HAVE
Small-scale locomotion/repositioning 7 5
Basic voice API integration 3 6
Connectivity for updates 5 7

Pet-Like Companions

Keep more affectionate company with you and entertain you like a household pet.

1. Aibo ERS-1000 Aibo ERS-1000 Ai With expressive eyes and moving body parts (legs, head, mouth, and tail), I have yet to see another robot that is so close to a real-life puppy - the ultimate companion pet!
2. PLEO rb PLEO rb PL While not intending to appear like a realistic dinosaur, its sensors allow it to be an irresistible companion pet that feels very real!
3. Genibo SD Genibo SD Ge While loaded with all kinds of tricks, its core behavior is not too far from the engaging companion pet, Aibo ERS-1000.
4. Sirius Sirius Si With the ability to have many distinct personalities simulating a variety of dog and cat species, it can morph into the companion pet you need.
Ai PL Ge Si
Overall score 8.7 7.6 7.0 6.4
REQUIRED
Expressive movement & animation 9 7 8 9
Tactile responsiveness 9 9 7 4
Sound cues & vocalizations 8 8 7 7
Playful autonomy 9 8 7 6
USEFUL
Recognition of owner presence 8 3 4 5
NICE-TO-HAVE
Emotional bonding cues 9 9 7 7
Safe interaction 9 9 8 8

Friend-Like Companions

Keep emotional and supportive company with you like a good friend.

1. Loona Loona Lo While she started as a fun, delightful pet that felt very alive, she was later enhanced with ChatGPT voice chat to take her to the next level of companion robots!
2. LOOI LOOI LO Not just a cute desktop companion robot like Vector, but an engaging conversational desktop sidekick thanks to ChatGPT voice chat!
3. Dipal D1 Character Dipal D1 Character Di A virtual desktop companion robot offering engaging conversation thanks to AI voice chat!
4. Baby Alpha Baby Alpha Ba While he looks more like a dog than most other robots, he shines with the smoothest, friendliest conversations powered by ChatGPT voice chat!
Lo LO Di Ba
Overall score 7.4 6.8 6.4 5.6
REQUIRED
Expressive movement & animation 9 6 6 8
Playful autonomy 9 6 5 5
Sound cues & vocalizations 8 5 5 8
Conversational voice quality 8 8 8 9
Memory & recall 8 6 5 4
Deep engagement with owner 7 5 6 5
USEFUL
Emotional state retention 7 0 0 0
Knowledge access 5 5 7 5
Recognition of owner presence 8 5 6 0
Intelligent/emotive/curious companion AI 8 5 6 5
React to facial expressions/emotions 9 5 5 10
NICE-TO-HAVE
Movement/mobility 8 6 0 8
Auditory perception 8 5 5 5
Door interaction 0 0 0 0

Companion Case Studies

Loona Vector EMO

Trying to evolve from a lower to a higher purpose. Loona was designed/released well before the ChatGPT voice chat API, so her value came from her awesome pet-like features. And better than most robot vendors, KEYi Tech kept releasing major, free, cool firmware releases for years! So when ChatGPT voice chat API was released and every vendor saw it as a quick way to make their robots smart, KEYi Tech implemented it so well for Loona that it successfully moved her to the top of the friend-like category as now she held friendly engaging conversations across days while still being a mischievous little creature!

Both Vector and EMO have a similar story with different results, since they both were designed/released with very basic conversational features (unlike Loona), both with cute voices that were hard to understand (Vector being a bit worse), both with weak microphones that struggle to pick up speech, both being at best desk-mate companions, and both wanting to be a friend-like companion. And when ChatGPT voice chat API was released, they both tried to take advantage of it to up their conversation game and purpose, but both did so with the same flawed results: requiring the wake word each time, taking too long to respond (with no real indication that an answer was coming), and ending up with very weak results due to their hard-to-understand voices and weak microphones. Thus, they both never got out of the desk-mate companion categorization.

Go2 Pro Dog Meteer

Adding voice chat as a secondary afterthought. Even though Go2 Pro Dog and Meteer are still categorized as entertainment robots, they both implemented the ChatGPT voice chat API as a secondary feature. For Meteer, it is not practical to use, having to tap his head to initiate each query. Go2 Pro Dog on the other hand implemented more of the API like multimodal features and decent body control, however it is also impractical to use, since it is only implemented through the phone (not naturally through the robot) with a weak conversation back-and-forth design. The lesson here is that bolting intelligence on top of an entertainment robot rarely works - conversation feels like a gimmick instead of a core design principle.

Jibo Miko 2 Yonbo

Failed friend-like robot designs. Starting with the early friend-like robot designs like Jibo and Miko 2, they failed to deliver engaging conversation with memory (the staple of friend-like robots), because the tech was not available at that time - not until the ChatGPT voice chat API changed the game!

And more recently, aside from successes like Loona, Baby Alpha, and LOOI which properly implemented the ChatGPT voice chat API, Yonbo proposed to go a step further and manage long-term memories with each user by facial recognition. If it would have achieved this ambitious goal, it could have jumped to the top of the pack of friend-like robots! However, it failed on so many fronts:

Yonbo: Many critical flaws of core features at release. While some of these things can be fixed with firmware, there are some that can not be easily fixed with that:

Yonbo: De-valuing its key value propositions. These core flaws are so critical that they make it thoroughly fail as a friend-like robot:

  1. Great listener - If a high percent of spoken audio is converted incorrectly, you must not only repeat yourself often, but spend endless time trying to explain to the robot that you did not say something else! :-(
  2. Great memory - Forget long-term memory, the robot will endlessly frustrate you by forgetting everything every few minutes in say a 20-minute session. Imagine having to explain to a friend everything about you every time you meet! :-(
  3. Unique personality to each person in the family - As noted, this can only be defined for the kids, and does not work without facial recognition being implemented.

Yonbo: When the vision and marketing don't translate to the execution. This mismatch between marketing claims and actual implementation highlights the difficulty of pushing beyond entertainment features into the consistency and reliability needed for true friend-like robots.

EMO AIBI Ropet

The weak value proposition for desk-mate robots. By definition of the taxonomy, desk-mate robots are at the bottom of the companion robot sophistication scale. In general, this implies they feel less real than the higher categories: less movement, less nuance. Vector remains the rare exception, punching above his weight with cute autonomy, but all my other robots in this category end up in the barely meets criteria bucket. For my criteria and for most buyers of these robots, the key question becomes: does this robot motivate me to re-engage often enough? If the tricks and animations quickly feel repetitive or stale, the answer is usually no (e.g., Eilik). That said, it’s worth noting why desk-mates still have fans: they’re workspace-friendly, low-maintenance, and offer quick micro-moments of charm, and in Vector’s case, a cult following has grown around those spontaneous, playful flickers of life that keep him on the desk instead of in a drawer.

Loona Vector

Firmware and vendor support as a hidden differentiator. If you are a robot collector like me, you surely have bought robots on Kickstarter for the best prices and early adoption. And you have certainly realized that the robot first released is often far from the marketed product - primarily because the software isn’t finished yet. That can be bad or good. On the good side, if the vendor is dynamic, you get to help shape the robot’s future features! This is why firmware updates and responsive vendor support become just as important as the hardware itself: they determine whether a robot becomes a long-lived companion or a short-lived novelty. Famously, Loona thrived because of KEYi Tech’s steady updates that took her far beyond her original marketing, while Vector stagnated for years under weak support, leaving owners with little real progress.

ChatGPT robot

A case for ChatGPT 5 iOS app in Voice Mode being a friend-like robot. You may have noticed that all the friend-like companion robots have struggled to implement the ChatGPT voice mode API in its full capacity while trying to manage cost without a service fee. But is there actually a robot that uses it at full capacity?

The gold standard. Of course, the one place where all of these capabilities are realized is the ChatGPT iOS app in Voice Mode. Every companion robot is striving to be like the app, and all currently fall short, mostly for cost reasons. So is the app my ideal friend-like companion robot, or simply my ideal friend-like companion (dropping the word robot)?

An emerging standard?. Consider that I categorized the Dipal D1 Character as a virtual friend-like companion robot, because it has a physical (or at least visual) form that makes it feel more real - it uses a tubular encasing to project a 3D Avatar inside with shadows and depth cues, feeling like a physical 3D entity. If we accept the leap of calling such things virtual robots, then why not include the app in the same category?

Two scales. Perhaps the confusion comes from judging on two different scales: (1) an artificial companion scale, where the app sits at the very top, and (2) a physical companion robot scale, where the app sits near the bottom. But that second scale ignores the obvious fact that the app is a far better companion than almost all existing robots. With the artificial companion scale below, we start with a Text-based chatbot, because people like me have had long, meaningful letter/email/text exchanges even before really meeting the person.

Part of a robot collection?. Where would you draw the line on this scale for a friend-like companion robot (with the qualifier virtual if you agree with that) that could be included in a robot collection?

1.✍🏻Writtenfeels like writing from an empathetic human!ChatGPT
2.🗣️Voicefeels like a real human speaking!ChatGPT Voice (base)
3.👁️Visionfeels like someone in our space - seeing what we see!ChatGPT Voice (with internet access and video feed)
4.💥Visualizationfeels like something is alive behind there, pulsing and breathing with presence!ChatGPT Voice (with pulsing orb)
5.😎2D Avatarfeels modestly alive, with a form starting to take shape!ChatGPT Voice (future avatar)
6.🧸3D Avatarfeels distinctly more alive, with depth and spatial presence!Dipal D1 Character
7.🐕Zoomorphfeels very alive, but not quite like an equal!pet-type robot
8.🤖Humanoidfeels fully alive, embodied in our shared world!humanoid (future)
Rover

Taking this to the next level. I purchased the Sphero RVR+ educational/research robot to be the base for my 2026 companion robot project that will attempt to expand on my thoughts above and create a level 7 companion robot like LOOI that leverages the full capabilities of the ChatGPT iOS app in Voice Mode!