Artificial human companion
An artificial human companion is a device or application designed to simulate companionship through social, emotional, or relational interaction. Examples of these systems include conversational agents, chatbots, digital pets, virtual avatars, or physically embodied robots.
Unlike task-oriented digital assistants, artificial human companions' capabilities include not just practical support but also often provide emotional presence, maintain ongoing social engagement, and/or cultivate interactions that resemble interpersonal relationships. They can engage in natural and dynamic conversations, provide assistance, offer companionship, and even perform tasks like scheduling or information retrieval. Their capabilities have expanded significantly with advances in artificial intelligence, large language models, affective computing, and social robotics.
Definition and categories
In fields such as human–robot interaction and social robotics, artificial companions are defined as non-living, machine based entities created for human engagement. In this article, "artificial" refers to something that is humanly constructed, based on a natural model or achieved through the manipulation of natural processes, and maintains the ability to exist and act/operate/behave in an open-ended environment without human control, independent of its material composition. This definition does not imply a particular level of autonomy or learning capacity as a result, both advanced generative AI companions and simple rule based devices would be included.Categories
Researchers have differing methods of classifying artificial companions; one framework groups them by their interaction type which is how they communicate and work with people.| Interaction type | Description |
| Physical | This involves "robot" companions designed for direct physical engagement with humans or the environment. This type includes assistive robots deployed in geriatric healthcare, such as systems that help older adults with daily tasks, as well as therapeutic robot animals used with children with special needs. |
| Virtual | Virtual interaction based artificial companions exist solely in digital environments, typically as autonomous avatars within virtual, augmented, or mixed reality systems. |
| Conversational | Conversational companions interact through natural language such as text, speech, or multimodal dialogue interfaces. This category includes large language model chatbots where the prompt is usually written by the user, and voice commanded assistants. These chatbots have also been used as search engines, for tutoring, and for task support. |
History
Early conversational agents (1960s-1980s)
The earliest artificial agents/companions were rule-based conversational programs. One of the first of these programs was ELIZA which was created in 1966 by Joseph Weizenbaum at MIT. ELIZA simulated a Rogerian psychotherapist using simple rules around pattern matching to reflect user statements. Although these interactions were driven by pattern matching that gave an illusion of understanding and ELIZA could not actually comprehend meaning, users sometimes attributed understanding to ELIZA. This phenomenon is known as the "ELIZA effect."In 1972, PARRY, another rule-based conversational program, was developed to simulate a patient with schizophrenia. PARRY was considered to be more advanced than ELIZA due to having a "personality" and improved controlling structure. Like ELIZA, PARRY had low capabilities in regards to understanding and comprehension, had minimal ability to express emotion, and could not learn from the conversations it engaged in.
First artificial intelligence tools and the emergence of chatbots (1980s - 1990s)
In the late 1980s and early 1990s, artificial intelligence began to emerge; it was used in chatbots starting in 1988 with the introduction of Jabberwacky. Jabberwacky used CleverScript, a spreadsheet based language, and, like previous early chatbot agents, used pattern matching, but unlike previous agents, it employed contextual pattern matching based on previous conversationsThe term "chatterbot" was first used in 1991 and it referred to an artificial player in the TINYMUD virtual world whose main role was to chat with real human players. Not only did many real human players seem to prefer talking to Chatterbot than a real player, many players assumed the chatterbot was a real human player.
Another milestone came with the development of Dr. Sbaitso in 1991. While not sophisticated in its conversational design, through its utilization of sound cards, Dr.Sbaitso was able to synthesize human speech which went beyond the purely text-based interactions of its predecessors.
In 1995, ALICE, the first online chatbot inspired by ELIZA, was created. ALICE used a new language, Artificial Intelligence Markup Language, which consisted of about 41,000 templates and related patterns expanding its discussion capability. Although it couldn't generate human emotions and attitudes, ALICE became one of the most influential early online chatbots later winning the Loebner Prize.
Consumer robotics and digital companions (1990s–2000s)
The late 1990s and early 2000s saw the rise of digital and robotic pet companions that encouraged caregiving behaviors and daily interaction with artificial companions. In 1996, Tamagotchi was released and, for a time, was referred to as the "world's most popular toy." Tamagotchi was a handheld digital pet that required users to attend to a virtual creature's needs through frequent interaction. Depending on the care it received, the Tamagotchi would respond differently to its user and develop characteristics that reflected the quality of care provided, reinforcing a sense of responsibility and emotional investment.Another influential interactive toy was the Furby which was released by Hasbro in 1998. Furby was one of the first interactive robotic toys to achieve widespread commercial success. The toy was designed to encourage nurturing behavior by responding to people's actions such as petting, feeding motions, and touch and was even marketed as being able to "learn" English since it initially spoke in its own constructed language, known as Furbish, but was programmed to gradually incorporate English words over time. Furby could display various behavioral states, including showing indications of hunger, sickness, drowsiness, and excitement, and could perform actions such as dancing or snoring, creating the impression of a developing personality. Later relaunches even introduced LCD eyes and additional sensors that allowed the toy to adjust its behavior based on user interaction which was intended to enhance the toy's lifelike qualities and increase the sophistication of its responsive behavior.
Sony's AIBO, introduced in 1999, used autonomous movement, touch sensors, simple vision processing, and expressive behaviors to simulate aspects of real pets. AIBO demonstrated that consumers were willing to form emotional attachments to artificial companions and contributed to the mainstream visibility of robotic companionship.
Outside of entertainment, social robotics also made its way into other industries particularly in healthcare and elder care. PARO, a therapeutic seal created in 2004 by Japanese engineer, Dr Takanori Shibata, was developed for use with older adults and people with dementia. Some studies showed that PARO was associated with improved mood, reduced agitation, and increased social engagement amongst dementia patients.
Although limited in computational complexity, these systems showed that artificial human companions could evoke sustained engagement, routine caregiving, and emotional investment from consumers. Early digital-pet devices laid the groundwork for later developments in social robotics and demonstrated the potential of artificial agents to serve relational and entertainment functions.
Mainstreaming of consumer chatbots and virtual assistants (2000s–2010s)
In parallel with expansion of consumer interest in social robotics, the 2000s also saw significant progress in the development and expansion of chatbot usage. In 2001, chatbot technology advanced significantly with the introduction of SmarterChild, a conversational agent available on messaging platforms such as AOL Instant Messenger and MSN Messenger. Unlike earlier chatbots, SmarterChild could assist users with practical daily tasks by retrieving information on movie times, sports scores, stock prices, news, and weather which marked a developmental shift in machine intelligence and human-computer interaction.The evolution of AI chatbots accelerated with the emergence of smart personal voice assistants, which were integrated into smartphones and dedicated smart-home speakers. Popular examples include Apple's Siri, IBM Watson Assistant, Google Assistant, Microsoft Cortana, and Amazon Alexa. Unlike earlier text-based chatbots, these assistants rely on internet connectivity and could generate faster, more contextually relevant responses.
In 2016, chatbots also became popular on social-media platforms as platforms enabled developers to develop chatbots to help customers perform different tasks. At the end of 2016, 34,000 chatbots were available across different fields including marketing, healthcare, entertainment, and education.
In 2014, Microsoft XiaoIce was launched in China whose primary design goal was to be an AI companion with which users could develop long term emotional connection satisfying the human need for sociability.
Expansion of embodied companionship (2010–2020)
This period introduced more sophisticated social robots and embodied conversational agents. In 2014, Jibo was announced as one of the first social robots designed specifically for private consumers. Jibo was marketed as a "family robot," and was intended to live in users' homes, form ongoing social relationships, and function as a personable assistant. Although Jibo did not achieve long-term commercial success and was eventually discontinued, its design represented a notable shift in social robotics toward companionship-oriented systems aimed at everyday domestic use and demonstrated growing interest in robots that could engage people through personality, emotion cues, and sociable behavior.Also in 2014, SoftBank Robotics introduced Pepper which was one of the first mass-produced humanoid robots. Pepper was designed to engage in social interaction and was capable of exhibiting body language and interacting with its surroundings. Pepper could also interpret facial expressions, vocal tone, and other affective cues, using built-in algorithms for emotion and speech recognition to initiate and sustain interaction. Pepper features a range of multimodal communication capabilities that enabled it to interact with people in a more natural and socially responsive manner.
Realbotix, a division of the company RealDoll, develops highly anthropomorphic robotic companions that combine physical embodiment with advanced conversational AI. Its flagship system, Harmony, was designed as a customizable, human-scale companion capable of delivering consistent interaction through both virtual and physical interfaces. Harmony's software emphasizes long-term conversational engagement and customizability. Although often discussed in the context of intimacy robotics, Realbotix has also produced less sexualized busts and humanoid forms intended for mainstream applications such as personal assistance, companionship, and interactive customer service.
In terms of the development of artificial intelligence and social robots, De Greeff and Belpaeme wrote in 2015 that the social learning ability of social robots has been improved and may further develop in the coming decades. Research has shown that social robots are typically designed with certain role characteristics to promote anthropomorphism in human interaction and encourage an interactive style that is in line with natural human communication. The appearance and behavior of robots can enhance people's understanding of their social agent attributes when interacting with them, rather than treating them as ordinary devices. This research result indicates that artificial intelligence is being used to enhance the language and social interaction abilities of technology and robots, in order to better support human communication and provide assistive functions.