
"In 1966, MIT's Joseph Weizenbaum created ELIZA, a conversational AI that mimicked a Rogerian therapist using simple pattern matching. Despite its limited design, many users felt understood and emotionally "seen," attributing empathy and intent to the system, a phenomenon later called the "ELIZA effect". From the very start, people related to even basic conversational software as if it had a mind of its own."
"Mailchimp's playful feedback or Slack's quirky copywriting showed how tone and character could give otherwise functional software a distinctive emotional presence. Design theory reinforces this! Norman's emotional design theory shows that products which evoke emotion foster attachment and loyalty, while anthropomorphismexplains our instinct to assign human traits to machines. Sherry Turkle's research at MIT shows that people form "artificial intimacy" with conversational agents assigning them emotions, motives, and moral qualities as if they were caring entities rather than tools."
ELIZA revealed that users project empathy and intent onto conversational systems even when responses use simple pattern matching. Brand and product design leverage tone and character to create emotional presence and differentiation. Emotional design and anthropomorphism explain why users form attachments and attribute human traits to machines. Research documents people forming artificial intimacy with conversational agents and assigning them emotions, motives, and moral qualities. Modern AI efforts embed values and ethical reasoning to shape model character. Designing personality therefore operates as a core interaction layer that influences trust, adoption, and long-term user relationships across applications such as virtual pets, companion robots, and AI tutors.
Read at Medium
Unable to calculate read time
Collection
[
|
...
]