A chatbot’s personality forms the basis for the “human” user experience. From start to finish, the way a bot says hello and the way it responds to our frustrations can be phrased in many different ways.
Should it do an informal “Heya :-)” or a formal “Salutations”?
At first glance, this seems trivial. You may wonder why bots can’t just “act like bots”: be functional and get our things done. Well, this article will explore why emotions are crucial to AI and why, instead of killing us all, they will give us more to live for.
There are three fundamental reasons why personalities and emotional intelligence matter in chatbots:
- Chatbots are an extension of a company’s brand image and customer service. Its personality should, therefore, be congruent with its corporate image
- Personality improves the experience and can be used as a differentiator
- How boring would bots be if they didn’t have personality?!
Personality types depend on context. The context for business chatbots will be defined by the industry they are in. Simply put, different sectors suit different personalities.
- Here is a starting point with descriptors that match different industries:
- Banking and insurance – “robust, secure, professional”
- Millennial shopping – “open, hip, understanding”
- High-end shopping – “professional, helpful, knowledgeable”
- Kids – “engaging, colourful, patient”
- Health – “patient, calm, caring”
- Tech startup – “smart, quirky, high-tech”
- Marketing / PR – “creative, proactive, charming”
Keep these descriptors in mind when creating your chatbot personality. Matching a personality to an industry will make user experiences more immersive and engaging, not to mention more fun…
More than a personality
The way a bot speaks to you isn’t the only thing you can change. The entire UX and UI of the platform should be consistent with its personality. Imagine talking to your bank’s bot and it’s rainbow-coloured and uses Comic Sans font with a lot of exclamation points in its dialogue. Nobody would trust that bot with their money! That theme would suit a kid’s educational bot that is designed to be fun and engaging. A banking bot needs to feel secure and professional: darker colours with a smart font that matches a reliable personality.
How far should we go?
Bots can act more like humans when they have defined and congruent personalities. But how much do we want them to mimic us? Ambit’s approach is for bots to have a personality built around a number of different variables, but to also be transparent to the user that they are not humans. Chatbots need personalities, but we don’t want them to fool us into believing they’re actually humans.
Why? Ambit have found that when users are unsure whether they’re talking to a bot or a human, they start asking questions unrelated to its purpose to test them: “How old are you?, “What is the meaning of life?”, and “Where should I hide this body?” (Siri can actually answer this), instead of say “What is my bank balance?” for a fintech chatbot. It also leads to frustration as we don’t give humans as much leniency for errors:
Bots should make it clear that they are bots. If users think this bot is a human, they’re less likely to realise that they changed the subject and confused the bot. Acting like a bot (and not pretending to be a human) means that if they mess up, we can understand why, and adjust our questions or simply ask the question in a different way. Ambit chatbots learn from their mistakes, and in the future, the same bot will be able to handle the situation.
Another hidden benefit of “bots acting like bots” for businesses is that it never feels like you’re being sold on anything. In other words, rather than a person who could sweet-talk their way into an upsell – “Would you like fries with that?”, we know that bots are only coming up with recommendations based on algorithms. There’s no emotion in it, just pure commerce. Humans don’t feel “played” by a bot.
Bots with developed personalities, without pretending to be human, create higher value for businesses and consumers alike. Maybe the “artificial” in AI will one day be “emotional”.