



Recruiting News Network
Recruiting
News
OperationsThe Recruiting Worx PodcastMoney + InvestmentsCareer AdviceWorld
Tech
DEI
People
People on the Move
The Leaders
The Makers
People
People on the Move
The Leaders
The Makers
Brand +
Marketing
Events
Labor +
Economics
SUBSCRIBE





Technology

‘Sorry, I Don’t Understand That’ – the Trouble With Chatbots and How to Use Them Better

Lena Waizenegger and Angsana Techatassanasoontorn

January 4, 2022

Technology

‘Sorry, I Don’t Understand That’ – the Trouble With Chatbots and How to Use Them Better

Lena Waizenegger and Angsana Techatassanasoontorn

January 4, 2022

Photo by Tim Gouw on Unsplash

Hands up if you’ve ever cursed, mocked or yelled at a chatbot. No surprise if you have. Those automated “helpers” – supposedly designed to make customer service smarter, faster and more efficient – can certainly be a source of frustration for sentient beings.

Interactions with chatbots have become increasingly common in our daily lives. But when asking for information or trying to solve a problem, we’re often annoyed when the chatbot either can’t understand or misinterprets our inquiry.

Even worse is when it advises us to contact the call centre or visit a web page, which defeats the purpose of using chatbots in the first place.

There are two main reasons for negative user experiences. First, organisations often present the chatbot as too “human”, leading to unrealistic expectations about the chatbot’s ability to understand human language, including nuanced questions and commands.

Second, many chatbots are rule-based and have a narrow knowledge base, which means grammatical and syntactical errors can throw them off and complex questions often can’t be answered, disappointing customers.

Chatbots are not human and many can’t understand nuanced natural language. Photo by Volodymyr Hryshchenko

A two-way street

Although it’s easy to blame the chatbot for a miserable experience, we need to realise that, just as it takes two hands to clap, it takes both chatbot and customer to create a satisfactory interaction.

While previous studies have focused mainly on the chatbot, including why companies implement them and the design cues that characterise them, there hasn’t been much consideration of the customer’s role in these interactions.

In our latest research, we put the spotlight on how customers deal with chatbots and suggest ways to improve the experience.

We find that to create constructive, meaningful engagement with a chatbot, the actions and reactions of the customer and a willingness to make it work are as important as the chatbot’s own functionality.

Understanding chatbots

We identified six distinct types of human-chatbot interactions: socialising, collaborating, challenging, accommodating, committing, and redirecting.

These vary depending on who is driving the conversation (the chatbot or the customer), how “real” they perceive each other to be, their social cues, and the customer’s effort.

In the case of socialising, the chatbot tries to entertain the customer – for example, by telling jokes or trying to cheer them up if they detect a bad mood.

Collaborating interactions are those conversations where both the chatbot and the customer work together on the customer’s needs, such as booking a flight or understanding the root cause of a problem and identifying solutions.

Both socialising and collaborating interactions involve smooth exchanges between the chatbot and customer and mostly lead to positive outcomes.

‘What’s the meaning of life?’

Accommodating interactions are ones where the customer is in the driver’s seat, helping the chatbot understand their needs by changing the way they phrase the question or statement, repeating their request or clarifying their intent.

On the flip side, a committing interaction sees the chatbot more engaged than the customer, trying to provide an answer to a question or solving a customer’s problem.

In those cases, chatbots often ask follow-up questions and provide additional information that might be relevant. These two types of interactions, however, often leave customers without the required information.

In some cases, people see the novelty of chatbots as an open invitation to challenge them and see when it breaks. This type of interaction usually leads nowhere, since most chatbots aren’t trained for off-topic questions such as “do you want to marry me?” or “what is the meaning of life?”.

Lastly, when redirecting a customer, chatbots act more like a navigator, pointing to alternative information sources such as the company’s website, and don’t directly respond to inquiries. These interactions are very short and may not be an ideal outcome for the customer.

Three keys to success

Based on our research, we provide three tips for your next encounter with a chatbot:

  • remember that a chatbot is not human and many chatbots can’t understand nuanced natural language, so try not to use complex sentences or provide too much information at once
  • don’t give up too quickly – if the chatbot doesn’t understand your question or request the first time, try to use keywords, menu buttons (if available) or short sentences
  • give it a second chance – chatbots acquires new “skills” over time, so it might now be able to solve a problem or answer a question it couldn’t two months ago.

Organisational tips

The introduction of chatbots has redefined the way customers, employees and technology interact, and we encourage organisations to take a holistic view of their customer service systems when redesigning them.

Careful consideration should be given to the changing role of customer service employees who need to work with chatbots. Additionally, we recommend organisations:

  • reimagine a customer service team – involve people in the redesign of customer service delivery through a mix of chatbots and actual employees
  • treat chatbots like a new (digital) employee – spend time and effort extending their skills
  • find the sweet spot for escalating an enquiry to a contact centre employee – some chatbots refer people too early (causing congestion), while others offer the option frustratingly late. Experiment to find the right timing
  • monitor the chat interactions – learn how and what questions customers ask and extend your chatbot’s knowledge base accordingly.

Lena Waizenegger and Angsana Techatassanasoontorn are lecturers and associate professors respectively at Auckland University of Technology. This commentary first appeared on The Conversation.

The Conversation

‍

Candidates are customers - and a bad chat interaction can stop their job-shopping experience fast

What we're reading

‘We’re all fighting the giant’: Gig workers around the world are finally organizing

by
Peter Guest
-
rest of world

Gig workers are connecting across borders to challenge platforms’ power and policies

Got Zoom fatigue? Out-of-sync brainwaves could be another reason videoconferencing is such a drag

by
Dr. Julie Boland
-
The Conversation

I was curious about why conversation felt more laborious and awkward over Zoom and other video-conferencing software.

How to Purchase an Applicant Tracking System

by
Dave Zielinski
-
SHRM

Experts say the first step in seeking a new ATS should be to evaluate your existing recruiting processes.

View All Articles

Events
No items found.
View All Events
Related Articles

AI can enable fake job applicants. How do recruiters protect themselves?

HR Dive

May 6, 2025

Why the digital employee experience is vital for business success

May 2, 2025

© 2024 recruiting news network.
all rights reserved.



Categories
Technology
Money
People
TA Ops
Events
Editorial
World
Career Advice
Resources
Diversity & Inclusion
TA Tech Marketplace
Information
AboutContactMedia KitPrivacy Policy
Subscribe to newsletter
