File Image : IOL

DURBAN - Whether it’s in a bid to be seen as a thought leader or a more business-driven need to improve customer service channels, organisations across the spectrum are investigating the potential AI offers. 

Consumers are already using AI, whether they realise it or not. From AI poster-child Amazon to Google’s product suite and a ton of Facebook features, AI has gone from research lab to mainstream relatively quickly. And it’s starting to make its presence felt - for better or worse - in the customer experience space too, most visibly with chatbots.

Chatbots sound like a great idea - not quite self-service, and not quite human service, they ostensibly offer a means to help a customer with their problem easily and quickly, and cost-effectively too. The reality is often quite different, however. Here we sum up 5 reasons why your chatbot might be driving you nuts. 

1. It doesn’t speak English - this is the first challenge chatbot developers face, making sure that the person using free text or voice is correctly understood by the technology, which, obviously enough, doesn’t actually speak whatever language you’re trying to communicate in (and in SA you have quite a few to choose from). To make sure your chatbot knows what your customer means, a lot of slog work needs to happen to offer the chatbot all possible customer inputs and link them to relevant intents.

2. You’re hoping your customers will teach it - many organisations put chatbots out there that are using training data (knowledge bases and unstructured data) as a starting point, and based on the responses they get, use machine learning to improve over time. Problem is, your customers don’t want to be teachers, they want their problem solved - and quickly, too.

3. It’s in the documentation - if, like many organisations, you have a knowledge base in place, you are likely trying to leverage it for your AI initiatives. Chatbots hooked into knowledge bases end up being a smart search facility and not much else though. At best they can find a sentence or paragraph to feedback to the customer, and if they can’t find it in the documentation, they can’t answer it.

4. It’s lacking logic - for a chatbot to truly be able to answer customers’ queries and solve their problems it needs to understand the context of the problem and ask questions to get that context, if need be. Chatbots using coded decision-tree logic can’t do that, because if the path isn’t coded, it doesn’t know how to handle it.

5. Formatting error - If you want to operate in the digital era and you want to drive logic through data then you need to start it in data, data tables to be precise. This method allows for highly complex, prescriptive logic to be captured off structured data, and means the chatbot can use contextually relevant, adaptive logic to give the customer a relevant, and even intelligent, answer.  

Ryan Falkenberg, co-Chief Executive of CLEVVA