Contents
Customer experience automation is rapidly evolving, reshaping how companies engage with customers and clients. LLM-powered virtual assistants, chatbots, and virtual agents promise to become the new faces of customer experience automation. Are LLMs the right way approach to CX automation? What other infrastructure and investments are required?
We will have to dive a bit deeper to understand if LLMs are truly capable of CX automation in your business.
Pros of LLM-Powered CX Automation
- Reduced Costs
Automation should significantly reduce overheard. This could be by automating common tasks performed by agents, or by augmenting agent knowledge and training with virtual agent assistants. Something as basic as automatic call summarization and topic extraction can reduce pre- and post-call operations by 30%, representing millions of dollars in savings. Further advances, such as contextually aware AI-powered agent assistants, will have an even more profound impact. - Increased Customer Convenience
Customers appreciate swift and round-the-clock service. 64% of consumers expect real-time responses from companies. Whereas chatbots promised to eliminate long wait times on the phone and 24/7 service, customers were often left frustrated. However, a next-generation interaction with an LLM-powered AI agent, whether through chat, email, voice or video, could meaningfully close the customer experience gap. - Less Stress for Customer Service Agents
Automation eases the load on human agents by sorting and prioritizing requests. For example, customer experience automation platforms handle initial query sorting and prioritization, alleviating agent workload. Collecting key customer information and visuals before an agent is assigned is an easy way to lighten the workload on agents, without heavy AI investment. The further infusion of Generative AI, such as LLMs, could further enhance digital customer interactions, making them more efficient. - AI-Enhanced Human Customer Service Interactions
Agents, when unburdened by high workloads, can provide superior customer service to the customers who most need human support – whether due to high levels of anxiety or frustration because they are higher-value customers or because their issues are more complex.
Cons of LLM-Powered CX Automation
- Technical Hurdles: Complex Requests and Context Maintenance
Most of today’s early LLM chatbots struggle with understanding complex requests or questions. At best, most LLM-powered chatbots can provide basic answers to common questions, with many duplicating the functionality previously relegated to a far-lower-cost FAQ page. Providing step-by-step guidance through a support flow requires a level of cognition and context that must be engineered as middleware between the LLM and the user. This cognitive solution would maintain context throughout an interaction. Furthermore, LLM-powered CX automation may not be capable or appropriate for handling all customer inquiries. At times, interactions are routed to or otherwise supervised by human agents due to limitations in the capabilities of LLM-powered chatbots or virtual agents. - High Cost to Value of Many LLM-Powered Chatbots
LLM tokens do not grow on trees. In addition to the cost of securely hosting, training, testing, optimizing and possibly fine-tuning the LLM, one must factor in the cost of recruiting experienced AI developers, developing cognitive middleware to maintain context within the support flow, and so much more. If an LLM-powered chatbot is only capable of duplicating the functionality of an FAQ page, it will be very difficult to prove measurable or substantial ROI. - Risk of Stilted or Rigid Customer Experience
Many chatbots are restricted to a limited set of interactions, leading to a less natural or intuitive customer experience. Early LLM-chatbots have been similarly restricted due to concerns about LLM hallucinations and off-topic guidance. Furthermore, when chatbots cannot address the customer’s needs, they must be able to transition the conversation to a human agent. This service continuity unfortunately remains a challenge for many enterprise service organizations built around strict operations silos.
The Need for Multi-Sensory AI & Contextual Cognition
We advocate for a multi-modal approach that combines vision, voice, and text through a contextual cognition framework to enhance CX in the real-world.
Those are some complex concepts, so let’s break them down.
- Natural and Intuitive Interaction
Multi-sensory AI allows customers to engage naturally, mimicking human conversation. For example, multi-sensory integrates visual cues (such as images or videos) and voice-based interactions, allowing customers to interact with AI much like they would any remote agent or technician. Visual and voice interactions have the added benefit of providing deeper contextual signals. For example, Visual AI can easily identify a home security panel make and model, and diagnose the issue far faster than a chat interaction with a customer struggling to communicate complex technical information. This contextual data enables the AI-powered assistant or autonomous virtual agent to more quickly, efficiently and cost-effectively guide the customer to full resolution. - Integrated Contextual Cognition
Think of an LLM as an over-confident teenager. They have all the answers but none of the life experience or expertise required to understand the context of the bigger picture. In colleges and universities, we train these young minds through cognitive development. This cognitive development opens their minds to the wider context of a given topic and the many considerations they must consider. This training enables them to enter the adult world with the skills they need to contribute to society meaningfully.LLMs and AI follow a very similar process. Cognitive middleware enables efficiency and effective LLM training while providing the context for who the user is, what steps have already been taken, and even what other information is known about this user and their issue from third-party systems.
For example, let’s say a customer needs help connecting their smart home system to the internet-based monitoring service.- A naive LLM will ask them to restart their system. If that doesn’t work, the chatbot will ask the user to contact support. While this will clearly frustrate the customer who is now chatting directly with support, LLM-based chatbots will typically repeat anything in the user manual or documentation.
- A virtual agent with integrated contextual cognition will follow the ideal support flow to resolve the customer’s issue.
- The virtual agent will first ask the user to see the screen on their smart home hub.
- Visual AI will identify the make, model and status of the smart home hub, seeing that it is working properly and connected to the internet.
- Integrations into the CRM, however, will inform the virtual agent that this user never activated the cloud services on their account.
- The virtual agent can then activate the customer’s cloud services, just as a human agent would, resolving the customer’s issue while potentially increasing any recurring revenues from those cloud services.
Conclusion
The future of CX is not just about automating tasks; it’s about creating meaningful connections with customers through intelligent, intuitive experiences. So is LLM-powered CX Automation ready for prime time? LLMs and generative AI have the potential to revolutionize CX, but success lies in understanding their strengths and limitations. With the right infrastructure and cognitive middleware, LLMs and multi-sensory AI are ready to augment, automate and transform enterprise CX today!
To learn more about TechSee’s Sophie AI and how we can help automate and improve your customer experience, please contact us today.