img_blog

Contact Lens for Amazon Connect Provides Real-Time Insights From Machine Learning

Insights gained through ML allow supervisors and agents to improve customer interactions immediately

Key Takeaways:

  • Support center managers need actionable insights that can’t be attained by listening to a call here and there.
  • Contact Lens provides real-time call analytics powered by Machine Learning (ML) and natural language processing via Contact Lens for Speech Analytics.
  • Real-time call analytics allow quick dissection of call transcripts to find sentiments and trends. 
  • Set up real-time alerts through automatic contact categorization, powered by ML.
  • Contact Lens helps identify opportunities for re-training and improving agent skills to provide the best customer experience in your industry.

To win the race against the competition, organizations must exploit customer insights at every touchpoint to deliver personalized experiences, reduce friction, and build relationships. This requires real-time insights, the kind that can’t be gathered simply by listening to random calls in progress.

With Contact Lens for Amazon Connect, Machine Learning (ML) and real-time call analytics provide contact center supervisors with immediate insights into customer experience issues and empower them to offer proactive assistance and resolve problems quickly. 

The valuable insights offered by Contact Lens mean supervisors can gauge customer sentiment, identify trending issues, and analyze customer feedback. Agent effectiveness can be judged, successful interactions can be replicated, and overall customer service improved. These tools offer unparalleled opportunities for consistency and improvement in customer service.

7 key benefits of Contact Lens powered by machine learning

Contact Lens doesn’t just enable a better understanding of how your customers feel and what they want – this data can be used to train agents to ensure compliance with greeting and sign-off policies, highlight opportunities for re-training, and inform future new employee onboarding.

Real-time capabilities mean supervisors are alerted to issues during live calls, and with transcripts of each call, supervisors can use a fast full-text search to highlight recurring issues and solve them. The tools offered through Contact Lens and ML include:

1. Comprehensive analytics 

Customer sentiment analysis is made easy with insights provided by ML, Natural Language Processing, and speech-to-text analytics. With a few clicks in your Contact Lens dashboard, you can dissect call transcripts, conversation characteristics, and sentiment to find issues and trends that provide coaching opportunities.

2. Conversational search

Using keywords, customer and agent sentiment scores, and non-talk time, you can identify the most relevant calls to help you gain CX insights and improve customer happiness. You’ll discover which words and phrases come up often in the way calls end, whether positive or negative, and can use this information to improve agent coaching. 

3. Alerts in real-time

Of course, every call doesn’t need to involve a supervisor, so Contact Lens allows you to create rules to tag customer experience issues based on matching keywords and phrases. These can be anything from, “not happy” to “I want a refund” to “cancel my subscription.” This feature lets supervisors know when they need to assist an agent on a live call and provide assistance via chat or ask the agent to transfer the call.

4. Immediate transcripts

Among frequent customer frustrations is the need to repeat their information over and over as their call is transferred from one support agent to another. With Contact Lens, when a call is transferred, the agent can pass along a real-time transcript and details of the conversation to the next agent. This means customers don’t have to give the same information more than once, and there’s no waiting on hold while the new agent tries to find the information they need.

5. Privacy guards

Contact Lens finds and redacts any sensitive data automatically — including names, addresses, and social security numbers — from transcripts and recordings to protect customer privacy. You have ultimate control of the access to this redacted data by enabling user-defined permission groups. 

6. An ML-powered categorization engine

Tracking every conversation for company policy and regulatory requirements is made easy with an ML-fueled engine trained to understand not just spoken phrases, but their intent and context. Using category labels, supervisors can set up scorecards to show the percentage of agents who complied with company greeting and sign-off policies. 

7. Detailed contract trace records

The Contact Search page of the Contact Lens dashboard allows a supervisor to see the contract trace record of any customer interaction, including ID, queue name, phone number, agent name, and the recording of the call. By listening to the call and viewing a video illustration of the interaction, supervisors can easily identify things like interruptions, sentiment, and non-talk time.

How to set up Contact Lens

Getting started with and configuring Contact Lens is easy with these simple steps:

  • 1. In your Amazon Connect contact flow, add a Set recording and analytics behavior block. Open the block and choose Agent and Customer. This will enable recording. 
  • 2. Next, set up recording and analytics behavior. Choose Enable Contact Lens for speech analytics. You’ll use speech analytics for speech analysis for transcripts, customer sentiment, and other data in either real-time, post-interaction, or both.While there are two modes of operation, you should enable real-time and post-call analytics to get the most out of Contact Lens. This option enables the real-time alerts and speech analytics for live calls.
  • 3. Choose the language of the call you want to analyze. Contact Lens supports 18 post-call and four real-time languages
  • 4. Redaction is enabled by clicking the Redact sensitive data This means sensitive information will be automatically detected and redacted from all call transcripts and audio recordings after a disconnected call. 

    You can generate both redacted and original transcripts and audio by selecting Generate both redacted and original transcripts and audio, or you can generate only a redacted transcript by selecting Generate redacted transcript only and both redacted and original audio.

  • 5. Now, set up automatic contact categorization to define customer experience issues that trigger real-time alerts. Specify keywords and phrases using the categories on the Amazon Connect Rules page to identify scenarios, then choose between exact matching, pattern matching, or ML-based semantic matching. 
  • 6. After you’ve created your real-time categories, Contact Lens will analyze live calls, detect when rule criteria are met, and then alert supervisors via Amazon Connect’s real-time metric dashboard. Supervisors can listen in on live calls, provide guidance over chat, or, if necessary, transfer the call to themselves or another agent. 

You can also build custom solutions based on use cases with a synchronous API.

Real-time analytics abilities powered by ML means that, with Contact Lens for Amazon Connect, you can not only retain, but build customer trust by solving any issues right away. You can also build a top-notch customer support team that’s fast, efficient, and builds profitable customer relationships. 

We’re the AWS experts. From cloud consulting to managed services and beyond, contact the CloudHesive team today to learn how we can help you build a robust cloud strategy that increases operational efficiencies.