Enlighten Autopilot

Enlighten Autopilot is a full-service, data-driven intelligent virtual agentClosed A software application that handles customer interactions in place of a live human agent.. It handles voice and digitalClosed Any channel, contact, or skill associated with Digital Experience. interactionsClosed The full conversation with an agent through a channel. For example, an interaction can be a voice call, email, chat, or social media conversation. in the place of live agents. Autopilot understands human language and responds to contacts naturally. It can switch topics, understand context, and answer questions. It fully resolves issues via self-service, limiting escalations to live agents.

Autopilot combines with Enlighten XO artificial intelligence (AI). Enlighten XO comes equipped with over 30 years of interaction data. Autopilot uses that data, along with your organization's interaction data, to build conversation flows. These flows include lists of possible utterances and appropriate responses. Using AI, Autopilot continues to learn from interactions to improve and add to these flows.

Autopilot can also:

  • Identify tasks agents may need to complete to resolve a contact's issue.

  • Identify back-end tasks that Autopilot itself can do, such as fetching a contact's data.

  • Proactively send messages to contacts based on their identified needs and interests. Watch this demo video for an example of this feature.

Enlighten Autopilot Knowledge is a separate knowledge baseClosed A website that stores troubleshooting articles. (KB) bot.

Edward Ferrars is chatting with Autopilot, the virtual agent for Classics, Inc. He tells Autopilot he wants to pay the balance from his latest book order. Autopilot begins the conversation flow for payments. It fetches his billing information and finds two credit cards. It asks Edward which one he wants to use. He answers "Credit Card A," but then changes his mind and says, "Actually, use Credit Card B". Autopilot changes the credit card selection.

Edward then asks for help changing his password. Autopilot switches to the conversation flow for that intent.

Autopilot remembers Edward's first intent was payment. It switches back to that conversation flow. It asks Edward if he'd like to remove Credit Card A from his billing information. He answers "Yes," and Autopilot removes it.

How Virtual Agents Work

The beginning of the conversation is different for voice and text virtual agents: 

After the conversation has started, the virtual agent analyzes the contact's utterances to understand the purpose or meaning behind what a person says. This is known as the contact's intent. When the intent is identified, the virtual agent sends an appropriate response to the contact.

Requests and responses are sent via Virtual Agent Hub and the script with each turn. This option allows for customization of the virtual agent's behavior from turn to turn. For voice virtual agents, this is the utterance-based method of connection. All text virtual agent providers use this method.

At the end of the conversation, the virtual agent sends a signal to the Studio script. It can signal that the conversation is complete, or that the contact needs to speak with a live agent. If the conversation is complete, the interaction ends. If a live agent is needed, the script makes the request. The contact is transferred to an agent when one is available.

Once the conversation is complete, post-interaction tasks can be performed, such as recording information in a CRMClosed Third-party systems that manage such things as contacts, sales information, support details, and case histories..

Enlighten Autopilot and Digital Channels

If your channel supports it, you can include rich mediaClosed Elements in digital messaging such as buttons, images, menus, and option pickers. content in the messages. The type of rich media that can be sent differs from channel to channel, as shown in the following table.

  Adaptive Cards HTML & Markdown Text Rich Link Quick Replies List Picker Time Picker Form message
Apple Messages for Business Red X, indicating "not supported" Red X, indicating "not supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported"

Digital Chat

Green checkmark, indicating "supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Red X, indicating "not supported" Red X, indicating "not supported"
Email Red X, indicating "not supported" Green checkmark, indicating "supported" Red X, indicating "not supported" Uses fallback text Red X, indicating "not supported" Red X, indicating "not supported" Red X, indicating "not supported"
Facebook Messenger Red X, indicating "not supported" Red X, indicating "not supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Red X, indicating "not supported"
WhatsApp Red X, indicating "not supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Red X, indicating "not supported"
Google Business Messages Red X, indicating "not supported" Red X, indicating "not supported" Green checkmark, indicating "supported" Red X, indicating "not supported" Green checkmark, indicating "supported" Green checkmark, indicating "supported" Red X, indicating "not supported"

Supported: Green checkmark, indicating "supported"

Not Supported: Red X, indicating "not supported"

Learn more about digital channel support for rich media.

When you want to include rich media content in text virtual agent responses, configure it in your virtual agent's management console. It should go in the configuration for each response that will send the rich media.

Rich media content is sent as JSON. When building your rich media JSON, follow the schema for the digital channel you're using. The schemas are different for each channel. Find the JSON for the media content you want to use, then add it to the response message configurations that you create in Cognigy your virtual agent provider's configuration console. Learn more about working with rich media in Studio scripts. You can use the Digital Experience JSON mirror tool to verify your JSON before adding it to your scripts or virtual agent.

Requirements

Autopilot has the same prerequisites and required components as SmartAssist.

Set Up Enlighten Autopilot

The steps below describe the general process of setting up Enlighten Autopilot. If you need help completing this process, contact your CXone Account Representative. They may recommend the SmartAssist App Development package. With this package, CXone Services sets up Autopilot for you.

To set up Autopilot as a voice virtual agent, contact your CXone Account Representative. Setup for voicebots involves custom scripting, which requires additional assistance. The steps below apply to text virtual agents only.

  1. Collect interaction transcriptions from your organization.

  2. Feed those transcriptions into Enlighten XO. Enlighten XO analyzes those transcriptions. It identifies utterances and intents.

  3. Import that data from Enlighten XO into Autopilot. Autopilot uses those utterances and intents to build conversation flows.

  4. Review the custom scripting guidelines for Autopilot.

  5. Follow the steps to integrate Autopilot. This involves Studio scripting.

Integrate Autopilot with a Third-Party Knowledge Base

Your implementation team can integrate Autopilot with CXone Expert or any third-party knowledge base (KB). Autopilot can then use information from your KB articles to help answer your contact's questions. Autopilot can provide your contacts with the following from your CXone Expert knowledge base:

  • A response composed of information from one or more KB articles.

  • Links and images within KB articles.

  • Links to full KB articles.

If the contact doesn't find the information helpful, Autopilot forwards their interaction to a live agent. For more information, reach out to your CXone Account Representative.

Voice Biometrics Authentication

Content in this section is for a product or feature in controlled release (CR). If you are not part of the CR group and would like more information, contact your CXone Account Representative.

Contacts can use their voice as their password. Instead of providing a username and password to prove their identity, they can simply begin speaking to Autopilot. As they explain their reason for calling, the system analyzes their voice. This is done in the background to avoid interrupting the call flow. Contacts can skip the authentication process and immediately start solving their issue. It also removes the need to remember authentication details, making the call quick and efficient.

Voice biometrics works by comparing the contact's speech in real time with their voiceprint. The system creates a voiceprint of the contact's voice if they agree to it. Voiceprints are a distinct pattern or set of characteristics of the speaker's voice. After creating the voiceprint, in future calls, the system compares the contact's voice with their voiceprint as they speak. If it matches, the call continues without interruption. If it fails, Autopilot gracefully interrupts the conversation.

Voice biometrics authentication is powered by Omilia. To set it up, you must integrate Omilia with Autopilot using Agent Assist Hub.