Data Management

Contact centers deal with huge amounts of data, from reporting data to calling lists. You may need to transfer data to and from CXone and you may need to work with data within CXone. How you work with data in Studio scripts or APIs impacts the performance of your system and the quality of your interactions. This page helps you manage data efficiently.

The Right Tool for the Right Job

The primary purpose of Studio is to control contact routing. Anything you do in a Studio script should be contact-focused. Any data processing you need that is not contact-related should be done outside of Studio. Studio is not designed to process large amounts of data, so it has a data limit of 32KB. This is sufficient when working with contacts and keeps the servers running efficiently.

The following is a task that requires data management, with two examples of accomplishing the task.

Example Task: Analyze agent data everyday to identify potential issues like overtly long breaks or unscheduled breaks.

Inappropriate Solution: Create a scheduled Studio script that runs everyday. The script first makes API calls to pull the agent list for the day then pull agent state history for the day. The script then checks to see if any agent took a break longer than 30 minutes that was not scheduled. It also searches for any overtly long breaks, like a four-hour bathroom break. The script also determines which agents spent the most time in an unavailable state, and which agents spent the most time on a particular skillClosed Used to automate delivery of interactions based on agent skills, abilities, and knowledge.

Why this method is bad:

  • This task is not focused on a single contact; Studio is not the right tool for the job.

  • Studio is not intended for processing large amounts of data, it's designed for contact handling. Your tenantClosed High-level organizational grouping used to manage technical support, billing, and global settings for your CXone environment may suffer performance issues since a lot of data in memory leads to poor server performance.

  • Studio has a data limit of 32KB. This method requires you to chunk data into small amounts to stay under that limit. Therefore, the script could run for a long period of time, which is resource intensive.

  • This task is better solved by engineering who can use CXone APIs, rather than a Studio script.

  • Studio works best by identifying information like an agent who spent the most time in a certain state or who handled the fewest number of calls. Processing the data on an external server allows you to produce more valuable metrics.


Good Solution: This is only one potential method of solving the task. Solutions for your scenarios may require a different approach.

One method of solving this task is with Python running on an AWS server, which offers more processing power. From the server, make the same API calls to pull the agent list and agent state history for the day. Put the data in a table or perhaps an array. A database table would be the preference to allow you to more easily compare and analyze the data in the future. The returned data would possibly be in megabytes, which is no problem for a database, but exceeds the data limit in Studio. You could have the entire state history for every agent within memory; you'd have the full dataset in one place to work with instead of small pieces of data in a Studio script.

Now you could begin working with the data to produce your desired metrics, like an ordered list of agents based on time spent in certain states. A database helps you present this data as you see fit, rather than within the constraints of a Studio script.

During active calls, The Jungle contact center wants to pull customer data from their CRM and display it in their agent application. In Studio, they first capture basic identifying information from the contact, like an account number. Then, the script makes a GET request to their CRM, checking if the information matches a customer record. If a record exists, it returns all of the customer data. The Jungle does not want to display all of the customer data, and the API does not allow them to specify what to return.

Their solution is to build a middleware that exists between Studio and the CRM. The API returns the JSON to the middleware, which parses out the desired customer details. Then, it passes the details into Studio. This also allows The Jungle to stay within the 32KB data limit.

Push vs Poll

In general, you want to use a push-based architecture. Waiting for something to happen by polling innately costs resources, whereas pushing data is on-demand; it doesn't consume resources when it's not needed. Pushing data typically allows you to deal with smaller chunks of data. Pushing data is often done for real-time needs like a single active contact. Polling is often used for integrations and can't be broken down or segment as much.

For example, perhaps you wanted to update the agent's UI when putting a phone call on hold. If you used the /get-next-event API in a script to listen (or poll) for the hold event from the agent client, this would constantly block a thread. Instead, you want to push data in a single instance to avoid constant use of server resources. In similar instances, perhaps with CRM integrations, instead of waiting for a request to return, make an API call that pushes data and frees up the thread. Then, use the signal API sends the data back to the script. You could also make an API call to see if that request is finished, and if yes, send the data.

Process Large Amounts of Data

If you want to manage large datasets, like historical data for your entire contact center, use CXone APIs or CXone Data Share. On a large scale, you can use Snowflake to pull all the data for your business unit; CXone Data Share gives you a direct pipeline for your contact center data, so you're not limited to making API calls to pull data. On a smaller scale, the example task explained above may be optimal. Small businesses that don't have a huge amount of data or for businesses that want to find specific metrics could build small apps for similar tasks. For example, you could build an app that sends emails to managers if they have employees who spent too much time in a certain state.

Examples of Avoiding Large Datasets in Scripts

The following exemplify common scenarios where you can avoid using large datasets in your Studio scripts. You can also refer to the example above, which avoids transferring too much customer data from a CRM into Studio.

  • Filtering API responses by field:

    Some APIs offer the ability to filter which information is returned. For example, if you request a CRM to return information for a case, some CRM APIs let you specify exactly which fields of information you want to return. If the API you're using offers this functionality, you can avoid large datasets by only working with exactly the information you need. If a vendor's API does not have this functionality, you may want to work with the vendor to add the option, or you can build a middleware. The middleware can receive the data before Studio, you can filter out what is unnecessary, then return the data to Studio.

  • Filtering API responses by time:

    If an API lets you filter by an amount of time, use this functionality to minimize the amount of data. If you only need a day or week's worth of data, be sure to filter the response to only include data within that time range.

  • Filter data for individual contacts:

    Studio is not a data management tool, it is a contact management tool. The capabilities of Studio and Studio actions let you work with data primarily focused on an individual contact. You can keep your data specific to individual contacts with techniques like gathering information through an IVR or pulling CRM data for one record or case at a time.

Data Storage

You have many options for storing data. If you have a Snowflake account, you can move your contact center data from CXone to Snowflake using Studio. NICE stores you data for 24 months, which you can retrieve from Snowflake. You can also use the Cloud Storage Services to store files like call recordings or chat transcripts, or move them to your own servers. Contact your CXone support representative for more information.

Unexpected Costs

In general, you will not incur additional costs based on making too many API calls or something similar. However, how you store certain data could generate cost. The following are examples where improper data storage created unexpected billing charges:

  • Creating many script variables and storing them in a text file for every contact without including a process for cleansing those files.

  • Storing IVR press path information into files for later use. Perhaps you want to use the information for reporting purposes in the future.

  • Storing API results in a file that continually expands as new results are added. Over time and as the file grows in size, that generates storage costs.

  • Storing files on CXone Cloud Storage. If you use Cloud Storage, be sure you're aware of any parameters around storage. You can reference the help content for Cloud Storage or contact a support representative.

API Calls from Studio

APIs help you efficiently and effectively work with data. You can make API calls from Studio with the actions listed below. The following list explains the technical differences between using the different actions:

  • SNIPPET: Lets you add custom code to a script. You can use this action to make API calls, prepare payloads, parse dynamic data objects, and so forth. When making API calls with this action, be wary of the response speed. This action utilizes a thread the entire time its active. If the response is slow, meaning the thread is blocked the entire time, it may negatively impact server performance. For example, a caller may hear dead air if all threads are utilized at one time.

  • REST API: Lets you make REST API calls and uses less server resources. You should use this action to make API calls whenever possible, however it specifically accepts JSON. If an API does not return JSON, you may need to use the SNIPPET action instead. Depending on your task, you may need to use this action in combination with SNIPPET, since SNIPPET can do things like prepare payloads.

  • ConnectAuth: Authenticates a connector created in Integration Hub. Integration Hub is a centralized source for building, managing, and executing integrations from CXone into third-party platforms. This action does not block threads.

  • ConnectRequest: Triggers an Integration Hub request after it has been authenticated. This action does not block threads.

Other Data Related Studio Actions

Studio has several actions that temporarily store and retrieve small amounts of data from a database table in order to make the data accessible to other scripts. These actions behave like a list of fields or values. Use them for storing multiple values, or values needed further along in other scripts. The complete list of actions are: PUTVALUE, GETVALUE, REMVALUE, GETLIST, and CLEARLIST.

These actions use a unique data type that can only be accessed using this set of Studio actions. The data are not accessible any other way. Users cannot get to this database and use it, regardless of their permissions.

The values are listed in a database table for a limited amount of time, as configured in the TTL hrs (time to life) property of the PUTVALUE action. The default is 24 hours, but it can range from one hour to 168 hours (seven days). You can use the REMVALUE action to delete data before the TTL time. This gives you complete control over the data within your scripts. Best practice is to delete values when they will not be used anymore and to leave the TTL at the default 24 hours.

Notes:

  • If several variables need to be accessed by other scripts or contacts, a database is generally the best solution.
  • Non-persistent public variables can be shared by other scripts or contacts throughout the life of the script that sets those variables. The variables are automatically cleaned up once they are released.
  • These actions have a limit of 1000 items in the "list". A single item, or piece of data, also has a 5KB limit.