Calibrate Forms and Evaluations

Regardless of how well a formClosed A collection of questions used to evaluate agent interactions is crafted, there may still be differences in interpretation. Calibration lets you validate the accuracy and usability of a form by testing out the form on several evaluators to see how closely they evaluate the same interaction.

When you want to perform a calibration flow, you select an interaction and distribute it together with a form to several evaluators to evaluate. You can choose an interaction that is new or has already been evaluated. The evaluators receive the form in their Tasks page in their My Zone application, and are not aware that they are participating in the calibration flow. They evaluate the interaction as though it is part of the regular evaluation flow.

You can view the results of the calibration flow in the Calibrations page and compare the scores between evaluators. If the deviation is high, then consider refining the form to increase its validity.

Here are two use cases for calibration: