📣 Just Released!
Voice of the Customer Insights, the Generative AI solution that surfaces actionable insights from all your contact center conversations
Omniscient AI is Coming to Your Contact Center: Announcing QA-GPT by Level AI

Keep Your QA Team On The Same Page With Calibration

Contents
Share this article
  • Get The Latest News and Resources Right in Your Inbox

    Subscribe to our newsletter today!

  • Quality assurance is human-driven. Every quality analyst understands and approaches a conversation in their own way while evaluating it. This might lead to variation in the way conversations are evaluated. Also, rubrics get updated from time to time which require the quality team to have the same interpretation and understanding of the rubric to rate the conversations fairly.  

    Hence, it is important for the QA lead to make sure that the team is on the same page when it comes to evaluating conversation. There could be deviation, but the goal is to keep that to a minimum. 

    With the Calibration module in Level AI, you can get QA auditors to evaluate the same conversation to understand how similar or different their evaluation approach is. Doing this periodically will help you understand your team and their practices, and course correct if necessary. 

    Accessing the Calibration module

    You can access the calibration module from the left sidebar. 

    Initiating a Calibration Session

    To initiate a calibration, please follow the steps below: 

    • Log into your Level AI account, head to Calibration and click on New Session
    • Choose the date and time for the calibration to begin and then select a conversation for the calibration session. You can do this by entering the conversation ID. You can either find it from the Interaction history screen or search for it in the dropdown. 
    • Add the QAs you want to be part of the calibration.
    • Click on Create and send invites to participants

    Each calibration session is given a unique id. 

    Once the session is initiated, the invite will be sent to all the chosen QAs in your team. When they click on the invite, they’ll be taken to the calibration page where they’ll be asked to score the conversation. 

    As a moderator, you can observe the progress of the calibration session from the Calibration History screen. You can see the number of participants, average QA score, the calibration score provided by the analyst in your team and the status of the session. 

    The status of the session would be Pending until all the analysts are done with the calibration. 

    Interpreting the Results of a Calibration Session

    Before or during the session, the moderator of the session can observe the participants’ response to all the questions in the rubric and mark the correct responses explaining the reason in the comment section. 

    On the right side of the screen, you can see the calibration score – which is calculated after comparing responses marked by participants vs the moderator. . The higher the score is, the more coordinated  your team is when it comes to evaluating a conversation. You can also dig into each individual QA’s detailed evaluation by clicking on their name in this section. 

    You can use this section to track who has and hasn’t completed the session. You can also see individual responses of participants if needed. 

    Deleting a Session

    To delete a session, open the calibration session of your choice and click on the Delete Session button on the top right corner. 

    Get a free demo today!

    Your customers will thank you for it!