Hi Joseph
I haven't worked much with evaluation forms (they're rarely used by my clients here in Brazil... in six years, I believe I've only configure one)...But I did some research, see if this helps you or confuses you more haha... there are a couple of confirmed behaviors worth highlighting.
1 - N/A questions
For AI-scored or auto-complete evaluation forms, every question must have a configured default answer. This is required so the system can always complete the evaluation automatically. If a question contains an N/A option, it can be used as the default fallback when the interaction does not meet the criteria for any other answer.
In practice, the recommended approach is:
- Include N/A as an explicit answer option when the question might not apply.
- Define clear help text telling the AI when N/A should be selected, ideally based on transcript-verifiable conditions.
This works well because AI scoring only evaluates content present in the transcript, so questions must be written in a way that the AI can objectively confirm from the conversation.
---
2 - Branching with AI scoring
Branching (conditional visibility of questions) is supported in evaluation forms in general, allowing questions to appear only when certain previous answers are selected.
However, when using AI scoring / auto-complete forms, all questions still require a scoring method (AI Scoring or Evaluation Assistance) and a default answer so the evaluation can be fully completed automatically.
Because of this requirement, branching logic usually needs to be designed carefully - many teams simplify the form structure or rely on N/A defaults instead of complex branching to ensure the AI can complete the evaluation without ambiguity.
----
- Summary
- Use N/A answers with clear help text for questions that may not apply.
- AI scoring requires default answers for all questions.
- Branching exists, but auto-scored forms still require every question to be scorable, so complex branching often needs redesign.
------------------------------
Kaio Oliveira
GCP - GCQM - GCS - GCA - GCD - GCO - GPE & GPR - GCWM
PS.: I apologize if there are any mistakes in my English; my primary language is Portuguese-Br.
------------------------------
Original Message:
Sent: 04-09-2026 11:55
From: Joseph Sutich
Subject: AI Scoring N/A Questions and Form Branching
Hello all-
We are in the process of building our AI Scoring Form and have come across a few questions on how AI scoring with handle N/A questions where the interaction doesn't apply to certain questions and how we script the question, answer and help text in those instances?
Also, one of our evaluation forms currently uses branching to decide which questions appear based on earlier questions. Does branching work with AI scoring? What would the scripting look like for branching questions, answers and help text?
Thank you for your assistance.
#AIScoring(VirtualSupervisor)
#QualityEvaluations
------------------------------
Joseph Sutich
------------------------------