Question Stats Report

The Question Stats report provides detailed insights into the performance of individual questions within an assessment. These analytics help instructors and administrators evaluate question quality, difficulty, and overall assessment reliability.

Screenshot 2025-05-14 at 12.37.19 PM

Figure 1: An image of the question stats reports

Questions are displayed one at a time in the report. If a question has a Title set, it will appear next to the Question number header for easier identification.

For multi-part questions, each part is shown indented beneath the question header to display individual statistics for each component.

Clicking on the question stem opens a popup with the Responses Report, which includes additional details such as the full question stem and a breakdown of each unique response submitted.

To quickly find a specific question in large assessments, you can use the search icon to look for any text contained in the question title or stem.

Metrics available

Each question in the report includes the following metrics:

  • Times Shown - The number of times the question was presented to learners. This may vary depending on whether the assessment is adaptive or includes randomized question pools.
  • Max Points - The maximum number of points that could be earned for the question.
  • Average - The mean score achieved on the question, calculated across all attempts where the question was shown.
  • Correct - The percentage of responses that were correct.
  • Not Answered - The percentage or count of instances where the question was not answered.
  • Upper 27% - The average score or correct response rate for the top 27% of scorers on the assessment. Useful for evaluating how well high-performing learners handled the question.
  • Lower 27% - The average score or correct response rate for the bottom 27% of scorers on the assessment.
  • Standard Deviation (SD) - Indicates how much the scores for the question vary from the average. A high SD may suggest varied performance or question ambiguity.
  • Discrimination Index (DI) - Reflects how well the question distinguishes between high and low scorers. Values typically range from -1 to +1, with higher positive values indicating better discrimination.
  • Point-biserial Index (PBI) - Measures the correlation between scores on this question and overall assessment performance. A higher PBI suggests the question is a good indicator of overall ability.
  • Difficulty P-Value (PV) - The proportion of learners who answered the question correctly. Lower values indicate harder questions, while higher values indicate easier ones.
  • Median Attempts - The median number of times learners attempted the question. This is particularly relevant for assessments allowing multiple attempts or retries.
  • Response Frequencies - Shows how often each answer choice was selected, either as a count or percentage. This is helpful for identifying distractors that may be too obvious or too misleading.

Filtering by Section

If your course is organized into sections of students, you can filter the report to show data for specific sections only. This allows you to analyze performance trends within different groups of students. Select the section name from the dropdown menu to view statistics only for the students in that section.

Download a CSV

You can download the complete question statistics data as a CSV file for offline analysis or record-keeping. Click the Download CSV button at the top right of the report to export all available data.