Introduction
An Individual Report maps how well a student has performed on the
different questions within a test. Although the Individual Reports for
PAT:Mathematics, PAT:Reading Comprehension and PAT:Reading Vocabulary
look different, they are all built around a set of common concepts.
There are two versions of the Individual Report for PAT:Mathematics:
the standard view and the alternative view. When viewing the individual
report for PAT:Mathematics users may switch between the two views using
a clickable button on the results bar at the top right of the report.
Understanding the standard view
The standard view of the Individual Report for PAT:Mathematics mirrors
the student report that may be constructed by hand using black line
masters contained in the teacher’s manual. The report displays the
questions positioned according to their location on the PAT:Mathematics
scale and grouped according to their content category. Questions that
the student has answered correctly are shown using a black-filled
circle, while those answered incorrectly are shown using unfilled
circles. Questions that were omitted by the student are shown as
grey-filled circles. The student’s overall level of achievement is
indicated by the dotted line that crosses the page and intersects the
scale and the stanine score distributions for three different year
levels. The dashed lines above and below the dotted line are used to
indicate the measurement error associated with the student’s score. If
the test could be repeated we would expect the student to score in the
range indicated by the dashed lines about two thirds of the time.
Students who achieve very highly or very poorly on a test will have a
larger error associated with their score.
Comparing the location of questions with a student’s achievement
The student’s location on the scale may be compared with the locations
of the questions on the scale. Typically, a student is most likely to
answer correctly the questions listed below their achievement level (as
shown by the dotted line). When a question is located well below the
student’s own level of achievement there is a strong expectation that
the question will be answered correctly. In contrast, when a question
is located well above the student’s level of achievement there is a low
expectation that it will be answered correctly.
Question descriptions
A short description for each question is provided on the right hand side
of the report. These descriptions can also be generated by hovering the
mouse over a question in the report. A tick symbol next to the question
description indicates that the question was answered correctly.
The actual questions themselves can be displayed by clicking on a question number. This opens the Individual Item Report for that question, which displays the actual question as presented in the test and provides a range of information about the question, including showing which options were selected by students and providing links to other similar questions in the test.
Gaps and strengths
Evidence that a gap exists in a student’s knowledge could be indicated
when a student has incorrectly answered a question that he or she was
expected to answer correctly. Although it could also have been just a
“silly mistake”, it should be followed up and investigated further.
Similarly, when a student has correctly answered a question that was
expected to be answered incorrectly it could be evidence that he or she
has a particular strength in an area. Again, there could also be other
reasons; for instance, the student may have simply made a lucky guess.
In the report shown below, the student has answered Questions 7
incorrectly (it is presented as a non-filled circle). Given the
student’s overall level of achievement this question was expected to be
answered correctly. It is possible that he has a gap in this area,
which could be confirmed with further questioning.
Comparing Performance in Different Content Categories
The Individual Report can be used to provide an indication of how a
student has performed in the different content categories. However, it
is important to note that each content category has a different number
of questions and that these questions vary in difficulty. This makes it
unwise to make direct comparisons between content categories on the
basis of the number of questions answered correctly. There is evidence
that a student is performing less successfully in a content category
when he or she is unable to answer the majority of the questions in that
content category that are located below the student’s location on the
scale.
Understanding the alternative view
The Individual Report for PAT:Mathematics can be viewed in an alternative configuration. In this view questions within a test are displayed according to their content categories, using the question number and a short, one-sentence description of what the question involves. Bold print is used to indicate when the student has correctly answered a question. Within each content category the questions are ordered according to their difficulty levels, with the more difficult questions placed higher on the page.
The dashed line that runs through the report is used to indicate how the student’s overall level of achievement compares with the difficulty of the questions. Typically, a student is more likely to answer correctly the questions listed below the line than above it. The report uses coloured shading to indicate when questions are well above or well below the student’s achievement level. A question below the line with a dark green background is expected to be very easy for the student and a question with a dark red background above the line is expected to be very difficult for the student. By scanning the different content areas, and looking at the student’s pattern of responses, especially with questions that were expected to have been easy or difficult, the reader can quickly get a feeling for the student’s performance on the different content categories.
Gaps and strengths
As above, evidence that a gap exists in a student’s knowledge could be
indicated when a student has incorrectly answered a question that he or
she was expected to answer correctly. Although it could also have been
just a “silly mistake”, it should be followed up and investigated
further. Similarly, when a student has correctly answered a question
when an incorrect response was expected it could be evidence that he or
she has a particular strength in an area. Again, there could also be
other reasons.