Test developers experienced in writing selected-response and even constructed-response items are often at a loss when asked to develop content for performance assessments. A white paper by Wallace Judd, Ph.D., titled, “Item Writing for Performance Tests,” provides much-needed guidance. It is the fifth in a series of white papers on performance testing.
Dr. Judd explains that incorporating the right amount of context is critical for performance items. The right context also adds to the utility and credibility of the test. “Candidates who can answer the item will feel that the test fairly evaluates their competence. Candidates who cannot answer the item should feel that this is something they should know, and hence will be motivated to learn the required knowledge or skills.”
Considering that many test developers are familiar with multiple-choice (MC) items, the paper compares the steps in text as well as a helpful overview table.
“In many ways, writing performance items is easier than writing MC items, since you don’t have to write credible distractors,” Dr. Judd writes. He adds, “On the other hand, quality assurance is much more difficult, since you have to assure that all potential correct answers are accommodated.”
The papers in this series are intended to be a resource for test developers, illustrating best practices in the performance-testing industry. They provide advice on performance testing topics from initial concept to test delivery.
These topics are presented in language that does not assume study in psychometrics.
The mathematics of any topics are expressed in abbreviated terms instead of the Greek alphabet, and all computations are illustrated in Excel examples.
The papers are published by Authentic Testing, leaders in the field of performance testing, in the hope that they will expand access to performance testing to a wider audience of practitioners, inspiring them to explore the possibilities inherent in performance testing.