Multiple-choice tests are a popular method of providing reliable and valid measures of achievement in many courses on campus. In recent years, their usage has increased as class sizes have grown thus making other measures of achievement practically infeasible. TSQS uses Optical Mark Reader (OMR) technology to capture and score responses on appropriate OMR forms and, then, offers a comprehensive array of analysis and reporting procedures.
Optical Mark Reading (OMR) is accurate and efficient and results are delivered in a maximum 24-hour turn-around on weekdays that may include:
- raw scores printed on each answer sheet
- class lists with names, id numbers and scores
- comprehensive item analysis reports
- comma-delimited results files suitable for opening in Excel or uploading to eClass
- merging results from alternate forms of a test into a single report
- combining (weighted) scores from sub-sections of a test into a single score
- formula (right minus wrong) scoring
- rescaling scores to percentages or other totals, or to specified means and standard deviations
- reports of right and wrong answers given by each student, and more
The following list provides an overview for steps that are involved in the test scoring process:
1) Select scoring options.
2) Select appropriate answer sheet for your test: Answer Sheet Style 1, Answer Sheet Style 2.
3) Obtain answer sheets. The UAlberta Bookstore has answer sheets available for purchase if more than 100 are required. Otherwise, you can get the from the TSQS office 240 GSB.
4) Fill-up answer sheet key.
5) After your test is done, gather students answer sheets making sure that all the answer sheets are included.
6) Complete a request for service form.
7) Send printed copy of Request for Service Form, key, and students answer sheet to TSQS office 240 GSB.
8) After TSQS process your exam, you will receive an email including: Class list, Item Analysis, and any other files requested.
9) Exam will be ready for pickup or send by mail to your office.
There are several scoring options available: GPSCOR, MRSCOR and alternate test versions.
GPSCOR is the most common type of exam. If in your exam students are asked to choose only one answer for each question asked, your exam format is GPSCOR. In the event your exam has some questions that contain more than one possible answer and students are asked to select one answer only, GPSCOR is also able to accommodate this.
In using GPSCOR, up to 8 key sheets may be used to generate scores, or sub-scale scores, on various sets of items in the test. Scores may be computed as the sum of the number of correct answers or the sum of the number of incorrect answers. Incorrect answers are only counted for those items having a correct answer indicated on the key. The absence of a response to an item is not counted as an incorrect answer. For additional information on GPSCOR options, view the guide.
MRSCOR (Multiple Response Scoring Program) is provided for situations where students are allowed or expected to respond with more than one answer per question. MRSCOR also provides a variety of scoring options (in all cases, only a single key sheet is used). For additional information on MRSCOR options, view the guide.
Alternate Test Versions
A popular practice in a number of departments is to create multiple forms of a test for purposes of administration in large (crowded) classes. This is usually done by reordering the items in the test. In order to analyse all forms of a test as a single set, mapping instructions must be provided so that the items can be aligned to correspond from one form to another before proceeding with the item analysis. See example.
There are two options available for scoring: class lists and item analysis.
A class list is a report that shows student name, ID and score, see example here. Additional columns may be added, such as an itemized list of problems that were incorrectly answered by the each student and scored responses which provide a list of item numbers that correspond to incorrect responses to all test questions. View the guide on how to generate this list.
An item analysis is a comprehensive report which depicts an overview of the test results and includes a histogram which shows the distribution of test scores. This report breaks down each question to show performance from the class as a whole, including a score breakdown for the highest, middle and lowest performing students.
There are many such textbooks including the following:
Haladyna, Thomas M. Developing and Validating Multiple-Choice Test Items. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.
Haladyna, Thomas M. Writing Test Items to Evaluate Higher Order Thinking. Needham Heights, MA: Allyn & Bacon, 1997.
Jacobs, Lucy Cheser & Chase, Clinton I. Developing and Using Tests Effectively: a Guide for Faculty. San Francisco: Jossey-Bass Inc., 1992.
Osterlind, Steven J. Constructing Test Items. 2nd ed. Boston: Kluwer Academic Publishers, 1998.
Jacobs, Lucy C. How to Write Better Tests: A Handbook for Improving Test Construction Skills. Indiana University, Bloomington
IDEA Papers from Kansas State University such as No. 16: Improving Multiple-Choice Tests