- Designing Your Exams and Creating the Answer Key
- Submitting Your Exams for Grading
- Typical Scantron Problems
- Data Files Created by Scantron
- How to Upload Grades to cuLearn
- Item Analysis
- Scantron Instructions for Students
How many total questions can a Scantron exam have?
The answer form used by the Scantron system can accommodate up to 250 multiple choice or true/false questions. There is no minimum number of questions required.
How many possible answers can you have for each question?
For each question, the answer form accommodates up to five alternative answers: A, B, C, D and E. We recommend using four options for multiple-choice format. True/false questions by definition will use only two alternatives, A and B, one designated true and the other false.
You may assign multiple correct answers for each question; however, the student may only choose one answer and will get only one point regardless of how many correct options are indicated (shaded) on the answer sheet.
How many versions of the exam can there be?
If you are concerned about cheating, you might consider using more than one version of an examination. All versions contain the same questions but the correct answers for each question are presented in a different order on each version. Copying from the next student is almost guaranteed to give the wrong answer.
You must provide an answer key for each version. So long as the students’ answer sheets correctly identify which version of the examination they’re using, the Scantron will select the appropriate answer key for scoring. It is not necessary to sort the students’ answer sheets into separate bundles for scoring.
Important: If students do not fill in the exam version number on a multi-version exam, or fill it incorrectly, this may affect their mark and you need to examine all such cases carefully.
How do you prepare the answer key(s)?
For each version of the examination, you must prepare an answer key. Remember, different versions of the exam does not mean different questions; it means the correct answers to the same questions in a different order. Different answer keys reflect the different orders. The information fields will be the same for each answer key, except for the grade field.
Please follow the instructions below on how to fill out each field:
- LAST NAME: Fill in (shade) the instructor’s last name.
- FIRST NAME: Instead of the instructor’s’ first name, fill in the word “KEY”.
- COURSE NO. & S: This field identifies the course code and section with no blank spaces. For example, 3800A.
- DATE OF EXAM: Indicates the date of the examination.
- STUDENT NUMBER: Leave blank.
- EXAM VERSION NO.: Use this field to indicate which alternate version of the examination this particular key corresponds. The keys should be numbered starting from 1, up to a maximum of 16.
After the exam has been written, you need to bring it to the Educational Development Centre in 410 Dunton Tower. You can submit the exam for scanning Monday-Friday, between 8:30 a.m. and 4:30 p.m.
Scantron answer sheets
Answer sheets need to be stacked face up, with the timing tracks on the left-hand side. If you require separate marking records for different sections of your course, please divide the answer sheets into separate stacks for each section.
Place your answer key(s) in ascending order (1-6) on top of the students’ answer sheets. For detailed instructions on how to properly fill out your answer key please refer to the section above.
Scantron scanning request form
Our Scantron Scanning Request Form is available online. Please complete the required sections prior to dropping your materials off at the EDC. Packages that have missing information on the request form or keys will not be accepted.
Once processed, instructors will be sent an email with the Scantron Data Files attached. After you receive this email, you are required to pick up your Scantrons from the EDC and keep them for a minimum of one year.
It is the responsibility of the student and the instructor to have Scantron sheets correctly completed. Improperly completed or incomplete Scantron exams can result in incorrectly graded exam(s).
How the Scantron machine works
The Scantron machine scans exams one at a time, recording which boxes have been shaded on an exam form. If the machine is unable to scan an exam, it will reject the student’s exam for later examination by EDC staff and instructors.
The machine cannot read:
- Forms that are damaged, bent or have folded corners. This obscures the bar code/timing lines and will cause the exam to be rejected.
- Forms that have pencil marks over the bar code black lines on the left-hand side of the exam. Each exam sheet contains small black bars along the left side of the page. These bars are used to line up the form and locate the questions. Drawings and/or writings on these bars cause an exam to be rejected. EDC staff will try to erase marks and rescan the exam.
Potential problems that can affect a student’s mark
- Exam version error. A student does not fill in the exam version number on multi-version exam, or fills it incorrectly. Check the “Problem_Keys.txt” file for students who did not fill in the exam version and hand mark these exams.
- A pen was used to complete the exam form rather than a pencil. Ink cannot be erased. When a student makes an error, they will often cross off the answer and fill in another box. Questions with two shaded boxes will be marked incorrect even if the student crosses off the original answer. These exams should be hand marked.
- Circled or very lightly filled in boxes. Students always must shade the appropriate boxes. Check marks, circled boxes or other marks will not be read by the machine. Very lightly shaded boxes will not be scanned, too. In all cases, the student will usually receive an abnormally low mark. Requires hand marking.
- A column of the form was jumped. A student may skip a column on the exam sheet due to exam stress or unfamiliarity with the Scantron exam form. This problem is usually discovered when the affected student asks to see their exam. Requires hand marking.
- Severe incomplete erasing. If a student changes an answer and does not completely erase their mistake, it can lead to them receiving no mark for the question. The Scantron machine can only read one answer per question. Requires hand marking.
To prepare your students and instruct them on how to fill in Scantron answer sheets properly, please refer to our Scantron instructions for students below.
OUTPUT.TXT file contains the raw data as it was read from the Scantron answer sheets. The answer key(s) will appear at the top row(s) of the file. Blank spaces indicate fields that did not have any boxes shaded in on the Scantron answer sheet and a “?” indicates fields where multiple boxes were shaded in. Each field is separated (delimited) by commas.
CULEARN UPLOAD.CSV file is comma delimited to allow easy importing to cuLearn Grades and other spreadsheet applications. Double-clicking this file will automatically open it in Excel. This file contains the following information:
- Student ID: Student number
- Last Name: Last name
- First Name: First name
- #C: Number of questions the student answered correctly
- %C: Percentage of questions the student answered correctly out of 100 per cent
- WC: Weighted number of questions answered correctly (only applicable if the value of each question was changed)
- W%C: Weighted percent of questions answered correctly (only applicable if exam total weight was changed)
- LG: Carleton’s letter grade equivalent
PROBLEM IDS.CSV Students appearing in the ‘problem ids.csv’ file have omitted or incorrectly indicated their student number. A student could also be appearing in this file if they are not currently registered in this course or this particular section of this course. Marks for students included in this file CANNOT be uploaded to cuLearn. Instructors must upload these grades manually. Please note: This file is only created if a problem ID occurs. The absence of this file indicates no problem ID’s.
PROBLEM KEYS.CSV file is only applicable for exams with multiple versions (or keys) being marked at once. Students appearing in this file have omitted or incorrectly indicated an exam version and their Scantron was unable to be processed. Instructors must mark these Scantron sheets by hand or resubmit the Scantron sheets with the exam version indicated. Please note: This file is only created if a problem key occurs. The absence of this file indicates no problem keys.
FULL REPORT.CSV contains the same information as the ‘cuLearn Upload.csv’ file plus key/version information, Carleton Gradepoint equivalent and letter grade information.
ADVISING REPORT.TXT contains a list of each student and information about every incorrect answer. This file includes the following:
- #Correct: Number of questions the student answered correctly
- #Incorrect: Number of questions the student answered incorrectly
- #Multi: Number of questions in which the student provided more than one answer
- MultiIndex: The question # in which the student provided more than one answer
- #Blank: Number of questions the student left blank
- BlankIndex: The question # in which the student left blank
- IncorrectItem > CorrectAnswer: Each of the incorrect answers listed as: the question number, student answer>correct answer
EXAM STATS.TXT is created for each version of the exam. This file contains the Item Analysis of the exam. Information pertaining to bonus questions, eliminated questions, exam total weight, question weight, mean, mode and variance can be found here. Find out more about item analysis.
SURVEY is only created if the Scantron sheets being submitted are for an anonymous survey. It contains only the answers from the Scantron sheets, with each letter value converted to a corresponding number (A=1, B=2, etc.). When scanning surveys, only this file will be created.
For detailed information on uploading Scantron grades to cuLearn, please visit our cuLearn support page.
Item analysis refers to a statistical technique that helps instructors identify the effectiveness of their test items. In developing quality assessment and specifically effective multiple-choice test items, item analysis plays an important role in contributing to the fairness of the test along with identifying content areas that maybe problematic for students.
Generally, the process of item analysis works best when class sizes exceed 50 students. In such cases, item analysis can help in identifying potential mistakes in scoring, ambiguous items, and alternatives (distractors) that don’t work. When performing item analysis, we are analyzing the following important statistical information.
Proportion answering correctly (item difficulty)
This indicates the proportion of students who got the item right. A high percentage indicates an easy item/question and a low percentage indicates a difficult one. In general, items should have values of difficulty no less than 20 per cent correct and no greater than 80 per cent. Very difficult or very easy items contribute little to the discriminating power of a test.
This is the difference between the proportion of the top scorers who got an item correct and the proportion of the bottom scorers who got the item right (each of these groups consists of 27 per cent of the total group of students who took the test and is based on the students’ total score for the test).
The discrimination index range is between -1 and +1. The closer the index is to +1, the more effectively the item distinguishes between the two groups of students. Sometimes an item will discriminate negatively. These items should be revised and eliminated from scoring as they indicate that the lower performing students actually selected the correct response more frequently than the top performers.
This is the correlation between an individual student’s performance on an item and their total score on the test. The values range from -1 to +1. The high positive values are desirable for the correct answer because they indicate that a student who did well on the exam also did well on this question. Negative values are desirable for the alternatives or distractors that were not the correct answer.
A score of 0 or less for the correct alternative indicates the question has difficulty distinguishing between those students who know the material and those who do not. The question should be examined and revised and potentially eliminated from scoring.
All of the incorrect options, or distractors, should actually be distracting. Preferably, each distracter should be selected by a greater proportion of the lower scorers than of the top group. In order for a distractor to be acceptable, it should attract at least one candidate. If no one selects a distractor, it is important to revise the option and attempt to make the distractor a more plausible choice.
Reliability of the test
The summary statistics found at the beginning of your item analysis include an estimate of the test’s reliability. KR 21 is a measure of the internal consistency of your exam or how well the items work together to obtain a measure of student achievement.
Generally, values of .60 or greater are acceptable for the purposes of classroom tests; however, an accurate interpretation of this value frequently requires the consideration of a number of factors impacting on student performance. Factors which lower the reliability of a test include: items which are poorly written, too many items which are very easy or very hard and too few items overall on a test to obtain a stable estimate of the student’s ability.
Adapted from Michigan State University website & Barbara Gross Davis Tools for Teaching
Please have your students fill out their Scantron sheets using the steps below.
- Print their LAST and FIRST NAME and fill in (shade) the appropriate boxes below.
- Starting at the left, print their entire STUDENT NUMBER, including the 100 or 110 (i.e., 100123456). Fill in (shade) the appropriate boxes below.
- The EXAM VERSION NO. indicates which version of the exam they are writing. If you’ve handed out different versions of the exam, this field must be filled in. Otherwise, the exam cannot be graded accurately and/or students might get an F mark. If there is only one version of the exam, students should leave this box blank.
- The COURSE NO. field is for the full course code, i.e., PSYC1001A. Have students print the course code and fill in the appropriate boxes below.
- In the DATE OF EXAM boxes, students should print the date of the exam and fill in the appropriate boxes below.
It’s important to remind them of the following:
- Use a dark lead pencil (HB #2 works well)
- Fill in (shade) answers firmly and neatly
- Completely erase any changed answers with a soft eraser
- Do NOT staple, fold, tear or crumple the form
- Do NOT draw or write on the bar code black lines on the left hand side of the exam
Was this page helpful?
no one has found this useful yet.