1. Examinations and Grading Regulations
    1. Using Scantron to Grade Multiple-Choice Exams
    2. Exams
    3. Deferred Term Work
    4. Grades
    5. Final Grade Standards
  2. Assignment prompts: What do you really mean?
  3. Assignment Design: Alternatives to Term Papers
  4. Multiple-Choice Exams
    1. Overall formatting and face validity
    2. Composing question stems
    3. Composing response options
    4. Final point for consideration
  5. Short Answer Questions
    1. Why use SAQs
    2. Tips on developing SAQs
  6. Rubrics
    1. Why use scoring rubrics
    2. How to create a scoring rubric
    3. Hints for creating a scoring rubric
  7. ePortfolio Grading Rubrics
  8. Online Assessments
    1. FAQs
    2. How to’s

Examinations and Grading Regulations

Detailed information about Carleton’s course evaluation policy is available in Section 2 (Regulations) of the undergraduate and graduate calendars, both of which can be found at carleton.ca/academics. Of note is Carleton’s early feedback guideline. Wherever possible, and especially in first and second year courses, instructors are urged to include academic work that is assigned, evaluated and returned prior to the 25th teaching day of each term. More generally, all instructors are urged to do this prior to the 40th teaching day of each term.

Course outlines should provide an indication of approximately when the first graded piece of work will be returned to students. In cases where a course does not lend itself to early feedback, this should be clearly noted on the course outline.

Using Scantron to Grade Multiple-Choice Exams

For multiple-choice exams, you may choose to use Scantron answer sheets (i.e., bubble sheets), which are available through your departmental administrator. The EDC can scan the exams and usually have them scored and returned to you within 48 hours. Find out more about Scantron here.


In-class Exams (Informal Exams)

Your midterm and in-class exams (including in-class finals) are generally your responsibility. Exceptional circumstances do exist where, for example, you are teaching one of several sections of the same course and the exams are already organized. However, there are a number of units where separate sections of the same course are run as separate courses and you have full autonomy.

You are responsible for a number of tasks when it comes to in-class exams, including:

  • Copying the exam papers to distribute to the class. Check with your departmental administrator to see if your academic unit has a photocopying budget.
  • Proctoring the exams. If you have TAs, it’s best to have them on site. You may be able to request additional help from your academic unit if needed.
  • Meeting any special accommodations students with disabilities may require for tests and exams.
  • Ensuring the exam’s integrity. Exercise good judgment in the seating arrangement and ensure the test or exam doesn’t exceed the time allotted for the class.

Have an airtight system for collecting exams and exchanging them among TAs for marking. If you lose the exams, contact your chair/director right away for advice on how this is handled within your academic unit.

You set the policy and procedure for missed exams. This policy should be clearly written in the course outline. It’s often good to schedule one make up exam and have a clear policy for what happens if both exam sessions are missed. You may have to draft an alternate exam in this case. Consult the EDC for strategies to re-work your assessment on the fly.

Once all of your students have written the exam, you can release the grades through cuLearn. This protects your students’ privacy and your responsibility through FIPPA. Never post student numbers, names and grades in any public way.

You decide whether or not you return exams to students, however they should have the opportunity to review their exams regardless of your choice. Even if you don’t want them to keep the exams, it’s a good idea to go over the exams with your students so they know what material they have mastered and where they need to place additional focus.

Administering your midterm in a large class is often difficult. Here are some tips for managing the situation:

  • Give different versions of the exam. Often if you just scramble questions on the first page, you can dissuade copying.
  • Have the first page of each version appear in a different colour.
  • Hire extra proctors. Many academic units have TAs available for hire to help with proctoring.

Final Exams (Formal Exams)

Exam Services schedules final exams. Your departmental administrator will contact you near the end of your course with the details. You will have to provide a copy of your final exam to Exam Services so they can make copies and bring them to your exam.

You must be present for the entire duration of your exam. Report to the head proctor when you get there so they can find you easily, if needed. If a class is seated in a room by itself, you are expected to proctor.  TA(s) are welcome to accompany you (this is recommended if your class is divided into separate rooms for the exam), but remember that this has to be counted as part of their contract hours for the term. If you can’t attend the exam, you are responsible for finding and securing an appropriate replacement (sending TAs in your place isn’t acceptable).

When the exam ends, you are responsible for removing the exams from the room. The turn-around time for calculating final grades is typically quite short (within five calendar days of the exam). You can find the deadline on Carleton Central under “E-Grades.”

Exam Services provides accommodations for PMC students. They are usually held at the same time but in a different location than the rest of the class. Finished exams will be mailed to your academic unit.

Students who don’t show up for the exam should be referred to the Registrar’s Office to apply for a deferral. Keep in mind that you will be responsible for grading the deferred exam and will be given the option to set a new exam. If a student knows ahead of time that they cannot make the final exam and you are satisfied that the reason is valid (if unsure, check with your chair/director), you can re-schedule it to occur before the formal exam.

Students are permitted to review their final exam up to a year after the course ends. You must therefore keep exams for a full year after the term ends and supervise the review. Once the year is up, you need to shred the exams to protect students’ privacy. Several academic units have storage facilities and shredding services, but some leave it up to the instructor (check with your departmental administrator).

Assigning a Take-Home Exam

Very clear guidelines must be established about your expectations of your students for take-home exams, including: word limits, guidelines about consultation with each other and how much external material they are expected to use. Take-home exams are due no later than the last day of the official exam period.

Deferred Term Work

If a student is unable to complete a significant term assignment late in the term because of illness or other circumstances beyond their control, it may be necessary to delay the due date beyond the deadline for reporting final grades. In this case, the student may contact their instructor and request an alternate arrangement. For details, please see section 2.6 of the undergraduate calendar.


Informal/Formal Appeal of Grades

Wherever possible, both during the term and after, concerns about the grading of student work should be settled informally between the student and the instructor. As a result of this informal appeal process the original grade may be raised, lowered or left unchanged.

If the issue cannot be settled informally, the student may submit a formal appeal of grade through the Registrar’s Office. For details, please see section 2.7 of the undergraduate calendar.

Posting Grades

You may use cuLearn to post your course work grades. For final grades, you must use E-Grades (accessible through Carleton Central). Training on E-Grades is available through the Registrar’s Office at select times of the year. Find out more about entering grades here. More information about cuLearn and E-Grades can be found in the computing for instructors section of this website.

Final Grade Standards

Pass/Fail Standards

Before you set your tests/exams and grade assignments in your course, it is best to ask your chair or director and a few faculty members what the grading and distribution standards in your academic unit are. Whether you agree with the standards or not (philosophically or practically), it is better to know about these standards before you are called by the dean’s office to explain your grading.

Carleton’s Grading System

Found in section 2.3 of the undergraduate calendar, the following is a letter and 12-point grade system used at Carleton. Standing in courses will be shown by alphabetical grades.

A+ 12
A 11
A- 10
B+ 9
B 8
B- 7
C+ 6
C 5
C- 4
D+ 3
D 2
D- 1
F 0

Number to Letter Grade Transfer

You determine the grades in your course, but they are subject to approval and adjustment by your chair/director and faculty dean.  All final grades at Carleton are letter grades. The most fair approach to grading is to record term work as number grades, calculate final grades, and then translate those numbers to letter grades before submitting them.

The following table indicates conversion from percentage grades to letter grades.

A+ 90-100
A 85-89
A- 80-84
B+ 77-79
B 73-76
B- 70-72
C+ 67-69
C 63-66
C- 60-62
D+ 57-59
D 53-56
D- 50-52
F 0-49

Grade Adjustments

Systematically adjusting student grades on course work or final grades is discouraged in some faculties and endorsed in others. If you are going to systematically adjust grades, you must indicate this in your course outline. You may be asked to make adjustments to final grades if your chair/director or dean feel the distribution signals a problem (e.g., assessment was too hard or too easy). Discuss the situation openly with your chair/director. If you have a reason for the skew, communicate it. Discuss a possible solution and be sure you understand how grades will be adjusted.

Assignment prompts: What do you really mean?

The following tips tell you what is really meant by some common assignment task descriptors. When you review the assignment with your students it may be useful to go over the terms you’ve used and be sure they understand what you meant by them.

Assignment prompts

  • Identification terms: cite, define, enumerate, give, identify, indicate, list, mention, name, state
  • Description terms: describe, discuss, review, summarize, diagram, illustrate, sketch, develop, outline, trace
  • Relation terms: analyze, compare, contrast, differentiate, distinguish, relate
  • Demonstration terms: demonstrate, explain why, justify, prove, show, support
  • Evaluation terms: assess, comment, criticize, evaluate, interpret, propose

What do you really mean?

  • Analyze: Divide a complex whole into its parts or elements, laying bare parts or pieces for individual scrutiny, so as to discover the true nature or inner relationships.
  • Compare: Look for qualities or characteristics that resemble each other. Emphasize similarities among them, but in some cases also mention differences.
  • Contrast: Stress the dissimilarities, differences, or unlikeness of things, qualities, events or problems.
  • Critique: Express your judgment about the merit or truth of the factors or views mentioned. Give the results of your analysis of these factors, discussing their limitations and good points.
  • Define: Give concise, clear, and authoritative meanings. Don’t give details, but make sure to give the limits of the definition. Show how the thing you are defining differs from other things.
  • Describe: Recount, characterize, sketch, or relate in sequence or story form.
  • Diagram: Give a drawing, chart, plan, or graphic answer. Usually you should label a diagram. In some cases, add a brief explanation of description.
  • Discuss: Examine, analyze carefully, and give reasons pro and con. Be complete, and give details.
  • Enumerate: Write in list or outline form, giving points concisely one by one.
  • Evaluate: Carefully appraise the problem, citing both advantages and limitations. Emphasize the appraisal of authorities and, to a lesser degree, your personal evaluation.
  • Explain: Clarify, interpret, and spell out the material you present. Give reasons for differences of opinion or of results, and try to analyze causes.
  • Illustrate: Use a figure, picture, diagram, or concrete example to explain or clarify a problem.
  • Interpret: Translate, give examples of, solve, or comment on a subject, usually giving your judgment.
  • Justify: Prove or give reasons for decisions or conclusions, taking pains to be convincing.
  • List: As in “enumerate,” write an itemized series of concise statements.
  • Outline: Organize a description under main points and subordinate points, omitting minor details and stressing the arrangement or classification of things.
  • Prove: Establish that something is true by citing factual evidence or giving clear logical reasons.
  • Relate: Show how things are related to, or connected with, each other or how one causes another, correlates with another, or is like another.
  • Review: Examine a subject critically, analyzing and commenting on the important statements to be made about it.
  • State: Present the main points in brief, clear sequence, usually omitting details, illustrations, or examples.
  • Summarize: Give the main points or facts in condensed form, like the summary of a chapter, omitting details and illustrations.
  • Trace: In narrative form describe progress, development, or historical events from some point of origin.

Common descriptions of information sources

  • Web-based: There are many types of online information, including e-journals, home-pages, newsgroups, and more. When you discuss “web-based” resources, be specific about what sort of online information you are referring to.
  • Scholarly journals: Articles are long, use terminology or jargon of the discipline, usually begin with an abstract and include a bibliography (e.g., Canadian Journal of Experimental Psychology; Journal of Academic Librarianship; IEEE Transactions on Microwave Theory and Techniques).
  • Popular journals: These are geared towards a more general audience and available on your local newsstand. Articles are short and rarely have bibliographies. (e.g., Maclean’s, Newsweek).
  • Current: Specifically define your boundaries for “current.” Do you mean “current” as in this week, this year, this decade, this century, etc.? Can they refer to older material at all, if it is relevant?
  • Peer reviewed (or refereed) journal articles: Explain the process of having experts in the field examine an article before it is published to ensure that the research described is sound and of high quality. Refer students to the Notes for Authors section of a journal to determine if it follows peer review.
  • Primary sources: These provide firsthand information in the original words of the creator or eyewitness and may include creative works, original documents, reports of original research, or ideas.
  • Secondary sources: These provide information reviews and/or, evaluation, analysis or interpretations of primary sources.

Reproduced with additions from: Skidmore College, NY: Common Terms for Paper Topics and Essay Questions. Permission from: Professor Michael Steven Marx, Associate Professor of English and Coordinator of Liberal Studies 1, English Department, Skidmore College.

Assignment Design: Alternatives to Term Papers

There are many different ways to have students go through the research and writing process without relying on the oft-assigned term paper. Assigning something different will generate enthusiasm for the assignment and encourage original work. Be creative!

  • Reflections on the process – At various times students turn in written descriptions of their research process.
  • Problem solving approach – What steps would be taken to solve a problem.
  • Literature review – Evaluative annotated bibliography.
  • Science in the news – Find evidence in literature for news release claims.
  • Event initiated examples – Evaluate a current event based on literature findings.
  • Web assignment – Design an informative webpage.
  • Evaluate thinking – Have students discuss what they found and compare sources.
  • Read and find facts – Read an editorial, article, reflection and find facts to support it.
  • Create a webpage – Select a topic to do with course content.
  • Biography – Select a scholar/researcher in field and report on career, influences, major ideas, moods and trends in research program.
  • Follow a piece of legislation through parliament – What groups are lobbying for/against it and why?
  • Follow a current foreign policy issue as it develops – Have students adopt the perspective of one of the various groups involved and predict the next move.
  • Nominate someone for a Nobel prize – Justify the nomination.
  • Adopt a persona – Write journal entries, letters, commentaries from that person’s perspective
  • Write an exam – Provide answers and provide a rationale for the responses.
  • Write a review of a performance, a movie, a book, a journal article, a guest speaker lecture, etc.
  • Write a newspaper, magazine, webpage story on a topic.
  • Describe your dream job – Research careers in the field and justify the choice of company, location, job, etc.
  • Compare and contrast primary and secondary sources.
  • Evaluate a website.
  • Compare and contrast the state of knowledge on a topic in two different decades or eras.
  • Conduct research but don’t write the final draft.
  • Prepare for a hypothetical interview – Do background research on a company or job offer and how you fit with job description.
  • Compare and contrast the content, style, and audience of three different scholarly journals in a field.
  • Compare and contrast a scholarly journal article with an article from a popular magazine.
  • Prepare for an interview with a top figure in the field – Justify responses, provide background notes.
  • Compare and contrast the ways different disciplines deal with the same subject matter.

Multiple-Choice Exams

It’s often assumed that administering multiple-choices tests is solely an issue of convenience: testing large numbers of students simultaneously with minimal time spent grading or assessing students’ passive recognition of key concepts. There is no denying that these are particular advantages, but this doesn’t mean that multiple-choice tests can’t be developed to promote and assess deep student engagement with course content. This document provides some useful tips about formatting and composing effective multiple-choice tests, including question items and response options.

Overall formatting and face validity

  • Use the one-best response format; avoid true/false, multiple-correct and complex K type formats that test logic and reading skills rather than content knowledge
  • Present questions and options vertically instead of horizontally to make the break between responses explicit, thereby ensuring the readability of individual questions
  • Ensure that all of the options for a particular question appear on the same page as their corresponding question; do not split items across two pages
  • If referencing media (e.g. illustrations or charts), ensure that its location is explicit and obvious; whenever possible place that media on the same page as, and directly above, the question
  • Avoid overly specific and overly general content; keep questions and options short and concise
  • Keep vocabulary appropriate for the group being tested; avoid the use of acronyms
  • Although three options may be adequate, four options can help maintain the validity of a question stem and overall test. Five options increases work effort (e.g. reading time) without providing a significant difference in the ability to discriminate between strong and weak performers
  • Use relevant material to test higher level learning, such as inclusion of typical settings; application questions (versus simple recall) can increase validity of exam
  • Proof and edit – and have someone else proof and edit – each question stem and response option for proper and consistent grammar, punctuation, capitalization and spelling

Composing question stems

  • Avoid trick questions which test neither mastery of content nor achievement of learning objectives; they erode students’ confidence, making them second-guess themselves (and you)
  • Focus on a single topic in each question so that if a student chooses an incorrect response, it is easy to identify which content they have not mastered
  • Keep the content of each question independent from that of other items on the test, this way difficulty with one question does not mean a student is unable to complete other test questions
  • Phrase the stem as a question; students should be able to come up with a reasonable potential answer prior to looking at the choices
  • Frame the stem positively; avoid negatives such as NOT and EXCEPT; if negative words are used, ensure that they are CAPITALIZED and boldfaced
  • Present stems in such a way as to question FACTS rather than personal opinions or preferences (e.g. avoid using the pronoun “you” in the stem of the question)

Composing response options

  • Make all distractors plausible yet definitively incorrect; silly or implausible distractors increase students’ chances of guessing the correct answer, even if they have not studied
  • Use familiar yet incorrect phrases or typical student errors as distractors to ensure that students cannot guess the answer based on the familiarity of only one of the choices
  • Keep length of choices about equal to avoid guessing based on common assumptions that the longest answer is always the correct answer (i.e. because the professor is careful to make it precisely correct and defensible)
  • “None of the above” should be used carefully as it increases question difficulty; if this option is not used, students know that the correct answer is included in the offered list and, thus, may be able to logic through the answer
  • “All of the above” should always be avoided; it’s difficult to create a valid question in this format and it’s an easy one for test-wise people to figure out
  • Avoid giving clues to correct responses by using either specific determiners (e.g. “always”, “never”, “completely”, “absolutely”) or choices identical to or resembling words in the stem
  • Avoid providing clues to the right answer via, for example: grammatical inconsistencies; conspicuous correct choice; pairs or triplets of options; blatantly absurd options

Final point for consideration

Consider the overall difficulty of the test in light of knowledge that university exams are supposed to assess mastery of course materials as taught. So, some questions should be designed to test items that most people should know based on the course materials, while other items should allow for discernments to be made between highly competent and less competent students.

Short Answer Questions

Short answer questions (SAQs) are brief, to the point, and a useful means of assessing students’ knowledge and comprehension of foundational information. You can use the following guidelines to draft SAQs for student assessments. Make sure that:

  1. Questions can be answered in a few words or phrases
  2. Each question has a single focus (if you need to ask several questions, use a vignette as a preface, then ask several focused questions)
  3. Response guidelines are built into the assessment tool (e.g. space allocated on page indicates length and complexity of answer)
  4. Questions may build on each other (i.e. increasing degrees of specificity without giving away the answer to previous question).
  5. Indicate the number of grades per question.

For example, you can ask students the following two SAQs:

  1. How many countries make up Africa in total? (2 marks)
  2. List six African countries. (1 mark each; 6 total)

Why use SAQs

Short Answer Questions are generally easy to develop. Like multiple choice tests, they are great for assessing students’ ability to remember foundational information, such as terminology. Yet, they do not require you to come up with viable distractors. Well-written SAQs provide students with clear tasks and provide you and/or TAs with a clear marking guide. Rather than asking students to explain in detail the process of eukaryotic cell reproduction, for instance, I might ask them to:

  1. Identify the number of phases
  2. Name those phases
  3. Explain what happens during two of those phases

This tends to lead students to provide more concise answers and TAs to find the grading easier than in long answer questions.

Tips on developing SAQs

  • Create questions that (a) can be realistically answered in a few words/phrases and (b) have a single focus per question or subquestion.
  • Restrict the length of the answer by using precise wording to define the task. In other words, ask direct questions (what is…) and use action verbs such as (list, name, identify).
  • Choose a topic area based on what you have covered in class and readings.
  • Review questions to make sure that (a) difficulty level is appropriate for students, (b) amount of information is clearly identified, (c) mark assignment is consistent, (d) content of the questions relates to the overall goal of the assessment.
  • Provide students with a guideline for length of answers (e.g. list three items…, define three of the following four terms in one or two sentences each…).
  • If you use vignettes, (a) take into account how much time they will take to read, (b) make sure you do not give away the answer in the vignette, (c) give students a relatively plausible situation for the specific context.


Scoring rubrics are documents which indicate (1) the criteria according to which students’ work is graded and (2) descriptions of various (3 or 4) levels of performance for each criteria. Instructors create these documents and, ideally, provide them to students and teaching assistants before the assignment is due. A rubric is thus a useful guideline for students, teaching assistants, and instructors. They can be constructed for classes in any discipline and for a variety of different types of assignment.

Why use scoring rubrics

There are many reasons you might want to use a scoring rubric. A well-constructed rubric functions as an instructional tool which guides students in developing their skills. It does so by clearly indicating what constitutes a beginner level of a skill through to an exemplary level of performance. This allows them to attempt to measure their own work against the criteria before submitting an assignment. A rubric also provides students with structured formative feedback: it gives them a sense of their strengths and areas for improvement as well as indicating how they can move from one level of performance to another. Rubrics also make the work of grading less tedious because you do not have to keep writing the same comments over and over as you move from one submission to another. Another useful element of rubrics is that they give multiple evaluators (i.e. TAs) a shared tool to use. This allows for more consistency across grading and provides specific suggestions they can give to students about how to improve their work. Finally, for some learners, the visual representation of grading makes it easier to understand and accept an assigned mark.

How to create a scoring rubric

Although rubrics may speed up the amount of time grading takes, they do take some time to prepare. If possible, you might want to spend a little time looking over other rubrics to get a sense of what would or would not work for the particular assignment you have in mind. Once you are ready to start, there are several steps to follow:

  • Decide on the 3-6 most important criteria for a particular type of assignment. For example, communication style is a key criteria in an oral presentation.
  • Once you have selected the most important criteria, consider the specific elements you would want to include in your description of each level of performance. In the oral presentation example, as you describe the quality of each level of performance for communication style, you might refer to several items: how audible the presenters were, whether they pronounced technical terms correctly, whether they explained all terms, whether they made eye-contact with their audience.
  • Decide how many levels of performance you want to include. Ideally, there should be either 3 or 4 levels of performance. Using fewer than three levels means losing specificity and nuance in the descriptions. Yet, using more than four levels makes it difficult to write meaningful descriptors.
  • Decide on the weight you want to give to each criteria and each level of performance. For example, a poor performance in a criteria graded out of four might get one mark (or a D) while an exemplary performance might get four marks (or an A).
  • Create a table which includes the list of criteria in the left hand column, then add another three or four columns depending on how many levels of performance you have chosen. Across the top of the table, you can label each column with a name (e.g. skill level: beginner, emergent, adequate, exemplary). For each criteria, assign a number and enter a description of the attributes describing each level of skill. In the oral presentation example, you might describe an exemplary level as follows: voice clear and audible, pronounced all technical terms correctly, clearly and correctly defined all terms, consistently made eye contact with audience throughout presentation. You could describe an adequate level as follows: voice mostly clear and audible, pronounced one or two technical terms incorrectly, defined most terms correctly, did not always maintain eye contact.
  • Once you have finished filling in the descriptions in each square in the table, you have a rubric!

Hints for creating a scoring rubric

  • Where possible, look at examples. You can ask your colleagues or search for examples on-line. Check out rcampus.com (it includes a searchable database of rubrics and you can get a free membership to use the site to generate your own rubrics).
  • Start with the highest level of performance for each criteria. This is usually easiest to describe and provides you a good starting point for being able to imagine a slightly less exemplary performance.
  • Keep your rubric as simple as possible – too much detail is overwhelming for you to come up with, your TA to keep in mind, and students to take in.
  • Do not go over four levels of performance.
  • Make sure your criteria and descriptions are general enough that students can transfer their learning to the next assignment (or one for another class), but narrow enough to provide meaningful feedback.
  • Avoid relative terms! In other words, do not describe a level of performance only in relation to another. For example, if one level of performance includes the description “no grammatical errors,” the lower levels should say “few grammatical errors” (not “more grammatical errors”), “some grammatical errors,” and “many grammatical errors.” The idea is to allow each student to understand their own level of performance against the criteria, not how they measure against other students.

ePortfolio Grading Rubrics

Carleton’s ePortfolio Faculty Learning community has designed rubrics to help instructors assess course-level undergraduate ePortfolio assignments. The group is sharing these rubrics as an open educational resource so that other instructors can adapt and use them in their own courses.

You can change, remove or add language within the rubrics or use language from the rubrics to help create your assignment descriptions. You have permission from the authors to use content from the rubrics however you see fit for your ePortfolio assignment. Learn more about the rubrics here.

Online Assessments

It seems that more and more people are opting to go online with their tests, quizzes and midterms these days based on its convenience and environmentally friendly nature. Assessments are graded automatically and reusable from year to year.

One of the biggest concerns facing instructors today is cheating. There are number of things that can be done to mitigate these concerns, the most popular option is using question categories (see FAQ #1). Remember, if you have any questions or would like assistance, the EDC can help you go from paper-based assessments to an online version. FAQs


Q: Is there a way to prevent cheating on online assessments?
A: There are many ways to reduce the risk of students cheating on an assessment: • Create multiple versions of the same question and randomly select one using question categories. • Add a time limit to your assessments, giving students less time to consult outside sources. • Set the questions to shuffle answers so that the right answer order will differ for each student.

Q: Will my students know how to take the test or have technical problems while trying?
A: For first year students or students not familiar with online testing, we suggest you create a pre-test to ensure that students are not having troubles or technical issues with the testing software. It’s important to weed out the main issues before the real test.

Q: What happens if a student is writing an exam and loses their internet connection? Or purposefully turns off their computer?
A: If a student loses an internet connection or their computer shuts off everything up to that point is saved. You will just need to override the test setting for that particular student and allow them one more attempt and adjust the time to reflect the time remaining.

Q: Are students able to go over their exams afterwards to learn from their mistakes or see comments on a short essay question?
A: Students can see their answers as well as the correct answers after submitting if you enable the proper settings. You can find more information on how to enable these settings by consulting “Quizzes” on the instructor technical support site.

Q: What happens if a student misses the exam day because of some emergency and has to write at a later date?
A: You can either:

  • Create a copy of the exam and release it at a different date to the select student. This process is actually relatively easy and is described below under the PMC students section of the How To’s.
  • Or, using user override, you can just override the test settings to open the test for those particular students.

Q: I have PMC students that need extra time. Can I extend the exam time for a couple of students?
A: Yes. See the PMC Students section in the How To’s below.

Q: I have 50 questions for a midterm. I want to have cuLearn randomly select 25 from an easy database of questions, 15 from a medium, and 10 from a hard. Is this possible?
A: To do this, you would have to make three Question Categories in the question bank. Question Categories lets you select a group of questions and will randomly choose a certain number of them to be on the assessment. For information on how to do this view the Quizzes section.

Q: Is there a way I can see all the questions I’ve generated?
A: Click the test title and then click “Preview” under the quiz administration Note: If the quiz contains question categories, preview will not display each possible question. It will randomly select questions from the question categories and display those.

How to’s

Exporting and importing questions from the question database


  1. Click question bank on the left side under course administration
  2. Click Export
  3. Select the format you want the questions be exported as
  4. Select the category of questions you’d like to export
  5. Click “export category to file”
  6. Select the destination for your exported files and click OK.


  1. Click question bank under course administration on the left
  2. Click Import
  3. Select the format of the questions file you want import – cuLearn currently can only import files with Aiken format or moodle XML format
  4. Select the category you’d like to import the questions to
  5. Upload the questions file into the box under “Import questions from file”
  6. Click import. If the file has no compatibility issues with cuLearn, you will see a list of questions that will be imported and a request to confirm that you want import these questions into the question bank.

Creating categories in the question bank

Question categories allow you to randomly choose one or more questions for a test from a set of selected questions.

  1. Click question bank on the left side under course administration
  2. Click categories
  3. Under “Add category”, select the parent category for the new category
  4. Type a name for the new category you want to create
  5. Click Add category

Moving questions to categories in the question bank

  1. Click questions under “question bank” on the left side under course administration.
  2. Select the category that the questions currently belong to
  3. Check the little box on left side of the question title
  4. Select the category that you want to move the question to in the box next to “Move to”
  5. Click Move to

PMC students

One of the most common problems with online assessments is that Paul Menton Centre students often require more time than other students. To accommodate this, you can adjust the quiz setting for those students only using the quiz’s user overrides to accommodate PMC students.

To release the assessment to select students

  1. Click the quiz title
  2. Click “User overrides” on the left side under quiz administration
  3. Click Add user override
  4. Select the students you want to release the quiz to
  5. You can change the time and date for the open quiz or for the close quiz, time allocated for the quiz and the number of attempts allowed for those students
  6. Click Save

Was this page helpful?

2 people found this useful