Skip to main content

Time-Saving Assessment Workflows in Blackboard

This post brings together details of a range of tools and workflows that you can use through Blackboard to help you develop more efficient approaches to marking student work.

If you are looking for more efficient marking processes, particularly in cases where you have large cohorts of students, there are a number of tools available that may be able to help you. The purpose of this post is to help you identify the options available by giving more details of each, supported by examples of efficient marking workflows currently in place. This will hopefully go some way to easing fears about using automated marking workflows, and show how the initial time investment to set up the workflow will reap rewards with the efficient marking of student submissions down the line.

We have also created an accompanying post on Writing Strong Test Questions. This resource gives good practice advice to help with the creation of well-constructed, high-quality assessment questions to effectively test student reasoning and synthesis skills.

Why Use Auto-Marked Assessments?

Auto-marked assessments with formats such as multiple-choice questions (MCQs), true or false, matching pairs, and fill in the blanks can significantly streamline the marking workflow. As there are known and pre-defined correct answers, these question types can be instantly graded by computer systems upon submission. This rapid processing not only saves staff time but also guarantees grading accuracy and consistency. Feedback can be provided immediately to students, supporting their learning by quickly identifying areas for improvement. This is especially beneficial in large class settings where manual grading is impractical and time-consuming.

It’s important to be aware that efficiency gains don’t happen straight away; there is significant workload in setting up good tests, so the first time you do this, the work shifts from marking to assessment design.

Auto-marked assessments can be useful in a range of contexts: 

  • as diagnostic tools to identify areas of misunderstanding
  • as formative exercises to help students master concepts and reinforce understanding
  • to provide feedback and help students develop self-assessment skills
  • for summative assessment 

Select what you wish to do from the following options to be directed to the relevant section:


Create Auto-Marked Assessments with Known Answers directly in Blackboard

Student Using a Computer

Why Use Auto-Marked Assessments?

Blackboard allows the creation of auto-marked assessments with formats such as multiple choice questions (MCQs), matching pairs and fill in the blank. Providing instant marking and feedback to students, this type of assessment creation route is helpful for large cohorts of students abd can thus significantly streamline the marking workflow. As there are known and pre-defined correct answers, these question types can be instantly graded by computer systems upon submission. This rapid processing not only saves staff time but also guarantees grading accuracy and consistency. Feedback can be provided immediately to students, supporting their learning by quickly identifying areas for improvement. This is especially beneficial in large class settings where manual grading is impractical and time-consuming.

MCQ and numerical tests using Blackboard are already in use as both formative and summative assessments in the departments of Computer Science, Psychology and Earth Sciences within the Faculty of Science.

Question Types

Blackboard allows the creation of a variety of different question types that may be marked automatically. These are:

Question Pools

Blackboard allows the creation of Question Pools, where an assessment may be set up in such a way that Blackboard randomly picks x of y questions from a pool and delivers them to students in an assessment, thereby presenting different students undertaking the same assessment with different questions of equal difficulty and equal value.

Question Banks

If large numbers of questions are created they may be added to a Question Bank, where they may be grouped by topic or difficulty level and indexed (to allow them to be searchable). When creating assessments, instructors may select one or more questions from a Question Bank and copy those questions into the new assessment. Categorising questions into Question Banks gives staff the opportunity to develop large numbers of questions, reuse questions in other assessments in the same modules, or share them with other modules. Therefore, the potential gains for staff in understanding and drafting well-crafted and shareable questions are clear. Sharing questions in this way can not only greatly increase efficiency of assessment creation, but it can also better engage staff in the workings of the VLE, encouraging them to reflect on their academic practice when it comes to assessment of students. View these guides for extra information on Question Banks:

Generating Questions Using AI

Blackboard also contains a new Design Assistant feature that allows questions to be built using generative AI. To read more about the scope of Blackboard’s AI Tools, visit the AI Design Assistant support page.

Assessment Exceptions

Students requiring extra time for assessments may have this set up for them by means of an Assessment Exception. To learn more about how this is deployed in Blackboard, view this Grant Assessment Exceptions YouTube clip.

Return to selection area


Equations on a Blackboard

Create Assessments for Maths Notation Using Numbas

The open source tool Numbas may be used to devise mathematical assessments that can include expressions, number entry, matrix entry, matching and selection question types, all of which are auto-marked. Numbas assessments may be linked out to, or uploaded directly into Blackboard. Developed by Newcastle University but released as an open-source product, academic staff create tests in an online editor. These may then be shared with students via a link, or uploaded into Blackboard. Students get randomised questions generated with the set of variables defined by the test creator. Assessments are marked automatically with instant feedback. As Numbas runs entirely through a browser, it can even cope with loss of internet connection during a test. In such a situation, student submissions are saved locally and uploaded to Numbas once the web connection is restored.

There is no need to be able to understand coding to make Numbas work for you as it uses a graphical editor. Questions may include graphics, video clips and interactive diagrams, as demonstrated on a page of example multimedia questions. As the product is open source, many of its previous users have shared tests and questions they have created under Creative Commons licensing, and these may be explored in the Numbas Public Database.

Numbas supports a range of Question Types:

  • Mathematical expression
  • Number entry
  • Matrix entry
  • Match text pattern
  • Choose one from a list/Choose several from a list/Match choices with answers
  • Gap-fill
  • Information only
  • Extension

For examples of what each of these look like when created, view the Numbas Website Demo page. There is exceptionally comprehensive Supporting Documentation for Numbas available to help users get started.

Numbas is already in use within Mathematical Sciences in the Faculty of Science.

Return to selection area


Student Working on Handwritten Work

Create Numeric Assessments with Known Answers using Gradescope Online Assignment

Gradescope is a tool integrated with Blackboard that has many options for aiding online assessment. While the strengths of Gradescope lie in its ability to support written assessments, it can also support the creation of online Tests. However, the functionality in this area is more limited than other tools listed in this section.

The Online Assignment tool (currently in public beta) allows the creation of several question types, including MCQ, short answer, check boxes, and file or image uploads. Students answer the questions either by typing, selecting, or even uploading scanned or photographed handwritten responses.

Within Gradescope, there are three question types that may be graded automatically and which require no manual marking intervention. They are:

  • Multiple Choice
  • Select All (multiple response)
  • Short Answer

To see an example of how easy it is to set these question types up, view this Creating an Online Assignment Gradescope Support YouTube clip.

Return to selection area


Summary

Whatever the type of assessment you have set up for your students, there is likely a way that by using one of the above tools you may be able to streamline the marking process, either by automating the marking to some extent or by engaging students in the process of peer review. Both routes may help reduce the marking load and save time without compromising the quality of feedback given to students.

If you have an interest in exploring one of these marking options, please contact the Science Digital Education team or with Ross Parker, the Senior Digital Education Consultant (DEC) for the Faculty of Science in DCAD.