, , ,

Best practice for inclusive assessment: Delving into digital

In part one of our two-part blog, we considered general accessibility in paper-based and digital assessments – check it out if you missed it! Part two looks at specific ways to make digitally-delivered assessments more accessible.

See part one of this two-part blog here.

Some students may struggle with the fine motor skills needed to click/press on a tiny radio button or checkbox.

1. Embed digital accessibility

We mentioned this in part one, but there are a few specific ways you can improve accessibility for digital products.

  • Don’t forget to be colour blind friendly and high contrast!
  • Make interactive elements as large as reasonably possible. Some students (and, in all likelihood, some adults) may struggle with the fine motor skills needed to click/press on a tiny radio button or checkbox. Ease of use should be rolled into your assessment design.
  • Avoid using text as part of an image. Some screen readers can’t pick up on this, and as a result will ignore it. If you have a labelled diagram, for example, ensure the labels are actual text. An added bonus here is that you’ll avoid the potential pitfalls of blurry or mismatched text when compared with the rest of your assessment.
  • Videos need captions or transcripts. In some contexts (language teaching, for example), having a transcript nullifies the assessment itself. Use your judgement as to when these features should be selectively excluded, but include them as default.
  • Add alt text to images. Yes, all of them.

The WCAG 2.1 guidelines are the current gold standard for digital accessibility, but it’s worth considering cultural accessibility alongside them.

2. Keep instructions simple

Writing quality question rubrics is an oft-overlooked skill. Instructions need to be clear, concise, and coherent. The most complex vocabulary in the question should be subject-specific and relevant, and any other language should be as simple as possible. Use short sentences and split up contextual and functional information.

The most complex vocabulary in the question should be subject-specific and relevant.

  • Instead of: ‘Read these sentences and then click and drag the answer options into the table to show whether they are positive or negative impacts of climate change on people currently living in the UK.’
  • Try: ‘Are these positive or negative impacts of climate change in the UK? Drag each option into the table.’

A student might know everything about climate change but if they can’t understand what a question is asking them to do, they won’t get the marks.

3. Images: quality over quantity

This one can be tough to action, especially when it comes to digital material. Everyone wants eye-catching content, but in reality every image added is just another distraction from the actual assessment.

If there is an image, it should be pertinent to the question being asked and large enough to be easily legible. This can pose a particular challenge for reflowable digital assessments. Zoom or popup functionality can work well, but try to make sure that the image is visible alongside the question text as well (even at a smaller size).

This is especially applicable to graphs or diagrams. Embedded image text doesn’t reflow on smaller screens: it just gets tinier and tinier.

4. Tailor your questions

Is this a friendly, low-stakes formative quiz, or an end-of-topic assessment? Some question styles are more suited to one over the other. True or false questions, for example, can be a great way to test recall of basic concepts, but students have a 50:50 chance of guessing correctly through random selection. Useful for quizzes – less so for summative assessment and progress evaluation.

Similarly unhelpful are ‘all/none of the above’ answer options in MCQs. Ideally avoid these in any assessment: they’re a cop-out and can make anything feel like a trick question.

This kind of thinking is relevant to paper-based assessments too but the breadth of application of digital assessment means we’ve included it here.

5. Make it user-friendly

Test the functionality of questions: ideally as an integral part of your proofreading stage. Does it make sense for MCQ answer options to randomise, or should this be switched off if, for example, answer options are numbered? If a question requires two answers, are two answers selectable – and only two? If it’s a drag and drop question, should all draggable options be the same size or does there need to be a visual distinction between them? 

Make time to reread your assessment with your UX hat on (or ideally get a fresh pair of eyes on the job). Are there any ways a student could ‘break’ things, and could a quick fix prevent this?

Make time to reread your assessment with your UX hat on.

Most thinking about assessment – digital or not – falls under the umbrella of instructional design. This is a wide-ranging and ever-shifting field with research being added all the time. This blog only scratches the surface of what’s out there! Some further reading to get you started:

Digital accessibility checklist

  • Embed digital accessibility
  • Keep instructions simple
  • Images: quality over quantity
  • Tailor your questions
  • Make it user-friendly