Blog

Maritime Training Issues Blog

The latest from Marine Learning Systems.

Guidelines for Multiple Choice Questions

Guidelines for Multiple Choice Questions

Jul 13, 2016 Murray Goldberg 0

Introduction

In my last article I began a discussion of the use of multiple choice questions (MCQs) in maritime assessment. In this, the second article, I now continue the discussion by looking at two aspects to consider when using MCQs:

  1. the importance of using them in combination with other assessment techniques,
  2. and the importance of understanding cultural and gender issues as they relate to MCQs.

Although this second post is somewhat “high level”, in the next and final article of this series, I will switch gears to look at some very concrete and practical tips for writing and maintaining multiple choice questions. If you would like to receive a notification when the third part of this series is available (as well as for subsequent maritime training articles) and have not yet done so, please subscribe to this blog.

 

Recap of Previous Post

MCQ tests are one of the oldest and most widely used assessment techniques in existence. Yet they are also one of the most highly maligned, with both strengths and limitations.

If you plan to use MCQs as part of maritime training and assessment, there are some basic guidelines that can be followed to help maximize their value and minimize their limitations. It is important to note that the limitations of MCQs, and other techniques, can never be fully neutralized. Every form of assessment, including MCQs, can be royally botched if the trainer does not have a basic understanding of what they would like to achieve, how the various techniques fit into the overall picture, and how to avoid common mistakes.

While the discussion in this series of articles is hardly comprehensive, it will hopefully serve to whet your appetite to find out more and to discuss the issues with your colleagues. It is an opportunity for us all to learn.

Let’s look some guidelines to follow:

 

Guideline #1: Multiple Choice Questions Should Never Be the Sole Assessment Technique

This is the first guideline here because I feel it is the most important. As I mentioned in the previous article, every form of assessment has its strengths and limitations. Each technique has something to offer. As such, combining multiple techniques to form an assessment program is a very effective way of broadening assessment coverage and negating the weaknesses of individual techniques.

Consider three assessment techniques:

  1. Multiple Choice Questions (MCQs)
  2. Constructed Response Questions (CRQs) – a “written test” where trainees answer in sentence or paragraph form
  3. Demonstrations of skill (Demo)

Looking at each of these from a high level, I have chosen a few assessment goals and indicated whether that goal could be viewed as a strength (S) or a limitation (L) of the technique:

Assessment Goal MCQs CRQs Demo
Is highly objective S L L
Ability to test basic knowledge S S L
Avoids potential to guess correctly L S S
Ability to test higher order thinking L S S
Ability to test skillful performance L L S
Requires ability to recall (rather than recognize) L S S

The table above is not meant to be a proper assessment of strengths and limitations of assessment techniques. The point of the table is not to demonstrate that one technique is better than the others – but instead to demonstrate that the strengths and weakness of different techniques are largely non-overlapping.

 

A Real-World Example:

Each assessment technique “brings something to the table” and accommodates the shortcomings of the others. This fact was not lost on British Columbia Ferry Services Inc. (BC Ferries) when they designed the assessment portion of their training program.

At BC Ferries, assessment varies according to position and trainee background, but it typically consists of four activities meant to assess the trainee from a different perspective:

  1. A multiple choice exam.
    This tests critical basic knowledge and is 100% objective in grading.
  2. An oral exam on a number of scenarios (e.g. “You spot a fire in a trash can on the car deck. What actions do you take?”).
    While not objective in grading, this tests one’s ability to assess a scenario and “think on their feet”.
  3. A demonstrative exam on a number of skills (e.g. “Show the proper technique to pass through a watertight door and explain the various hazards associated with such an act.”).
    This tests the ability to perform required skills under normal situations.
  4. A meeting with the master.
    This meeting helps assess attributes which typical assessment techniques are unable to: communication skills, attitude, professionalism, etc.

I have greatly simplified the BC Ferries assessment approach here, but this is just a real-life example of combining assessment techniques to achieve a better overall assessment of a trainee. Other combinations of techniques are also valid. Whatever is chosen, consider the various strengths and limitations of each, and chose the combination to ensure that all of the your assessment goals are met.

 

Guideline #2: Consider Cultural and Gender Differences in Using Multiple Choice Questions

I received a very thoughtful comment from Dan Connors who has a great deal of experience in maritime training in various parts of the world. His observation was that there were significant cultural differences in people’s ability to perform well on MCQs: many who could not perform well on them still could demonstrate competencies in other ways.

At the same time, I recalled attending a presentation of an outstanding paper on the use of MCQs in the maritime industry given by co-author Denis Drown Ex.C., F.N.I. at the 20th International Maritime Lecturers Association Conference (IMLA). The paper was titled “Multiple Choice Question Assessment: A Question Of Confidence”.

One of the interesting results presented was that there are gender differences in the ability to perform successfully on MCQ tests. This was a bit of an eye-opener for me. I had always been aware of cultural differences, but never considered gender differences.

Although this is an oversimplification (and possible misrepresentation), I consider cultural and gender differences to boil down to two basic issues – familiarity with assessment techniques, and real differences in people based on their cultural and gender context.

 

Familiarity with MCQs

The first issue is that some people are completely unaccustomed to the MCQ format. After all, MCQs are more of a North American phenomenon – less popular in other parts of the world and almost unheard of in still other parts. We cannot expect people who have never seen a MCQ to perform well when confronted with them.

This problem can partially be accommodated for through practice. Fortunately, practice is one of the great strengths of MCQs. MCQs are an excellent for formative testing – tests taken to provide feedback on student learning. With technology, MCQs can be easily delivered over and over again, with a different set of questions each. In this way, trainees gain insight into whether they are learning the knowledge to the depth expected, and they also become accustomed to the assessment technique itself.

This effect cannot be overstated for any other assessment technique. Trainees must have the opportunity to practice assessments before they are faced with their “final exam”. Otherwise, when exam time comes, although they may be very familiar with the materials, they may be confounded by the assessment technique and thus perform poorly. You want to test a trainee’s familiarity with maritime concepts, not their familiarity with a style of assessment.

 

Yes – People are Different

The second issue with respect to cultural and gender differences is that people of different genders and cultural background are…different. As such, their performance is going to reflect not only their actual knowledge, but also the differences in personality traits molded by cultural norms and gender influences. As with many of the issues raised, this is true of all types of assessment – MCQs are no exception.

In the paper presented by Denis Drown at IMLA he indicates that gender differences in MCQ testing need to be considered given increased female participation and recruitment in the maritime industry. He cites a number of studies which indicate that:

“MCQ tests promote values of objectivity, factual knowledge, and rapid performance (male socialisation), and devalue subjectivity, reflection, introspection and feelings (female socialisation)”.

And that:

“Females do less well than males on MCQ tests and better on essay tests … with lower MCQ test scores for females attributed to social and cultural differences”.

Although the above text addresses gender differences, I suspect that very similar findings are true when considering cultural differences. We have all had the experience of seeing people from different cultures perform very differently on our tests.

These are deep differences inherent in culture and gender affecting not only the ability to perform on various assessment types, but also how people learn. I suspect that these differences will be similarly evident on all forms of assessment – not only on MCQs. This, to me, is yet another strong argument for combining multiple assessment techniques. Doing so would hopefully help to balance the playing field as people with different backgrounds will have more and less success with each assessment type. At the very least, it is important to be aware of the issues and consider them when designing your assessment program.

 

Conclusion of Part 2

As I concluded in the first article of this series, I believe MCQs to be of value despite their shortcomings. All assessment techniques have shortcomings. Should MCQs be used as the sole technique to assess seafarers? Absolutely not. Nor should any other technique. Likewise, in designing an assessment program, we need to be aware of the differences between people of different cultures and genders – and the fact that those differences affect their ability to perform on different types of assessments.

The next article is going to conclude this series with some practical guidelines on how to create good multiple choice questions. While it is not difficult to do so, a little bit of advice goes a long way in ensuring the time you put into it yields good results.

If you would like to receive notification of the final article of this series (and subsequent maritime training articles) and have not yet done so, please sign up for blog notifications here.

Until then – thanks for reading!

Follow this Blog!
Receive email notifications whenever a new maritime training article is posted. Enter your email address below:

Interested in Marine Learning Systems?
Contact us here to learn how you can upgrade your training delivery and management process to achieve superior safety and crew performance.

Murray Goldberg

Murray Goldberg is the founder and President of Marine Learning Systems. He began research in eLearning in 1995 as a faculty member of Computer Science at the University of British Columbia. He went on to create WebCT, a highly successful LMS for higher education; serving 14 million students in 80 countries. Murray has won over a dozen University, National and International awards for his pioneering contributions to the field of educational technology. Now, in Marine Learning Systems, Murray is hoping to play a part in advancing the art and science of learning in the maritime industry.

Scroll to Top