Improving Multiple Choice Test Assessment Items Using The "Item Writing Flaws Evaluation Instrument" (IWFEI)

This workshop is designed to present item writing guidelines for multiple-choice questions that can support the development of high-quality multiple-choice test items. The session will adopt a recently published guideline called the item writing flaws evaluation instrument (IWFEI). These guidelines, whilst not exhaustive, are designed using common guidelines represented in literature. The session will start with a short presentation of the evaluation instrument to attendees. The presentation is designed to not only present the items criteria, but also explain why the guideline exists. Once we have presented and explained the instrument, we will run two different activities:
Using the IWFEI to evaluate an existing multiple-choice exam. For this activity we will ask you to evaluate a multiple-choice exam that you have implemented in the past year. This assessment can be in-person or set online. The exam can also be one you created yourself, set by an education organization or formed from a test bank. Please bring all the questions and, if available, the exam statistics such as item difficulty and item discrimination. After this activity you will better understand the quality of the exam you have set.
Using the IWFEI to design your own multiple-choice questions. In this activity we will give you a selection of topics to choose from and ask you to design a multiple-choice question at the appropriate level for your students. We will ask you to use the IWFEI to support the design process. After this activity you will have at least 1 question you have designed yourself that you could use on a multiple-choice exam.
This workshop is suitable for all attendees that are involved with setting, designing and/or implementing multiple-choice assessments in their classroom. The assessments can be formative or summative, high- or low-stake assessments.

High School
Undergraduate Education
Graduate Education
Assessments and Research Methods
Cross-cutting Thread(s):

Christopher Randles

Erin Saitta

Julie Donnelly

Jared Breakall