Trends in Adult Education – “Aha” Moment
For the final portion of this assignment we have been asked to identify the point or topic that had the most effect on us and caused us to reflect and consider how we can use these new trends in our courses.
My biggest “Aha” moment actually came from a different article but one that follows the same trend as Making Multiple-Choice Exams Better. This other article is titled Optimizing Multiple-Choice Tests as Tools for Learning. I will place a link to a PDF version of it at the end of this paragraph. This article talks about how a well written, challenging exam question can be used as a lesson on its own. As students write an exam with quality distractor answers, they’re actually learning from those question throughout the exam. This gives them additional knowledge to help with questions later on in that test and even in later tests scheduled for the coming weeks. My “Aha” moment happened while I was auditing another instructors test review class. This instructor had a very well written exam that challenged the students to critically think on nearly every question. As he was going through each question in the review, he didn’t just state the answer and move on. He went through each available answer and described why that answer can not be the correct one. He even posed additional questions throughout the review to help students clarify why they got some questions wrong. The students weren’t happy with the test at first, they had become so used to memory recall type questions that they were caught off guard but by the end of the course they had all learned far more than they would have with just basic questions. Seeing these students in the shop environment later confirmed my thoughts that the tests helped solidify the theory for them. They were able to fully diagnose the vehicle systems much more accurately than previous classes that couldn’t think at the “analysis” level. Optimizing Multiple-Choice Tests as Tools for Learning.
My “Aha” moment for Adaptive Learning came when the article mentioned pretests. I had always done group discussions at the start of each subject as a way to pre-assess the students but I found that generally only the most vocal students who obviously knew some of the subject matter would contribute. This left the other students to just sit back and wait for me to deliver the curriculum. I feel like if I make an open book take home pretest with questions based on information straight out of the students school books then they will read the chapters before I cover them in the next few classes. This will hopefully give them a base knowledge and give me more time to expand further upon the subject than I would be able to otherwise. After reading this article I talked with another instructor and he mentioned that he does this style of pretests already. For the subjects that he has written the tests for, he claims that the students final marks jumped up about 10% over previous years. While this still isn’t Adaptive Learning, I feel that at least taking a portion of the idea behind it will be beneficial to my students.
Trends in Adult Education – Implications
Part two of this assignment involves discussing the implications of these trends on our own instructional styles and techniques.
When it comes to the article Making Multiple-Choice Exams Better, I feel that I have already started making strides to achieve this. Throughout the last course that I was teaching I consistently went back to the tests and added questions that I felt better challenged the students’ knowledge. I then uploaded all the tests into Brightspace and had the students write them on a LockDown browser. I now have data and statistics on all my tests that I administered, both the older questions and the new ones that I have written. From these statistics I can see which questions are too easy (95% of the class answered correctly) which ones are potentially worded poorly (the majority of the class answered incorrectly) and which answers are not good enough distractors or are not plausible. As time goes by and I start changing all the questions to be of a higher level of thinking, I will start to see a more accurate picture of how students are actually doing. As of right now my only chance to see that is when my students come down into the shop. Some may have done well on the older recall/memory based exams but once in a shop environment they cannot diagnose the vehicle systems that we just discussed. It becomes apparent that their understanding has stopped at the “Knowledge” section of Bloom’s Taxonomy.
With the trend of Adaptive Learning this may be a bit harder to implement in the classroom for me at this time. At least to the full extent that the article Adaptive Learning: An Innovative Method for Online Teaching and Learning describes. The article talks about using pretests and depending on the students answers, it tailors a learning path for them based on their knowledge. This would be a fairly expensive method to get up and running but once it is then I think that it’s a great idea. One thing that I am going to try based off this trend is the idea of pretests at the beginning of every new section. While not Adaptive Learning, I think it will give me a better idea of where my students are at with their knowledge. Ideally it will encourage them to pre-read the modules before classes to gain some base knowledge of the material too. That way they’ll be able to ask more direct questions when we have face-to-face time and I can either expand on subject matter or help reinforce the foundations depending on the student.
Trends in Adult Education – Trends
For this assignment we were asked to find an article related to some trends within adult education. Once we’d found an article that interested us we were paired up with a learning partner to share the articles with each other and discuss our insights to these trends. My learning partner is Dazy, an Instructor and Faller Blaster from beautiful Vancouver Island. I’ve placed a link to Dazy’s blog in the Links section. Both of the articles that we’ve chosen can be found in the Resources section or just by scrolling further down to see some older posts.
My article was titled Making Multiple-Choice Exams Better. This article talks about some pros and cons to multiple-choice tests, from the standpoints of the students saying that they prefer them, to the issues of the majority of exam questions being fairly low on Bloom’s Taxonomy. The article talks about how changing questions to be geared towards critical thinking can show the students progress more accurately. Most multiple-choice questions, especially ones from exam banks rely too heavily on memory recall. This is why students seem to prefer them. They’ve become accustomed to study strategies like flashcards and memorizing definitions without truly understanding them. If the exam questions where more analytical or diagnosis based then we would truly get a better understanding on whether or not the student grasps the material. Another thing the author talks about is getting rid of answers that are obviously wrong. Tracking the data from tests over a period of time and throwing out any answers that less than 5% of students chose. This helps strengthen the exam. If say two out of the 4 answers are obviously wrong, then the question just became a 50/50 guess for the student. It makes it difficult to really see if the student has a good handle on the subject matter or if they are just getting by because of easy exams.
Dazy’s article is titled Adaptive Learning: An Innovative Method for Online Teaching and Learning. His article discusses a method of instruction called Adaptive Learning. Adaptive Learning doesn’t follow a traditional, linear path for the course. Using an algorithm, software determines individual learners needs based on their past knowledge and experience. This style of teaching can be an excellent way to personalize lessons and avoid “teaching to the middle” as well as identifying which students need more support or help. Students seem to enjoy the course structure too, only learning about new exciting materials and not going over old facts or knowledge that they’ve already logged into their long term memory.
Trends in Adult Education – My Learning Partner’s Article Choice
Adaptive Learning: An Innovative Method for Online Teaching and Learning
Here’s a link to my learning partner Dazy’s article of choice for “Trends in Adult Education” Go to the “Links” section to find his own blog as we both go through PIDP.
Source: Adaptive Learning: An Innovative Method for Online Teaching and Learning
Trends in Adult Education – My Article Choice
Making Multiple-Choice Exams Better

The relatively new Scholarship of Teaching and Learning in Psychology journal has a great feature called a “Teacher-Ready Research Review.” The examples I’ve read so far are well organized, clearly written, full of practical implications, and well referenced. This one on multiple-choice tests (mostly the questions on those tests) is no exception. Given our strong reliance on this test type, a regular review of common practices in light of research is warranted.
This 12-page review covers every aspect of multiple-choice exams, at least every aspect I could think of. What follows here are bits and pieces culled from the review. For teachers serious about ensuring whether their multiple-choice exams (both those administered in class and online) assess student knowledge in the best possible ways, this article should be kept on file or in the cloud.
Perhaps the most important ongoing concern about multiple-choice tests is their propensity to measure surface knowledge, those facts and details that can be memorized without much (or any) understanding of their meaning or significance. This article documents studies showing that students’ preference for multiple-choice exams derives from their perception that these exams are easier. Moreover, that perception results in students’ using studying strategies associated with superficial learning: flashcards with a term on one side and the definition on the back, reviewing notes by recopying them, and so on. Students also prefer multiple-choice questions because they allow guessing. If there are four answer options and two of them can be ruled out, there’s a 50 percent chance the student will get the answer right. So students get credit for answers they didn’t know, leaving the teacher to wonder how many right answers indicate knowledge and understanding the student does not have.
In one of the article’s best sections, the authors share a number of strategies teachers can use to make multiple-choice questions more about thinking and less about memorizing. They start with the simplest question. If the directions spell out that students should select “the best answer,” “the main reason” or the “most likely” solution, that means some of the answer options can be correct but not as correct as the right answer, which means that those questions require more and deeper thinking.
Another strategy that promotes more thinking includes a confidence level indicator given along with the right answer. The greater the certainty that the answer is correct, the higher the confidence level. When scoring, a right answer and high confidence level are worth more than a right answer and a low confidence level.
Perhaps the most interesting way to make students think more deeply about the question doesn’t use questions per se but contains several sentences presented as a short essay. Some of the information in the essay is correct, and some of it is incorrect. The students must identify mistakes in the essay. The multiple-choice options list different numbers of errors so that students need to select the correct number.
There’s also some pretty damning evidence cited in the article. A lot of us don’t write very good multiple-choice questions, especially those who write questions for test banks. The most common problem in an analysis of almost 1,200 multiple-choice questions from 16 undergraduate courses was “nonoptimal incorrect answers,” so designated if they were selected by fewer than 5 percent of the test takers. In other words, they’re obvious wrong answers because almost nobody is picking them, and that means more correct answers are selected by guessing.
Students want test questions to be fair. If they think tests are, research indicates students are more likely to study and do, in fact, learn more as a result of their studying. What they mean by “fairness” is pretty straightforward. The questions need to be clear: students should be able to figure out what the question is asking even if they can’t answer it. There should be a broad range of questions. If the test covers three chapters, there should be content from all three chapters. The authors recommend giving students sample test questions so they know what to expect and can’t persuade themselves that memorizing a few definitions will be all it takes to conquer the test.
The article recommends several guides for writing good multiple-choice questions. It’s good to remember: the easiest questions to write are those that focus on specific information bits and have one right answer. Other details also merit consideration. For example, how many answer options should a question have? Four? More? A fairly recent study looked three, four, or five options in terms of test reliability, response behaviors, item difficulty, and item fit statistics and found no evidence of significant differences between the numbers of answer options. A meta-analysis of 27 studies (across different content areas and age levels) recommends three options. There’s an interesting reason that favors three-answer options. On average, students can answer three-option questions five seconds faster than those with four or five options. That means more questions can be included on the exam. Imagine how popular that will be with students!
“Consistent meaningful feedback (e.g., detailed explanation of why certain answers were correct or incorrect) is an important component of student learning outcomes, enjoyment, engagement in the course and rating of teaching quality” (p. 151), are the findings of another study referenced in the review. This argues for more than posting the right answers on the professor’s office door or on the course website. The authors recommend an interesting way of providing this high-quality feedback, which is giving students opportunities to self-correct. Research shows that students who were allowed to turn in a self-corrected midterm performed better on the final than students who weren’t given this option. Both the in-class (or online) exam and the self-correct exam are scored. Students earn full credit if the answer on both exams is correct, partial credit if the question is missed on the in-class test but corrected on the take-home version, and no credit if it’s wrong on both tests.
As these highlights illustrate, this is an article packed full of good information. Students take tests seriously. We need to do our best to make those exams fair and accurate measures of learning.
Reference: Xu, X., Kauer, S., and Tupy, S. (2016). Multiple-choice questions: Tips for optimizing assessment in-seat and online. Journal of Scholarship on Teaching and Learning in Psychology, 2(2), 147–158.Reprinted from Teaching Professor, 30.9 (2016): 3,7. © Magna Publications. All rights reserved.
Weimer, M. (2018, March 2). Making Multiple Choice-Exams Better [Blog post]. Retrieved from https://www.facultyfocus.com/articles/educational-assessment/making-multiple-choice-exams-better/
The Journey Begins
Thanks for joining me!
Good company in a journey makes the way seem shorter. — Izaak Walton
