How the Marines do it
The Marine Corps known distance rifle qualification course has worked well for the better part of a century. How does it produce such consistently good results? It works because the Marines get timely, relevant, necessary & accurate feedback on their shots. The Marines use this feedback to improve their performance as they practice and during their final day of shooting. The process is very effective.
Hold that thought…
Are we even hitting the target?
For years I didn’t think enough about asking people we trained if what we were doing was helpful to them. I wrote training plans, developed curriculum, integrated activities and exercises into mundane training, re-imagined some critical training events, etc. I think we were making things better, but I was so wrapped up in making things better, that it only occasionally crossed my mind to ask the question: Are we really making things better?
Asking that, and a few other key questions, would have provided valuable feedback about how the audience felt & thought about what we were doing. In turn we could have used that feedback to re-engage the design & development process and provide a better product that hit the mark a lot more consistently…
But to get useful feedback, you must take the time to ask the right questions…
“Ah-ha!” versus Rah Rah!
A woodsman* was once asked, “What would you do if you had just five minutes to chop down a tree?” He answered, “I would spend the first two and a half minutes sharpening my axe.”
Put some thought and planning into developing evaluation questions that deliver useful information. If the learners’ responses to your questions don’t occasionally give you an ‘ah ha!’ moment, you should consider re-visiting how and why your student evaluations are constructed.
Asking ‘rah-rah!’ questions like, “Did everybody learn something today?” is not helpful. You may get 50 people shouting ‘yes’, but invariably there will be four students thinking, ‘the instruction didn’t meet my needs or my expectations”. You need to hear from those four people.
No one likes busy work. Course critiques should not be busy work or space fillers. Evaluation done right is a critical and dynamic part of any instructional design process and should always lead to reviewing and improving instructional material. Don’t waste your time or learners’ time on ‘opinionnaires’ or answering generic questions (“did you like the course?”) that don’t help you improve the product.
When it comes to getting relevant student feedback, make sure you sharpen your axe.
“Anything worth doing is worth doing well.”
My Aunt Kay said that to encourage me to always strive to do better. If your students’ course evaluations are meant to help you meet your instructional goals, then apply that maxim to designing critique questions. Review every question you are asking learners and then ask yourself:
- Is it necessary?
- Does it have a purpose?
- Does it provide relevant information?
- Are we using the information to provide a better product?
- Does it tell us if customers are getting value for their effort?
- Does it contribute to meeting the goals of the instruction?
If you don’t answer ‘yes’ a lot, then maybe some questions have outlived their usefulness…
If you are dedicated to the instructional design process then do not use course critiques or student feedback sheets simply because they have been used in the past. View every form, process and question with a critical eye. Regularly cull the herd and eliminate questions that provide no relevant or actionable information. Remember, it is worth doing well.
All roads from evaluation lead to redesign
Whether you subscribe to Gerlach-Ely, A.D.D.I.E., Dick and Carey or you are shooting from the hip (my preferred method for years), evaluation phases lead back into the process. Evaluation from learners has a purpose. That purpose is to drive you to where you need to re-engage in the design process in order to improve the curriculum.
- Read learners’ responses
- Look for patterns/themes
- Listen to verbal feedback
- Analyze testing results (don’t just count scores)
When you find points that need improvement, turn them into action items and re-engage in the instructional design process.
Now, back to the Marine Corps
The Marine Corps known distance rifle qualification course works extremely well. Marines are able to gain or improve marksmanship skills over a relatively short period of time because of their process.
- They have a clear goal (high marksmanship scores)
- They only ask for information that they know they can use to help them reach that goal
- They analyze the answers to the questions
- They apply everything they learn to the next shot
- They repeat this process until they are hitting consistent bullseyes
Think about your process for evaluating your product and integrating that information into redesign. You don’t have to be a Marine to be deliberate about making this an effective part of your work. Be able to clearly state the instructional goal. Only ask evaluation questions that will yield useful information. Digest and analyze the answers to your questions. Apply what you learn back into improving the instructional material.
You will hit more bullseyes. Those bullseyes will help you teach with a purpose so your audience wins.
*my wife said I should use ‘lumberjack’ since woodsman evokes images of fairy tales. I think she’s right, but the original quote (often incorrectly attributed to Lincoln) from a 1956 agricultural education paper used ‘woodsman’ and I didn’t want to alter the original thought. So, woodsman it is.