After you ENVISION, RESOURCE, PLAN and DEVELOP to Bring your Online Learning Strategy into Focus, you probably feel ready to launch. But I cannot stress enough the importance of TESTING- and expanding the audience of who evaluates your creation.
I’ve been surprised how often this essential part of any design process is overlooked or undervalued in course design. For example, I was the only driving force for a pilot of the last online program I developed for a university. I had to ask for it repeatedly, and when the attempt to gather a group of pilot students through an email to alums yielded no results, my team members, consisting of faculty, administrators, and leaders, shrugged and said- no problem, we can log in and act like students.
NO, I said to them. You can’t. You can’t unknow what you know about this online program and course we are testing and see it with the fresh eyes of a novice who needs to use the system or an expert who has an internalized and well-informed criteria for what the system or course should be able to do. The newbie point of view comes through piloting. And the expert point of view comes through external evaluation.
It’s essential that the testing and evaluation process be rigorous- not just gathering opinions or smile sheets asking how’d you like it? Or seeing rubrics about what should or shouldn’t be in the course, like an assignment grade from an expert instructional designer or experienced online faculty member. But also cross-referencing with usage data, like the learning analytics that can be collected inside learning management systems- what do student pathways and behaviors in the class look like?
And finally, just completing these steps of testing is not enough. Design teams, faculty and leaders have to hold themselves accountable to making changes based on the feedback received through piloting processes.
Here’s your TO DO List:
- Require a pilot of program courses with faculty and students
- Request reporting on data from pilot through surveys, interviews and analytics
- Include faculty and staff feedback group in evaluation process
- Arrange for external evaluation of courses
- Arrange for technology focused testing of platforms and tools
- Use pilot results to inform design tweaks and implementation