Jul 20 2016
This is the second (slow!) of four pieces reflecting on the experiences of Emilio, a subject matter expert who was tasked with converting his successful F2F training into an elearning offering. Emilio has let me interview him during the process.
This piece focuses on the thorny issue of learning objectives at the front end of an elearning project and assessment at the other end. You can find the context in part 1 here. (Disclaimer: I was an adviser to the project and my condition of participation was the ability to do this series of blog posts, because there is really useful knowledge to share, both within the colleague’s organization and more widely. So I said I’d add the blog reflections – without pay – if I could share them.)
Nancy: Looking back, let’s talk about learning objectives. You started with all of your F2F material, then had to hone it down for online. You received feedback from the implementation team along the way. What lessons came out of that process? How do we get content more precise when you have fewer options to assess learner needs and interests “in the moment” as you do face to face, and with the limited attention span for online?
Emilio: I realize now that I had not thought about being disciplined with learning objectives. I had created them with care when I first developed my F2F offering. Once I had tested the course several times, I recognized that I forgot my own initial learning objectives because in a F2F setting I adapted to student’s interest and knowledge gaps on the spot, and I was also able to clarify any doubts about the content. Therefore, over time, these learning objectives become malleable depending on the group of students, and thus lost presence in my mind.
This became apparent as I was doing the quizzes for the online work and got comments back from Cheryl (the lead consultant). She noted where my quiz questions were and were NOT clarified in the learning objectives and content. I realized I was asking a bunch of questions that were not crucial to the learning objectives.
With that feedback, I narrowed down the most important questions to achieve and measure the learning objectives. It was an aha moment. This is something that is not necessarily obvious or easy. You have to put your mind into it when you are developing an e-learning course especially. It applies to the F2F context as well, but in an e-learning setup you are forced to be more careful because you cannot clarify things on the spot. There is less opportunity for that online. That was very critical. (Note: most of the course was asynchronous. There were weekly “office hours” where clarifications happened. Those learners who participated in the office hours had higher completion rates as well.)
It was clear I had to simplify the content for elearning set up – and that was super useful. While my F2F materials were expansive to enable me to adapt to local context, that became overload online.
Nancy: What was your impression of the learners’ experiences?
Emilio: It was hard to really tell because online we were dealing with a whole different context. Your indicators change drastically. When I’m in F2F I can probe and sense if the learners are understanding the material. It is harder online to get the interim feedback and know how people are doing. For the final assessment, we relied on a final exam with an essay question. The exam was very helpful in assessing the learner’s experience, but since it is taken at the end of the course, there are no corrective measures one can take.
Nancy: Yes, I remember talking about that as we reviewed pageviews and the unit quizzes during the course. The data gives you some insight, but it isn’t always clear how to interpret it. I was glad you were able to get some feedback from the learners during your open “office hours.”
We used the learning objectives as the basis for some learner assessment (non graded quizzes for each unit and a graded final exam which drew from the quizzes.) How did the results compare with your expectations of the learners’ acquisition of knowledge and insights? How well did we hit the objectives?
Emilio: We had 17 registered learners and 7 completed. That may sound disappointing. Before we started, I asked you about participation rates and you warned me that they might be low and that is why I am not crying. The 7 that completed scored really well in the final exam and you could see their engagement. They went through material, did quizzes and participated in the Office Hours. One guy got 100% in all of the quizzes, and then 97% in the exam.
We had 8 people take the final exam. One learner failed to pass the 70% required benchmark, but going deep into it, Terri (one of our consultants) discovered the way Moodle was correcting the answers on the multiple choice was not programmed precisely. It was giving correct answers for partially correct answers. We need to fix that. Still, only one failed to pass the 70% benchmark even with the error.
The essay we included in the exam had really good responses. It achieved my objective to get an in depth look at the context the learners were coming from. Most of them described an institutional context. Then they noted what they thought was most promising from all the modules, what was most applicable or relevant to their work. There were very diverse answers but I saw some trends that were useful. However, it would be useful to have know more of this before and during the course.
Nancy: How difficult was it to grade the essays? This is something people often wonder about online…
Emilio: I did not find it complicated, although there is always some degree of subjectivity. The basic criteria I used was to value their focus on the question asked, and the application of all possible principles taught during the course that relate to the context described in the question.
Nancy: One of the tricky things online is meaningful learner participation. How did the assessment reflect participation in the course?
Emilio: We decided not to give credit for participation in activities because we were not fully confident of how appropriately we had designed such activities for an e-learning environment in this first beta test. I think this decision was the right one.
First, I feel that I did not do a good job at creating an atmosphere, this sense of community, that would encourage participation. Even though I responded to every single comment that got posted, I don’t really feel that people responded that much in some of the exercises. So I would have penalized students for something that is not their fault.
Second, we had one learner who did every exercise but did not comment on any of the posts. He is a very good student and I would have penalized him if completion relied on participation. Another learner who failed did participate, went to the office hours and still did not pass the final exam.
We failed miserably with the group exercise for the second module. I now realize the group exercise requires a lot of work to build the community beforehand. I sense this is an art. You told me that it is completely doable in the elearning atmosphere, but after going through the experience I really feel challenged to make it work. Not only with respect to time, but how do you create that sense of community? I feel I don’t have a guaranteed method for it to work. It is an art to charm people in. I may or may not have it!
Nancy: The challenges of being very clear, what content you want to share with learners, how you share it, and how you assess it should not be underestimated. So often people think it is easy: here is the content! Learning design in general is far more than content and learning design online can be trickier because of your distance from your learners – and not just geographic distance, but the social distance where there is less time and space for the very important relational aspects of learning.
Up Next: Facilitating Online