Learning While Building eLearning: #4 Lessons from the Pilot

Scholar Project - 2This is the last of four pieces reflecting on the experiences of Emilio, a subject matter expert who was tasked with converting his successful F2F training into an elearning offering. This one focuses on the lessons learned from the pilot and we are pulling in Cheryl Frankiewicz, the project manager. You can find the context in part 1 ,  part 2 and part 3. (Disclaimer: I was an adviser to the project and my condition of participation was the ability to do this series of blog posts, because there is really useful knowledge to share, both within the colleague’s organization and more widely. So I said I’d add the blog reflections – without pay – if I could share them.)

Nancy: Emilio and Cheryl, what is your advice for someone else embarking on this process?

Emilio: Solve the prep work for the launch. Pay a lot of attention to the very important thing you do in every single project, no matter what it is. Getting the process of getting the people there. The enrollment, selecting a good partner and being on top of your partner so that nothing goes wrong in this introduction process. The key thing is to get the people there at the start of your course. That has to go flawless. If it starts flawless, it is almost a piece of cake to do a good learning course. Then everything flows easily.

Cheryl: I would encourage others who embark on this process to start by revisiting their objectives and making sure that they measure the most important learning outcomes. Once the objectives are clear, focused and measurable, it’s much easier to make wise choices about which content and activities to include in the course design. Interaction is just as important in elearning as in F2F learning, but that doesn’t mean that all the interaction that takes place in the face-to-face environment should be transferred to the online environment. Attention spans are more limited and the demands on learners’ time are greater in an online environment, so you have to be careful not to include so much interaction that it becomes overwhelming to learners.

If you haven’t facilitated online before, take an elearning facilitation course before you deliver for the first time. I took one before I delivered my first online training and it was worth every penny I paid. Not only did I get useful tips on how to manage participation in a virtual environment, but I also had the opportunity to practice them before going “live”.  The big surprise for me was how much I depended on participants’ body language for feedback in a F2F environment, and how lost I felt online without it. The course helped me identify other strategies for gathering and giving feedback online. Emilio wanted to take one of these courses but his travel schedule didn’t allow it.

One other recommendation I’d make is to plan for regular communication with learners. In a F2F setting, facilitators don’t have to think about how this will happen because they are in constant contact with learners, but in an elearning environment, extra effort has to be made to design and time communication in a way that helps keep participants on track and motivated to participate. Regular bulletins from the facilitator that remind participants what is happening in a given week or unit are a valuable tool for accomplishing this. These bulletins can also highlight key lessons learned or insightful contributions from participants during the previous week. The review can help re-engage those that have fallen behind, and the recognition can help motivate quality participation in the future.

Nancy: Emilio, I have done quite a bit of work with your organization around learning, facilitating and elearning. As you think about your experiences and the experiences you’ve learned from other colleagues doing elearning in FAO, what capacity is needed to do this sort of work in an organization like yours?

Emilio: We have our own elearning team at FAO doing their own projects for specific groups. Their services are relatively expensive.  If I were to do with them the same thing I did with MEDA I would have likely paid more. And they have a limited number of people. They don’t have enough capacity to be service providers to the rest of the organization. We have so many different units. Our organization is structured so that we have to provide services to each other and we have to pay for them.

Nancy: I know there is a lot of talent spread through the organization, but it is not clear that they are aware of each other, talk to each other, learn and support each other.

Emilio: You are right. I have a  colleague doing a training. She decided to work with Unitar. She is thrilled with the experience. Then she started talking about her very different needs and experiences. From what she tells me I would not be inclined to use that model. I would have to have something different.  It is hard at the end of the day to come up with a corporate, very well coordinated approach to this elearning, to cultivate that knowledge among all of FAO’s staff, or at least expand it as much as possible.

But you are right, the result is we don’t leverage, learn from each other, from a very valuable experience a colleague is having and have to go through painful process of learning myself.

Cheryl, how about you? What is your advice?

Cheryl: Don’t aim for the moon in your beta test. Aim to learn. As Emilio mentioned, only 41% of those who registered for the course actually completed it. But 100% of those who completed  it said they would recommend it to their colleagues. Learning happened, and more learning will happen the next time around because Emilio and his team are observant, open to learning, and patient with themselves and the process.

Make sure you bring together a good team of people who can cover all the bases that need to be covered when converting a F2F training into an elearning offering. Don’t expect that any one person is going to be your subject matter expert, instructional designer, programmer, learning strategist, platform troubleshooter and project manager all in one. Ultimately, a team of six people contributed to this conversion, none of us working on it full time, but all of us contributing expertise in a particular area. Make sure that someone on the team takes responsibility for organizing the work and keeping your timeline on track. And avoid the temptation to outsource everything because you’ll miss the opportunity to learn how to do it yourself. Emilio’s probably not ready to develop his next course entirely in-house, but he and Milica have built the capacity to maintain and adapt the courses that now exist.

Speaking of adaptation, one last piece of advice is to take advantage of the opportunities that elearning provides to monitor how participants are learning as they are learning and make adjustments to the course design as you go along. Emilio mentioned earlier that the feedback he received in the office hours helped him adjust the course materials, but our analysis of the quiz, final exam and evaluation results also helped us identify which concepts could be better explained, and which objectives could be better supported. We monitored how, when and where learners engaged (and did not engage) and this is helping Emilio to improve his next offering of the course. For example, we learned that participants who did not complete the course tended to follow one of two patterns: approximately one-third logged in only once or twice and did not finish even the first module; the other two-thirds participated fairly regularly and completed module 2, but then dropped out. With this information, and with feedback from participants who completed the course, Emilio is revising the design of the Module 2 group work, and he and Milica are planning to follow up more quickly with inactive participants during the first module of the course to identify if there are any barriers to participation that they might help learners address.

Here’s mine (Nancy)…

I’m really glad the decision was made to have a beta test which helps us sharpen the content, process, assessment and technology. The example of understanding how the exam was graded shows that there are always technical things to learn, and the careful attention to assessment as it relates to learning objectives helped us learn a lot.

We learned some things about the process of having a marketing partner, the importance of lead time and a very real need to  do some pre-course orientation for the learners about the technology and course expectations. We have talked about developing some short videos and having a short “week 0” prior to the actual start of the course to ensure the tech is working for learners before we dive so quickly into content and community building.

We need to get the participation rate higher because I’m convinced that is key to successful completion – look at the people who participated in the office hours — they stayed engaged and completed! I think this starts with a clearer ramp up and explicit expectations (including pre-course communications), regular emails during the course and refinement of our pre-course learner survey that would help the facilitator understand the learners a bit before the course.

That said, there were SO many things to pay attention to, it was easy to spend less time on the social aspects of learning: initial engagement with the learners, building a learning community (which is difficult in three weeks and limited expectation of learner hours), and helping learners contextualize the content to their contexts. I had warned Emilio beforehand that facilitating online learning is a bit different than teaching face to face. The learning management system delivers a lot of the content. The real role is connecting learners to the content and to each other.  

Thanks to Emilio, Cheryl, FAO and MEDA for supporting these four blog reflections!

Learning While Building eLearning: Part 3 – Facilitating Online

Scholar Project -8This is the third of four pieces reflecting on the experiences of Emilio, a subject matter expert who was tasked with converting his successful F2F training into an elearning offering. This one focuses on the facilitation aspects of the course! You can find the context in part 1 , and part 2. (Disclaimer: I was an adviser to the project and my condition of participation was the ability to do this series of blog posts, because there is really useful knowledge to share, both within the colleague’s organization and more widely. So I said I’d add the blog reflections – without pay – if I could share them.)

I want to kick this off with a quote from the amazing Beck Tench talking about facilitating online learning:

Learning and change are super complex. Consider we may never know the effects of our work. Every snapshot lacks context in some way. Proceed with listening, kindness, observation, and experimentation. Accept that there will be uncertainty, as in all things, and move forward anyway.

I love this quote because it reminds us that facilitating online learning is about the teacher’s expertise. And about engagement. And about our stance as an online facilitator – something I think is often invisible or ignored.  Emilio stepped into that stance with a lot of grace, tolerance for the unknown and comfort with trying, learning, and even with a little failure. In my experience this is not that common!

Nancy:  Let’s talk a bit about stepping into reality, the launch of the course. This was your first time facilitating an online learning course. What happened?

Emilio: The beginning was very stressful. There was a moment where I had to reset my vision that I had created at the beginning of this project. We thought we had everything planned by the Thursday before the course. We were prepared to send a message out  to the people who had signed up for the course, expecting them to register on the actual Moodle site, and begin surfing the site and get fully on board on the first Monday of the course.

Then our partner failed to send us the list of participants in time and we had to postpone the launch. Once we got the list, we sent the welcome message on a Thursday. And yet by Monday people had not surfed the website and registered. I had to say, “wait wait, convince yourself, just don’t get frustrated.” This is what we were paying for: a pilot to experience everything, anything that can go wrong. It is better to experience it now. Next time we will do it better. That will be the real start.

This process takes a little bit of emotional intelligence. You can’t lose your focus. You have to learn in the experience. Don’t focus on the idea that this is the official worldwide launch of your elearning program, but a learning experience. So it was not a big deal. Just a couple of hours of freaking out.

Nancy: Now that you have had the experience what reflections do you have about moving and facilitating your successful F2F course? How did you engage people?

Emilio: Other than wanting to respond more quickly? (Laughter: Emilio was amazing – he was not only teaching online for the first time, but he was doing it WHILE he was on the road for work!) Here are some of my lessons.

First, what should I do about participants that belong to a group not responding to each other? I see the first person in that group posts and gets no response. I wondered, should I intervene? I wondered about how to  group participants in some way, to point out some challenges and invite others to react. But I didn’t hoping they would eventually engage. There were two groups where no one commented at all. If I were to do it again I would immediately ask others to post something.  

Nancy: There are more experiments with gamification in online, where, for example, you get points towards badges for responses. I’m not always sure of the long term benefit of these kinds of incentives and if they actually support the learning, but they appear to get people engaged in the moment. Maybe it can trigger learner socialization quicker and be something useful to explore.  Because as you noted, participation in the design of this course assumes people will interact with each other. So socialization of the group is the first step towards that participation, and later is essential for successful group work.

Emilio: Second, I can teach from anywhere. I could see that in our pilot. I was travelling like crazy. Another take away is the real leverage of technology. I could be doing different things in different places in the world and still deliver a course. You see people are learning from anywhere. If you compare that to level of effort for a F2F course, it is a trade off. But the value is there and you as an officer, can become much more productive. Once you invest in the up front work of design and planning, which was more than I expected.

There are some challenges to this anytime/anywhere though! I feel a bit guilty. I could have done a better job dedicating a bit more time overall. Once I woke up I did not realize the time difference in the office hours and had to wake up at 3am. There are a couple of times I knew I was responding two days later. I know that shouldn’t happen, how I wanted it to be. I wanted to respond within 24 hours.

Emilio: Third, include a synchronous element. The most effective tool I feel I had was our weekly synchronous “Office Hours.”  They gave me an opportunity to introduce a dose of F2F interaction which is fantastic.

During the office hours I got a chance to interact with the participants. They would post several questions. The sharing the screen was super critical. I surfed and took them where we wanted to go, to a question related to a graph or slide and explain it. You can sense by the comments – “oh yes, thank you this clarifies a lot.” We quickly solved problems.

Also, just by hearing their questions I could pinpoint those slides where the message may not be that clear and I would edit a couple of things right away. So it helped me get clearer as well.

We tried to record and post the recordings for those who could not attend due to work or time zones, but we had some technical problems. We will try and fix that next time. But I will also really encourage the participants to attend, because it brings the passion for the subject matter and the collegiality which is needed for the group work and active participation. The people who attended office hours were also the people who completed the course!

Some ideas for next time is to expand the use of office hours to help better set up the groups and the process for the group work. Maybe teams could have a private chat or meeting once a week and I could use some questions to help them get to know each other in the context of the course. That leads to my fourth learning: group work requires building relationships. Our group exercises need to be reconsidered (design) and I need to figure out how to get people comfortable enough with each other to actually engage in the group work.

Nancy: Yes, that is really hard, particularly when the participants have allocated an hour a day for three weeks and there is a lot of material to cover!

Emilio: Fifth, don’t do this alone! Milica was my assistant and she was always there. One time I could not log into the office hours and Melicia took care of it. In hindsight, we should have included her earlier in the facilitation conversations and planning. Part of the team. You and the other consultants Cheryl, Terri and everyone were very helpful.

Nancy: What was the facilitation highlight for you?

Emilio:  The first and second Office Hours were critical. The course was mostly asynchronous. I knew people were coming in. I logged in and I saw people logging in and that made it real. There are people there! They had interest, and were  asking questions, actually reading the slides. I could see the numbers (page views). But until you talk to them, see them asking questions, it is hard to see if they really are reading the material. When we held our weekly synchronous Office Hours, this became much more real.

Nancy: So would you keep doing this?

Emilio: Absolutely yes, I’ll keep doing this. Reflecting on it now, and putting into perspective from an administration standpoint,  what I produced during those four weeks of the course, there is an increase in efficiency. I delivered a course – granted for 7 people – but while I was working Bangkok, Mexico and then Peru. Pretty impressive. Amazing, yeah. I had good connectivity fortunately.

Up Next: Reflections from the whole team

Learning While Building eLearning: Part 2 – Learning Objectives & Assessment

signsThis is the second (slow!) of four pieces reflecting on the experiences of Emilio, a subject matter expert who was tasked with converting his successful F2F training into an elearning offering. Emilio has let me interview him during the process.

This piece focuses on the thorny issue of learning objectives at the front end of an elearning project and assessment at the other end. You can find the context in part 1 here. (Disclaimer: I was an adviser to the project and my condition of participation was the ability to do this series of blog posts, because there is really useful knowledge to share, both within the colleague’s organization and more widely. So I said I’d add the blog reflections – without pay – if I could share them.)

Nancy: Looking back, let’s talk about learning objectives. You started with all of your F2F material, then had to hone it down for online. You received feedback from the implementation team along the way. What lessons came out of that process? How do we get content more precise when you have fewer options to assess learner needs and interests “in the moment” as you do face to face, and with the limited attention span for online?

Emilio: I realize now that I had not thought about being disciplined with learning objectives. I had created them with care when I first developed my F2F offering. Once I had tested the course several times, I recognized that I forgot my own initial learning objectives because in a F2F setting I adapted to student’s interest and knowledge gaps on the spot, and I was also able to clarify any doubts about the content. Therefore, over time, these learning objectives become malleable depending on the group of students, and thus lost presence in my mind.

This became apparent as  I was doing the quizzes for the online work and got comments back from Cheryl (the lead consultant).  She noted where my quiz questions were and were NOT clarified in the learning objectives and content. I realized I was asking a bunch of questions that were not crucial to the learning objectives.

With that feedback, I narrowed down the most important questions to achieve and measure the learning objectives. It was an aha moment. This is something that is not necessarily obvious or easy. You have to put your mind into it when you are developing an e-learning course especially. It applies to the F2F context as well, but in an e-learning setup you are forced to be more careful because you cannot clarify things on the spot. There is less opportunity for that online.  That was very critical. (Note: most of the course was asynchronous. There were weekly “office hours” where clarifications happened. Those learners who participated in the office hours had higher completion rates as well.)

It was clear I had to simplify the content for elearning set up – and that was super useful. While my F2F materials were expansive to enable me to adapt to local context, that became overload online.

Nancy: What was your impression of the learners’ experiences?

Emilio: It was hard to really tell because online we were  dealing with a whole different context. Your indicators change drastically. When I’m in F2F I can probe and sense if the learners are understanding the material. It is harder online to get the interim feedback and know how people are doing. For the final assessment,  we relied on a final exam with an essay question. The exam was very helpful in assessing the learner’s experience, but since it is taken at the end of the course, there are no corrective measures one can take.

Nancy: Yes, I remember talking about that as we reviewed pageviews and the unit quizzes during the course. The data gives you some insight, but it isn’t always clear how to interpret it. I was glad you were able to get some feedback from the learners during your open “office hours.”

We used the learning objectives as the basis for some learner assessment (non graded quizzes for each unit and a graded final exam which drew from the quizzes.) How did the results compare with your expectations of the learners’ acquisition of knowledge and insights? How well did we hit the objectives?

Emilio: We had 17 registered learners and 7 completed. That may sound disappointing. Before we started, I  asked you about participation rates and you warned me that they might be low and that is why I am not crying. The 7 that completed scored really well in the final exam and you could see their engagement. They went through material, did quizzes and participated in the Office Hours. One guy got 100% in all of the quizzes, and then 97% in the exam.

We had 8 people take the final exam. One learner failed to pass the 70% required benchmark, but going deep into it, Terri (one of our consultants) discovered the way Moodle was correcting the answers on the multiple choice was not programmed precisely. It was giving correct answers for partially correct answers. We need to fix that.  Still, only one failed to pass the 70% benchmark even with the error.

The essay we included in the exam had really good responses. It achieved my objective to get an in depth look at the context the learners  were coming from. Most of them described an institutional context. Then they noted what they thought was most promising from all the modules,  what was most applicable or relevant to their work. There were very diverse answers but I saw some trends that were useful. However, it would be useful to have know more of this before and during the course.

Nancy: How difficult was it to grade the essays? This is something people often wonder about online…

Emilio: I did not find it complicated, although there is always some degree of subjectivity. The basic criteria I used was to value their focus on the question asked, and the application of all possible principles taught during the course that relate to the context described in the question.

Nancy: One of the tricky things online is meaningful learner participation. How did the assessment reflect participation in the course?

Emilio: We decided not to give credit for participation in activities because we were not fully confident of how appropriately we had designed such activities for an e-learning environment in this first beta test. I think this decision was the right one.

First, I feel that I did not do a good job at creating an atmosphere, this sense of community, that would encourage participation. Even though I responded to every single comment that got posted, I don’t really feel that people responded that much in some of the exercises. So I would have penalized students for something that is not their fault.

Second, we had one learner who did every exercise but did not comment on any of the posts. He is a very good student and I would have penalized him if completion relied on participation. Another learner who failed did participate, went to the office hours and still did not pass the final exam.

We failed miserably with the group exercise for the second module.  I now realize the group exercise requires a lot of work to build the community beforehand.  I sense this is an art. You told me that it is completely doable in the elearning atmosphere, but after going through the experience I really feel challenged to make it work. Not only with respect to time, but how do you create that sense of community? I feel I don’t have a guaranteed method for it to work. It is an art to charm people in. I may or may not have it!

Nancy: The challenges of being very clear, what content you want to share with learners, how you share it, and how you assess it should not be underestimated. So often people think it is easy: here is the content! Learning design in general  is far more than content and learning design online can be trickier because of your distance from your learners – and not just geographic distance, but the social distance where there is less time and space for the very important relational aspects of learning.

Up Next: Facilitating Online

Social Media Planning and Evaluation for NGOs

I’ve been co-designing and c0-facilitating a number of workshops for the CGIAR and FAO over the past few years about knowledge sharing, and more recently, this phenomenon people call “social media.” Part of this work has been to  comb through resources and create some launch pads that are relevant to NGOs and non profits. I thought I’d share a few of them on this blog.  I’ve edited this one a bit more since the first writing.

Over time, most of this material will also be added to the every growing “KS Toolkit,” another collaborative resource I’ve pointed to frequently.

Simone Staiger, my frequent collaborator in these efforts, pointed out this quote and URL from Margaret Wheatley that is a good kick off for the topic.

In nature, change never happens as a result of top-down, pre-conceived strategic plans, or from the mandate of any single individual or boss. Change begins as local actions spring up simultaneously in many different areas. If these changes remain disconnected, nothing happens beyond each locale. However, when they become connected, local actions can emerge as a powerful system with influence at a more global or comprehensive level. (Global here means a larger scale, not necessarily the entire planet.)

A wordle from Beth KanterSocial Media Strategy Planning & Measurement – What’s Working?

As people responsible for getting things done in your organization, you know the value of having a clear strategy and a way of evaluating if your strategy is working. With social media,  however, strategy is a compass, not a map, because it is a fast changing territory.

This topic is designed to give you some tools and ideas for including social media appropriately in your overall  organizational strategic plan and to measure its effectiveness.

Strategic Social Media Planning

You might want to look at the very useful “Social Media Strategic Planning Worksheet: from WE ARE MEDIA. Like any good communications strategic planning, social media strategy takes into consideration goals and target audiences AND the technology implications. This is the fundamental part that most of us are familiar with.

Bill Anderson (in a comment on this post, which was so good I’m editing it into the post) wrote:

I have three engineering like questions to add to the list that come directly from the late Neil Postman.

From an engineering perspective any technology, be it a tool, software, or processes and procedures, or new work practices, is a solution. Whenever considering adopting a solution consider asking the following three questions.

(1) What problem will it solve?
(2) Whose problem is it?
(3) What new problems are likely to arise by adopting it?

These three simple questions help me clarify my (sometimes hidden) assumptions about what I’m doing and why I think a particular technology is useful. I think they complement the set of questions you suggest in this post.

While it might be easy to say most of your constituents are not even online, some of your strategic audiences may be, such as funders, researchers and policy makers. So scan your audiences and look for possibilities.

Social media, however, is like a river you swim in. It is always flowing past, sometimes carrying us along, sometimes dumping us on the rocks of the shore. It is important to think iteratively of your strategy so you can adjust to changing conditions.   The advice  is to experiment often, fail quickly and learn, learn, learn to allow you to adapt your strategy. Think in 6 weeks or 6 months, not 3 year cycles. Keep an eye on the goal, but but ready to switch how you get to it.

Social Media Policies

Often people’s first questions are “how do we manage and control this stuff?” Organizations working with limited bandwidth want to block applications to prioritize internet use. Organizations working in more conservative parts of the world worry about what people will access if they start using web based tools.  The first thing to know here is that you can’t control all of this. So building on your core values and developing agreements is a sound strategy.

Some organizations find having a social media policy useful — as long as the policy doesn’t squash the initiatives right from the start! Always try and look at policies from two perspectives: control and emergence. Too much control and  you will miss the innovation and inventiveness that is a core benefit of social media.

Here are two articles that you might find helpful from IBM:

And a few more if you like to read…

What we have observed is that NGOs have been slower to consider their policies. This can be an advantage to the early innovators (few barriers) but may cause worry as leadership, not familiar with social media themselves,   want to overreact rather than thoughtfully consider policy.

Work Iteratively – Measure as You Go

The good thing about using social media is it is fairly simple to experiment, iterate or throw out an experiment that is not working for you. Think small, frequent experiments and low risk, rather than trying to build “the perfect system” and over investing in any one thing until you understand the value. For example, you may try a blog as an alternative to a traditional email newsletter. Track how many times a blog post has been viewed (using your blog software or a tool like Google Analytics). See how many comments you get when you post entries that specifically ask for feedback. (People are more likely to respond to open ended questions rather than traditional press releases!). Do a search to see who has linked to that post? (Do you know how to do this on Google, Yahoo or Microsoft search? What about the new Bing.com?)

These are examples of  using quantitative metrics. For a great list of more metrics you might consider, see Rachel Happe’s blog post on Social Media Metrics. See what blog posts are more read and then start adjusting your posting style. Some people call this “social listening.” In the early phases of using social media, you are trying things out and “listening” for the response as indicated by page views, links, responses or even action by your target audience. To read more about this, check out Beth Kanter’s blog post about evaluating first projects.

Qualitative Evaluation

There is more than quantitative metrics for evaluating your social media ROI. As you know, communications is as much a qualitative thing as a quantitative thing. Some things are intangible. Like a funder reading a blog post that told the STORY of some work and begins to engage more deeply to support the project. Or the people who start following the messages you send out on Twitter and gain a deeper appreciation for food and hunger in the world and start making small changes in their own lives. These things require a deeper listening – finding stories, doing interviews with people from your target audience. For more on this, here is another blog post from Beth Kanter.

As you get a sense of how social media is helping you achieve your communications strategy, you can begin to fold social media evaluation into your overall communications evaluation work. Keep what is working. Adjust the things that might be working. Stop doing the things that aren’t working. Just a note on this. Sometimes it takes both experimentation and time to find out if something is working. So don’t give up too quickly.

Examples of social media evaluation efforts:

Questions:

  • What communications objective do you want to try and support with social media?
  • Do you want or need to have a social media policy?
  • What are the benefits, both tangible and intangible, that a social media strategy might offer? What value does our social media strategy provide to our organization or stakeholders?
  • What type of quantitative and qualitative information do we need to track to measure our success or learn how to improve our social media strategy?

Additional Resources: