aLearning Blog

Online Learning for Trade Associations

Posts Tagged ‘ASAE’

How Time Flies…

Posted by Ellen on January 27, 2012

…when you’re having fun, right?!?

And we have been having fun here at the aLearning Blog! Suddenly, it seems, we’re publishing our 250th post and celebrating five years.

Yep, five years. And so much has changed!

When aLearning published its first post back on January 27, 2007:

  • no LMS systems (that we know of at the time) were designed especially to meet the needs of associations and nonprofits
  • few (if any) research endeavors about online learning focused on associations and nonprofits
  • few (if any) organizations bothered to survey association learning leaders to find out what we’re doing in the field and how things were going
  • the number of association-specific blogs could be counted on the fingers of one person’s hands
  • social learning and virtual learning environments were mysterious, hocus-pocus, scary entities

A lot has changed over just five changes of the seasons, hasn’t it?!?

Top 100 aLearning Blog Posts

To celebrate this milestone, we’ve compiled an ebook of our Top 100 aLearning Blog Posts. Just skimming through these selections made us realize how quickly the elearning sands shift, affecting the landscape, even moving the horizon.

At over 200 pages, this compilation brings together in one place the best — and most controversial — writing from the aLearning Blog. We’ve included most comments (the fine print is that we’ve deleted pingpacks, backtracks, and outright sales pitches) and are proud of the attention the aLearning Blog has garnered over the years by elearning and education experts.

To Get Your Copy

We’ve made this e-publication very affordable at just $5. To order, go to www.ellenbooks.com/store.html and click the “Buy Now” PayPal button. You should be able to read this PDF from any device with a PDF reader (such as Adobe Reader).

Special Offer

If you’ve purchased aLearning: A Trail Guide to Association eLearning, we’ll send you a copy of the Top 100 Posts for free. Just send Ellen an e-mail at ellenbehr@aol.com and attach an electronic copy of your Lulu receipt, and we’ll send you the Top 100 Posts by return e-mail. We appreciate your support and are happy to say “thank you” in this small way.

Thank You!

Posted in aLearning Strategies, aLearning Surveys, aLearning Trends, Conferences, eLearning Marketing, eLearning Resources, Financing eLearning, Justifying aLearning, Learning in General, LMS, Measuring Results, Online Learning in General, Social Learning, Webinars | Tagged: , , , , , , , , , , , , , , , , | Leave a Comment »

Empowering Subversive Implementation

Posted by Ellen on August 14, 2011

Odd title, eh? If you haven’t yet read Maggie McGary’s post over at Acronym (“Are you empowered to implement what you learn?”), you’ll get a head start on where I’m going with this.

First, I completely agree with Maggie. I’d add that — at least in my case — I had piles of notes from books, magazines, conversations, social networking threads, blogs, etc etc, as well as those notes from conferences I attended.

Of those, I managed to make a couple of changes, create one or two new inspired projects, and otherwise implement what I had learned. Sometimes it was with the support and encouragement of the organization’s leadership.

Sometimes it was through sheer determination and what I started to call “subversive leadership.”

Call it manipulation. Call it whatever you want, but it worked.

And it was simple: I just did it. I kept my efforts under the radar, and worked slowly yet patiently — sometimes through lunches or other “lag” times. Then when I had something to show — a demo, a bit of the project, a sliver that hinted at what could be done or the results of what I managed to accomplish — I shared it with the appropriate leader. Sometimes that person was the executive director. Sometimes the chair of the education committee.

You don’t always need permission to do something. Sometimes you just have to give yourself permission.

Sometimes you have to empower yourself.

Posted in aLearning Strategies, Justifying aLearning, Learning in General | Tagged: , , | 3 Comments »

Measuring Level Four

Posted by Ellen on June 24, 2011

Sounds like something from a sci-fi flick, doesn’t it?!? But of course I’m referring to Donald Kirkpatrick’s four-level model for measuring learning outcomes. The first level, you’ll remember, is “reaction.” We do a good job of measuring that by using “smile sheets” — those feedback forms that we issue right after learning as occurred (for more on Smile Sheets, see the article “Smile Sheets To Smile About” in the April 2010 issue of ASAE’s Associations Now magazine).

And whenever we “test” our learners on what they absorbed from a session, we’re measuring whether they learned (level Two on Kirkpatrick’s scale).

Levels Three (Behavior/Transfer) and Four (Impact/Results) are admittedly more difficult. They’re a challenge for corporations — and they have access to employee records, performance reviews, business outcome data, and all of that. How could we possibly begin to tackle these evaluative levels — and why would we want to try?

Let’s start with why. The answer is because.

Because we want our members to see evidence for themselves of the effectiveness of the training we’re delivering to them. The more we can demonstrate to them that they are benefitting (and their employers are likewise benefitting) from the educational sessions we provide, the more likely they are to renew their membership, register for more events, and tell others about the advantages they’re experiencing.

Because we want our association leaders to bear witness to the results of the programs we offer. Yes, they’ll see the attendance data, the revenue, and all of that, but showing them how members are contributing to their workplaces in ways they hadn’t before the training they took with us, the more powerful the rest of the numbers will be. This builds credibility for your department and should make it easier to gain their support for future program investments.

You can get insight into Level Three (behavior/transfer) by following up six to eighteen months after the event with an evaluation written specifically for this purpose (see “Nothing To Smile Sheet About”  and Chapter 17 of aLearning: A Trail Guide to Association eLearning for more on how to construct these evaluations).

But how do we get to Level Four? Much the same way we got to Level Three — by sending the session participants an evaluation that’s been carefully designed to solicit the sorts of responses that are experiencing the positive business impact we intended as a learning outcome.

Here’s how you might do that (adapt this to your own purposes, of course):

1. Get the learning objectives in front of you. If they were written well, they should provide the desired outcomes. For example, “The learner will be able to write effective broadcast e-mails that result in increased numbers of click-thrus.”

2. If your learning objectives weren’t written this clearly, brainstorm the possible business outcomes when the learning objectives are correctly applied.

3. Write questions that solicit specific business outcomes as a result of the session. Using our earlier example of broadcast e-mails, one question could be, “As a result of taking this training, have you experienced an increase in the number of click-thrus for your broadcast e-mails?”

4. Write follow-up questions that probe for details. For example, “What percentage of an increase in click-thrus have you experienced?”

5. Allow for exceptions — you can learn from these, too. For example, “If you haven’t experienced an increase in the number of broadcast e-mail click-thrus, describe the factors that could be affecting this result.” You might learn that they stopped sending broadcast e-mails or that someone else is now sending them and the learner doesn’t have the data. It could be that they always had a high rate of click-thrus so an increase that doesn’t seem significant is still a positive outcome.

Here are some examples of phrases to get you started:

“As a result of taking this training, have you experienced a decrease in…

…the cost of [X,Y,Z]?”

…employee turnover?”

…number of claiims?”

…number of errors in [A,B,C]?”

…number of complaints?”

…complaints about [A,B,C]?”

“As a result of taking this training, have you experienced an increase in…

…productivity?”

…sales?”

…profitability?”

…frequency of orders?”

…amount per order?”

…repeat business?”

…employee retention?”

…employee satisfaction?”

…customer satisfaction?”

…customer retention?”

“As a result of taking this training, have you experienced a savings in [X,Y,Z]?”

“As a result of taking this training, have you experienced enhanced creativity?”

“As a result of taking this training, have you reduced…

…waste?”

…re-work?”

…accidents?”

“As a result of taking this training, have you cultivated innovation?”

“As a result of taking this training, have you shortened your time to market with new products or services?”

“What other business outcomes have you experienced as a result of taking this training?”

Most importantly… after each question, ask for specifics:

How many? By how much? By what percentage did this change?

And of course you’ll want to emphasize that your data is strictly for evaluative purposes — you don’t need specific financial or other data, you just want some indication of the effect the training has had. Most members won’t release data that’s confidential to their company anyway, and some might be reluctant to even share that the training has made a business-side impact. That’s okay. Find out what you can from those who are able to share and consider yourself lucky to have that.

If the results are particularly stunning, follow up with individual respondents to see if you can use a quote from them for reports to the education committee, board of directors, or even in marketing materials. Offer to show them the quote and obtain their permission before releasing it. Being able to use specific testimonials is a plus — the real purpose of conducting this evaluation isn’t marketing, however.

When you have enough responses, aggregate the data so you can see the overall picture: how did learners benefit in general from the session? Was one objective particularly valuable? Was there any learning objective that seemed especially challenging? Why?

Thank every respondent, especially if their names are attached to the evaluations you get. Let them know how much you appreciate providing the feedback so you can continue to improve the program. A simple thank you goes a long way!

Has your organization conducted Level Four evaluations? How have you conducted them? What did you learn from the results? We’d love to hear your stories here at aLearning!

Posted in Learning in General, Measuring Results, Online Learning in General | Tagged: , , , , | Leave a Comment »

Deciding Not to Learn at Conferences?

Posted by Ellen on September 24, 2010

Remember awhile back when I posted on ASAE’s “Associations and CEOs: A Report on Two Studies During a Down Economy.” [“Why ANY Revenue Increase is a Good Thing”]? It got some followup (and needed correction on an interpretation of the data) from ASAE and — for the record — I’m still convinced the report made some faulty cause-and-effect conclusions.

The good news is that the new report, “Decision to Learn,” seems to clarify things.

According to Lillie R. Albert and Monica Dignam, writing in “Exploring the Decision to Learn,” from the August 2010 issue of Associations Now :

Though face-to face learning is a major preference, it is clear adult learners will participate in distance-learning formats as well. The current abundance of research and experimentation into distance learning by learning providers of all types, from the smallest association to graduate-level academic programs, suggests we are in a period of significant innovation as it applies how learning is delivered. Distance-learning offerings on topics that are easily applied to current problems and needs, are personalized and adapted to the individual learner’s learning style, and readily available and cost-effective will continue to grow.

Leaving aside the reference to “the individual learner’s learning style” as a topic worthy of its own post (when will ASAE finally accept what others are coming to realize, which is that the concept of learning styles is a myth?!?), at least this “Decision to Learn” summary admits that while learners might prefer face-to-face learning, the reality is that they are accessing online learning as well. The report’s own data support this notion: over 51% of respondents reported they attended face-to-face and “distance” learning events in the past year.

But that’s not the startling thing in this report, at least as noted in this summary article. Here’s the sentence that should make people sit up and take notice:

The preferred education format is in person, led by an instructor or presenter but not at a conference, tradeshow, or convention.

Whoa! Think of all those dollars you’re investing in the education sessions at conferences when it’s not the preferred face-to-face learning environment! What will you do about that?

What’s that you say? You’re going to leave your conference with its education sessions alone, despite what the report says? There’s so much other value that members get from it, you say? Too many better reasons to continue to offer it than to abandon it because of some report?

So here’s my conclusion from this report:

  • We’ve been doing conferences for years and years, are still trying to figure out the best ways to deliver effective learning via this type of event, and will not give it up.
  • We’ve been doing online learning for a very short period of time, are still trying to figure out the best ways to deliver it effectively, and should not give it up, either.

And forgive me, but I can’t help noticing that the two Web ads appearing on the page describing the report are for a major city’s convention and visitors bureau and a major hotel chain, while the Associations Now article Web page has two destination city ads.

Hmmm….

Anyway, I applaud ASAE — especially the volunteers who worked behind the scenes on the report — for examining learning in associations. There’s great ammunition here for beleaguered association learning leaders who need something to point to when justifying the value of the educational programs they offer.

But it’s just a start. Now that ASAE has put some real data behind the generally held believe that members find educational events to be a key factor in their affiliation with an association, it makes sense that they provide more support when it comes to professional development.

Maybe an ASAE PD Conference? Oh — wait — people don’t prefer conferences for learning….

What do you think? What was your reaction to the report? Where do you think ASAE should go from here? Where should we go from here?

Posted in aLearning Strategies, eLearning Resources, Justifying aLearning | Tagged: , , , , , | 2 Comments »

It Doesn’t Have To Be That Hard

Posted by Ellen on June 6, 2010

Marsha Rhea has great food for thought in her Acronym post, “The Hard Work of Collaborative Learning”:

Let’s be honest about collaborative learning for a moment. People who just want an answer–fast–would rather listen to experts or click their way to a solution.

And those experts–well–they just barely have time to spew forth some of what they know before racing to their next great achievement.

And too many association executives are forced to crank out educational opportunities, because they are programming too many sessions, meetings and workshops to have enough time to inspect their products for learning outcomes and quality experiences.

Is this assessment too harsh?

No, Marsha, it’s not. And I agree that we need to spend more time and energy creating collaborative learning opportunities.

HOWEVER….

We have to be careful we’re not trying to put a bandaid on an inch-wide gash or stitch up a tiny scratch.

Remember once upon a time when list servs and forums and bulletin boards were pretty new and less moderated than they are today? Remember that you were expected to lurk until you understood the lingo and other basics of the group before chiming in? Remember those awful flaming messages that were launched at any innocent “newbie” who asked what the group deemed to be a question that was too basic?

Of course none of us wants to go back to that, but here’s my point: if you’re wondering why your collaborative learning events aren’t as successful as you think they should be, the answer might lie in the first part of Marsha’s post.

— People do need answers, and often they need them quickly.

— Those with the answers have been asked for those answers so often, over and over, that at some point they start to pull away from the conversation (forum, listserv, educational event, volunteer opportunity, etc.).

Marsha’s suggestions for kick-starting collaborative learning in your organization are good ones.

Allow me to add another.

One of the first rules in instructional design is to know your learner. Level of experience or knowledge of the subject/topic and the reason(s) they need to know the content are especially important.

Here’s why:

[Click to enlarge]

This is especially helpful for education leaders in trade associations and professional organizations: learners who are early in their careers have different training needs than the “veterans” who have more experience. Early careerists need more fundamentals that can generally be provided through more structured learning situations; experienced veterans who have lived through that learning curve benefit more from direct peer-to-peer (PTP) interactions.

Not that the early careerists wouldn’t benefit from PTP learning as well — but unless the exchange is clearly set up for mentoring or coaching, you risk alienating your vets by putting them in a “learning” environment in which they won’t be the ones learning.

You think this isn’t happening at your learning events? Think again. Have you seen this combination of comments in your event feedback?

“I’ve heard this before. One or two new ideas but mostly a repeat of what I already know and do.”

“This was great! I took a bunch of notes and can’t wait to get back to put the in action!”

“Old stuff… isn’t there a new angle?”

“Loved it! Learned so much!”

These “contradictory” comments are evidence that you’ve attracted a range of early careerists and vets, and that your content was better received by the former than the latter. Figuring out how to make the session more collaborative could work, but you need to do it in a way that balances what the newbies need and engaging the vets so they will learn something as well.

It’s not an easy balance to find, especially when you don’t know who is sitting in the room. Who are the vets? How many are there? What’s their level of experience with the topic? Why did they show up — what do they hope to gain from the session? Who are the early careerists? What do they hope to gain from the session?

The experience of the vets needs to be valued and appreciated while the curiosity and enthusiasm of the early careerists is nurtured.

It doesn’t have to be that hard. Here’s one idea of how to set up such a session:

1. Think of a problem they’re all likely to face related to the topic.

2. Design a scenario around that problem (better yet, design several — one for each table of attendees).

3. Organize the way you’ll present the scenario by assigning roles to various levels of experience.

4. Set the room in rounds. At the start of the session, tell everyone they’re probably going to end up at another table, with other people.

5. Using the four corners of the room, ask those who’ve been in their positions less than a year to go to one corner, those with 1-3 in another corner, 4-9 in a third, and those with 10 or more years in another corner (of course, you should change these options so they’ll make sense for the averages in your industry).

6. Assign at least one person from each corner to each table until everyone is assigned and the tables have roughly the same number of individuals from the various experience levels at each.

7. Present the scenario. If you can, use a variety of scenarios or case studies so the tables aren’t all working on the same ones. The scenarios/case studies should be designed around a problem that must be solved. Because everyone knows their group has at least one vet and at least one “newbie,” encourage (or better yet, set up specific) roles encourage the learners to share experiences and questions, expose their curiosity, and exchange ideas.

8. Remind the groups that there is no absolute answer, and that the value in the exercise is learning how the problem could be solved, maybe in different ways. Let them know each group will have a chance to describe their situation and what they decided needs to be done.

Any push back you get from collaborative and social learning is usually from individuals who expected to learn something and didn’t. Sometimes they ended up being the “facilitator” (because of their level of experience related to the topic) when they weren’t expecting it.

Setting up the learning event so those attending know immediately that their strengths will be leveraged so they can learn from each other is the key.

Is that so hard?

Posted in Conferences, Learning in General, Social Learning | Tagged: , , , , | 1 Comment »