aLearning Blog

Online Learning for Trade Associations

Measuring Level Four

Posted by Ellen on June 24, 2011

Sounds like something from a sci-fi flick, doesn’t it?!? But of course I’m referring to Donald Kirkpatrick’s four-level model for measuring learning outcomes. The first level, you’ll remember, is “reaction.” We do a good job of measuring that by using “smile sheets” — those feedback forms that we issue right after learning as occurred (for more on Smile Sheets, see the article “Smile Sheets To Smile About” in the April 2010 issue of ASAE’s Associations Now magazine).

And whenever we “test” our learners on what they absorbed from a session, we’re measuring whether they learned (level Two on Kirkpatrick’s scale).

Levels Three (Behavior/Transfer) and Four (Impact/Results) are admittedly more difficult. They’re a challenge for corporations — and they have access to employee records, performance reviews, business outcome data, and all of that. How could we possibly begin to tackle these evaluative levels — and why would we want to try?

Let’s start with why. The answer is because.

Because we want our members to see evidence for themselves of the effectiveness of the training we’re delivering to them. The more we can demonstrate to them that they are benefitting (and their employers are likewise benefitting) from the educational sessions we provide, the more likely they are to renew their membership, register for more events, and tell others about the advantages they’re experiencing.

Because we want our association leaders to bear witness to the results of the programs we offer. Yes, they’ll see the attendance data, the revenue, and all of that, but showing them how members are contributing to their workplaces in ways they hadn’t before the training they took with us, the more powerful the rest of the numbers will be. This builds credibility for your department and should make it easier to gain their support for future program investments.

You can get insight into Level Three (behavior/transfer) by following up six to eighteen months after the event with an evaluation written specifically for this purpose (see “Nothing To Smile Sheet About”  and Chapter 17 of aLearning: A Trail Guide to Association eLearning for more on how to construct these evaluations).

But how do we get to Level Four? Much the same way we got to Level Three — by sending the session participants an evaluation that’s been carefully designed to solicit the sorts of responses that are experiencing the positive business impact we intended as a learning outcome.

Here’s how you might do that (adapt this to your own purposes, of course):

1. Get the learning objectives in front of you. If they were written well, they should provide the desired outcomes. For example, “The learner will be able to write effective broadcast e-mails that result in increased numbers of click-thrus.”

2. If your learning objectives weren’t written this clearly, brainstorm the possible business outcomes when the learning objectives are correctly applied.

3. Write questions that solicit specific business outcomes as a result of the session. Using our earlier example of broadcast e-mails, one question could be, “As a result of taking this training, have you experienced an increase in the number of click-thrus for your broadcast e-mails?”

4. Write follow-up questions that probe for details. For example, “What percentage of an increase in click-thrus have you experienced?”

5. Allow for exceptions — you can learn from these, too. For example, “If you haven’t experienced an increase in the number of broadcast e-mail click-thrus, describe the factors that could be affecting this result.” You might learn that they stopped sending broadcast e-mails or that someone else is now sending them and the learner doesn’t have the data. It could be that they always had a high rate of click-thrus so an increase that doesn’t seem significant is still a positive outcome.

Here are some examples of phrases to get you started:

“As a result of taking this training, have you experienced a decrease in…

…the cost of [X,Y,Z]?”

…employee turnover?”

…number of claiims?”

…number of errors in [A,B,C]?”

…number of complaints?”

…complaints about [A,B,C]?”

“As a result of taking this training, have you experienced an increase in…

…productivity?”

…sales?”

…profitability?”

…frequency of orders?”

…amount per order?”

…repeat business?”

…employee retention?”

…employee satisfaction?”

…customer satisfaction?”

…customer retention?”

“As a result of taking this training, have you experienced a savings in [X,Y,Z]?”

“As a result of taking this training, have you experienced enhanced creativity?”

“As a result of taking this training, have you reduced…

…waste?”

…re-work?”

…accidents?”

“As a result of taking this training, have you cultivated innovation?”

“As a result of taking this training, have you shortened your time to market with new products or services?”

“What other business outcomes have you experienced as a result of taking this training?”

Most importantly… after each question, ask for specifics:

How many? By how much? By what percentage did this change?

And of course you’ll want to emphasize that your data is strictly for evaluative purposes — you don’t need specific financial or other data, you just want some indication of the effect the training has had. Most members won’t release data that’s confidential to their company anyway, and some might be reluctant to even share that the training has made a business-side impact. That’s okay. Find out what you can from those who are able to share and consider yourself lucky to have that.

If the results are particularly stunning, follow up with individual respondents to see if you can use a quote from them for reports to the education committee, board of directors, or even in marketing materials. Offer to show them the quote and obtain their permission before releasing it. Being able to use specific testimonials is a plus — the real purpose of conducting this evaluation isn’t marketing, however.

When you have enough responses, aggregate the data so you can see the overall picture: how did learners benefit in general from the session? Was one objective particularly valuable? Was there any learning objective that seemed especially challenging? Why?

Thank every respondent, especially if their names are attached to the evaluations you get. Let them know how much you appreciate providing the feedback so you can continue to improve the program. A simple thank you goes a long way!

Has your organization conducted Level Four evaluations? How have you conducted them? What did you learn from the results? We’d love to hear your stories here at aLearning!

Leave a comment