aLearning Blog

Online Learning for Trade Associations

Posts Tagged ‘surveys’

Pulling Back the Curtain on the 2011 aLearning Association Survey

Posted by Ellen on October 15, 2011

A few things about the recent aLearning Association Survey that you need to know, and that I need to get off my chest.

Being a lifelong learner doesn’t just mean continuing to take classes, stay informed, and all of that… it also means being willing to learn from each experience. Maybe the “post mortem” experiences I had as an elearning project manager for a custom content development company have become a part of the way I naturally do things, so that after nearly every experience I think: “How did that go? What went well? What didn’t? What can I do differently next time to make the experience/situation go better?”

I’ve already mentioned that I didn’t ask at least one question clearly enough to get numbers that could be used to answer a few other questions as well… something that will be corrected next time.

And here are some other questions you might be asking about the survey itself:

Why Was the Survey So Short??

Everything aLearning does is the result of balancing what we can do ourselves, get for free or accomplish at minimal expense. There are a lot of reasons for this, and one of them is to demonstrate to you (practicing what we preach) what can be done with a shoestring staff and budget.

So we use the free version of Survey Monkey for our surveys. This limits us to 10 questions, and requires that any sort of report breakdowns be done manually. Which is what we did to give you the results by category — I just see a summary of all responses or I can see anonymous, individual responses. No reports.

I suppose we could have done a few surveys of more depth that focus on one area (and maybe we’ll try this in the future), but we’re also aware that you get requests to do a LOT of surveys… so we like to keep them short and easy to complete.

Using the free version of Survey Monkey is also why we’re unable to automatically enter everyone who completes the survey into the drawing — and why you have to let me know separately that you’ve completed the survey and want to be entered. As with all things. We get what we pay for, and in this case, the price is still worth what we get in return.

How Did You Decide What to Ask?

Curiosity. We’re not motivated by anything else. We don’t have sponsors or advertisers or others who might have agendas, so we aren’t in danger of being wooed into asking questions that could support some hidden motive. In this case, we tried to ask questions that you might want to know about what others are doing.

How Many Respondents Did You Get?

As we said, too few respondents answered our call to offer what could have been a nice benchmark, unfortunately. We had fewer than 100 responses, and of those, some dropped out of the questionnaire before answering all questions.

Why Did So Few Respond?

That’s a good question and it has plenty of answers. We put the word out via the blog and through direct e-mails to aLearning supporters and contacts. Thanks to those of you who promoted the survey via your own blogs, Tweets, and other means, we saw a “lift” in the number of respondents. We’re grateful for the word-of-mouth attention we got, and hope to see this sort of support in the future. The endeavor isn’t large enough to afford other means of promotion, so we’re happy to see this sort of response.

What’s It All Mean?

So what does all of this mean? Well, on the downside, fewer respondents than we would have liked means results that don’t provide reliable benchmarking.

On the upside, we’re able to respond individually to the requests for information we get and otherwise form friendly bonds with the aLearning blog readers. We got enough responses to provide a broad picture of what various organizations are doing based on the size of the organization, and to look at the data from a few other perspectives as well.
We don’t accept advertising and — despite lots of requests to allow guest bloggers who seem to want to promote some specific LMS or learning service — remain independent. Our independence means we can say what we want without worrying about alienating sponsors, advertisers, subscribers, clients, dues-paying members or other entities.

The only agenda you’ll find at aLearning is advocacy for association learning leaders (aka: YOU). The smaller your organization, the closer you are to aLearning’s heart.

So we don’t attract hundreds or thousands of survey participants. We often see several posts go by without a comment. But that’s okay. We’ll continue to provide what we can in the best way we can do it….

…and serving as an inspiration along the way, we hope!

And though the summary of the results is now concluded (those of you who requested a report will get one soon), we hope this opens a dialogue — let us know what you’re doing that might not have been mentioned. Feel free to add clarity to any of the responses you gave that deems it.

 

Let’s open the door as wide as we can so we can get everybody who wants to improve their elearning and social learning options can get into the room :)

Posted in aLearning Strategies, aLearning Surveys, eLearning Resources | Tagged: , , , | 2 Comments »

2011 aLearning Association Survey Results Summary — Part 4

Posted by Ellen on October 14, 2011

If you’ve been following our recent posts that summarize our 2011  survey, you’ve seen that organizations of all sizes are leveraging online learning in some way or another. (Click here to see part 1 covering profiles and budget, here for part 2 on elearning programs, here for part 3 on social learning.)

But how are associations and other non-profit organizations making decisions about which programs to pursue? Do they have a strategic plan? Do they have a different method they follow?

Again, results were scattered. But, again, there’s a lot we can learn from taking a look at them.

Half or more of responding organizations have some sort of method for planning educational programs (click the image to see it enlarged):

Here’s the question that was asked: “Do you have a strategic plan for your association’s educational offerings? If not, how do you decide how and when to make changes regarding your educational offerings?”

Many respondents didn’t seem to see a distinction between getting input from an education committee (just to use one example) from creating and implementing a strategic plan for the education function. Other organizations were quite clear about the differences, saying (for example) they were in the process of developing a strategic plan.

What are the different methods for deciding how and when to make changes in educational offerings? Here are some responses:

  • “courses are evaluated on an ongoing basis by the education committee”
  • “an annual education plan”
  • “analytic and sales results judge whether programs are implemented”
  • “content changes/edits occur at every event, different volunteers lead the program content, including Webinars”
  • “Our decisions about educational offerings are guided by our association’s overall strategic plan, which includes some direct  strategic directions related to education and online engagement.”
  • “input from committees, board and membership”

So does it really matter whether you evaluate your programs in these ways or have a more formally created (and attended to) strategic plan?

We were curious about this, and decided to look at what organizations will be changing in the next year next to whether they have a strategic plan (or follow the organization’s overarching strategy).

See what you think. Does having a strategic plan make a difference?

Certainly major decisions — about whether to incorporate an LMS or get a new one, for example — can be made without a strategic plan.

But as you can see, organizations with a plan had a greater variety of anticipated changes — from implementing mobile learning to adding virtual experiences into the mix.

Did you also notice that organizations with a strategic plan are adding education-dedicated staff members?!?!?

I sure did.

One of the biggest challenges paid education staffers face is limited time. With only so many hours in a day, it’s hard to get everything done. So when the case can be effectively made to add personnel, it’s worth celebrating.

Can such a case be made without an education strategy? Probably. And of course this survey wasn’t designed to try to show a causal relationship between having a strategic plan and being able to hire additional staff (or purchasing an LMS, or making other significant changes), but there does seem to be some relationship between them.

So if you’re thinking you’re okay moving from event to event, making changes here and there, adding a program and subtracting one as the numbers seem to fluctuate… think again. Are you really moving your organization forward in leaps and bounds toward a clear destination, or inching it along to who knows where?

Your organization is relying on you to lead them. Don’t let them down.

My sincere thanks to all of the survey participants, and special congratulations to Mary Beth Ciukaj, Director of Education for the Council of Residential Specialists in Chicago, who won a signed copy of aLearning: A Trail Guide to Association eLearning.

More general comments about the survey next time, then I’ll put the survey and its results to rest.

Posted in aLearning Strategies, aLearning Surveys, aLearning Trends, Financing eLearning, Justifying aLearning, LMS, Social Learning | Tagged: , , , , , , | 1 Comment »

2011 aLearning Association Survey Results Summary — Part 3

Posted by Ellen on October 13, 2011

Once again, our sincere thanks to the many association learning leaders who responded to our request to participate in the 2011 aLearning Association Survey and to those who promoted it. While we had the best response yet to an aLearning Survey, the number of responses wasn’t high enough for us to confidently suggest that the results serve as any sort of benchmark. Instead, we recommend that you use this summary as a way of seeing what other associations and non-profit organizations are doing in the way of online learning.

Past posts have summarized profiles of the survey participants, their staffing, budget, and online programs.

In this post we’ll take a look at how many of the respondents are using social learning.

Respondents were given these answer choices to a question about how often they have been using social learning:

Every Event
Every Online Event
Every Face-2-Face (F2F) Event
Some Online & Some F2F
Sometimes for Online Only
Sometimes for F2F only
Tried it but haven’t used it consistently
Have Never Used It

For summary purposes, we’ll use the following abbreviations:

EE= Every Event
EO= Every Online Event
EF2F = Every f2f Event
SOSF2F = Some Online & Some f2f
SO = Sometimes for online only
SF2F = Sometimes for f2f only
T = Tried it but haven’t used it consistently
N = Never

Remember, respondents were asked to use the number of members served, rather than the number of memberships to identify the size of their organization. (For example, a trade organization with 500 institutional members that serves 5000 individuals, should have identified themselves as an organization in the 3001-6000 category.) We can’t be sure all respondents followed this request, but we’re trusting that they did :)

Take a look at this table showing how various organizations are (or aren’t) using social learning elements with their programs:


I don’t know about you, but a few things stand out for me:

  • A lot of organizations, regardless of the number of members or staff size, has incorporated social learning in some way. And while we might assume that the larger organizations are more aggressive in this area, our results don’t support that assumption.
  • Some organizations have opted to incorporate a social learning component with every event; it seems that this would only happen if the benefit of doing so had proven well worth the additional time and resources required.
  • Social learning components are primarily tied to face-fo-face events, rather than online events.

This last item is a bit puzzling… Maybe social learning isn’t being implemented as an element of online events because those events are structured to allow for interaction with others — so there is no perceived need for a supporting “social” element. But that wouldn’t explain why, then, a “social” element would be desirable to supplement a face-to-face event, where — presumably — people are about to meet and talk one-on-one. Hmmm! I confess to expecting to see social learning linked to asynchronous events, as those tend to be situations with solitary learners. Supplementing them with social learning elements seems to make sense, don’t you think?

Some of those who commented remarked that they use Twitter but in a general way, rather than tied to specific programs. Another respondent remarked that every event incorporates social learning because all elearning is connected to their organization’s social network. Yet another said they require a social component within a formal, online certification program.

As usual, these variations indicate that associations and other organizations are navigating their way along the social learning and elearning paths… but what big changes do they see coming in the next year?

For that insight, watch for our next post, summarizing more of the survey results.

Posted in aLearning Strategies, aLearning Surveys, aLearning Trends, Asynchronous Learning Types, Justifying aLearning, Social Learning, Webinars | Tagged: , , , , , , , | 1 Comment »

2011 aLearning Association Survey Results Summary — Part 2

Posted by Ellen on October 12, 2011

Last post we looked at the general profiles of respondents to the recent aLearning Association survey and some of the outliers we noticed in the data. We also summarized the educational staffing and general budget information.

In this post we’ll take a look at how respondents are spending their money. As has been the case throughout, you’ll notice what we did: use of online and social learning is uneven. Some organizations are neck-deep while others are not involved at all. The survey didn’t explore reasons, but differences in how education supports each organization’s strategy probably account for most of the cases.

Remember, we asked that the size of the organization be identified by the number of individuals served in the membership, even if the organization is a trade association with institutional members.

The pattern is easy to spot (click the table to see it larger):

The larger the organization, the more likely it is to be involved in synchronous and asynchronous learning. Remember that these include Webinars and Webinar recordings (though we asked respondents to differentiate by a sub-question, many didn’t make this distinction — so it’s entirely possible that the only way synchronous elearning is being delivered is via Webinar and that most asynchronous elearning consists of recorded Webinars.

Of course what’s most interesting is the uneven implementation of blended options. First it should be noted that although none of our respondents in the 501-1000 category uses blended learning, we can’t conclude that no organization of this size uses it, or that half of our respondents in the 1001-3000 category report using blended learning that half of all organizations of this size use it. (As much as we hoped to have adequate responses for true benchmarking, we didn’t…. To everyone’s disappointment, I’m sure.)

The uneven implementation of blended learning has (we believe) everything to do with the various types of “blend” that are going on. Here are various ways some of the respondents described their use of blended instructional modes:

  • Online forum discussions before & after face-to-face events
  • Webinars with structured face-to-face activities
  • Face-to-face programs with follow-up Webinars
  • Recordings from face-to-face programs made available online
  • Live sessions from the annual conference streamed online
  • Incorporating Webinars, online workspace and conference calls into a year-long training program

So what does all of that say about using social learning across associations? Our next post covers what the survey revealed about that.

Posted in aLearning Strategies, aLearning Surveys, aLearning Trends, Asynchronous Learning Types, Justifying aLearning, Webinars | Tagged: , , , , , , , | 2 Comments »

2011 aLearning Association Survey Results Summary — Part 1

Posted by Ellen on October 11, 2011

A heartfelt THANK YOU to everyone who completed the recent aLearning Survey for Associations and to those who helped promote it! We were thrilled to see more respondents to this survey than those in the past, although we were disappointed that we didn’t achieve the numbers desired for it to be a reliable benchmark.

Even so, the results are revealing and worth a close look. Those of you hoping to use the results as a benchmark will find some valuable insights as you compare your elearning status to other organizations.

It was clear at the start that one of the Profile questions might not have been worded correctly for accurate responses. We’d hoped to get some kind of ratio for the number of paid staff members to the number of members served (in the case of trade organizations, individuals served, rather than institutions). But when we saw organizations listing their size as “1001-3000″ saying they had 100 (in one case ) and 300 (in another) staff members fully dedicated to education, we knew something was off. And when we saw an organization of 1-500 members say they have an education staff of 300, we guessed that these responses weren’t very reliable. (It’s possible volunteer-driven associations serving education see all of their volunteer education leaders as staffers… but that’s just a guess for why the numbers seem off.)

Despite some outliers, generalities can be made.

That said, here’s the first installment of a series covering the results of the 2011 aLearning Association Survey.

Organization Size and Education Staffing

Some respondents completed the initial profile information, then opted out of the additional pages for various reasons (in one case, the respondent was a vendor and realized her responses would skew results). Respondents who did not complete the full survey have been omitted from this summary.

The single largest group of respondents came from organizations representing more than 10,000 individuals, and the second largest group serves 1001-3000 members. Generally, the respondents were pretty evenly spread across all sizes of organizations. Organizations representing more than 10,000 individual members were asked to note the specific number, and (of those responding) these ranged from 20,000 up to 180,000.

You’d expect this would mean that these organizations are also all over the board in their other responses, and you’d be right.

We asked how many staff members in the respondents’ organizations are dedicated full-time to education, including directors, meeting planners, and support personnel. The numbers were all over the place, as already mentioned, so we have to be careful in interpreting the answers. But here’s what’s intersting:

1-500 members: 1-5 education staff members
501-1000: 0-9
1001-3000: 0-200 (or take your pick: 0-100; 0-15)
3001-6000: 4
6001-10000: 1-13
10000+: 1/2 – 100+

Talk about all over the place! If we take the most conservative numbers, staff members could be representing anywhere from 50-80,000 individuals! That’s quite a range. (The 80,000 number comes from a respondent who listed individuals served as 40,000 with one person dedicated 1/2 time to education.)

Budget

What surprised me the most about these results was the number of respondents who didn’t know what percentage of their organization’s overall budget is dedicated to education. Also surprising were those who said they didn’t know whether their education funding would be increased, decreased, or stay the same in the next year. (More on this in a future post.)

Here are the ranges from  those who did answer:

1-500 members: 5-70% of the organization’s budget is dedicated to education
501-1000: 0-100%
1001-3000: 1-30%
3001-6000: 30%
6001-10000: 8-50%
10000+: 4-80%

Budget is always rough territory — so much depends on the organization’s mission and how critical education is in supporting the organizational strategy. So we expected some range within these numbers.

The question is: what are you doing with those funds, and how are you deciding what to do with them?

So let’s take a closer look at two specific respondents from the 1-500 member category:

  • Respondent 19 said their education funding is just 5% of the overall budget. They have 1 individual fully dedicated to education, yet they offer up to 11 online synchronous and 2 blended events each year. They’ve tried social learning but haven’t fully implemented it. They expect their education funding to increase in the next budget cycle.
  • Respondent 15 said their education funding is 70% of their overall budget, but they aren’t doing any synchronous, asynchronous, nor blended learning events (including Webinars). Their focus (it seems) is completely on in-person, face-to-face events. They expect their education budget to remain the same for the next year. Like Respondent 19, they have one fully-dedicated education staff member. They’re doing a bit more with social learning by incorporating it with some of their face-to-face events.

Of course, lots of unknowns are in play here: even a 5% budget can be larger than someone else’s 70%… educational needs aren’t always best met online… etc etc.

BUT:

Unless Respondent 15′s organization is reaching 100% of their membership with face-to-face events (and maybe they are) they could be leveraging online learning more effectively than they are. Do they have a plan? No. Does Respondent 19 have a strategic plan for their organization’s educational offerings? Yes. (And which organization is getting an increase in funding?!?)

Maybe this is the real difference between the two.

Anticipated Budget Changes

What about the organizations’ expectations regarding whether their budget will increase, decrease, or stay the same?

1-500: 50% of respondents expect an increase; 50% expect their budget to remain the same
501-1000: 50% increase; 50% stay the same
1001-3000: 60% increase; 30% stay the same; 10% decrease
3001-6000: 100% stay the same
6001-10000: 75% increase; 25% stay the same
10000+: 30% increase; 70% stay the same

In a time of cutbacks all around, it’s great to see educational initiatives holding their own — or, even better — their funding be increased. We can guess that this means more organizations are appreciating the value that good educational programming brings to the organization.

So what are organizations doing with their money when it comes to elearning and social learning?

Details on those next time….

Posted in aLearning Strategies, aLearning Surveys, aLearning Trends, Financing eLearning, Justifying aLearning, Online Learning in General | Tagged: , , , , , | 4 Comments »

 
Follow

Get every new post delivered to your Inbox.

Join 721 other followers