Guila Muir and Associates

Home » Book, » How Do You Know They Know? Evaluating Adult Learning

How Do You Know They Know? Evaluating Adult Learning

Archives

How can you evaluate training effectiveness? “Happy Sheets” don’t go far. Outcome-based evaluations meet criteria in Kirkpatrick’s evaluation model.

Learn more about evaluating adult learning

Mature students learning computer skills

How Do You Know They Know? Evaluating Adult Learning

by Guila Muir
info@guilamuir.com

I con­tinue to be sur­prised at the use of “Happy Sheets” as eval­u­a­tion tools in train­ing. Beyond let­ting the trainer know if he or she was loved and if the room was too cold, what else do they tell us?

In 1959, Don­ald Kirk­patrick devel­oped his famous model of train­ing eval­u­a­tion. Since then, it has pro­vided basic guide­lines to assess learn­ing. Experts have found that 85% or more of all train­ing pro­grams use “Happy Sheets,” which reveal noth­ing about actual learn­ing. And because data is much harder to col­lect and attribute directly to the train­ing the deeper you go, fewer than 10% of train­ing pro­grams use a Level 4 evaluation.

Level

Issue

Ques­tion Answered

Tool

1

Reac­tion

How Well Did They Like The Course? Rat­ing Sheets

2

Learn­ing

How Much Did They Learn? Tests, Sim­u­la­tions

3

Behav­ior

How Well Did They Apply It To Work? Per­for­mance Measures

4

                  Results What Return Did The Train­ing Invest­ment Yield? Cost-Benefit Analy­sis (Return on Investment)

Outcome-Based Eval­u­a­tions

By cre­at­ing and using an eval­u­a­tion based on the course’s learn­ing out­comes, you may get closer to an hon­est answer to the ques­tion, How Do You Know They Know? which is eval­u­a­tion at Level 2 of Kirkpatrick’s model. Typ­i­cally, the outcome-based eval­u­a­tion would ask par­tic­i­pants to rate their own abil­ity to per­form the learn­ing out­come, as in the fol­low­ing example:

As a result of this train­ing, please rate your abil­ity to do the fol­low­ing action from 1 (“I can’t do this at all”) to 5 (“I feel totally con­fi­dent doing this”): “I can explain at least five fea­tures of the Get Fit pro­gram with­out using notes.”

In many cases, the outcome-based eval­u­a­tion would also ask the par­tic­i­pant to list or explain those five features–in this way, act­ing as a test.

Keep in mind that unless you ask addi­tional ques­tions, you are still sim­ply col­lect­ing data on your par­tic­i­pants’ per­cep­tions of their own learn­ing. Sadly, those per­cep­tions of learn­ing are usu­ally much higher imme­di­ately after the train­ing ses­sion than a few days or weeks later. This is why follow-up train­ing and rein­force­ment is so important.

Nonethe­less, using an Outcome-Based Eval­u­a­tion can pro­vide infor­ma­tion on:

  • Per­for­mance issues about which the par­tic­i­pants feel less confident.
  • Issues you could improve or clar­ify for the next round of training.

All of this data is valu­able to you as you (1) improve the class itself, and (2) fol­low the par­tic­i­pants into the work­place to observe and sup­port them. We invite you to down­load free exam­ples of Out­come Based Eval­u­a­tions from Guila’s book, Instruc­tional Design That Soars.

Advertisements

2 Comments

  1. […] See on instructionaldesignthatsoars.wordpress.com […]

  2. texicana2013 says:

    Thank you for following my blog! All the best to you.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Archives

%d bloggers like this: