Crowdsourcing learning design with Larry Israelite – ATD2015

For the final session of ATD2015 day 1, SU401 address the topic of “crowdsourcing”. According our speaker Larry Israelite, the learning world has a problem: Often, we are not in the right place, nor fast enough to respond to business needs.Crowdsourcing can help. “But”, he hears us say, “we already do that. We have subject matter experts who help us.” Good start, but an SME is not a crowd. So how is the concept of crowdsourcing applicable to what we do in learning? Read on…

 

Who is it good for?
The man sitting next to me works for the U.S. Federal Government’s procurement department. He needs to be sure that people around the world and across departments comply to rule and regulations, follow processes and do a good job. He gets a request for some formal learning programme. He makes it. He facilitates it. Now he wants to know if people learnt something. It’s time to test them.

According to speaker Larry Israelite, our learning designer will have to book a meeting with a (busy) subject-matter-expert in order to create a test. And after his first design effort, he will no doubt go back to that busy person to correct and refine the test. If he instead asked the crowd to make the test for him, he would get much faster to a perfectly acceptable test.

 

How does it work?
In 1906, a British statistician Francis Galton observed during a country fair that the average answer of a crowd of about 800 people guessing the weight of an ox was correct to within 1% of the actual answer. He proposed that provided the size of the crowd hit a critical-mass, this would always be the case: The crowd is smarter than the sum of its parts. And its right.

To see the principle in action, our speaker asked us to make a test together using an online tool from Smarterer. The 200-odd attendees created quiz questions on “the 80s” related to different categories (trends, movies, hair-bands). Then we corrected the test questions that other people had written: Are the answer options correct? Is the question clear? Are there issues with the answers cited as correct? etc.. Within about 5 minutes, we had a test of 300 questions, signed-off by over 100 people.

My first reaction was: This is awesome! Crowdsourcing is brilliant. Where is the app for this? I want it!

Then I thought a little more…

 

Firstly, what about skills?
My neighbour made a knowledge-based training programme. For him, it might be interesting to test that knowledge. But how much do I really care about knowledge testing? How can I get the crowd involved in skills-assessment?

 

Actually, do I even care about testing at all?
If I slow down a bit and think about the final result I want from my learning initiative, it is hardly ever (never?) really about people passing a test. What I want is for people to do what they are supposed to do, to get the business results I need. Provided they are consistently doing that, do I really care what they learnt or how? Wouldn’t it be better to put the crowd’s wisdom and resources into we be putting more effort into supporting actual performance in the real workplace?

 

And finally: Am I really sold on the wisdom of the crowd?
In Francis Galton’s original crowdsourcing experiment, the participants were “country-folk” living in an era of agriculture and farming. They might have known a thing or two about oxen. And they could see the ox, a physical thing, “weighed-up of” real facts.

But today, we were talking about people-culture, movies and random 80s opinion. There was no ox in the room and the questions did not concern physical factual attributes. Yes, we made a test together and yes we agreed on the questions and answers. But were we right? And if we are not yet sure and this has to be checked, then didn’t we just lose the whole (speed) mission of crowdsourcing the test creation in the first place (instead of just asking an “80s SME”) ?

I suppose therefore that this question of crowdsourcing expertise and testing is not about speed and test answers at all, but about trust and control. Can our U.S Federal Government learning designer put his faith in the crowd of government employees to make his test? Or will he feel the continued need to control and verify everything with someone who knows best?

To be or not to be, THAT is the question.

Should I ask the crowd?

 

 

Auteur

Dan Steer is freelance trainer bij Kluwer en learning & development consultant. Dan Steer is een Infinite Learning© kampioen, en gedreven door alles wat te maken heeft met SoMe, SoLearn en Enterprise 2.0.

Lees ook

Nieuws per domein

Meest gelezen

Let's connect