Ask Why before Measuring

Recently I had the following scenario put forward to me:

If someone asked you: “How do I get started thinking about and moving towards a measurement-based approach to training?” How would you answer them?

Why?

My first question to them would be “Why?”
“Why are you feeling the need to move towards a measurement-based approach to training?”
To be honest, it’s a trick question.  Seeking to have a measurement-based approach to training is wrongly focused.  Training (and all learning) should be focused on improving specific and measurable organizational and/or individual performance.  Measurement (and data collection, metrics, and learning analytics) is in service of a performance-based, data-driven approach to training and learning.
However, there is value in the question of why a measurement-based approach to training is desired.  The answers, some worse than others, will reveal how much work is to do be done.

Nice Try, But Try Again

The really bad answers will be along the lines of:

  • My supervisor asked me to look into it
  • Every other department has great looking graphs and charts
  • At DevLearn/ATD-ICE/HR Tech Conference everyone was talking about measurement and metrics
  • xAPI enables us to measure all kinds of

The problem with these answers should be obvious.  Taking on any initiative to “keep up with the Joneses” or to meet an unrationalized task given by a supervisor is a waste of time and money.  Without a purpose, you’re simply taking stabs in the dark hoping to land on something valuable.   My response to these answers is “don’t waste your time. ”
Go back to the drawing board and determine if there is a business need to drive your exploration of measurement or forget about the initiative completely.  With no real rationale, it’s doomed to failure.

That’s Better, but…

There is a set of “middle ground” answers that are headed in the right direction.  But yet they fall short of providing an answer to the question “Why?”

  • The stakeholders for our new sales training want to measure the success of our solution by sales metrics (percentage of deals closed in first three months, overall increase of sales per representative)
  • We know we aren’t capturing Fitzpatrick Level 3 and 4 data and have to figure out how to do that quickly.
  • Finance won’t accept our budget requests without an explanation of how we are going to determine if each program is meeting its operational and financial goals.
  • We keep reading industry reports that say senior leadership of most organizations feel they aren’t getting adequate data from learning and development.  We approached our senior management and found out they feel this way as well.  But we don’t know what data we need to gather to satisfy them.

At this level of understanding,  the push for measurement is coming from external (to L&D) agents – various stakeholders, industry thought leadership,  organizational gatekeepers (finance/IT), and others.  These answers still reflect a reactionary stance regarding how we report on our initiatives.  Input from these external agents is important.  But it should be input, not strategic direction.  We need to synthesize this input and build a coherent and achievable strategy for projects and learning as a whole.

This may seem a little overblown, but generally, it’s not.  We are just at the beginning of the transformation of L&D to being evidence-based and data-driven.  Most of us don’t yet understand the nuances of performance measurement tied to business objectives

Purposeful Strategy

Asking “why?”  to this point in answering the original question has been posed in order to identify 1) a lack of true business goals, 2) a scattered, unfocused approach to data collection and measurement, and 3) to unearth the potential resources and roadblocks to performance-based, data-driven reporting and decision making.

But the real “why?” (or why’s) gets at the heart of the purpose of each initiative and the desired change in organizational and/or individual performance.  Once this purpose is fully understood and a preliminary learning strategy and supporting measurement strategy can be developed.  Data collection, measurement, metrics, performance evaluation, and learning analytics are in support of the overall learning strategy. They are the means to an end, not an end in themselves.

It Comes Down to the Data

With all of this said, I still haven’t answered the original question, which re-written to address my initial concern would be:

If someone asked you: “How do I get started thinking about and moving towards a performance-based, data-driven decision and reporting system to support learning here at XYZ Corporation?” How would you answer them?

In general, I advocate choosing one or two new projects that are small-to-medium in scale to serve as a pilot and/or guinea pig.  One, if something goes haywire, it will have less of an impact. Two, you’ll be able to cycle through it faster – validating your new approach more quickly – so you can replicate your success rapidly.

On a more specific level, my response would be: Do you have a Data Strategy?

Do you know the answers to the following questions?

  1. Figure out who are the stakeholders in your project’s success. What role will they want to play in the project? Who is ultimately responsible for achieving the desired performance change?
  2. Make sure that the requested intervention has a clearly identifiable expected impact upon the business. If the stakeholders can’t define the benefit the change will have on the business, how will you create appropriate learning or performance management experiences?
  3. Understand how the stakeholders for this intervention expect to determine the project’s success or failure. What is the measure of success? Some may be qualitative.  Some may be quantitative.
  4. With your stakeholders, determine how each measure of success should be measured and set a SMART goal. If the measure of success is “increased sales” will it be measured by region? individual? company wide? Will it be recorded in units? currency? signed contracts?  Will the goal be an increase? a raw number?  a percentage over last period? 
  5. Determine the specific data that is needed to complete each measurement. What is it? Where in the process would it occur? Is it quantitative or qualitative? How might you measure it? Does the measure of success have component points of data that much be collected separately then calculated together? This should be done in an “ideal world” exercise.  Don’t worry about technologies, policies, collection methodologies, etc. at this point. What data would you need to provide the best information possible?

At this point, if not before, I’d stop and let them know that while there is much more beyond this, I’m guessing that their head is reeling.  My point is, it all has to begin with the business purpose behind the initiative which needs to be analyzed down to the data points needed.

The five steps I’ve outlined above, are required if you want to establish valid measurements that meet the business objective(s) of the initiative.  You will have powerful stakeholder buy-in and a foundation for valid and accepted reporting.  You’ll have the basis for ROI figures that are supported by senior management.  You also will be on your way to becoming a trusted business partner.

Determining the data you need to collect before you begin the design phase of your project is crucial, otherwise your design may leave out critical moments that are needed to generate the correct data.

SECRET: You Already Know How to Do This

You are doing stakeholder analysis already.  You’re already talking to your stakeholders doing a needs analysis, you’ll just add a few questions about their dreams and aspirations (and ask them to quantify them if they haven’t already).  You do task analysis of the process to be taught. You’ll add a few columns to your task analysis table for information regarding the related data.  You already know how to set SMART goals.

What do you think? Is this doable? Do you agree that sweating the data is worth it?  What would you change?  Why?  Please add to the conversation in the comment section below.

xAPI Resource Center Update

I’ve added 10 new resources to the xAPI Resource Center, including a subsection on Talking to Your Techies on the Statements page.  Your IT contacts will be amongst your most important stakeholders on an implementation of xAPI.  The resources I’ve included are written to be a bridge between non-technical L&D folks and the technical professionals who will have to endorse projects like this in order for them to move forward.  These resources should get them to a point of feeling like they know what xAPI is and to make a decision of whether they are ready to dig into the technical side of the spec for you.

Several new resources regarding cmi5 profile for content update that section including the launch of the SCORM Cloud Testing Utility.  The remainder are various items I think fit the criteria for inclusion in the Resource Center.

I’m working on two other Resource Centers that I hope to launch this spring.  Watch for opportunities to help me with those as i have a couple of “Work out Loud” activities that I’ll be seeking input on.

As always, your thoughts on xAPI or suggestions for resources I should include in the Resource Center are welcome in the comments section below.

Hey L&D! Your Underpants are on Fire!

This is the second of a 2-post series on the challenges L&D faces and solutions to overcome them.  In the first post, I addressed the challenges.  In this second post, I offer a suggested solution.

In my last post, Hey L&D, Your Underpants (gnomes) are Showing!, I discussed the black box that training and development has historically worked in.  Mimicking a South Park episode, I suggested that we have utilized the following model:

lnd-underpants-model

If we get honest, we don’t have any real clue if what we create actually will increase the ability of our learners to do their jobs better and impact business goals in a positive fashion.  We’ve gotten away with it because senior leadership knows that talent development is crucial to the organization’s health and ability to deal with ever more complex and changing markets.

But they are being pressured to justify every expenditure in the budget to ensure is is driving business priorities, goals and profits.  The free pass we in L&D have been receiving is about to be revoked.

We need to overcome what Matthew Syed calls “Black Box Thinking.”  We have been enabled by our organizations and the general culture to cover up and hide from our failures. One example.  If we don’t look at long-term retention of information, ignoring the forgetting curve, we can be satisfied with Kirkpatrick Level 1 and 2 responses that say our training was “great”, “just fine.”  Our learners don’t want to point out that they don’t remember what they learned a month ago – because it might make them look stupid.

We have to open up that black box and determine how to work in a new way and we don’t have much time.  Now a full-scale overhaul of L&D will take time and I’m sure that most senior leaders will work with us. If we set out a strategic plan to transform learning in our organizations and get working on the needed changes to make it so, they will give us the time to do it.  But the time to take action is now.

So what do we have to do now that we haven’t been doing?  (NOTE: these actions are interrelated.  This is not a step-by-step process.)

Data, Data, Data

And I don’t mean a compilation of smile sheets. We need data-driven evidence of the business impact of learning.  As CEB put it in a Learning Analytics whitepaper last year:

You can’t defend L&D activities to executives with anecdotal evidence, buzz from hallway conversations, or surveys that show program participants liked their instructors and were satisfied the learning experiences.

We need to build a comprehensive Learning Data and Analytics Strategy that will enable the collection, storage, analysis, and reporting of data.  Data about training efficiency, learning effectiveness, target behaviors, and business results.  Some of this data doesn’t exist today, some exists but our stakeholders “own” it (ie; sales, performance data), some might be at our fingertips, but we don’t know how to access it) and we may even need benchmarking data from outside sources.  It is vital that we work to develop an understanding of what data we need in order to answer the questions and then figure out where it will come from – is it collected manually or automatically?  where will it be stored?  who will analyze the data?  who needs to see reports of what? when? how?

We’ll have to understand how data, particularly big data, is handled, warehoused, and managed with our organizations.  What software tools do we need?  Are there IT resources to connect APIs?  to write scripts to link programs?

Change and Promote a Learning Brand

Even if you don’t think your learning has a brand, it does.  Jeff Bezos, CEO of Amazon, said: “Your brand is what other people say about you when you’re not in the room.”  More importantly, the quality of a brand determines how much people will trust you, how loyal they will be to you, will they have your back when others attack.

What do you want L&D to be known for in your organization?   A trusted business colleague?  A problem solver?  A group that uses evidence to make decisions?  A brand is a consistent message of who you are.  That consistent message is built through conversations and commitments fulfilled.  It is enhanced by actively listening and collaboratively determining business goals with your stakeholders.  Using data to drive decision making and in reporting results.

A Learning Culture focused on Continous Learning

Move away from event-based training and toward socially energized learning experiences.  Executives and Managers need to understand that helping their employees to enhance their knowledge and skills both specific to their current job and as professionals is a part of being a leader.  L&D needs to provide support, tools, and training so that managers can fulfill this role.  If you can get learning objectives incorporated into the performance management process, all the better.

Help employees to become self-directed learners.  Our employees know that in today’s world it is imperative that they expand then knowledge, hone their skills, take on new challenges, and become better professionals.  We need to aid them in this process.  Help them to learn how to learn.  Scaffold social learning experiences online and in their everyday interactions. Reflect on their work – on their own and with others.  We need to provide them with easy access to the resources they need to do their jobs.

Change Everything about L&D

We need to take a deep and honest look at how we are, or more likely aren’t, meeting the needs of our organizations and ourselves.   This will require the courage to face the fear of admitting past deficiencies and proposing radical change.

This goes to how we design our learning experiences which need to be grounded in business goals and data-driven decisions.  How we deliver learning needs to move to a multi-faceted, extended multi-c0ntact sequence of experiences.

We need to take the time and make the effort to learn the businesses we work in.  We must become peers with our stakeholders by understanding how they contribute to the success of the business.  We need to be able to engage in conversations that explore the details of their goals and the knowledge and skills necessary to meet them.  We need to walk the walk of a true business partner.

We also need to open ourselves to new ways of thinking about learning and living in a new world that is fast approaching. The science of learning, neuropsychology, big data, learning analytics, artificial intelligence, the internet of things are changing the world around us.  Think about how mobile devices has changed not only how we need to design and deliver learning, but also how humans interact with each other and the information they need.  The changes coming are even bigger than the impact of mobile.

Partner with Stakeholders as Never Before

Any major change in an organization that affects all employees (as learning does) is going to be disruptive, difficult, and met with resistance.  In order to effect the changes discussed above, we will need partners who are committed to our success.  Partners who trust that we have their best interest in mind and who will in return have our backs when things get rough.

Change the conversation.  Let them know that you know what their goals are and that you understand their needs.  We need to find an outcome or set of outcomes and related measures for learning that they can feel co-ownership because they tie to their business goals.  We have to be consultants to their business.

But simply setting agreed upon outcomes and measures isn’t enough.  We also must demonstrate a knowledge of the influence chain within our organizations.  Who do they listen to?  Does their boss delegate authority or do we need to help them advocate to their boss?  Do they defer to another manager or operations group (if IT says it’s ok, then….).  Are there roadblocks that they may not be aware of?  Knowing the political terrain around the solution will help both of you.  Because L&D isn’t siloed, you may have an oversight of the organization that they don’t have.

Communicate Early and Often

L&D should have a comprehensive, well articulated Communication Plan.  How do we plan to promote individual programs?  How do we plan to share our brand? What reports or dashboards can we provide to our stakeholders? sponsors? learners?

We need to be transparent with the organization about would we are doing, why, what our goals are and the results.  Reports and dashboards for our stakeholders and sponsors that show how we are contributing to their success.  Work out loud.

What “Phase Two” Should Say

In the end, we want to be able to say what Phase Two involves.  It won’t be easy, but we need to break the cycle of accepting less than we are capable of.  Our new model then would look like this:

our-new-model

What do you think?   Do we need to stop being underpants gnomes?  Can we make these changes?  What do you see as obstacles to making these changes?  Please share your thoughts in the comment section below.  I’d love it if you used the sharing buttons below to share this post with your social network. 

Hey L&D, Your Underpants (gnomes) are Showing!

This is the first of a 2-post series on the challenges L&D faces and solutions to overcome them.  In this first post, I’ll address the challenges.  In the second post, I’ll offer a suggested solution.

In Season 2, Episode 17 of South Park, the animated bitingly satirical cartoon on Cartoon Network, Cartman discovers that gnomes are stealing his underpants.  He and the boys follow the gnomes to their village and demand an explanation.  The gnomes provide the following explanation:

For a very long time, Learning and Development has worked with a version of this business model that goes something like this:

lnd-underpants-model

We spend millions of dollars creating and delivering ILT courses, e-learning, blended learning, games, simulations, job aids, webinars, etc. After delivery, we do limited evaluation (mostly at Kirkpatrick Levels 1 and 2), check with our stakeholders to see if they are happy (which the general answer is some form of yes/maybe when they really mean “I have no clue”), and we hope everyone believes that we’ve added value to the organization.

But we don’t really know. We don’t know if the experiences we deliver are effective at the learning level or the organizational value level.  The crazy thing is Leadership knows we don’t know, but knows that learning is important.  So the conclusion is that we must be having some impact and they approve next year’s budget.

Then there are things we do know.

Like the Forgetting Curve research that evidences that learners will forget nearly everything they learn within 90 days if it isn’t reinforced regularly.  In past practice, this would include just about everything we’ve tried to teach people.  There are those who refer to this as “Training’s Dirty Little Secret.”

We also know that by defaulting to an “order taker” mode of operation in which we build the training or tools we’re asked/told to create, we are creating training and tools that won’t impact.  We fall into this default because we don’t have the time to do a proper needs assessment, we lack the relationship with stakeholders that allows us to challenge them, stakeholders don’t value what L&D does, and/or we don’t understand how the business works.

We know that butts in seats and course completion rates have no actual bearing on learning outcomes or business goals, but we report them earnestly and leadership humors us, then gets back to real business.

This has recently been documented by two research reports released by LEO Learning and LinkedIn Learning Solutions.

In the LinkedIn Learning Solutions report, the No 1 desired result of training for leadership is proven business impact, but only 8% are seeing this reported.  The No 2 desired result is ROI, only 4% are receiving this on training.

Learners aren’t any happier either. 46% of L&D leaders say they have a hard time engaging learners.

The ability to report business impact is driven by four key challenges that L&D faces:

  • Having a limited budget
  • Getting employees to make time for L&D
  • Having a small L&D team
  • Demonstrating ROI (lack of data)

Perhaps the most shocking finding in the LinkedIn study is that less than 1/4 of L&D employees would recommend their own offerings!  Yet we are expecting our organizations to see enough value to keep funding us next year.

The LEO Learning report shows that 85% of L&D and talent leaders want to be doing the data analysis to demonstration Learning’s impact on the organization.  77% believe it is possible.  When compared to the LinkedIn numbers in single digits regarding how much leadership is seeing such impact data, the difference is shocking.  Why the huge discrepancy between desire (85%), intent (77%), and execution (8% and 4%).

The LEO Learning study also asked about how L&D was evaluated.  The results were all over the board – including 14% reporting the weren’t evaluated at all.

evaluation-of-ld
Data from LEO Learning, Measuring the Impact of Learning at Work, FEB 2017.

Only 35% of the LEO Learning respondents indicated they are feeling pressure to report on the impact of learning. They quote Mike Rustici, CEO of Watershed, “I think a lot of business leaders just don’t know what’s possible. …They’ve set their expectations very low in terms of what they can hold L&D accountable for and what kind of results they can expect.”

In addition, drawing data from the 2016 Towards Maturity Benchmark Report, while 93% of L&D leaders say data analysis skills are important, only 31% feel their departments have those skills.

Finally, LEO Learning identified 7 categories of difficulties in measuring the impact of learning:

challenges-for-lnd
Data from LEO Learning, Measuring the Impact of Learning at Work, FEB 2017.

The Towards Maturity 2016 Benchmark Report identified this list of barriers to change in L&D:

66% – Cost of development, set-up and maintenance
62% – lack of skills amongst employees to manage own learning
59% – Lack of skills amongst L&D staff to implement and manage e-learning
58% – Reluctance by line managers to encourage new ways of learning
57% – Unreliable ICT infrastructure/low bandwidth/technical restrictions firewall
54% – L&D staff lack knowledge about the potential use and implementation of technology

Towards Maturity also reported that 35% of learners find eLearning content uninspiring and 26% say it’s irrelevant to their needs.

To show that these are not new concerns, I went back to an ASTD (now ATD) study sponsored by IBM from 2006 – C-Level Perceptions of the Strategic Value of Learning Research Report.  That report found the following   (Note: CXO is used to refer to any C-Level executive other than the CLO). 

CXOs and CLOs agreed on the following:

  1. Learning provides strategic value at the enterprise, business unit, and individual capability level of an organization.
  2. Learning’s value contribution is seen primarily in business outcomes and human capital.
  3. It is difficult to isolate and measure learning’s value contribution to business outcomes.
  4. Perceptions of stakeholders (employees, business unit leaders, and executives) are a key indicator of learning’s value.
  5. The strategic value of learning is increased by strong governance processes for planning, allocating, and managing learning investments.
  6. Learning’s alignment with business needs is indicated by integration, proactivity, and responsiveness.
  7. The learning function’s efficiency can be increased by streamlining and standardizing processes, leveraging technology, and selectively outsourcing components of the function.

CXOs and CLOs differed in the following ways:

  1. CXOs are less concerned than CLOs with quantitative metrics that show learning’s value contribution to business outcomes. CXOs are more concerned with qualitative alignment of learning investment with the changing needs of the business, and with the perceptions of employees and business leaders, than they are with data from ROI studies.
  2. CLOs are more focused than CXOs on improving the performance of business units, through understanding business unit goals and performance gaps, and identifying learning and non-learning solutions to close the gap.

While stated in a different way, you can see the same themes running through these. #5 and #6 clearly show how all parties “agree” to gloss over real inadequacies.  Planning, allocating, and managing a budget are not factors in strategic value.  Integration, proactivity, and responsiveness are not business needs that L&D should align to (they are operational efficiency and relationship building goals). The issues and challenges for L&D haven’t varied much in the past 10-15 years.

For Learning and Development, the black box that is Phase 2 in the Underpants Gnome model glosses over:

  • Apathy of leadership toward holding L&D accountable in a real business sense.
  • Resource allocations that are “ballparked”, often ending up insufficient
  • Managers who don’t understand the need or role of learning in day-to-day business.
  • Learners who don’t value L&D offerings because they perceive them to be boring, irrelevant, and wasteful of their time.
  • L&D staff that are not capable of the skills needed and aren’t properly trained to do the jobs they are supposed to do.
  • Lack of business acumen amongst L&D staff.
  • Lack of learning acumen amongst operational staff.
  • Urgency overrides strategic need in L&D resource deployment.

There has been a mutual understanding between all stakeholders that these things are unimportant, too hard to address, or irresolvable.  L&D tends to be cordoned off from the rest of the organization.  And we’ve enjoyed the perceived autonomy it provides.

But there are signposts warning us that this way of operating isn’t going to last much longer.  We need to address this issues – starting yesterday.  Many feel it is an adapt or die situation.  But I blogged about this very issue in May of 2004 (adapt or die on eelearning). And we’ve not addressed it yet.  Will we now?

What do you think?  Please share your thoughts in the comment section below.  I’d love it if you used the sharing buttons below to share this post with your social network.