Learning Analytics are about to Go Big League

Will new SEC regulations regarding disclosure of human capital investments have an impact on learning and development?

On August 9, the US Security and Exchange Commission (SEC) adopted new rule amendments which, among other things, now require all publicly trade corporations to report on human capital metrics and objectives “as far as they are material to the success of the business.”

The interesting aspect of the new regulations is that the disclosures are not prescriptive and do not follow a framework. Rather the regulations take a “principle-approach” to what must be disclosed. What this means is that the company is required to disclose and human capital measures that will reflect the impact that human capital impacts the organization’s ability to generate profit or enables growth. This approach acknowledges that appropriate measurements may vary based on any number of variables including type of business, regionality, seasonality, etc.

These regulations have been officially under discussion for well over a year in the US. But the International Office of Standardization issued voluntary standards in 2018, which companies have been utilizing. While I haven’t yet seen the metrics the SEC suggests as part of the amendments to Item 101(c), the IOS guidelines prescribe 10 human capital measures of which 2 are L&D related: percentage of employees who have completed training on compliance and ethics and development and training costs. In an article in CLO Magazine in April of this year, Dave Vance argues that these regulatory changes are the catalyst that is going to ignite greater and greater demand for and scrutiny of learning metrics and reporting.

While the SEC’s principles-based approach leaves much room for interpretation, especially in early implementation, I think Vance is right that it will lead to an expectation of a culture of measurement in workplace learning and the standard will be what is the financial impact on the business. Sound familiar?

But this time, Wall Street will be watching. That’s the Big League. Are you ready to play ball?


Do you think these regulations will change anything? Will they drive greater support for data collection in learning? Motivate more collaboration between the business units and L&D?

When Good Social Networks Go Bad

Not all social network are healthy. What is the impact of unhealthy social networks?

Harold Jarche has a great blog post about trust and social networks entitled, “when trust is lost.” In this post he points to how social networks in China erupted when Doctor Li Wenliang who had identified the corona virus and was reprimanded by the Chinese government for going public about the virus died from the virus. The government had tried to tamp down the news by not sharing information and silencing whistle-blowers.

But once the news of the virus was out and information began to be shared, the world community rallied and seems to be containing the virus’s spread.

Jarche makes he point that trust is vital for social networks to thrive and when it is present, networked learning increases trust.  I agree with Harold on this.

I think the real lynchpin is in the information allowed into the system and the social validity that information can achieve. Unfortunately, we can see a negative version of that being played out here in America. If you can subvert or at least call into doubt information – say, it was Ukraine, not Russia who tampered with US elections – and you have enough of the social network that will repeat this information as being true, you can subvert the network effect that normally would hone down falsehoods to leave the truth standing free.

Social networks honed down the falsehoods China was building to hide the epidemic, but it’s also clear that it was close to succeeding if it weren’t for Doctor Li’s death. Unfortunately, social networks can give credence to falsehoods and erode trust.

Social networks enable knowledge-sharing but don’t guarantee that the knowledge shared is truthful. Healthy social networks with authentic, service-oriented leaders; that welcome dissent and questioning of current knowledge; and are open to change will tend to weed out falsehoods, build trust in the network and its members, and provide knowledge that can be trusted to the point when other networks may test it and revise it.

But there are social networks whose leaders are self-serving; whose members fail to question “known” knowledge – either out of convenience or by coercion; and are resistant to change what they hold to be true. These networks will seldom issue information that is “true” but with propaganda, diffusion, and bluster to cover the fact that they haven’t vetted this information against all other information. They put on a charade of network behavior that seems to generate truth.

Unhealthy social networks aren’t necessarily negative or nefarious. There are plenty of well meaning groups who leaders who prefer to lead from authority, not with authenticity; who’s members cling to the “truths” they know; who reject any dissent from group norms. These groups too generate erroneous “truths”.

Jarche’s model holds well with the assumption of healthy social networks.  What it doesn’t address is when the system has been corrupted and unhealthy social networks begin to change the equation. The challenge is how can we stem the influence of unhealthy social networks – without trampling on their rights to believe what they believe.


Do you think these regulations will change anything? Will they drive greater support for data collection in learning? Motivate more collaboration between the business units and L&D?

Gauging the Impact of Learning

The Talent Development Optimization Council (TDOC) releases information about its promising Impact Optimization Model.

The Talent Development Optimization Council (TDOC), lead by Kent Barnett, CEO of Performitiv, is beginning to roll out it’s Impact Optimization Model (IOM) which aims to be “a comprehensive and practical approach to address the issues of impact, business results, ROI and optimization for programs.”

They have published a white paper that outlines the framework for the IOM (it’s free if you provide your information) and a podcast interview with Kent Barnett by Conrad Gottfredson of The Five Moments of Need.

The TDOC is an all-star panel, including Dr. Gary Becker, Dr. Nick Bontis, Dr. Jac Fitz-enz, Fred Reichheld, Jack and Patti Phillips, and Matt Barney.

The goal of the model is “to help professionals create an automated, repeatable process to tell their story of impact, by demonstrating value, while identifying improvement opportunities in their programs.” There are four components to IOM:

  1. Evangelizing the Story from Learning to Impact.
  2. Connecting Learning to Business Results
  3. Linking Learning’s Value to the Business
  4. Continuous Improvement

Overall, the model looks quite good overall. It draws from non-learning focused business models like Total Quality Management (TQM), Six Sigma, Net Promoter Score (NPS) and others.

The white paper outlines a 5 step process for moving from Descriptive Analytics to Predictive to Prescriptive.

  1. upgrade the traditional learning evaluation process to gather the right data
  2.  identify measurable business outcome indicators (BOIs) tied to the desired outcomes of each strategic program
  3. gather and process the data to ensure the programs are operating within acceptable impact ranges.
  4. analyze the prescriptive data to find ways to improve impact
  5. take action on the findings of your analysis and monitor the link to business result from your original performance to goals

The white paper goes through these steps in some detail. Overall, the model stands upon various measures and models that are standard business practice for most organization. This is a great step forward for Learning and Development in its effort to show its value to the business.

However, there is a significant, and disappointing, amount of data and analytics that still rests upon opinion, self-reporting and feelings. The authors of the report are clearly aware of this as they provide a weak defense of having done so:

Some people may complain that is self-reported data, but so is NPS, and this is much more insightful. NPS does not tell you if learning was applied, what business result was impacted or if it created value.

Impact Optimization Model, p.10

The “NPS uses self-reported data” argument is a strawman defense. NPS (Net Promoter Score) is a measure intended to report a group’s perception and opinion of a company, program or product. By definition it is based upon self-reported attitudes. Most of the measures that the IOM leans upon opinion and perception to generate data for evaluation could easily be replaced by behavior observation measures.

For example, the authors suggest “Are you a more effective manager after this program” is a valid measurement of impact. 1) research clearly shows that we are terribly bad at self-assessing our own performance and 2) there is no assurance that the manager and the L&D team share the same definition of effectiveness. On the other hand, if we have crafted the learning intervention well, we should understand the behavioral changes we were trying to create. If this is the case, then there should be a measurable behavior changes that will demonstrate impact.

This last concern that I raise may be in the process of being dealt with by the Council. In a new post on LinkedIn, Barnett explains they have identified 8 metrics that will support the IOM. The look great but they are working out the details. If the TDOC can incorporate more financial, business result, and behavioral data and steer away from opinion, self-reported and perceptual data, they will have a model that could indeed change the game for L&D as a strategic function within organizations.


Do you think these regulations will change anything? Will they drive greater support for data collection in learning? Motivate more collaboration between the business units and L&D?

Feature Photo by Ryan McGuire provided by Gratisography.

living in the past

I’ve started a somewhat daunting task. I am in the process of adding my posts from eelearning – my first blog – to new eelearning.

Unfortunately, I can’t import them over via WordPress because I can’t remember my user name and password from 10 years ago. Nor can I produce m bill for the domain name. So WordPress won’t give me access to my old account. Understandable.

Knowing that old, inactive blogs eventually get taken down by their hosts, I’ve decided to do the next best thing – cut and paste. Fortunately, posts can be predated so I can post them “on the date” they were posted on eelearning. I’m including comments to the original blogs in the body of the posts here.

I know that such activity could bring action by the copyright holder. But since he is me, I think I’m safe from his wrath.


Do you think these regulations will change anything? Will they drive greater support for data collection in learning? Motivate more collaboration between the business units and L&D?

What will be hot for L&D in 2020?

Don Taylor is conducting his 7th annual global sentiment survey regarding what is hot in workplace L&D. Join in!

Don Taylor is conducting his global sentiment survey of learning and development professionals. His 7th annual survey contains one required question – “What will be hot in workplace L&D learning in 2020?”.

He reports that now having 6 years of survey results is providing clear trends and response patterns. In an email soliciting respondants for this year’s survey he points out:

  1. Not every trend or technology that is hot makes it to mainstream adoption
  2. There are two clear groups of respondents – early adopters and folks who are more cautious before they will jump on the bandwagon,
  3. It can take time for a fad to reach widespread adoption,
  4. What leads a hot trend or technology to more to widespread adoption is “intricate, messy, and absolutely fascinating.” (Don is working on a model that he hopefully will share soon.)

Take a couple minutes to answer Don’s survey at http://bit.ly/GSS20GenM. After you answer the question and/or an optional question (What makes something hot?), you’ll see the aggregate responses for this year.). Answers are anonymous.

Feature Photo by Pixabay on Pexels.com

When Will We Learn?

How well is our learning organization tranforming to meet the demands of the modern workplace?

 I recently was doing some digging though content I previously created on blogs and wikis and came across a post I wrote in Learning Circuits Blog that I feel deserves a revisit.

Back in July 2006, I wrote We’re #3! We’re #3! in which I suggested that training departments, at that time, weren’t the home of the best training in organizations when it came to helping employees or customers learn what they needed to learn. I suggested that IT and Customer Service, on average, did a better job of meeting their audiences’ needs. I also pointed to Sales, OD and Finance employee training as perhaps better than what training departments were producing.

Comments in reaction to my post pointed out that IT Help Desks and Customer Service teams often fell short of helping those who called them. But my general point that the training department might not be the home of the best training within an organization was supported.

I provided a long list of practices that are found in IT and Customer Service for developing their own employees and helping customers. A summary of those practices would be:

  • They diagnose the immediate need of the learner in real time
  • They provide a response at the moment of the learner’s greatest need for that response
  • They provide a proposed solution as quickly as possible
  • They check in with the learner after they have had a chance to implement their learning to see if the need was resolved
  • They track the question, the answer provided and the result – in detail
  • They maintain a knowledge base of previous responses to learners’ needs and the effectiveness of the response provided
  • They maintain a practice of sharing best practices and peer-to-peer learning
  • They provide feedback to parts of the organization that might be able to solve the situation that created the need or better anticipate future needs
  • They constantly gather feedback from key stakeholders regarding their performance
  • They do very little of this through a pre-designed curriculum of courses.

I suggested that learning professionals might have a lot to learn from the practices used by other parts of the organization to better serve our “customers”, key stakeholders, and organizations better.

Yet 13.5 years later, only a small percentage of L&D teams are implementing more that a few of these best practices. Why?

With all the talk about being learner-centered, building a culture of learning, self-directed microlearning, etc. You’d think we’d be further along than we are? What’s the hold up?

Here are some thoughts as to why we aren’t making the progress towards the change we as an profession have been advocating for over a decade:

We don’t want to or don’t know how to change. We’re comfortable with the paradigm we’ve been using. We’re used to providing wide swathes of knowledge and instruction to prepare employees for the situations they may face in the future. We work with SME’s and do research to provide the ideal corpus of knowledge and set of skills they will need to know.

We haven’t developed the processes needed to provide immediate solutions at the moment of need.

  • Do we understand the day-to-day work well enough to suggest appropriate solutions?
  • How well do we understand the strategic objectives of the organization and how the target audience fits into them?
  • Do we identify ourselves as problem solvers and pain relievers?
  • Do we understand how various parts of the organization currently learn what they need to know?

Our stakeholders keep demanding or expecting standard training. The way we’ve done things in the past is the only way they ask for us to do things. We, in turn, in our order taking tradition, build what they ask for. After all, they often hold the purse strings to the funding we need and the access to SME’s and the target employees to be trained.

We aren’t used to pushing back when their requested solution may not be the best solution to the problem they are trying to solve.

  • Perhaps we don’t know their business well enough to suggest an option that will work better.
  • Maybe we don’t have the knowledge of their work or understanding of organizational strategic goals to build a compelling argument for a different approach.
  • Or we haven’t developed a business relationship of trust in which they understand and appreciate our professionalism at what we do.
  • We lack the awareness of different ways of organizing learning solutions or the tools to deliver them.

We lack evidence that shows we are effective or don’t see the need to promote what we do. We’re happy to keep doing what we are doing rather than rocking the boat. Neither we nor our stakeholders truly understand what the other does, so we keep focused on what we do and don’t challenge each other. We’re comfortable with the status quo.

We aren’t created evidence based solutions that can be pointed to as changing individual performance and driving organizational strategy.

  • Do we gather evidence that we are adding value to the organization?
  • Are we providing managers with the tools they need to support their employees who are involved in our solutions? Are we holding them accountable?
  • Do our solutions form an ecosystem of learning that builds a culture of collaborative, work-based learning?

Obviously, the these thoughts are not equally applicable from organization to organization. There are learning professionals who are successfully changing and are becoming better partners to their organizations. But as a field, we are lagging.


Do you think these regulations will change anything? Will they drive greater support for data collection in learning? Motivate more collaboration between the business units and L&D?

Feature Photo by Suzanne D. Williams provided by Unsplash.

Retooling for the Future

Connie Malamed does a nice job of defending we Learning and Development professionals in her blog post, Retooling Instructional Design: 7 Ways Learning Professionals are Preparing for the Future. There has been a massive wave of change that has often left us subject to criticism that we’ve fallen behind or obsolete. Connie points out that many of us have been working at changing our methods, approaches and tools in order to prepare for new ways of doing what we do.

She provides a list of 7 ways learning professionals have been working to meet the demands of the modern workplace that is evolving quickly.

  1. Acceptance of Evidence-based Strategies
  2. Focus on Human-centered Design
  3. Adopting UX Techniques
  4. Use of Agile Models
  5. Creating Learning Journeys
  6. Applying Learning Analytics
  7. Designing for Newer Technologies

I wholehearted agree with Connie on these 7 trends that are at the core of what learning professionals will be doing now and in the future. I do feel she slightly missed the mark on #6 and #7. And I would add a #8 to the list.

Applying Learning Analytics

While she does indicated we are making more data driven decisions, she only mentions “the value of learning analytics for continuous improvement.” While this is true, it’s not a huge change from what we’ve always done in evaluating the effectiveness programs. Big data is enabling faster, more responsive analysis, but it’s not the game changer when it comes to Learning Analytics.

The real power of Learning Analytics comes in our ability to use data to:

  • make predictions of what is needed and what will work,
  • we can combine learning data with business data to determine true business value from learning activities, and
  • we can use data in real-time to provide truly personalized learning experiences in the flow of work.

These are the game-changing promises of Learning Analytics that will enable us to get in-sync with our business unit colleagues and finally demonstrate our real value to the organization.

Designing for Newer Technologies

Here I feel like Connie over simplified by limiting her discussion to the impact that virtual reality (VR), augmented reality (AR), and conversational interfaces (I’m guessing she is referring to Chatbots and other tools that take advantage of voice recognition and Natural Language Analysis) are having on.

She is right that learning professionals are leveraging the latest technologies. I’d even argue that this is a trend we’ve been at the forefront of for decades dating back to pre-internet days.

She correctly points out that we have an awareness that “a new tool will not magically solve a performance problem.” Yet we fall for the bright shiny new toy as quickly as others. There are all kinds of new technologies emerging (artificial intelligence, machine learning, xAPI, geo-presence, sensors and other internet of things devices, image recognition, pattern recognition, robotics, and more) and the “old” technologies are still viable (ink on paper remains a great, cost effective delivery mechanism for learning) depending on the solution needed.

Designing for Newer Technologies really points to the necessity to determine which technology:

  • resonates with our learners (irregardless of whether it is a “learning” technology or not),
  • can deliver the best learning experience for the given need, and
  • does so in a cost effective manner.

Be Marketers of Learning

Connie does touch on a bit of this trend when she discusses using personas and conducting learning campaigns. But I believe it should be called out separately. One, because there are numerous learning professionals and organizations who are starting to do this and, two, I believe it is vital to our successful transition into our future state.

We need to be champions of individual and organizational learning. The evangelists of a new learner centered, lifelong culture of learning that is supported by senior leadership and frontline managers. The learning journeys that Connie discusses need to supported with well articulated marketing campaigns.

Like our Marketing colleagues, we need to have an intimate knowledge of who our audience is. Who are the thought leaders? Who are the saboteurs? Who are the influencers? Who are the campaigns of change? What social networks already exist? Can we leverage them to help or will they resist?

Finally, we need to target managers and provide them with the meta-learning tools and the evidence that they are working that will lead to a conversion experience about learning.


Do you think these regulations will change anything? Will they drive greater support for data collection in learning? Motivate more collaboration between the business units and L&D?

Finding the Right Tech for Learning – Where Ever It Is

Surfing – the (bad?) habit of clicking on a link in a webpage just to see what’s on the other side.

From the first day I came in contact with the internet, I was a surfer – sometimes to my detriment as I’d wander around the web aimlessly with little to show for the effort other than a couple wasted hours. But then there are the times I’d come across a find. Like today.

I clicked on a link on a Microsoft page and found this marketing content regarding a professor in Australia doing learning the right way. David Kellermann at University of New South Wales has cobbled together a learning environment that should be a model for how learning can, and should, happen given the existing, available technologies.

While his project is for a university audience, I believe the lessons that can be gleened from this project are just as applicable to workplace learning.

Now Microsoft, of course wants you to know that he’s done it with only their technology (and with their support). But I think the real news is the process Kellermann took to build it. He started with a simple solution of using Microsoft Teams to connect the 500+ students and teaching assistants in one place along with his content.

From there Kellermann added tools that helped better fill his goal wanting to move the students from “500 islands” to a single team, working together no matter where they are or what their individual situations might be. He now has a system that is creative, simple, collaborative, and individualized.

He uses AI tools that help students ask questions and identify content that meets their learning needs. He’s offers his lectures and notes in a way they can be searched by students, TA’s, and AI. The system can accommodate learners with different needs (autism, blind, deaf, etc.). Student/TA communication is improved by tools that can route questions to the right TA.

But the lessons I gleened from this :

  1. Have an unwavering focus on what motivates your learners (and what decreases their motivation).
  2. Understand how your learners learn in the real world. HINT: They don’t go to an LMS or a training class.
  3. Look to technologies your learners already use and either use them or mimic them.
  4. Work iteratively. Work in manageable chunks.
  5. Experiment. If you find a technology that you think might work to meet your goals – try it, test it, and, if you like it figure out how best to incorporate it.
  6. Understand what various tools can do. You don’t have to be a programmer, but you do need to know how the tool can enhance your ability to to meet your goals.

He admits it’s a lot of work, but he is motivated by a very inspiring mission:

I’m just trying to be a good engineer and create a good system that solves a problem, by rethinking education end-to-end and using AI to enhance humanity and make our interactions richer.

David Kellermann, University of New South Wales


Do you think these regulations will change anything? Will they drive greater support for data collection in learning? Motivate more collaboration between the business units and L&D?

ATDChi Winter Conference

“So what do you do?” A common question we’ll all be asked at holiday parties from Halloween to New Year’s Eve. Do you have a persuasive answer that shows off your best capabilities? 

After ATDChi’s Winter Conference you’ll be ready to impress and inform when it comes to tooting your own horn .

Sessions include:

  • Know Thyself: The Art of Meaningful Connection (Erich Kurschat)
  • LinkedIn is a Love Fest: How to Build a Powerful Network of Influential and Talented Friends (Callista Gould)
  • Coaching: Benefits & Tools to Accelerate Progress Toward Your Career Goals (Dan Johnson, CPC, CNTC)
  • People Skills: Learn How to Master Your Value Proposition (Hayward Suggs)
  • The Power of Networking (Rose Pagliari & Kris Felstehausen)

The day long conference will be at Benedictine University in Lisle. You can read more about the sessions and register for the conference at http://bit.ly/2nYZPLv

Feature Photo by Aaron Burden on Unsplash

Content and Community

I’ve been doing a lot of thinking about the role of content in a learning community recently and as a thought piece I’ve come up with a diagram that I think is at least interesting and hopefully helpful.

I would love any and all advice on whether this diagram is effective as is, what should be changed, added or removed to make my point

I want to represent the diversity of content and activities that can be used by a community to create content and drive learning throughout the community. Furthermore, to truly move a community of professionals requires a wide spectrum of experiences that as they grow in sophistication can gradually lift the community to higher levels of activity and growth.

In this diagram, I’m trying to represent that content is multi-dimensional and varies in character (the five vertical lines along the horizontal axis) and complexity which is represented by the positioning of content and activities from bottom to top).

Not only is there a multitude across a wide spectrum of options that can be utilized by a community to learn, but these options are often interrelated.  I’ve thought about drawing connecting lines between items, but I think that would be far to cluttered.  I’m hoping that proximity to each other will lead to this understanding of connection. 

Ultimately, I hope to drive the idea that the best approach to moving a community of learners forward is to encourage a wide range of activities that touch on multiple dimensions of interactions gradually building capability and competency that drives the communities health and growth.


Do you think these regulations will change anything? Will they drive greater support for data collection in learning? Motivate more collaboration between the business units and L&D?