Thursday, May 19, 2011

Another Way of Looking at Learning Metrics

This is part 3 – the rest of my notes. I just wanted to separate out the calculations so that they were easier to find.
----------------------
After jumping through the usual higher education fiery hoops to get a room, my university hosted a seminar on eLearning Impact by SkillSoft and Knowledge Advisors.

Key topic: How to get and present metrics the senior execs understand.

There were a number of ideas that I thought could help the community at large. Or at least spark some discussion

Thank you Kevin Duffer (SkillSoft) for giving me permission to blog these notes. Please comment on what you think of these models and I will forward them on to Kevin.

--------------------------------
General trend: more pressure than last year to report. Expect more pressure next year.
- 47.4% - I can report on L&D
- 31.6% Don’t know that I can report

What folks want to report on
- ROI
- Business Impact
- Relevance
- Certification
- Hours / “Pants in seats”
- What was actually used?
- Compliance regulations
- Tracking demographics
- Why employees don’t take/complete
- More efficient data collection

Performance.gov
- Government trying to get standards and consistency

Bersin – 4 dynamics in gov’t sector
- Budget cuts – need to demonstrate value and articulate impact
- Aging workforce – knowledge management to maintain institutional memory
- Low retention of new hires
- Federal mandates / shifting politics

During this turning point, learning investments have been proven to accelerate performance.

Learning fuels adaptability and innovation, improves engagement and retention.
- Needs transparency and accountability
----------
6 Disciplines of Breakthrough Learning (Wick, Pollock, Jefferson and Flanagan) – link

No metrics + no value evidence = Risks
- Sustainability
- Budget
- Resources
- Credibility

If you don’t have a value story – that program is at risk.
Demonstrate IMPACT
Doesn’t matter how many actually run to the program
CFO – why should we retain this?

What gets measured improves.
----------
eLearning does lots of things classroom can’t scale to
- Efficient
- Adaptable
- Available
- Scalable
- Effective
- Economic

Department of Education – Evaluation of evidence-based practices (link)

No significant difference phenomenon (link)

If you use Google, you are eLearning
- eLearning = using technology to answer a question or build knowledge

Organizations must embrace this
- Blended and classroom really don’t scale
-----------
FAQs
- Did the program make a difference?
- Recover costs?
- ROI?
- Do we need measurement expertise in-house?
- How do we standardize measurement?
- How do we benchmark?

Findings – Moving the Needle [SkillSoft sponsored, but I think it can apply to any well-designed library of content]
- SkillSoft has increased effect on performance
- Learners feel SkillSoft has impact on critical areas of org performance
- SkillSoft help close skill gap and build confidence
- Highly valued
“I would rather go to Books 24x7 than Google
- Faster and more relevant
- Comments “ was worst, now best” [peer marketing]

Alignment (to org) / Adoption (visible! Easy access!) / Value (measurement strategy)

Focus on the following levels (though most questions will still be level 1 or 2)
- 3 Application – is this put into practice on the job?
- 4 Impact (organizational results)
+ Productivity
+ Revenue
+ Quality
+ Time Efficiency
+ Customer Satisfaction
+ Employee Engagement
- 5 Value
+ ROI
+ Benefit / Cost ratio
----------
CLO April

Calculating 2 main Level 5 metrics (put this in a separate post)

Monetizing Program Effect (also put this in a separate post, because I thought it was that important)
------------
CLO December 2010 – Patty and Jack Philips
How Execs View Learning Metrics

Executives want to use more about impact.
- If has impact, ROI is naturally there

The learning industry lacks framework
- measures
- definitions
- principles

Mistakes to avoid
- Assuming ALL eLearning participation has offset classroom
- Measuring all learning programs
+ Focus on the most strategic 5% of programs
- Not being conservative in business outcome claims
- Inconsistent reporting approaches
- Measuring the wrong things
- Only communicating usage metrics
- Not establishing expectations for learning
+ What did they expect they were getting out of it?

To evaluate
- Demonstrate effectiveness
- Improve curriculum
+ Get rid of underperforming courses
+ Improve others
+ Also better with strategy

Talent Development Reporting Principles (TDR)
- Still in Development
- 5 of 7 CLOs of the year help to create and adopt this model
-----------
3 Key Learning Report Statements
- Start with the end in mind

Efficiency
- Cost, # of people
- Activity and investment

Effectiveness – how well learning contributes

Business Outcomes
- Desired business results
- Learning Impact
- Quality improvement, cost reduction
- Examples
+ Time to proficiency – new hires
+ Lost mail
+ Dress code compliance
----------
Multiple data systems should be used
- Data Mining systems to make relevant
- Process to pull and/or calculate data
+ What is relevant to you? Determine this with stakeholders.
- Find location to pool calculated measures
+ reference easily
+ compare periods
+ can even be an Excel spreadsheet if necessary
- Turn into a scorecard / dashboard

Give reports targeted to each audience
- Quarterly, Business Execs – efficiency, effectiveness, outcome
- Monthly, Learning Execs – outcomes, learning effect
- Weekly, Learning execs – efficiency

It will be a long haul to get everything compiled
- Build consensus around it
- What info do you want?

This is a way of valuing human capital with CFO [basically making human capital more quantitiative]
--------------
Standardize your evaluation forms
- Smart Sheet with at least 1 metric for levels 2-5.
- Roll out to every course in curriculum
- Ask same questions across every course

1) Business Outcome Statement
- In terms of level 4 measure
- Can be qualitative and quantitative
- [What are we measuring currently?]

2) Effectiveness Statement
- Measure at all 5 levels
- At least 1 question each for level 4 and level 5 with several level 1 questions

3) Efficiency statements – primarily out of LMS (usage, completion etc)

Metrics that Matter – impact analysis from Knowledge Advisors available through Skillsoft. Service that will help develop these evaluations and compile data for you.
---------
Remember to communicate in a way the leaders understand

When reporting upward – consider your audience

Organizational execs
- Quarterly
- Synopsis of L&D (total)
- How does learning contribute to the executive strategic initiative?
- Think long term
- Get prior year results if you can
- Scope and build what you need

Remember to get others involved

Can do dashboard – but limit it to keys for stakeholders. Don’t clutter with a bunch of pictures.

-----------
Skillsoft users – there is apparently a Business Impact calculator available in SkillSoft that is complementary. Work with the Learning consultant / SkillSoft team.

No comments: