Training Diagnostics to Improve Learner and Class Outcomes
Sponsored by: Special Operations Forces Language Office, USSOCOM; Prepared by: ALPS Insights Raleigh, NC
Introduction
Since 2007, ALPS Solutions has worked closely with training programs within the Special Operations Forces (SOF) community to evaluate training effectiveness and identify areas where interventions by program administrators and instructors can have a positive impact on learner- and class-level outcomes. While much of that effort has focused on evaluation of instructor behaviors (see Evaluating Instructional Behaviors for Improved Training Outcomes), many other components and characteristics of the training process and environment have been shown to influence outcomes. The ultimate goal of our research related to training diagnostics was to provide program managers with an evidence-based, easy-to-use tool to monitor learner and instructor issues during training and initiate results-focused interventions during training – as it is too late to help current learners and maximize the training investment after the class is completed. … Read more by downloading the PDF.
RECENT INSIGHTS
Create More Value With Your Learning Evaluation, Analytics, and Feedback (LEAF) Practice
TK2020 Event: Thurs, February 6, 10:15AM-10:45AM
Optimizing your LEAF practice is your best opportunity to improve learning and its impact. Less than half of organizations indicate evaluation helps them meet their learning and business goals. Data alone doesn’t create value. People acting on data create value. Our ALPS Ibex™ platform drives effective, purpose-driven evaluation empowering L&D stakeholders with insights and creating a culture of continuous improvement in the workplace. Examples demonstrate how using ALPS Ibex helps L&D stakeholders Act on Insights™ to drive improvement and impact.
Improving Instructor Impact on Learning with Analytics
Each of us can recall an instructor who made learning engaging, relevant and impactful, inspiring us to apply what we learned. Unfortunately, each of us can also recall an instructor who failed in one or more these areas. Instructors are force multipliers, reaching hundreds — if not thousands— of learners, impacting both their learning experience and motivation to transfer. So, how can we improve instructor impact on learning?
Two Fundamental Questions L&D Stakeholders Should Answer to Improve Learning
During recent conference presentations and webinars focused on analytics, big data and evaluation, we noticed audience members asking, “What questions should I be asking [and answering with evaluation data and analytics]?” Speakers typically answer these questions one of two ways: either by recommending collecting specific types or “levels” of data, as if all relevant questions for all learning and development stakeholders should be immediately identified and addressed; or by recommending collecting and tagging as much data as possible so the data analysts figure it out, as if the important questions will only emerge from analyzing all the data after the fact.
Qualitatively Different Measurement for Training Reactions
Training evaluation should provide insights not only about the effectiveness of training but also about how it can be improved for learners and organizations. In this context, the term “insights” implies a deep understanding of learning, the training process and its outcomes as well as evaluation procedures – designing, measuring, collecting, integrating and analyzing data from both a formative and summative perspective.
Beyond Levels: Building Value Using Learning and Development Data
For many learning and development (L&D) professionals, training evaluation practices remain mired in the muck. What ends up being evaluated hasn’t changed much over the past two or three decades. We almost universally measure whether trainees liked the training, most of us measure if they learned something, and beyond that, evaluation is a mix of “We’d like to do that” and, “We’re not sure of what to make of the data we get.” Perhaps more critically, in one recent national survey, nearly two-thirds of L&D professionals did not see their learning evaluation efforts as effective in meeting their organization’s business goals.
The Key To Quality Comments is Asking the Right Questions
Anyone who has participated in a training event is familiar with open-ended survey items like this one: “Please provide any additional comments you have about the training program you just completed.” After getting into the rhythm of clicking bubble after bubble in response to closed-ended survey items, many trainees come to a roadblock when provided with a blank box of space and asked to provide feedback in their own words.
Request a Demo
Learn how ALPS Insights can help your organization with L&D and HR analytics, measurement, and feedback needs? Share with us a little about you and our specialist will contact you to schedule a demo.