Evaluating Instructional Behaviors for Improved Training Outcomes
Sponsored by: Special Operations Forces Language Office, USSOCOM; Prepared by: ALPS Insights Raleigh, NC
Introduction
In 2007, ALPS Solutions conducted a series of analyses on training evaluation data that were being collected at the United States John F. Kennedy Special Warfare Center and School (USAJFKSWCS) to determine training effectiveness (Ellington & Surface, 2007). Findings from the analysis showed that a significant amount of variance in post-training language proficiency outcomes was related to the instructor [i.e., 31% variance in Defense Language Proficiency Test (DLPT) listening and 42% variance in DLPT reading]. The fact that instructors appeared to have so much influence on student outcomes led to a series of research studies and interventions designed to improve instructional behaviors and, ultimately, student outcomes. In this summary, we review this research and discuss how the initial problem uncovered by our research led to a successful intervention for students and instructors. … Read more by downloading the PDF.
RECENT INSIGHTS
Create More Value With Your Learning Evaluation, Analytics, and Feedback (LEAF) Practice
TK2020 Event: Thurs, February 6, 10:15AM-10:45AM
Optimizing your LEAF practice is your best opportunity to improve learning and its impact. Less than half of organizations indicate evaluation helps them meet their learning and business goals. Data alone doesn’t create value. People acting on data create value. Our ALPS Ibex™ platform drives effective, purpose-driven evaluation empowering L&D stakeholders with insights and creating a culture of continuous improvement in the workplace. Examples demonstrate how using ALPS Ibex helps L&D stakeholders Act on Insights™ to drive improvement and impact.
Improving Instructor Impact on Learning with Analytics
Each of us can recall an instructor who made learning engaging, relevant and impactful, inspiring us to apply what we learned. Unfortunately, each of us can also recall an instructor who failed in one or more these areas. Instructors are force multipliers, reaching hundreds — if not thousands— of learners, impacting both their learning experience and motivation to transfer. So, how can we improve instructor impact on learning?
Two Fundamental Questions L&D Stakeholders Should Answer to Improve Learning
During recent conference presentations and webinars focused on analytics, big data and evaluation, we noticed audience members asking, “What questions should I be asking [and answering with evaluation data and analytics]?” Speakers typically answer these questions one of two ways: either by recommending collecting specific types or “levels” of data, as if all relevant questions for all learning and development stakeholders should be immediately identified and addressed; or by recommending collecting and tagging as much data as possible so the data analysts figure it out, as if the important questions will only emerge from analyzing all the data after the fact.
Qualitatively Different Measurement for Training Reactions
Training evaluation should provide insights not only about the effectiveness of training but also about how it can be improved for learners and organizations. In this context, the term “insights” implies a deep understanding of learning, the training process and its outcomes as well as evaluation procedures – designing, measuring, collecting, integrating and analyzing data from both a formative and summative perspective.
Beyond Levels: Building Value Using Learning and Development Data
For many learning and development (L&D) professionals, training evaluation practices remain mired in the muck. What ends up being evaluated hasn’t changed much over the past two or three decades. We almost universally measure whether trainees liked the training, most of us measure if they learned something, and beyond that, evaluation is a mix of “We’d like to do that” and, “We’re not sure of what to make of the data we get.” Perhaps more critically, in one recent national survey, nearly two-thirds of L&D professionals did not see their learning evaluation efforts as effective in meeting their organization’s business goals.
The Key To Quality Comments is Asking the Right Questions
Anyone who has participated in a training event is familiar with open-ended survey items like this one: “Please provide any additional comments you have about the training program you just completed.” After getting into the rhythm of clicking bubble after bubble in response to closed-ended survey items, many trainees come to a roadblock when provided with a blank box of space and asked to provide feedback in their own words.
Request a Demo
Learn how ALPS Insights can help your organization with L&D and HR analytics, measurement, and feedback needs? Share with us a little about you and our specialist will contact you to schedule a demo.