Sei sulla pagina 1di 8

| | 


  

ⅅᏒ Implementing Training Scorecards: Prove That Training Pays


Training programs can be a significant monetary investment for an organization. Senior
leadership often challenges their training departments to prove that training can provide a
return-on-investment. By implementing a proven measurement and evaluation methodology that
applies to any training initiative, training organizations can demonstrate the benefit they provide
Յ᾽
to the bottom-line. Incorporating sales training, leadership training, and technical training case
studies and scorecard examples, the speaker of this session will provide an overview of the
components of a training scorecard, along with tools and techniques that have been used
successfully to help training organizations prove that training pays.
Implement the components of a training scorecard to evaluate training results. Create a training
⺂ᯞᏒⶅ scorecard to track training results. Apply tips and techniques for implementing training
scorecards back on the job.
₥ᰅ 5/7/2006 3:30 PM-4:45 PM   
A   
 Lynn Schmidt

ⅅᏒ Evaluating Training Programs: The Four Levels


In this session, the speaker will describe 10 requirements for effective programs and three
Յ᾽ reasons for evaluating them. Guidelines, forms, and procedures for all four levels of evaluation
will be presented. Session M216 conducted by James Kirkpatrick is a follow-up to this session.
Use the guidelines for all four evaluation levels. Apply level 1 evaluation for all programs.
⺂ᯞᏒⶅ Implement the other three levels of evaluation depending on the number and expertise of your
training professionals.
₥ᰅ 5/8/2006 12:00 PM-1:15 PM     A  
 onald Kirkpatrick
| |  
  

ⅅᏒ Evaluating Training Programs: The Four Levels


In this session, the speaker will describe 10 requirements for effective programs and three
Յ᾽ reasons for evaluating them. Guidelines, forms, and procedures for all four levels of evaluation
will be presented. Session M216 conducted by James Kirkpatrick is a follow-up to this session.
Use the guidelines for all four evaluation levels. Apply level 1 evaluation for all programs.
⺂ᯞᏒⶅ Implement the other three levels of evaluation depending on the number and expertise of your
training professionals.
₥ᰅ 5/8/2006 12:00 PM-1:15 PM     A  

 onald Kirkpatrick

ⅅᏒ Make Evaluation Work


In the age of evaluation, a question still demands attention in boardrooms and break rooms:
How can we make evaluation work? Three simple steps can help you make training evaluation
work: Set the stage; develop the practice; and implement the practice. In this highly interactive
session, you will reflect on your own situation, develop answers to many of the issues, and
obtain tools to create a working evaluation system that can be integrated and sustained so that it
Յ᾽
becomes a seamless part of the training and performance-improvement process. Corporate and
organizational teams can benefit from attending this session together. uring the session you will
analyze the 12 basic problems with evaluation, develop plans for using the three steps to
building a successful evaluation practice, and strategize about how to overcome resistance to
evaluation
Apply solutions to the 12 basic problems with evaluation. Use the three steps build a successful
⺂ᯞᏒⶅ
evaluation practice. Use these techniques to overcome resistance to evaluation.
₥ᰅ 5/8/2006 12:00 PM-1:15 PM    A  
 Patti Phillips
| |  
  

ⅅᏒ emonstrate Training Value: Aligning Kirkpatrick's Four Levels With Balanced Scorecards
Training professionals all over the world feel pressure to demonstrate that training is an
effective method to improve profitability and retention. This session will deliver a novel, yet
extremely effective, approach to demonstrating the value of training by linking onald
Յ᾽
Kirkpatricks Four Levels of Evaluating Training Programs with the speaker's modification of
Kaplan and Norton s Balanced Scorecard. He will give examples of balanced scorecard measures
for each of the four levels.
evelop training evaluation methods and measures for each of Kirkpatrick's four levels. esign
a balanced scorecard for your department that will include key measures from each level. Use
⺂ᯞᏒⶅ
knowledge and skills to provide evidence to senior executives that training positively impacts the
bottom line.
₥ᰅ 5/8/2006 2:15 PM-3:30 PM     A 

 James Kirkpatrick

ⅅᏒ Training Impact Evaluations That Senior Managers Believe and Use


Use the Success Case Method to Ètell training s story with impact data that is credible and
compelling. You need an evaluation method that is simple, valid, and does not mislead senior
management with suspicious statistical gyrations and over-blown ROI estimates. The Success Case
Method is an innovative procedure that quickly digs out and documents the very best results
that training is achieving, and then pinpoints the replicable factors and practices that managers
Յ᾽ can leverage to increase ROI and drive performance improvement. Better yet, this breakthrough
method s results make a CFO-proof business case for manager involvement in training and help
your organization build capability to leverage learning investments into sustained performance
improvement. In this session, you ll review examples from Allstate Insurance, Hewlett Packard,
Grundfos, (enmark) and Coffee Bean & Tea Leaf, among others, that demonstrate this highly
effective, innovative, and above all, practical evaluation method.
Apply the five-step Success Case Method to tell training s story in your organization. Accurately
⺂ᯞᏒⶅ estimate the contribution of your training function to build effectiveness. Analyze actual case
examples from leading global organizations for application to your organizational needs.
₥ᰅ 5/9/2006 10:00 AM-11:15 AM    A  
 Robert Brinkerhoff
| |  
  

ⅅᏒ See the Organizational Big Picture: An Action-Oriented Evaluation Case Study


Executives and managers often want to step back from all of the detailed reports on their desk
to see the big picture. They want answers to the larger questions that keep them up at night.
This interactive case study will address an innovative, action-oriented evaluation approach that
answers the big-picture questions regarding sales, marketing, training, coaching, and on-the job
support for sales representatives. Action-oriented evaluations provide a proactive, systemic, and
Յ᾽
systematic way to determine which materials work well and why and which materials don t work
well and how to improve them. They identify potential breaks in the performance chain and
emerging trends. Using this information, decision makers can implement blended solutions that
support multiple employees. Come for the discussion and the exercises, stay for the job aids,
samples, and lessons learned.
Form evaluation questions that address stakeholder concerns. escribe a high-level process to
⺂ᯞᏒⶅ conduct action-oriented evaluations that improve job performance. Apply lessons learned to use
action-oriented evaluation strategies in your organization.
₥ᰅ 5/9/2006 10:00 AM-11:15 AM     A  
 Steven Villachica , eborah Stone

ⅅᏒ Meaningful Metrics to Measure The Value of Knowledge Management


While managing a large effort to consolidate global knowledge-management repositories and
processes across the enterprise, Accenture has established a centralized measurement framework
to assess the status and value of knowledge-management activities. Based on this work, the
speaker will prepare you to accomplish some of the key activities in establishing a knowledge-
management measurement framework€including developing an knowledge-management evaluation
model, defining and prioritizing metrics across levels of the model, and designing a knowledge-
Յ᾽ management reporting system. You also will gain tips and techniques for dealing with critical,
practical issues such as the balance between standardization and localization, identifying common
metrics across diverse knowledge-management areas (for example: search of knowledge
repositories, communities of practice, expert forums, and so forth), and working with
stakeholders to obtain support for the measurement framework. Of particular interest will be
establishing a knowledge-management evaluation model€recognizing that the well-known training
evaluation levels do not directly generalize to knowledge-management evaluation.
evelop a measurement model for knowledge management. esign measures and reports to
⺂ᯞᏒⶅ summarize the status and value of knowledge-management activities. Communicate effectively
with stakeholders to design evaluation of knowledge-management initiatives.
₥ᰅ 5/9/2006 1:45 PM-3:00 PM    A  

 Bruce Aaron
| |  
  

ⅅᏒ How to Evaluate "Soft" and "Hard" ata


In this session, the speaker will explain how to convert qualitative data to quantitative data so
you can apply standard mathematical and statistical analysis methods. He will cover how to
Յ᾽ analyze data from interviews, focus groups, and observations and explain how to apply
descriptive and inferential statistics to identify significant results needed to show a return on
investment.
Evaluate soft data such as opinions and feelings from interviews and surveys. Convert soft data
to hard data so you can analyze it mathematically. Evaluate hard data from surveys and
⺂ᯞᏒⶅ
individual and group interviews, and document searches by determining the average, percentage,
frequency, and standard deviation.
₥ᰅ 5/9/2006 1:45 PM-3:00 PM     A  

 Judith Hale

ⅅᏒ Measuring Human Capital


Learn how to use newly developed metrics for measuring human capital in the workplace. Find
out how to calculate the exact dollar value of any competence change in your workforce
through the use of a simple equation. Be able to set up a dashboard to see the dollar value
from your performance-improvement efforts such as new training courses, management coaching,
and many other interventions. Part II of the session is based on 20 years of research. Learn
Յ᾽
how to forecast financials such as growth in sales and profits from changes in 80 different
people management areas. Be able to develop root-cause survey questions that predict an
organization s financial success. Use the data collected here to calculate the exact dollar value of
changes such as a better training curriculum, better selection tools, and an improved career-
development system.
evelop new metrics for measuring human capital in the workplace. Use two simple equations
that convert changes in competencies or changes in people-management practices into exact
⺂ᯞᏒⶅ
dollar amounts. evelop systems to automate the dollar value of human capital change and do
away with specialized studies.
₥ᰅ 5/9/2006 1:45 PM-3:00 PM   
A  

 ennis Kravetz
| |  
  

ⅅᏒ Create an Annual Report for Training and evelopment


This session introduces an innovative evaluation process that combines the best of two evaluation
methodologies€Brinkerhoff s Success Case Method and Phillip s ROI€to produce a complete
annual report on training. The report provides stakeholders with the data they need to
understand the full impact of training on an organization. The annual report documents stories
Յ᾽ that capture best practices and areas for improvement on a training initiative, calculates the
return on investment on developing and implementing the training initiatives, and links
performance outcomes to business metrics. A sample annual report for a qualification program
for operators in a pharmaceutical manufacturing facility is used as a model to guide you through
the process to create your own report.
Apply two evaluation methodologies€Brinkerhoff s Success Case Method and Phillip s ROI€to
your programs. esign an evaluation plan that links skills and knowledge to individual
⺂ᯞᏒⶅ
performance improvement and to the impact on achieving business results. Partner with the
business to define and collect business metrics.
₥ᰅ 5/9/2006 4:00 PM-5:30 PM   
A   
 Rachelle Verkamp

ⅅᏒ Evaluating Training Programs: The Four Levels


In this session, the speaker will describe 10 requirements for effective programs and three
Յ᾽
reasons for evaluating them. He will present guidelines, forms, and procedures for all four levels.
Use the guidelines for all four levels. Apply level one for all programs. Implement the other
⺂ᯞᏒⶅ
three levels, depending on the number and expertise of your training professionals.
₥ᰅ 5/10/2006 8:00 AM-9:15 AM     A  
 onald Kirkpatrick
| |  
  

ⅅᏒ Take Charge: esign Training for Results and ROI


The training process has its share of program designers who build a great event and trainers
who are intent on telling the world everything they know about their topic and expertise. Too
often, a focus on execution in the work setting and business results is missing. Interventions
must be properly aligned to influence execution and business outcomes. You will learn how you,
as a training manager, can take charge of managing the training process to influence and
Յ᾽
achieve business results. uring this session, five areas of importance will be explored: aligning
and sustaining a results focus; guiding the training design for optimum transfer to the work
setting; keeping training delivery on track; influencing performance transfer to the work setting;
and achieving the ultimate ROI. Tools presented during the session will help you use an eyes-on
but hands-off approach to guide the training process toward achieving business results and ROI.
Monitor five key factors in the training process that will influence results and ROI. Apply
⺂ᯞᏒⶅ specific actions to influence the five key factors to achieve results and ROI. Ask key questions
to analyze the risk of transfer and to influence client involvement in learning transfer.
₥ᰅ 5/10/2006 8:00 AM-9:15 AM    A  
 Ron Stone

ⅅᏒ emonstrate Training Value: Aligning Kirkpatrick's Four Levels With Balanced Scorecards
Training professionals all over the world feel pressure to demonstrate that training is an
effective method to improve profitability and retention. This session will deliver a novel, yet
extremely effective, approach to demonstrating the value of training by linking onald
Յ᾽
Kirkpatricks Four Levels of Evaluating Training Programs with the speaker's modification of
Kaplan and Norton s Balanced Scorecard. He will give examples of balanced scorecard measures
for each of the four levels.
evelop training evaluation methods and measures for each of Kirkpatrick's four levels. esign
a balanced scorecard for your department that will include key measures from each level. Use
⺂ᯞᏒⶅ
knowledge and skills to provide evidence to senior executives that training positively impacts the
bottom line.
₥ᰅ 5/10/2006 10:45 AM-12:00 PM     A  

 James Kirkpatrick
| |  
  

ⅅᏒ Leveraging Technology for Practical and Scaleable Learning Analytics


In this session, the speakers will discuss how technology tools and templates can automate the
data collection, storage, processing, and reporting of learning data. They will discuss the
Յ᾽
challenges and best practices, and use Grant Thornton as an example of the business case and
implementation approach.
Use available technologies for learning analytics. Implement learning analytics technology from a
⺂ᯞᏒⶅ practitioner s perspective. Apply the key elements that are important in learning analytics
technologies.
₥ᰅ 5/10/2006 2:15 PM-3:45 PM     A   
 Jeffrey Berk , Bob ean

Potrebbero piacerti anche