SlideShare uma empresa Scribd logo
1 de 8
Baixar para ler offline
I - T E C H

T E C H N I C A L

I M P L E M E N T A T I O N

G U I D E

# 3

Piloting a Curriculum:
Evaluating the Effectiveness of a New Training
I-TECH’s Technical Implementation Guides are a series of practical and instructional papers designed
to support staff and partners in their efforts to create and maintain quality programs worldwide.

A successful training program requires multiple
components, including skilled trainers with adequate content knowledge, training participants who
have the necessary baseline knowledge and motivation to learn, and a well-written curriculum to guide
everyone through the process. Implementing training programs often requires a significant amount
of training staff time, trainee time spent away from
work, and dollars spent on transportation, lodging
and per diem. Pilot-testing a curriculum is an important aspect of quality control in training and can
help to ensure that the time and investment in training really pays off. This guide presents an approach
for conducting such an evaluation.

Why Pilot?
The purpose of piloting a curriculum is to make sure
the curriculum is effective, and to make changes
before it is distributed or offered widely. Piloting
a curriculum helps to identify which sections of
the curriculum worked and which sections need
strengthening. The process should include a comprehensive evaluation of the curriculum’s effectiveness and usefulness in achieving the course’s training objectives. The information gathered from the
pilot is used to strengthen and improve the course
content, materials, and delivery strategies in the
next version of the curriculum.
Pilot testing will only be effective if you have the
resources, time, and ability to revise the curriculum
based on the evaluation feedback received from
the pilot. If, for some reason, you will not be able to
make changes to the curriculum, then completing
the pilot-evaluation process described below is not
advisable.
The process described in this guide for piloting a

curriculum is appropriate for extensive, classroombased cur­ icula in which you have invested signifir
cant resources. This type of process is not as relevant for an on-site training where training topics
might vary with the situation, or for a less structured curriculum in which experts present case
studies or their own PowerPoint slides.

Preparing for the Pilot
The pilot evaluation is an extensive assessment
carried out during the first time that a newly developed or adapted curriculum is used to conduct
training. Participants should be informed that they
are participating in a pilot session of the training
workshop when they are invited and will be asked
to provide extensive and honest feedback as part
of the pilot-evaluation process. Emphasizing that
their feedback is critical to the development and
improvement of the training will often help motivate participants to complete written-evaluation
forms in detail and participate actively in group discussions on the training experience. On the first
day of the workshop, facilitators should mention
that evaluation activities will be more extensive
than normal because of the fact that this is a pilot,
and they should articulate the specific evaluation
activities that will be conducted.
It is also critical to brief the facilitators in advance
that this is a pilot test of the curriculum. Some facilitators may have presented similar material before
and may have their own presentations that they
are used to giving. For the curriculum pilot, it is important that all facilitators follow the curriculum as
closely as possible. All facilitators should be provided with copies of the curriculum well in advance
of the workshop so that they have an oppor­unity
t
to review their assigned sessions.
Piloting a Curriculum: A Technical Implementation Guide	

Recruit at least one observer who will be present
at the whole training. The observer is often someone who participated in the development of the
curriculum, but this is not a requirement. If it is not
possible to get someone who participated in the

PAGE 2

development of the curriculum to attend the pilot,
then the observer can be someone with listening
and observation skills who is knowledgeable about
training design. The observer role is described in
more detail later in this guide.

Areas to Evaluate During the Pilot
The evaluation of a pilot curriculum should assess teaching methods, appropriateness of content,
materials, issues of timing and flow, and general effectiveness of the training. Some key questions to
answer through the evaluation of the pilot are listed below:
Teaching methods
n  ere the teaching methods (lecture, discussion, group work, etc.) used in the training successful
W
in increasing participant knowledge/understanding?
n Did some methods work particularly well?
n  id some methods not work and need to be changed?
D
Content
The pilot session is not the time to review clinical accuracy of the content. The curriculum should have
undergone a technical review prior to piloting to ensure that the content is clinically accurate and reflects current guidelines.
n  as the content at the appropriate depth and breadth for the audience?
W
n  as the reading level of the curriculum too difficult/easy?
W
n  ere the right topics covered?
W
n  ere there topics missing?
W
n  ere there stories, examples, cases mentioned during the workshop that could be incorporated
W
into the curriculum?
Materials
n  ere the materials user friendly for both trainers and participants?
W
n  id the trainers use all of the materials? (PowerPoint slides, handouts, worksheets)
D
n  id participants refer to the training materials?
D
n  ere there additional materials and resources that would enhance the training?
W
Effectiveness
n  id participants acquire the intended skills and knowledge from the training? If not, what were the
D
weak areas?
Timing and flow
n  as there too little or too much time allocated for individual activities?
W
n  as there too little or too much time allocated for the workshop as a whole?
W
Piloting a Curriculum: A Technical Implementation Guide	

PAGE 3

should be brief and easy to complete to avoid
overburdening participants. Often, daily evaluations only ask what participants liked and did not
like about the day’s session. Since this is a pilot,
you may want to include some knowledge-related questions specific to the content covered
that day to see if participants absorbed some of
the key themes of the day.

Participants’ feedback should include their subjective response, i.e. what
worked well, and objective measures, i.e., a change in knowledge.

Conducting the Evaluation
The evaluation of the pilot session of a new curriculum should encompass feedback from three
key groups in order to triangulate data and assess
the training from different perspectives: the participants, the facilitators/trainers of the workshop, and
the observer(s).
Participant Evaluation of the Workshop
The purpose of a training event is to enhance participants’ capacity in a specific, defined area. Participant feedback on the workshop is thus a central
piece in assessing the effectiveness of a curriculum.
Participants’ feedback should include their subjective response to the workshop (What worked well?
What did they feel they learned? What did they not
understand? How will they apply what they learned
in the workshop in their work setting?) and objective
measures, such as a change in their knowledge and/
or skills resulting from the workshop. Following are
specific tools for collecting participants’ feedback:
Tool 1: Daily written evaluations
A daily evaluation, to be completed by participants at the end of each day of the workshop,
captures participants’ reactions to workshop
sessions and activities while the information is
still fresh in their minds. A daily evaluation form

Generally, the data from the daily evaluation form
does not need to be rigorously tallied and analyzed; rather, the trainers should scan the forms to
determine general trends. For example, if a particular session seems to be receiving notably lower
scores than other sessions, this would indicate
the need for further investigation to determine the
reason. The daily evaluation should also include a
question about information not understood by participants so that facilitators can clarify any misunderstandings at the training the following day.
Tool 2: Final written evaluation
Participants should complete a final written evaluation at the end of the final day of the workshop.
This evaluation allows participants to reflect back
on the knowledge and skills acquired during the
training within the context of the workshop as a
whole. It can often capture different information
than the daily evaluation, as material that was
unclear or confusing on Day 2 may have become
clear by Day 5. A final written evaluation also allows participants to reflect on their key learning
from the overall workshop and assess how they
will be able to use what they have learned when
they return to their work settings.
Tool 3: Focus group discussion
Focus group discussions with participants at
the end of the workshop can be a useful method for obtaining feedback about the workshop
and curriculum. The group discussion format inspires synergies that can result in a rich discussion, evoking different information than might
be provided in a written format. It also facilitates
feedback from those who are more comfortable
expressing themselves verbally than in writing.
Piloting a Curriculum: A Technical Implementation Guide	

Participants should be divided into groups of approximately 6-8 for the focus groups. Trainers
should identify individuals not directly involved
with the workshop to facilitate the focus group
discussions and take notes. A focus group discussion guide should be developed and used to
structure the discussion, but should not limit or
constrain the discussion. Key themes emerging
from the groups should be identified and summarized. Refer to I-TECH’s Organizing and Conducting Focus Groups for more detailed information on focus group discussions.
Tool 4: Pre- and post-test
The purpose of the pre- and post-test is to provide an objective measure of changes in knowledge and/or skills resulting from the training, and
thus can serve to provide valuable information
about the effectiveness of the curriculum.
Caution must be used in interpreting the preand post-test results in terms of implications for
the curriculum. The pre/post-test must also be
piloted to determine whether it is valid. For example, significant increases in scores between
the pre-test and the post-test can serve as an
indication that the workshop achieved its objectives and the curriculum was effective; lack
of substantial increase in scores, however, requires further analysis and interpretation. A lack
of change or decrease could reflect a poorly designed test rather than indicate a problem with
the curriculum.

PAGE 4

systematic feedback from the trainers using the
curriculum and supporting training materials. This
assessment should include the trainers’ experiences in using the materials, their feedback on the
thoroughness and appropriateness of the content,
and their suggestions for activities, stories, case
studies, and examples that can be incorporated
into future editions of the curriculum.
Tool 1: Written evaluation
A written evaluation should be completed by
each facilitator at the end of each session he/she
facilitates. This evaluation form should ask the
facilitator to reflect on what worked well in the
session and what did not work well. The evaluation can also ask the facilitator to comment
specifically on how he or she used the facilitator
manual and the slides for that specific session.
Although it may seem like a lot of work for the
facilitators to complete an evaluation after each
session of the training, it is important to capture
their thoughts and very specific feedback while
it is still fresh and clear in their minds.

A simple method of gauging the validity of the
test is to discuss the responses to the test during the training and after participants have completed it on their own. This will enable facilitators
to see if the questions themselves were unclear
or whether there were gaps in the way the material was covered during the training. Refer to
I-TECH’s Guidelines for Pre- and Post-Testing for
more detailed information about designing, validating, and interpreting pre/post tests.
Facilitator Evaluation of Curriculum Materials
The pilot evaluation should include structured and

Participants should complete a written evaluation at the end of the final day
of the workshop.
Piloting a Curriculum: A Technical Implementation Guide	

Tool 2: Daily debrief meetings
Trainers and observers (see next section) should
hold a short debriefing at the end of each day.
The purpose of this debrief is to discuss the day’s
activities (what worked well, what needs to be
changed/improved) as well as to prepare for the
following day. A simple form can be used to guide
the discussion and a note taker should be designated to ensure that key points are captured. Debriefings can be extremely effective in identifying
issues or concerns with the curriculum as well as
strategies to address these concerns.

PAGE 5

Observer Evaluation of Workshop and Use of
Curriculum Materials
The observer(s) plays an invaluable role during
the piloting of a curriculum. Since observers are
not directly involved in the implementation of the
workshop, they are able to carefully monitor elements, such as use of the materials by trainers and
participants, the interaction between participants
and trainers, and participants’ level of engagement.
They can also be tasked with capturing stories, cases, and examples used by trainers to incorporate
into the curriculum and providing general overall
feedback on each session.
The observer can make notes directly on a copy
of the facilitator’s guide during the presentation of
each session. Observers should note if an activity did not work with the group or if the facilitator
skipped a specific activity or session. The observer
can also note which activities took more time than
anticipated or which ones led to interesting discussions about topics of importance to the learning
objectives of the curriculum. If participants appear
confused at certain times during the training session, the observer can note at what point this confusion occurred. Likewise, if the facilitator deviates
from the curriculum to clear up a point of confusion, the observer should note this as well.
It is helpful to develop an “Observation Guide” for
the observer to fill out at the end of each session
of the training. This form is a helpful tool to remind
observers what to look for, such as the timing and
flow of the session, the use of the facilitator and
participant manuals, participant involvement, and
the quality of the content in the session.

Trainers and observers should hold a short debriefing at the end of each day.

Observers should also participate in the daily debrief meeting with the facilitators to review the successes and challenges of the day’s sessions and
to share their observations. Lastly, the observers
should summarize their detailed comments each
day so that they can be easily fed back into the curriculum review process.
Piloting a Curriculum: A Technical Implementation Guide	

PAGE 6

Evaluating Non-Classroom
Training Tools

dress the most important issues first depending on
available time and resources. Revising the curriculum may include:

Your curriculum may include components that
take place outside of the classroom, like field
visits or preceptorships. You may also have
created specific tools, like trigger tape scenarios, to enhance your curriculum. Each of
these additional pieces should be evaluated
individually through separate evaluation forms
or group discussions with facilitators and
participants. You should also include specific
questions about these experiences or tools in
the final participant-evaluation forms, observation guides, and facilitator-feedback forms.

n  odifying slides or handouts;
M
n  nhancing and expanding facilitator’s notes;
E
n  eveloping new activities or methodologies,
D
such as case studies;
n  dding, deleting, or reordering content;
A
n  ewriting sections of the facilitator manual;
R
n  implifying the level of language used;
S
n  dding more clinical depth;
A
n  eveloping additional resources, such as a ParD
ticipant Reference Manual or Pocket Guide;
n  ethinking the length of time of the workshop,
R
or making difficult decisions about what can be
covered in the time available for training.

Revising the Curriculum
After the pilot training is complete, it is time to review all that you have learned and use it to make
changes to the curriculum. As mentioned earlier, this
is the most important part of the entire pilot-evaluation process. The first step is to compile and analyze
all the data collected. The pre- and post-test should
be scored and results entered into a spreadsheet by
question; this will enable you to determine if there
were particular content areas that were not well understood by participants (this assumes that the pre/
post-test has been validated). Participant evaluation
results, both written and from the focus groups,
should be summarized and analyzed to identify key
themes and issues, and recurring comments about
what worked and what did not. Observer and trainer
feedback on specific sessions should be compiled
in one place and relevant feedback from the trainerdebrief sessions should be summarized.
Once you have a good sense of what the data are
telling you, the next step is to revise the curriculum
and the supporting materials based on the feedback. This can be an extensive process depending
on the type of feedback received; it may be necessary to prioritize all the suggested changes to ad-

Be sure that the revision includes any changes or
updates to local, national, or regional guidelines relevant to the training content.
If extensive changes are made to the curriculum
during the revision process, it is helpful to conduct
a second pilot evaluation. This involves repeating
the methodologies described here during the first
presentation of the revised curriculum. For a large,
technical curriculum that is expected to be adopted
widely in training health care workers, the time and
energy spent evaluating the curriculum will pay off
in the long run with a more effective training and
better prepared health care workers.
Piloting a Curriculum: A Technical Implementation Guide	

Acknowledgments
Funding
This document was developed with funding from Cooperative Agreement U69HA00047 from the US Department of Health and
Human Services Health Resources and Services Agency (HRSA); its contents are solely the responsibility of the authors and do not
necessarily represent the views of HRSA.
About I-TECH
The International Training and Education Center for Health (I-TECH) is a collaboration between the University of Washington and the
University of California, San Francisco. Its mission is to support the development of a skilled health care work force and well-organized national health delivery systems to provide effective prevention, care, and treatment of infectious diseases in resource-limited
settings. Staff work in Africa, Asia, and the Caribbean in partnership with local ministries of health, universities, non-governmental
organizations, and medical facilities.
I-TECH
901 Boren Avenue, Suite 1100
Seattle, Washington 98104 USA
www.go2itech.org
© 2008 University of Washington
Please seek the permission of I-TECH before reproducing, adapting, or excerpting from this guide.

PAGE 7
Piloting a Curriculum: A Technical Implementation Guide	

PAGE 8

January 2010 ©I-TECH

Mais conteúdo relacionado

Mais procurados

Observations types
Observations typesObservations types
Observations typesthsieh
 
Designing & Organizing a Training Course
Designing & Organizing a Training CourseDesigning & Organizing a Training Course
Designing & Organizing a Training CourseMd Raihan Ubaidullah
 
Training & Development - Designing a training program - key factors, strategi...
Training & Development - Designing a training program - key factors, strategi...Training & Development - Designing a training program - key factors, strategi...
Training & Development - Designing a training program - key factors, strategi...ShatakshiSingh17
 
Training Design and Methods of Training
Training Design and Methods of TrainingTraining Design and Methods of Training
Training Design and Methods of TrainingMegha Anilkumar
 
8 Professional Tips to Conduct Effective Training Sessions
8 Professional Tips to Conduct Effective Training Sessions8 Professional Tips to Conduct Effective Training Sessions
8 Professional Tips to Conduct Effective Training SessionsTrainingroomSG
 
Training evaluation
Training evaluationTraining evaluation
Training evaluationMis bah
 
Signature Assignment CUR 516: Instructional Plan and Presentation
Signature Assignment CUR 516: Instructional Plan and PresentationSignature Assignment CUR 516: Instructional Plan and Presentation
Signature Assignment CUR 516: Instructional Plan and PresentationEricaLJonesMAEd
 
How To Prepare A Basic Training Module
How To Prepare A Basic Training ModuleHow To Prepare A Basic Training Module
How To Prepare A Basic Training ModuleBruhad Buch
 
Module 4: Project and quality management
Module 4: Project and quality managementModule 4: Project and quality management
Module 4: Project and quality managementTELECENTRE EUROPE
 
Four Levels of Learning Evaluation
Four Levels of Learning EvaluationFour Levels of Learning Evaluation
Four Levels of Learning EvaluationMavict De Leon
 
Chapter 9: Planning a Training Session
Chapter 9: Planning a Training SessionChapter 9: Planning a Training Session
Chapter 9: Planning a Training SessionTeam Web Africa
 
Designing A Training Programme
Designing A Training ProgrammeDesigning A Training Programme
Designing A Training Programmeselbie
 
Ix564 week 4 blackboard wiki
Ix564 week 4 blackboard wikiIx564 week 4 blackboard wiki
Ix564 week 4 blackboard wikiKris Murner
 

Mais procurados (20)

Design a training programme
Design a training programmeDesign a training programme
Design a training programme
 
Observations types
Observations typesObservations types
Observations types
 
Designing & Organizing a Training Course
Designing & Organizing a Training CourseDesigning & Organizing a Training Course
Designing & Organizing a Training Course
 
Training & Development - Designing a training program - key factors, strategi...
Training & Development - Designing a training program - key factors, strategi...Training & Development - Designing a training program - key factors, strategi...
Training & Development - Designing a training program - key factors, strategi...
 
Training Design and Methods of Training
Training Design and Methods of TrainingTraining Design and Methods of Training
Training Design and Methods of Training
 
8 Professional Tips to Conduct Effective Training Sessions
8 Professional Tips to Conduct Effective Training Sessions8 Professional Tips to Conduct Effective Training Sessions
8 Professional Tips to Conduct Effective Training Sessions
 
Training design
Training designTraining design
Training design
 
Training evaluation
Training evaluationTraining evaluation
Training evaluation
 
Training manual
Training manualTraining manual
Training manual
 
Signature Assignment CUR 516: Instructional Plan and Presentation
Signature Assignment CUR 516: Instructional Plan and PresentationSignature Assignment CUR 516: Instructional Plan and Presentation
Signature Assignment CUR 516: Instructional Plan and Presentation
 
How To Prepare A Basic Training Module
How To Prepare A Basic Training ModuleHow To Prepare A Basic Training Module
How To Prepare A Basic Training Module
 
Module 4: Project and quality management
Module 4: Project and quality managementModule 4: Project and quality management
Module 4: Project and quality management
 
Four Levels of Learning Evaluation
Four Levels of Learning EvaluationFour Levels of Learning Evaluation
Four Levels of Learning Evaluation
 
Implementing Training Programs
Implementing Training ProgramsImplementing Training Programs
Implementing Training Programs
 
Chapter 9: Planning a Training Session
Chapter 9: Planning a Training SessionChapter 9: Planning a Training Session
Chapter 9: Planning a Training Session
 
Designing A Training Programme
Designing A Training ProgrammeDesigning A Training Programme
Designing A Training Programme
 
Ix564 week 4 blackboard wiki
Ix564 week 4 blackboard wikiIx564 week 4 blackboard wiki
Ix564 week 4 blackboard wiki
 
Section 3 (1)
Section 3 (1)Section 3 (1)
Section 3 (1)
 
Technical Assistance to Schools
Technical Assistance to SchoolsTechnical Assistance to Schools
Technical Assistance to Schools
 
Logistics of training
Logistics of trainingLogistics of training
Logistics of training
 

Destaque (10)

Tennessee Opportunities In Energy Efficiency Market Survey 2013
Tennessee Opportunities In Energy Efficiency Market Survey 2013Tennessee Opportunities In Energy Efficiency Market Survey 2013
Tennessee Opportunities In Energy Efficiency Market Survey 2013
 
Desbloqueate
DesbloqueateDesbloqueate
Desbloqueate
 
Pantallazo rowid
Pantallazo rowidPantallazo rowid
Pantallazo rowid
 
18 c oand_au
18 c oand_au18 c oand_au
18 c oand_au
 
manual del vostro 3360
manual del vostro 3360manual del vostro 3360
manual del vostro 3360
 
Tesispptfinal
TesispptfinalTesispptfinal
Tesispptfinal
 
F57 competencias en_bachillerato
F57 competencias en_bachilleratoF57 competencias en_bachillerato
F57 competencias en_bachillerato
 
B1 bacterioses-130325082826-phpapp01
B1 bacterioses-130325082826-phpapp01B1 bacterioses-130325082826-phpapp01
B1 bacterioses-130325082826-phpapp01
 
Unidad 1 qué es un docente y un adolescente
Unidad 1 qué es un docente y un adolescenteUnidad 1 qué es un docente y un adolescente
Unidad 1 qué es un docente y un adolescente
 
Curriculum vitae
Curriculum vitaeCurriculum vitae
Curriculum vitae
 

Semelhante a Tig 3 piloting curriculum(1)

Tig 3 piloting curriculum
Tig 3 piloting curriculumTig 3 piloting curriculum
Tig 3 piloting curriculumJaydee Borje
 
Program DesignFor learning to occur, training programs require m.docx
Program DesignFor learning to occur, training programs require m.docxProgram DesignFor learning to occur, training programs require m.docx
Program DesignFor learning to occur, training programs require m.docxwkyra78
 
training and development module 3
training and development module 3  training and development module 3
training and development module 3 POOJA UDAYAN
 
IPM NDTHRD-Developing a Trainers Toolkit
IPM NDTHRD-Developing a Trainers ToolkitIPM NDTHRD-Developing a Trainers Toolkit
IPM NDTHRD-Developing a Trainers ToolkitDupani Hatanarachchi
 
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does it
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does itSilvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does it
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does iteaquals
 
TRA 2010: Course team approaches to task design - Adam Unwin
TRA 2010: Course team approaches to task design - Adam UnwinTRA 2010: Course team approaches to task design - Adam Unwin
TRA 2010: Course team approaches to task design - Adam UnwinCentre for Distance Education
 
Workshop method commerce
Workshop method commerceWorkshop method commerce
Workshop method commerceBrinal Lopes
 
Challenges in Design of Training final.pptx
Challenges in Design of Training final.pptxChallenges in Design of Training final.pptx
Challenges in Design of Training final.pptxjnBaliya2
 
Learning tool M4T4: Assesment and improving the program
Learning tool M4T4: Assesment and improving the programLearning tool M4T4: Assesment and improving the program
Learning tool M4T4: Assesment and improving the programTOTVET
 
Encourage Manager Support for Training
Encourage Manager Support for TrainingEncourage Manager Support for Training
Encourage Manager Support for TrainingRenjoie Soriano
 
Evaluating practice workshop TELFest 2016
Evaluating practice workshop TELFest 2016Evaluating practice workshop TELFest 2016
Evaluating practice workshop TELFest 2016telshef
 
Phases of Administering the Curriculum
Phases of Administering the CurriculumPhases of Administering the Curriculum
Phases of Administering the CurriculumSharon Geroquia
 
Workshop: Setting the Foundations for an Iterative Course Evolution Model – A...
Workshop: Setting the Foundations for an Iterative Course Evolution Model – A...Workshop: Setting the Foundations for an Iterative Course Evolution Model – A...
Workshop: Setting the Foundations for an Iterative Course Evolution Model – A...Blackboard APAC
 
Berhanu topic introduction to ttlm
Berhanu topic  introduction to ttlmBerhanu topic  introduction to ttlm
Berhanu topic introduction to ttlmberhanu taye
 
BE THE PROFESSIONAL COACH
BE THE PROFESSIONAL COACHBE THE PROFESSIONAL COACH
BE THE PROFESSIONAL COACHSetiono Winardi
 

Semelhante a Tig 3 piloting curriculum(1) (20)

Tig 3 piloting curriculum
Tig 3 piloting curriculumTig 3 piloting curriculum
Tig 3 piloting curriculum
 
Program DesignFor learning to occur, training programs require m.docx
Program DesignFor learning to occur, training programs require m.docxProgram DesignFor learning to occur, training programs require m.docx
Program DesignFor learning to occur, training programs require m.docx
 
Lesson plan for training
Lesson plan for trainingLesson plan for training
Lesson plan for training
 
training and development module 3
training and development module 3  training and development module 3
training and development module 3
 
IPM NDTHRD-Developing a Trainers Toolkit
IPM NDTHRD-Developing a Trainers ToolkitIPM NDTHRD-Developing a Trainers Toolkit
IPM NDTHRD-Developing a Trainers Toolkit
 
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does it
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does itSilvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does it
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does it
 
TRA 2010: Course team approaches to task design - Adam Unwin
TRA 2010: Course team approaches to task design - Adam UnwinTRA 2010: Course team approaches to task design - Adam Unwin
TRA 2010: Course team approaches to task design - Adam Unwin
 
Workshop method commerce
Workshop method commerceWorkshop method commerce
Workshop method commerce
 
Challenges in Design of Training final.pptx
Challenges in Design of Training final.pptxChallenges in Design of Training final.pptx
Challenges in Design of Training final.pptx
 
Trg evaluation
Trg evaluationTrg evaluation
Trg evaluation
 
Learning tool M4T4: Assesment and improving the program
Learning tool M4T4: Assesment and improving the programLearning tool M4T4: Assesment and improving the program
Learning tool M4T4: Assesment and improving the program
 
Assesment and evaluation
Assesment and evaluationAssesment and evaluation
Assesment and evaluation
 
Encourage Manager Support for Training
Encourage Manager Support for TrainingEncourage Manager Support for Training
Encourage Manager Support for Training
 
Evaluating practice workshop TELFest 2016
Evaluating practice workshop TELFest 2016Evaluating practice workshop TELFest 2016
Evaluating practice workshop TELFest 2016
 
Phases of Administering the Curriculum
Phases of Administering the CurriculumPhases of Administering the Curriculum
Phases of Administering the Curriculum
 
Unit 9-6503.pptx
Unit 9-6503.pptxUnit 9-6503.pptx
Unit 9-6503.pptx
 
Workshop: Setting the Foundations for an Iterative Course Evolution Model – A...
Workshop: Setting the Foundations for an Iterative Course Evolution Model – A...Workshop: Setting the Foundations for an Iterative Course Evolution Model – A...
Workshop: Setting the Foundations for an Iterative Course Evolution Model – A...
 
Berhanu topic introduction to ttlm
Berhanu topic  introduction to ttlmBerhanu topic  introduction to ttlm
Berhanu topic introduction to ttlm
 
Implementaion
ImplementaionImplementaion
Implementaion
 
BE THE PROFESSIONAL COACH
BE THE PROFESSIONAL COACHBE THE PROFESSIONAL COACH
BE THE PROFESSIONAL COACH
 

Último

Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoffsammart93
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...Neo4j
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024The Digital Insurer
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherRemote DBA Services
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Scriptwesley chun
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)wesley chun
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProduct Anonymous
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc
 

Último (20)

Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 

Tig 3 piloting curriculum(1)

  • 1. I - T E C H T E C H N I C A L I M P L E M E N T A T I O N G U I D E # 3 Piloting a Curriculum: Evaluating the Effectiveness of a New Training I-TECH’s Technical Implementation Guides are a series of practical and instructional papers designed to support staff and partners in their efforts to create and maintain quality programs worldwide. A successful training program requires multiple components, including skilled trainers with adequate content knowledge, training participants who have the necessary baseline knowledge and motivation to learn, and a well-written curriculum to guide everyone through the process. Implementing training programs often requires a significant amount of training staff time, trainee time spent away from work, and dollars spent on transportation, lodging and per diem. Pilot-testing a curriculum is an important aspect of quality control in training and can help to ensure that the time and investment in training really pays off. This guide presents an approach for conducting such an evaluation. Why Pilot? The purpose of piloting a curriculum is to make sure the curriculum is effective, and to make changes before it is distributed or offered widely. Piloting a curriculum helps to identify which sections of the curriculum worked and which sections need strengthening. The process should include a comprehensive evaluation of the curriculum’s effectiveness and usefulness in achieving the course’s training objectives. The information gathered from the pilot is used to strengthen and improve the course content, materials, and delivery strategies in the next version of the curriculum. Pilot testing will only be effective if you have the resources, time, and ability to revise the curriculum based on the evaluation feedback received from the pilot. If, for some reason, you will not be able to make changes to the curriculum, then completing the pilot-evaluation process described below is not advisable. The process described in this guide for piloting a curriculum is appropriate for extensive, classroombased cur­ icula in which you have invested signifir cant resources. This type of process is not as relevant for an on-site training where training topics might vary with the situation, or for a less structured curriculum in which experts present case studies or their own PowerPoint slides. Preparing for the Pilot The pilot evaluation is an extensive assessment carried out during the first time that a newly developed or adapted curriculum is used to conduct training. Participants should be informed that they are participating in a pilot session of the training workshop when they are invited and will be asked to provide extensive and honest feedback as part of the pilot-evaluation process. Emphasizing that their feedback is critical to the development and improvement of the training will often help motivate participants to complete written-evaluation forms in detail and participate actively in group discussions on the training experience. On the first day of the workshop, facilitators should mention that evaluation activities will be more extensive than normal because of the fact that this is a pilot, and they should articulate the specific evaluation activities that will be conducted. It is also critical to brief the facilitators in advance that this is a pilot test of the curriculum. Some facilitators may have presented similar material before and may have their own presentations that they are used to giving. For the curriculum pilot, it is important that all facilitators follow the curriculum as closely as possible. All facilitators should be provided with copies of the curriculum well in advance of the workshop so that they have an oppor­unity t to review their assigned sessions.
  • 2. Piloting a Curriculum: A Technical Implementation Guide Recruit at least one observer who will be present at the whole training. The observer is often someone who participated in the development of the curriculum, but this is not a requirement. If it is not possible to get someone who participated in the PAGE 2 development of the curriculum to attend the pilot, then the observer can be someone with listening and observation skills who is knowledgeable about training design. The observer role is described in more detail later in this guide. Areas to Evaluate During the Pilot The evaluation of a pilot curriculum should assess teaching methods, appropriateness of content, materials, issues of timing and flow, and general effectiveness of the training. Some key questions to answer through the evaluation of the pilot are listed below: Teaching methods n ere the teaching methods (lecture, discussion, group work, etc.) used in the training successful W in increasing participant knowledge/understanding? n Did some methods work particularly well? n id some methods not work and need to be changed? D Content The pilot session is not the time to review clinical accuracy of the content. The curriculum should have undergone a technical review prior to piloting to ensure that the content is clinically accurate and reflects current guidelines. n as the content at the appropriate depth and breadth for the audience? W n as the reading level of the curriculum too difficult/easy? W n ere the right topics covered? W n ere there topics missing? W n ere there stories, examples, cases mentioned during the workshop that could be incorporated W into the curriculum? Materials n ere the materials user friendly for both trainers and participants? W n id the trainers use all of the materials? (PowerPoint slides, handouts, worksheets) D n id participants refer to the training materials? D n ere there additional materials and resources that would enhance the training? W Effectiveness n id participants acquire the intended skills and knowledge from the training? If not, what were the D weak areas? Timing and flow n as there too little or too much time allocated for individual activities? W n as there too little or too much time allocated for the workshop as a whole? W
  • 3. Piloting a Curriculum: A Technical Implementation Guide PAGE 3 should be brief and easy to complete to avoid overburdening participants. Often, daily evaluations only ask what participants liked and did not like about the day’s session. Since this is a pilot, you may want to include some knowledge-related questions specific to the content covered that day to see if participants absorbed some of the key themes of the day. Participants’ feedback should include their subjective response, i.e. what worked well, and objective measures, i.e., a change in knowledge. Conducting the Evaluation The evaluation of the pilot session of a new curriculum should encompass feedback from three key groups in order to triangulate data and assess the training from different perspectives: the participants, the facilitators/trainers of the workshop, and the observer(s). Participant Evaluation of the Workshop The purpose of a training event is to enhance participants’ capacity in a specific, defined area. Participant feedback on the workshop is thus a central piece in assessing the effectiveness of a curriculum. Participants’ feedback should include their subjective response to the workshop (What worked well? What did they feel they learned? What did they not understand? How will they apply what they learned in the workshop in their work setting?) and objective measures, such as a change in their knowledge and/ or skills resulting from the workshop. Following are specific tools for collecting participants’ feedback: Tool 1: Daily written evaluations A daily evaluation, to be completed by participants at the end of each day of the workshop, captures participants’ reactions to workshop sessions and activities while the information is still fresh in their minds. A daily evaluation form Generally, the data from the daily evaluation form does not need to be rigorously tallied and analyzed; rather, the trainers should scan the forms to determine general trends. For example, if a particular session seems to be receiving notably lower scores than other sessions, this would indicate the need for further investigation to determine the reason. The daily evaluation should also include a question about information not understood by participants so that facilitators can clarify any misunderstandings at the training the following day. Tool 2: Final written evaluation Participants should complete a final written evaluation at the end of the final day of the workshop. This evaluation allows participants to reflect back on the knowledge and skills acquired during the training within the context of the workshop as a whole. It can often capture different information than the daily evaluation, as material that was unclear or confusing on Day 2 may have become clear by Day 5. A final written evaluation also allows participants to reflect on their key learning from the overall workshop and assess how they will be able to use what they have learned when they return to their work settings. Tool 3: Focus group discussion Focus group discussions with participants at the end of the workshop can be a useful method for obtaining feedback about the workshop and curriculum. The group discussion format inspires synergies that can result in a rich discussion, evoking different information than might be provided in a written format. It also facilitates feedback from those who are more comfortable expressing themselves verbally than in writing.
  • 4. Piloting a Curriculum: A Technical Implementation Guide Participants should be divided into groups of approximately 6-8 for the focus groups. Trainers should identify individuals not directly involved with the workshop to facilitate the focus group discussions and take notes. A focus group discussion guide should be developed and used to structure the discussion, but should not limit or constrain the discussion. Key themes emerging from the groups should be identified and summarized. Refer to I-TECH’s Organizing and Conducting Focus Groups for more detailed information on focus group discussions. Tool 4: Pre- and post-test The purpose of the pre- and post-test is to provide an objective measure of changes in knowledge and/or skills resulting from the training, and thus can serve to provide valuable information about the effectiveness of the curriculum. Caution must be used in interpreting the preand post-test results in terms of implications for the curriculum. The pre/post-test must also be piloted to determine whether it is valid. For example, significant increases in scores between the pre-test and the post-test can serve as an indication that the workshop achieved its objectives and the curriculum was effective; lack of substantial increase in scores, however, requires further analysis and interpretation. A lack of change or decrease could reflect a poorly designed test rather than indicate a problem with the curriculum. PAGE 4 systematic feedback from the trainers using the curriculum and supporting training materials. This assessment should include the trainers’ experiences in using the materials, their feedback on the thoroughness and appropriateness of the content, and their suggestions for activities, stories, case studies, and examples that can be incorporated into future editions of the curriculum. Tool 1: Written evaluation A written evaluation should be completed by each facilitator at the end of each session he/she facilitates. This evaluation form should ask the facilitator to reflect on what worked well in the session and what did not work well. The evaluation can also ask the facilitator to comment specifically on how he or she used the facilitator manual and the slides for that specific session. Although it may seem like a lot of work for the facilitators to complete an evaluation after each session of the training, it is important to capture their thoughts and very specific feedback while it is still fresh and clear in their minds. A simple method of gauging the validity of the test is to discuss the responses to the test during the training and after participants have completed it on their own. This will enable facilitators to see if the questions themselves were unclear or whether there were gaps in the way the material was covered during the training. Refer to I-TECH’s Guidelines for Pre- and Post-Testing for more detailed information about designing, validating, and interpreting pre/post tests. Facilitator Evaluation of Curriculum Materials The pilot evaluation should include structured and Participants should complete a written evaluation at the end of the final day of the workshop.
  • 5. Piloting a Curriculum: A Technical Implementation Guide Tool 2: Daily debrief meetings Trainers and observers (see next section) should hold a short debriefing at the end of each day. The purpose of this debrief is to discuss the day’s activities (what worked well, what needs to be changed/improved) as well as to prepare for the following day. A simple form can be used to guide the discussion and a note taker should be designated to ensure that key points are captured. Debriefings can be extremely effective in identifying issues or concerns with the curriculum as well as strategies to address these concerns. PAGE 5 Observer Evaluation of Workshop and Use of Curriculum Materials The observer(s) plays an invaluable role during the piloting of a curriculum. Since observers are not directly involved in the implementation of the workshop, they are able to carefully monitor elements, such as use of the materials by trainers and participants, the interaction between participants and trainers, and participants’ level of engagement. They can also be tasked with capturing stories, cases, and examples used by trainers to incorporate into the curriculum and providing general overall feedback on each session. The observer can make notes directly on a copy of the facilitator’s guide during the presentation of each session. Observers should note if an activity did not work with the group or if the facilitator skipped a specific activity or session. The observer can also note which activities took more time than anticipated or which ones led to interesting discussions about topics of importance to the learning objectives of the curriculum. If participants appear confused at certain times during the training session, the observer can note at what point this confusion occurred. Likewise, if the facilitator deviates from the curriculum to clear up a point of confusion, the observer should note this as well. It is helpful to develop an “Observation Guide” for the observer to fill out at the end of each session of the training. This form is a helpful tool to remind observers what to look for, such as the timing and flow of the session, the use of the facilitator and participant manuals, participant involvement, and the quality of the content in the session. Trainers and observers should hold a short debriefing at the end of each day. Observers should also participate in the daily debrief meeting with the facilitators to review the successes and challenges of the day’s sessions and to share their observations. Lastly, the observers should summarize their detailed comments each day so that they can be easily fed back into the curriculum review process.
  • 6. Piloting a Curriculum: A Technical Implementation Guide PAGE 6 Evaluating Non-Classroom Training Tools dress the most important issues first depending on available time and resources. Revising the curriculum may include: Your curriculum may include components that take place outside of the classroom, like field visits or preceptorships. You may also have created specific tools, like trigger tape scenarios, to enhance your curriculum. Each of these additional pieces should be evaluated individually through separate evaluation forms or group discussions with facilitators and participants. You should also include specific questions about these experiences or tools in the final participant-evaluation forms, observation guides, and facilitator-feedback forms. n odifying slides or handouts; M n nhancing and expanding facilitator’s notes; E n eveloping new activities or methodologies, D such as case studies; n dding, deleting, or reordering content; A n ewriting sections of the facilitator manual; R n implifying the level of language used; S n dding more clinical depth; A n eveloping additional resources, such as a ParD ticipant Reference Manual or Pocket Guide; n ethinking the length of time of the workshop, R or making difficult decisions about what can be covered in the time available for training. Revising the Curriculum After the pilot training is complete, it is time to review all that you have learned and use it to make changes to the curriculum. As mentioned earlier, this is the most important part of the entire pilot-evaluation process. The first step is to compile and analyze all the data collected. The pre- and post-test should be scored and results entered into a spreadsheet by question; this will enable you to determine if there were particular content areas that were not well understood by participants (this assumes that the pre/ post-test has been validated). Participant evaluation results, both written and from the focus groups, should be summarized and analyzed to identify key themes and issues, and recurring comments about what worked and what did not. Observer and trainer feedback on specific sessions should be compiled in one place and relevant feedback from the trainerdebrief sessions should be summarized. Once you have a good sense of what the data are telling you, the next step is to revise the curriculum and the supporting materials based on the feedback. This can be an extensive process depending on the type of feedback received; it may be necessary to prioritize all the suggested changes to ad- Be sure that the revision includes any changes or updates to local, national, or regional guidelines relevant to the training content. If extensive changes are made to the curriculum during the revision process, it is helpful to conduct a second pilot evaluation. This involves repeating the methodologies described here during the first presentation of the revised curriculum. For a large, technical curriculum that is expected to be adopted widely in training health care workers, the time and energy spent evaluating the curriculum will pay off in the long run with a more effective training and better prepared health care workers.
  • 7. Piloting a Curriculum: A Technical Implementation Guide Acknowledgments Funding This document was developed with funding from Cooperative Agreement U69HA00047 from the US Department of Health and Human Services Health Resources and Services Agency (HRSA); its contents are solely the responsibility of the authors and do not necessarily represent the views of HRSA. About I-TECH The International Training and Education Center for Health (I-TECH) is a collaboration between the University of Washington and the University of California, San Francisco. Its mission is to support the development of a skilled health care work force and well-organized national health delivery systems to provide effective prevention, care, and treatment of infectious diseases in resource-limited settings. Staff work in Africa, Asia, and the Caribbean in partnership with local ministries of health, universities, non-governmental organizations, and medical facilities. I-TECH 901 Boren Avenue, Suite 1100 Seattle, Washington 98104 USA www.go2itech.org © 2008 University of Washington Please seek the permission of I-TECH before reproducing, adapting, or excerpting from this guide. PAGE 7
  • 8. Piloting a Curriculum: A Technical Implementation Guide PAGE 8 January 2010 ©I-TECH