SlideShare uma empresa Scribd logo
1 de 8
Baixar para ler offline
I - T E C H            T E C H N I C A L                   I M P L E M E N T A T I O N                     G U I D E         # 3



Piloting a Curriculum:
Evaluating the Effectiveness of a New Training
I-TECH’s Technical Implementation Guides are a series of practical and instructional papers designed
to support staff and partners in their efforts to create and maintain quality programs worldwide.




A successful training program requires multiple                          curriculum is appropriate for extensive, classroom-
components, including skilled trainers with ade-                         based cur­ icula in which you have invested signifi-
                                                                                   r
quate content knowledge, training participants who                       cant resources. This type of process is not as rel-
have the necessary baseline knowledge and motiva-                        evant for an on-site training where training topics
tion to learn, and a well-written curriculum to guide                    might vary with the situation, or for a less struc-
everyone through the process. Implementing train-                        tured curriculum in which experts present case
ing programs often requires a significant amount                         studies or their own PowerPoint slides.
of training staff time, trainee time spent away from
work, and dollars spent on transportation, lodging                       Preparing for the Pilot
and per diem. Pilot-testing a curriculum is an im-
portant aspect of quality control in training and can                    The pilot evaluation is an extensive assessment
help to ensure that the time and investment in train-                    carried out during the first time that a newly de-
ing really pays off. This guide presents an approach                     veloped or adapted curriculum is used to conduct
for conducting such an evaluation.                                       training. Participants should be informed that they
                                                                         are participating in a pilot session of the training
Why Pilot?                                                               workshop when they are invited and will be asked
                                                                         to provide extensive and honest feedback as part
The purpose of piloting a curriculum is to make sure                     of the pilot-evaluation process. Emphasizing that
the curriculum is effective, and to make changes                         their feedback is critical to the development and
before it is distributed or offered widely. Piloting                     improvement of the training will often help moti-
a curriculum helps to identify which sections of                         vate participants to complete written-evaluation
the curriculum worked and which sections need                            forms in detail and participate actively in group dis-
strengthening. The process should include a com-                         cussions on the training experience. On the first
prehensive evaluation of the curriculum’s effective-                     day of the workshop, facilitators should mention
ness and usefulness in achieving the course’s train-                     that evaluation activities will be more extensive
ing objectives. The information gathered from the                        than normal because of the fact that this is a pilot,
pilot is used to strengthen and improve the course                       and they should articulate the specific evaluation
content, materials, and delivery strategies in the                       activities that will be conducted.
next version of the curriculum.
                                                                         It is also critical to brief the facilitators in advance
Pilot testing will only be effective if you have the                     that this is a pilot test of the curriculum. Some facil-
resources, time, and ability to revise the curriculum                    itators may have presented similar material before
based on the evaluation feedback received from                           and may have their own presentations that they
the pilot. If, for some reason, you will not be able to                  are used to giving. For the curriculum pilot, it is im-
make changes to the curriculum, then completing                          portant that all facilitators follow the curriculum as
the pilot-evaluation process described below is not                      closely as possible. All facilitators should be pro-
advisable.                                                               vided with copies of the curriculum well in advance
                                                                         of the workshop so that they have an oppor­unity   t
The process described in this guide for piloting a                       to review their assigned sessions.
Piloting a Curriculum: A Technical Implementation Guide	                                                          PAGE 2




  Recruit at least one observer who will be present          development of the curriculum to attend the pilot,
  at the whole training. The observer is often some-         then the observer can be someone with listening
  one who participated in the development of the             and observation skills who is knowledgeable about
  curriculum, but this is not a requirement. If it is not    training design. The observer role is described in
  possible to get someone who participated in the            more detail later in this guide.



       Areas to Evaluate During the Pilot


       The evaluation of a pilot curriculum should assess teaching methods, appropriateness of content,
       materials, issues of timing and flow, and general effectiveness of the training. Some key questions to
       answer through the evaluation of the pilot are listed below:

       Teaching methods
       n  ere the teaching methods (lecture, discussion, group work, etc.) used in the training successful
          W
          in increasing participant knowledge/understanding?
       n Did some methods work particularly well?
       n  id some methods not work and need to be changed?
          D

       Content
       The pilot session is not the time to review clinical accuracy of the content. The curriculum should have
       undergone a technical review prior to piloting to ensure that the content is clinically accurate and re-
       flects current guidelines.
       n  as the content at the appropriate depth and breadth for the audience?
           W
       n  as the reading level of the curriculum too difficult/easy?
           W
       n  ere the right topics covered?
           W
       n  ere there topics missing?
           W
       n  ere there stories, examples, cases mentioned during the workshop that could be incorporated
           W
           into the curriculum?

       Materials
       n  ere the materials user friendly for both trainers and participants?
         W
       n  id the trainers use all of the materials? (PowerPoint slides, handouts, worksheets)
         D
       n  id participants refer to the training materials?
         D
       n  ere there additional materials and resources that would enhance the training?
         W

       Effectiveness
       n  id participants acquire the intended skills and knowledge from the training? If not, what were the
           D
           weak areas?

       Timing and flow
       n  as there too little or too much time allocated for individual activities?
          W
       n  as there too little or too much time allocated for the workshop as a whole?
          W
Piloting a Curriculum: A Technical Implementation Guide	                                                                           PAGE 3




                                                                                  should be brief and easy to complete to avoid
                                                                                  overburdening participants. Often, daily evalua-
                                                                                  tions only ask what participants liked and did not
                                                                                  like about the day’s session. Since this is a pilot,
                                                                                  you may want to include some knowledge-re-
                                                                                  lated questions specific to the content covered
                                                                                  that day to see if participants absorbed some of
                                                                                  the key themes of the day.

                                                                                  Generally, the data from the daily evaluation form
                                                                                  does not need to be rigorously tallied and ana-
                                                                                  lyzed; rather, the trainers should scan the forms to
                                                                                  determine general trends. For example, if a partic-
                                                                                  ular session seems to be receiving notably lower
                                                                                  scores than other sessions, this would indicate
   Participants’ feedback should include their subjective response, i.e. what     the need for further investigation to determine the
   worked well, and objective measures, i.e., a change in knowledge.              reason. The daily evaluation should also include a
                                                                                  question about information not understood by par-
  Conducting the Evaluation                                                       ticipants so that facilitators can clarify any misun-
                                                                                  derstandings at the training the following day.
  The evaluation of the pilot session of a new cur-
  riculum should encompass feedback from three
                                                                                Tool 2: Final written evaluation
  key groups in order to triangulate data and assess
                                                                                  Participants should complete a final written evalu-
  the training from different perspectives: the partici-
                                                                                  ation at the end of the final day of the workshop.
  pants, the facilitators/trainers of the workshop, and
                                                                                  This evaluation allows participants to reflect back
  the observer(s).
                                                                                  on the knowledge and skills acquired during the
                                                                                  training within the context of the workshop as a
  Participant Evaluation of the Workshop
                                                                                  whole. It can often capture different information
  The purpose of a training event is to enhance par-
                                                                                  than the daily evaluation, as material that was
  ticipants’ capacity in a specific, defined area. Par-
                                                                                  unclear or confusing on Day 2 may have become
  ticipant feedback on the workshop is thus a central
                                                                                  clear by Day 5. A final written evaluation also al-
  piece in assessing the effectiveness of a curriculum.
                                                                                  lows participants to reflect on their key learning
  Participants’ feedback should include their subjec-
                                                                                  from the overall workshop and assess how they
  tive response to the workshop (What worked well?
                                                                                  will be able to use what they have learned when
  What did they feel they learned? What did they not
                                                                                  they return to their work settings.
  understand? How will they apply what they learned
  in the workshop in their work setting?) and objective
                                                                                Tool 3: Focus group discussion
  measures, such as a change in their knowledge and/
                                                                                  Focus group discussions with participants at
  or skills resulting from the workshop. Following are
                                                                                  the end of the workshop can be a useful meth-
  specific tools for collecting participants’ feedback:
                                                                                  od for obtaining feedback about the workshop
                                                                                  and curriculum. The group discussion format in-
  Tool 1: Daily written evaluations
                                                                                  spires synergies that can result in a rich discus-
    A daily evaluation, to be completed by partici-
                                                                                  sion, evoking different information than might
    pants at the end of each day of the workshop,
                                                                                  be provided in a written format. It also facilitates
    captures participants’ reactions to workshop
                                                                                  feedback from those who are more comfortable
    sessions and activities while the information is
                                                                                  expressing themselves verbally than in writing.
    still fresh in their minds. A daily evaluation form
Piloting a Curriculum: A Technical Implementation Guide	                                                                              PAGE 4




     Participants should be divided into groups of ap-       systematic feedback from the trainers using the
     proximately 6-8 for the focus groups. Trainers          curriculum and supporting training materials. This
     should identify individuals not directly involved       assessment should include the trainers’ experi-
     with the workshop to facilitate the focus group         ences in using the materials, their feedback on the
     discussions and take notes. A focus group dis-          thoroughness and appropriateness of the content,
     cussion guide should be developed and used to           and their suggestions for activities, stories, case
     structure the discussion, but should not limit or       studies, and examples that can be incorporated
     constrain the discussion. Key themes emerging           into future editions of the curriculum.
     from the groups should be identified and sum-
     marized. Refer to I-TECH’s Organizing and Con-          Tool 1: Written evaluation
     ducting Focus Groups for more detailed informa-           A written evaluation should be completed by
     tion on focus group discussions.                          each facilitator at the end of each session he/she
                                                               facilitates. This evaluation form should ask the
  Tool 4: Pre- and post-test                                   facilitator to reflect on what worked well in the
    The purpose of the pre- and post-test is to pro-           session and what did not work well. The evalu-
    vide an objective measure of changes in knowl-             ation can also ask the facilitator to comment
    edge and/or skills resulting from the training, and        specifically on how he or she used the facilitator
    thus can serve to provide valuable information             manual and the slides for that specific session.
    about the effectiveness of the curriculum.                 Although it may seem like a lot of work for the
                                                               facilitators to complete an evaluation after each
     Caution must be used in interpreting the pre-             session of the training, it is important to capture
     and post-test results in terms of implications for        their thoughts and very specific feedback while
     the curriculum. The pre/post-test must also be            it is still fresh and clear in their minds.
     piloted to determine whether it is valid. For ex-
     ample, significant increases in scores between
     the pre-test and the post-test can serve as an
     indication that the workshop achieved its ob-
     jectives and the curriculum was effective; lack
     of substantial increase in scores, however, re-
     quires further analysis and interpretation. A lack
     of change or decrease could reflect a poorly de-
     signed test rather than indicate a problem with
     the curriculum.

     A simple method of gauging the validity of the
     test is to discuss the responses to the test dur-
     ing the training and after participants have com-
     pleted it on their own. This will enable facilitators
     to see if the questions themselves were unclear
     or whether there were gaps in the way the ma-
     terial was covered during the training. Refer to
     I-TECH’s Guidelines for Pre- and Post-Testing for
     more detailed information about designing, vali-
     dating, and interpreting pre/post tests.

  Facilitator Evaluation of Curriculum Materials             Participants should complete a written evaluation at the end of the final day
  The pilot evaluation should include structured and         of the workshop.
Piloting a Curriculum: A Technical Implementation Guide	                                                                            PAGE 5




  Tool 2: Daily debrief meetings                                                  Observer Evaluation of Workshop and Use of
    Trainers and observers (see next section) should                              Curriculum Materials
    hold a short debriefing at the end of each day.                               The observer(s) plays an invaluable role during
    The purpose of this debrief is to discuss the day’s                           the piloting of a curriculum. Since observers are
    activities (what worked well, what needs to be                                not directly involved in the implementation of the
    changed/improved) as well as to prepare for the                               workshop, they are able to carefully monitor ele-
    following day. A simple form can be used to guide                             ments, such as use of the materials by trainers and
    the discussion and a note taker should be desig-                              participants, the interaction between participants
    nated to ensure that key points are captured. De-                             and trainers, and participants’ level of engagement.
    briefings can be extremely effective in identifying                           They can also be tasked with capturing stories, cas-
    issues or concerns with the curriculum as well as                             es, and examples used by trainers to incorporate
    strategies to address these concerns.                                         into the curriculum and providing general overall
                                                                                  feedback on each session.

                                                                                  The observer can make notes directly on a copy
                                                                                  of the facilitator’s guide during the presentation of
                                                                                  each session. Observers should note if an activ-
                                                                                  ity did not work with the group or if the facilitator
                                                                                  skipped a specific activity or session. The observer
                                                                                  can also note which activities took more time than
                                                                                  anticipated or which ones led to interesting discus-
                                                                                  sions about topics of importance to the learning
                                                                                  objectives of the curriculum. If participants appear
                                                                                  confused at certain times during the training ses-
                                                                                  sion, the observer can note at what point this con-
                                                                                  fusion occurred. Likewise, if the facilitator deviates
                                                                                  from the curriculum to clear up a point of confu-
                                                                                  sion, the observer should note this as well.

                                                                                  It is helpful to develop an “Observation Guide” for
                                                                                  the observer to fill out at the end of each session
                                                                                  of the training. This form is a helpful tool to remind
                                                                                  observers what to look for, such as the timing and
                                                                                  flow of the session, the use of the facilitator and
                                                                                  participant manuals, participant involvement, and
                                                                                  the quality of the content in the session.

                                                                                  Observers should also participate in the daily de-
                                                                                  brief meeting with the facilitators to review the suc-
                                                                                  cesses and challenges of the day’s sessions and
                                                                                  to share their observations. Lastly, the observers
                                                                                  should summarize their detailed comments each
                                                                                  day so that they can be easily fed back into the cur-
  Trainers and observers should hold a short debriefing at the end of each day.   riculum review process.
Piloting a Curriculum: A Technical Implementation Guide	                                                       PAGE 6




                                                            dress the most important issues first depending on
      Evaluating Non-Classroom                              available time and resources. Revising the curricu-
      Training Tools                                        lum may include:

      Your curriculum may include components that           n  odifying slides or handouts;
                                                              M
      take place outside of the classroom, like field       n  nhancing and expanding facilitator’s notes;
                                                              E
      visits or preceptorships. You may also have           n  eveloping new activities or methodologies,
                                                              D
      created specific tools, like trigger tape sce-          such as case studies;
      narios, to enhance your curriculum. Each of           n  dding, deleting, or reordering content;
                                                              A
      these additional pieces should be evaluated           n  ewriting sections of the facilitator manual;
                                                              R
      individually through separate evaluation forms        n  implifying the level of language used;
                                                              S
      or group discussions with facilitators and            n  dding more clinical depth;
                                                              A
      participants. You should also include specific        n  eveloping additional resources, such as a Par-
                                                              D
      questions about these experiences or tools in           ticipant Reference Manual or Pocket Guide;
      the final participant-evaluation forms, observa-      n  ethinking the length of time of the workshop,
                                                              R
      tion guides, and facilitator-feedback forms.            or making difficult decisions about what can be
                                                              covered in the time available for training.

                                                            Be sure that the revision includes any changes or
  Revising the Curriculum                                   updates to local, national, or regional guidelines rel-
                                                            evant to the training content.
  After the pilot training is complete, it is time to re-
  view all that you have learned and use it to make         If extensive changes are made to the curriculum
  changes to the curriculum. As mentioned earlier, this     during the revision process, it is helpful to conduct
  is the most important part of the entire pilot-evalua-    a second pilot evaluation. This involves repeating
  tion process. The first step is to compile and analyze    the methodologies described here during the first
  all the data collected. The pre- and post-test should     presentation of the revised curriculum. For a large,
  be scored and results entered into a spreadsheet by       technical curriculum that is expected to be adopted
  question; this will enable you to determine if there      widely in training health care workers, the time and
  were particular content areas that were not well un-      energy spent evaluating the curriculum will pay off
  derstood by participants (this assumes that the pre/      in the long run with a more effective training and
  post-test has been validated). Participant evaluation     better prepared health care workers.
  results, both written and from the focus groups,
  should be summarized and analyzed to identify key
  themes and issues, and recurring comments about
  what worked and what did not. Observer and trainer
  feedback on specific sessions should be compiled
  in one place and relevant feedback from the trainer-
  debrief sessions should be summarized.

  Once you have a good sense of what the data are
  telling you, the next step is to revise the curriculum
  and the supporting materials based on the feed-
  back. This can be an extensive process depending
  on the type of feedback received; it may be neces-
  sary to prioritize all the suggested changes to ad-
Piloting a Curriculum: A Technical Implementation Guide	                                                                                        PAGE 7




       Acknowledgments

       Funding
       This document was developed with funding from Cooperative Agreement U69HA00047 from the US Department of Health and
       Human Services Health Resources and Services Agency (HRSA); its contents are solely the responsibility of the authors and do not
       necessarily represent the views of HRSA.

       About I-TECH
       The International Training and Education Center for Health (I-TECH) is a collaboration between the University of Washington and the
       University of California, San Francisco. Its mission is to support the development of a skilled health care work force and well-orga-
       nized national health delivery systems to provide effective prevention, care, and treatment of infectious diseases in resource-limited
       settings. Staff work in Africa, Asia, and the Caribbean in partnership with local ministries of health, universities, non-governmental
       organizations, and medical facilities.

       I-TECH
       901 Boren Avenue, Suite 1100
       Seattle, Washington 98104 USA
       www.go2itech.org

       © 2008 University of Washington
       Please seek the permission of I-TECH before reproducing, adapting, or excerpting from this guide.
Piloting a Curriculum: A Technical Implementation Guide	               PAGE 8




                                                           January 2010 ©I-TECH

Mais conteúdo relacionado

Mais procurados

Pilot testing, monitoring and evaluating the implementation of the curriculum
Pilot testing, monitoring and evaluating the implementation of the curriculumPilot testing, monitoring and evaluating the implementation of the curriculum
Pilot testing, monitoring and evaluating the implementation of the curriculumMa Lorraine Dee Faa
 
Alternative assessment, portfolios, journals, interviews
Alternative assessment, portfolios, journals, interviewsAlternative assessment, portfolios, journals, interviews
Alternative assessment, portfolios, journals, interviewsKin Susansi
 
School improvement plan
School improvement planSchool improvement plan
School improvement plantjune1
 
Structures of higher education. The structure of Phil. Educ. System
Structures of higher education. The structure of Phil. Educ. SystemStructures of higher education. The structure of Phil. Educ. System
Structures of higher education. The structure of Phil. Educ. SystemAllan Galangue
 
The social subsystem inside the schoolsystem
The social subsystem inside the schoolsystemThe social subsystem inside the schoolsystem
The social subsystem inside the schoolsystemAileen Calaramo
 
Difference between educational leadership and managment
Difference between educational  leadership and managmentDifference between educational  leadership and managment
Difference between educational leadership and managmentuniversity of the punjab
 
Recent trends and issues in assessment and evaluation
Recent trends and issues in assessment and evaluationRecent trends and issues in assessment and evaluation
Recent trends and issues in assessment and evaluationROOHASHAHID1
 
Education planning types
Education planning typesEducation planning types
Education planning typesMrinal Mondal
 
Educational Planning: Challenges and Approaches
Educational Planning: Challenges and ApproachesEducational Planning: Challenges and Approaches
Educational Planning: Challenges and ApproachesMiss Beau
 
Monitoring and evaluation Learning and Development
Monitoring and evaluation Learning and DevelopmentMonitoring and evaluation Learning and Development
Monitoring and evaluation Learning and DevelopmentSESH SUKHDEO
 
Pantawid pamilya for neda sp training 10222014
Pantawid pamilya for neda sp training 10222014Pantawid pamilya for neda sp training 10222014
Pantawid pamilya for neda sp training 10222014Abbey Pangilinan
 
Dynamic and traditional school
Dynamic and traditional schoolDynamic and traditional school
Dynamic and traditional schoolTEACHER JHAJHA
 
Curriculum Development and Change
Curriculum Development and ChangeCurriculum Development and Change
Curriculum Development and ChangeJoan Marie Geraldo
 
Principal as Instructional Leader presentation
Principal as Instructional Leader presentationPrincipal as Instructional Leader presentation
Principal as Instructional Leader presentationNola Taylor
 
Statistical Requirements of Educational Plan
Statistical Requirements of Educational PlanStatistical Requirements of Educational Plan
Statistical Requirements of Educational PlanRamil Gallardo
 
Curriculum planning mam nourien rafique
Curriculum planning mam nourien rafiqueCurriculum planning mam nourien rafique
Curriculum planning mam nourien rafiqueHalim Ghazi
 
Assessing the curriculum (intended vs. implemented vs. achieved)
Assessing the curriculum (intended vs. implemented vs. achieved)Assessing the curriculum (intended vs. implemented vs. achieved)
Assessing the curriculum (intended vs. implemented vs. achieved)Laurice Sarmiento
 
School Improvement Plan
School Improvement PlanSchool Improvement Plan
School Improvement PlanMedia Center
 
Master plan of evaluation
Master plan of evaluationMaster plan of evaluation
Master plan of evaluationEko Priyanto
 

Mais procurados (20)

MONITORING THE CURRICULUM
MONITORING THE CURRICULUMMONITORING THE CURRICULUM
MONITORING THE CURRICULUM
 
Pilot testing, monitoring and evaluating the implementation of the curriculum
Pilot testing, monitoring and evaluating the implementation of the curriculumPilot testing, monitoring and evaluating the implementation of the curriculum
Pilot testing, monitoring and evaluating the implementation of the curriculum
 
Alternative assessment, portfolios, journals, interviews
Alternative assessment, portfolios, journals, interviewsAlternative assessment, portfolios, journals, interviews
Alternative assessment, portfolios, journals, interviews
 
School improvement plan
School improvement planSchool improvement plan
School improvement plan
 
Structures of higher education. The structure of Phil. Educ. System
Structures of higher education. The structure of Phil. Educ. SystemStructures of higher education. The structure of Phil. Educ. System
Structures of higher education. The structure of Phil. Educ. System
 
The social subsystem inside the schoolsystem
The social subsystem inside the schoolsystemThe social subsystem inside the schoolsystem
The social subsystem inside the schoolsystem
 
Difference between educational leadership and managment
Difference between educational  leadership and managmentDifference between educational  leadership and managment
Difference between educational leadership and managment
 
Recent trends and issues in assessment and evaluation
Recent trends and issues in assessment and evaluationRecent trends and issues in assessment and evaluation
Recent trends and issues in assessment and evaluation
 
Education planning types
Education planning typesEducation planning types
Education planning types
 
Educational Planning: Challenges and Approaches
Educational Planning: Challenges and ApproachesEducational Planning: Challenges and Approaches
Educational Planning: Challenges and Approaches
 
Monitoring and evaluation Learning and Development
Monitoring and evaluation Learning and DevelopmentMonitoring and evaluation Learning and Development
Monitoring and evaluation Learning and Development
 
Pantawid pamilya for neda sp training 10222014
Pantawid pamilya for neda sp training 10222014Pantawid pamilya for neda sp training 10222014
Pantawid pamilya for neda sp training 10222014
 
Dynamic and traditional school
Dynamic and traditional schoolDynamic and traditional school
Dynamic and traditional school
 
Curriculum Development and Change
Curriculum Development and ChangeCurriculum Development and Change
Curriculum Development and Change
 
Principal as Instructional Leader presentation
Principal as Instructional Leader presentationPrincipal as Instructional Leader presentation
Principal as Instructional Leader presentation
 
Statistical Requirements of Educational Plan
Statistical Requirements of Educational PlanStatistical Requirements of Educational Plan
Statistical Requirements of Educational Plan
 
Curriculum planning mam nourien rafique
Curriculum planning mam nourien rafiqueCurriculum planning mam nourien rafique
Curriculum planning mam nourien rafique
 
Assessing the curriculum (intended vs. implemented vs. achieved)
Assessing the curriculum (intended vs. implemented vs. achieved)Assessing the curriculum (intended vs. implemented vs. achieved)
Assessing the curriculum (intended vs. implemented vs. achieved)
 
School Improvement Plan
School Improvement PlanSchool Improvement Plan
School Improvement Plan
 
Master plan of evaluation
Master plan of evaluationMaster plan of evaluation
Master plan of evaluation
 

Semelhante a Tig 3 piloting curriculum

Tig 3 piloting curriculum(1)
Tig 3 piloting curriculum(1)Tig 3 piloting curriculum(1)
Tig 3 piloting curriculum(1)geraldine mendoza
 
Training & development
Training & developmentTraining & development
Training & developmentLata Bhatta
 
Program DesignFor learning to occur, training programs require m.docx
Program DesignFor learning to occur, training programs require m.docxProgram DesignFor learning to occur, training programs require m.docx
Program DesignFor learning to occur, training programs require m.docxwkyra78
 
IPM NDTHRD-Developing a Trainers Toolkit
IPM NDTHRD-Developing a Trainers ToolkitIPM NDTHRD-Developing a Trainers Toolkit
IPM NDTHRD-Developing a Trainers ToolkitDupani Hatanarachchi
 
Facilitator training program
Facilitator training programFacilitator training program
Facilitator training programMichael Harding
 
BE THE PROFESSIONAL COACH
BE THE PROFESSIONAL COACHBE THE PROFESSIONAL COACH
BE THE PROFESSIONAL COACHSetiono Winardi
 
Designing Your Programme (May 2015)
Designing Your Programme (May 2015)Designing Your Programme (May 2015)
Designing Your Programme (May 2015)AQD Winchester
 
Maintenance Craft Skill Maturity Matrix
Maintenance Craft Skill Maturity MatrixMaintenance Craft Skill Maturity Matrix
Maintenance Craft Skill Maturity MatrixRicky Smith CMRP, CMRT
 
Methodologies for small groups Part 1
Methodologies for small groups Part 1Methodologies for small groups Part 1
Methodologies for small groups Part 1Leo Domondon
 
Training & Development - Designing a training program - key factors, strategi...
Training & Development - Designing a training program - key factors, strategi...Training & Development - Designing a training program - key factors, strategi...
Training & Development - Designing a training program - key factors, strategi...ShatakshiSingh17
 
Training Design and Methods of Training
Training Design and Methods of TrainingTraining Design and Methods of Training
Training Design and Methods of TrainingMegha Anilkumar
 
Think before you click: steps on the road to independent learning - LILAC 2012
Think before you click: steps on the road to independent learning - LILAC 2012Think before you click: steps on the road to independent learning - LILAC 2012
Think before you click: steps on the road to independent learning - LILAC 2012Anthony Beal
 
Chapter 10 Designing and Conducting Formative Evaluations
Chapter 10 Designing and Conducting Formative EvaluationsChapter 10 Designing and Conducting Formative Evaluations
Chapter 10 Designing and Conducting Formative Evaluationscdjhaigler
 
Phases of Administering the Curriculum
Phases of Administering the CurriculumPhases of Administering the Curriculum
Phases of Administering the CurriculumSharon Geroquia
 

Semelhante a Tig 3 piloting curriculum (20)

Tig 3 piloting curriculum(1)
Tig 3 piloting curriculum(1)Tig 3 piloting curriculum(1)
Tig 3 piloting curriculum(1)
 
Training & development
Training & developmentTraining & development
Training & development
 
Program DesignFor learning to occur, training programs require m.docx
Program DesignFor learning to occur, training programs require m.docxProgram DesignFor learning to occur, training programs require m.docx
Program DesignFor learning to occur, training programs require m.docx
 
IPM NDTHRD-Developing a Trainers Toolkit
IPM NDTHRD-Developing a Trainers ToolkitIPM NDTHRD-Developing a Trainers Toolkit
IPM NDTHRD-Developing a Trainers Toolkit
 
Trg evaluation
Trg evaluationTrg evaluation
Trg evaluation
 
T&D
T&DT&D
T&D
 
Facilitator training program
Facilitator training programFacilitator training program
Facilitator training program
 
BE THE PROFESSIONAL COACH
BE THE PROFESSIONAL COACHBE THE PROFESSIONAL COACH
BE THE PROFESSIONAL COACH
 
Designing Your Programme (May 2015)
Designing Your Programme (May 2015)Designing Your Programme (May 2015)
Designing Your Programme (May 2015)
 
Reflection on Assessment
Reflection on AssessmentReflection on Assessment
Reflection on Assessment
 
Maintenance Craft Skill Maturity Matrix
Maintenance Craft Skill Maturity MatrixMaintenance Craft Skill Maturity Matrix
Maintenance Craft Skill Maturity Matrix
 
Lesson plan for training
Lesson plan for trainingLesson plan for training
Lesson plan for training
 
Methodologies for small groups Part 1
Methodologies for small groups Part 1Methodologies for small groups Part 1
Methodologies for small groups Part 1
 
Training & Development - Designing a training program - key factors, strategi...
Training & Development - Designing a training program - key factors, strategi...Training & Development - Designing a training program - key factors, strategi...
Training & Development - Designing a training program - key factors, strategi...
 
Training Design and Methods of Training
Training Design and Methods of TrainingTraining Design and Methods of Training
Training Design and Methods of Training
 
Think before you click: steps on the road to independent learning - LILAC 2012
Think before you click: steps on the road to independent learning - LILAC 2012Think before you click: steps on the road to independent learning - LILAC 2012
Think before you click: steps on the road to independent learning - LILAC 2012
 
Chapter 10 Designing and Conducting Formative Evaluations
Chapter 10 Designing and Conducting Formative EvaluationsChapter 10 Designing and Conducting Formative Evaluations
Chapter 10 Designing and Conducting Formative Evaluations
 
Phases of Administering the Curriculum
Phases of Administering the CurriculumPhases of Administering the Curriculum
Phases of Administering the Curriculum
 
Implementaion
ImplementaionImplementaion
Implementaion
 
Implementaion
ImplementaionImplementaion
Implementaion
 

Tig 3 piloting curriculum

  • 1. I - T E C H T E C H N I C A L I M P L E M E N T A T I O N G U I D E # 3 Piloting a Curriculum: Evaluating the Effectiveness of a New Training I-TECH’s Technical Implementation Guides are a series of practical and instructional papers designed to support staff and partners in their efforts to create and maintain quality programs worldwide. A successful training program requires multiple curriculum is appropriate for extensive, classroom- components, including skilled trainers with ade- based cur­ icula in which you have invested signifi- r quate content knowledge, training participants who cant resources. This type of process is not as rel- have the necessary baseline knowledge and motiva- evant for an on-site training where training topics tion to learn, and a well-written curriculum to guide might vary with the situation, or for a less struc- everyone through the process. Implementing train- tured curriculum in which experts present case ing programs often requires a significant amount studies or their own PowerPoint slides. of training staff time, trainee time spent away from work, and dollars spent on transportation, lodging Preparing for the Pilot and per diem. Pilot-testing a curriculum is an im- portant aspect of quality control in training and can The pilot evaluation is an extensive assessment help to ensure that the time and investment in train- carried out during the first time that a newly de- ing really pays off. This guide presents an approach veloped or adapted curriculum is used to conduct for conducting such an evaluation. training. Participants should be informed that they are participating in a pilot session of the training Why Pilot? workshop when they are invited and will be asked to provide extensive and honest feedback as part The purpose of piloting a curriculum is to make sure of the pilot-evaluation process. Emphasizing that the curriculum is effective, and to make changes their feedback is critical to the development and before it is distributed or offered widely. Piloting improvement of the training will often help moti- a curriculum helps to identify which sections of vate participants to complete written-evaluation the curriculum worked and which sections need forms in detail and participate actively in group dis- strengthening. The process should include a com- cussions on the training experience. On the first prehensive evaluation of the curriculum’s effective- day of the workshop, facilitators should mention ness and usefulness in achieving the course’s train- that evaluation activities will be more extensive ing objectives. The information gathered from the than normal because of the fact that this is a pilot, pilot is used to strengthen and improve the course and they should articulate the specific evaluation content, materials, and delivery strategies in the activities that will be conducted. next version of the curriculum. It is also critical to brief the facilitators in advance Pilot testing will only be effective if you have the that this is a pilot test of the curriculum. Some facil- resources, time, and ability to revise the curriculum itators may have presented similar material before based on the evaluation feedback received from and may have their own presentations that they the pilot. If, for some reason, you will not be able to are used to giving. For the curriculum pilot, it is im- make changes to the curriculum, then completing portant that all facilitators follow the curriculum as the pilot-evaluation process described below is not closely as possible. All facilitators should be pro- advisable. vided with copies of the curriculum well in advance of the workshop so that they have an oppor­unity t The process described in this guide for piloting a to review their assigned sessions.
  • 2. Piloting a Curriculum: A Technical Implementation Guide PAGE 2 Recruit at least one observer who will be present development of the curriculum to attend the pilot, at the whole training. The observer is often some- then the observer can be someone with listening one who participated in the development of the and observation skills who is knowledgeable about curriculum, but this is not a requirement. If it is not training design. The observer role is described in possible to get someone who participated in the more detail later in this guide. Areas to Evaluate During the Pilot The evaluation of a pilot curriculum should assess teaching methods, appropriateness of content, materials, issues of timing and flow, and general effectiveness of the training. Some key questions to answer through the evaluation of the pilot are listed below: Teaching methods n ere the teaching methods (lecture, discussion, group work, etc.) used in the training successful W in increasing participant knowledge/understanding? n Did some methods work particularly well? n id some methods not work and need to be changed? D Content The pilot session is not the time to review clinical accuracy of the content. The curriculum should have undergone a technical review prior to piloting to ensure that the content is clinically accurate and re- flects current guidelines. n as the content at the appropriate depth and breadth for the audience? W n as the reading level of the curriculum too difficult/easy? W n ere the right topics covered? W n ere there topics missing? W n ere there stories, examples, cases mentioned during the workshop that could be incorporated W into the curriculum? Materials n ere the materials user friendly for both trainers and participants? W n id the trainers use all of the materials? (PowerPoint slides, handouts, worksheets) D n id participants refer to the training materials? D n ere there additional materials and resources that would enhance the training? W Effectiveness n id participants acquire the intended skills and knowledge from the training? If not, what were the D weak areas? Timing and flow n as there too little or too much time allocated for individual activities? W n as there too little or too much time allocated for the workshop as a whole? W
  • 3. Piloting a Curriculum: A Technical Implementation Guide PAGE 3 should be brief and easy to complete to avoid overburdening participants. Often, daily evalua- tions only ask what participants liked and did not like about the day’s session. Since this is a pilot, you may want to include some knowledge-re- lated questions specific to the content covered that day to see if participants absorbed some of the key themes of the day. Generally, the data from the daily evaluation form does not need to be rigorously tallied and ana- lyzed; rather, the trainers should scan the forms to determine general trends. For example, if a partic- ular session seems to be receiving notably lower scores than other sessions, this would indicate Participants’ feedback should include their subjective response, i.e. what the need for further investigation to determine the worked well, and objective measures, i.e., a change in knowledge. reason. The daily evaluation should also include a question about information not understood by par- Conducting the Evaluation ticipants so that facilitators can clarify any misun- derstandings at the training the following day. The evaluation of the pilot session of a new cur- riculum should encompass feedback from three Tool 2: Final written evaluation key groups in order to triangulate data and assess Participants should complete a final written evalu- the training from different perspectives: the partici- ation at the end of the final day of the workshop. pants, the facilitators/trainers of the workshop, and This evaluation allows participants to reflect back the observer(s). on the knowledge and skills acquired during the training within the context of the workshop as a Participant Evaluation of the Workshop whole. It can often capture different information The purpose of a training event is to enhance par- than the daily evaluation, as material that was ticipants’ capacity in a specific, defined area. Par- unclear or confusing on Day 2 may have become ticipant feedback on the workshop is thus a central clear by Day 5. A final written evaluation also al- piece in assessing the effectiveness of a curriculum. lows participants to reflect on their key learning Participants’ feedback should include their subjec- from the overall workshop and assess how they tive response to the workshop (What worked well? will be able to use what they have learned when What did they feel they learned? What did they not they return to their work settings. understand? How will they apply what they learned in the workshop in their work setting?) and objective Tool 3: Focus group discussion measures, such as a change in their knowledge and/ Focus group discussions with participants at or skills resulting from the workshop. Following are the end of the workshop can be a useful meth- specific tools for collecting participants’ feedback: od for obtaining feedback about the workshop and curriculum. The group discussion format in- Tool 1: Daily written evaluations spires synergies that can result in a rich discus- A daily evaluation, to be completed by partici- sion, evoking different information than might pants at the end of each day of the workshop, be provided in a written format. It also facilitates captures participants’ reactions to workshop feedback from those who are more comfortable sessions and activities while the information is expressing themselves verbally than in writing. still fresh in their minds. A daily evaluation form
  • 4. Piloting a Curriculum: A Technical Implementation Guide PAGE 4 Participants should be divided into groups of ap- systematic feedback from the trainers using the proximately 6-8 for the focus groups. Trainers curriculum and supporting training materials. This should identify individuals not directly involved assessment should include the trainers’ experi- with the workshop to facilitate the focus group ences in using the materials, their feedback on the discussions and take notes. A focus group dis- thoroughness and appropriateness of the content, cussion guide should be developed and used to and their suggestions for activities, stories, case structure the discussion, but should not limit or studies, and examples that can be incorporated constrain the discussion. Key themes emerging into future editions of the curriculum. from the groups should be identified and sum- marized. Refer to I-TECH’s Organizing and Con- Tool 1: Written evaluation ducting Focus Groups for more detailed informa- A written evaluation should be completed by tion on focus group discussions. each facilitator at the end of each session he/she facilitates. This evaluation form should ask the Tool 4: Pre- and post-test facilitator to reflect on what worked well in the The purpose of the pre- and post-test is to pro- session and what did not work well. The evalu- vide an objective measure of changes in knowl- ation can also ask the facilitator to comment edge and/or skills resulting from the training, and specifically on how he or she used the facilitator thus can serve to provide valuable information manual and the slides for that specific session. about the effectiveness of the curriculum. Although it may seem like a lot of work for the facilitators to complete an evaluation after each Caution must be used in interpreting the pre- session of the training, it is important to capture and post-test results in terms of implications for their thoughts and very specific feedback while the curriculum. The pre/post-test must also be it is still fresh and clear in their minds. piloted to determine whether it is valid. For ex- ample, significant increases in scores between the pre-test and the post-test can serve as an indication that the workshop achieved its ob- jectives and the curriculum was effective; lack of substantial increase in scores, however, re- quires further analysis and interpretation. A lack of change or decrease could reflect a poorly de- signed test rather than indicate a problem with the curriculum. A simple method of gauging the validity of the test is to discuss the responses to the test dur- ing the training and after participants have com- pleted it on their own. This will enable facilitators to see if the questions themselves were unclear or whether there were gaps in the way the ma- terial was covered during the training. Refer to I-TECH’s Guidelines for Pre- and Post-Testing for more detailed information about designing, vali- dating, and interpreting pre/post tests. Facilitator Evaluation of Curriculum Materials Participants should complete a written evaluation at the end of the final day The pilot evaluation should include structured and of the workshop.
  • 5. Piloting a Curriculum: A Technical Implementation Guide PAGE 5 Tool 2: Daily debrief meetings Observer Evaluation of Workshop and Use of Trainers and observers (see next section) should Curriculum Materials hold a short debriefing at the end of each day. The observer(s) plays an invaluable role during The purpose of this debrief is to discuss the day’s the piloting of a curriculum. Since observers are activities (what worked well, what needs to be not directly involved in the implementation of the changed/improved) as well as to prepare for the workshop, they are able to carefully monitor ele- following day. A simple form can be used to guide ments, such as use of the materials by trainers and the discussion and a note taker should be desig- participants, the interaction between participants nated to ensure that key points are captured. De- and trainers, and participants’ level of engagement. briefings can be extremely effective in identifying They can also be tasked with capturing stories, cas- issues or concerns with the curriculum as well as es, and examples used by trainers to incorporate strategies to address these concerns. into the curriculum and providing general overall feedback on each session. The observer can make notes directly on a copy of the facilitator’s guide during the presentation of each session. Observers should note if an activ- ity did not work with the group or if the facilitator skipped a specific activity or session. The observer can also note which activities took more time than anticipated or which ones led to interesting discus- sions about topics of importance to the learning objectives of the curriculum. If participants appear confused at certain times during the training ses- sion, the observer can note at what point this con- fusion occurred. Likewise, if the facilitator deviates from the curriculum to clear up a point of confu- sion, the observer should note this as well. It is helpful to develop an “Observation Guide” for the observer to fill out at the end of each session of the training. This form is a helpful tool to remind observers what to look for, such as the timing and flow of the session, the use of the facilitator and participant manuals, participant involvement, and the quality of the content in the session. Observers should also participate in the daily de- brief meeting with the facilitators to review the suc- cesses and challenges of the day’s sessions and to share their observations. Lastly, the observers should summarize their detailed comments each day so that they can be easily fed back into the cur- Trainers and observers should hold a short debriefing at the end of each day. riculum review process.
  • 6. Piloting a Curriculum: A Technical Implementation Guide PAGE 6 dress the most important issues first depending on Evaluating Non-Classroom available time and resources. Revising the curricu- Training Tools lum may include: Your curriculum may include components that n odifying slides or handouts; M take place outside of the classroom, like field n nhancing and expanding facilitator’s notes; E visits or preceptorships. You may also have n eveloping new activities or methodologies, D created specific tools, like trigger tape sce- such as case studies; narios, to enhance your curriculum. Each of n dding, deleting, or reordering content; A these additional pieces should be evaluated n ewriting sections of the facilitator manual; R individually through separate evaluation forms n implifying the level of language used; S or group discussions with facilitators and n dding more clinical depth; A participants. You should also include specific n eveloping additional resources, such as a Par- D questions about these experiences or tools in ticipant Reference Manual or Pocket Guide; the final participant-evaluation forms, observa- n ethinking the length of time of the workshop, R tion guides, and facilitator-feedback forms. or making difficult decisions about what can be covered in the time available for training. Be sure that the revision includes any changes or Revising the Curriculum updates to local, national, or regional guidelines rel- evant to the training content. After the pilot training is complete, it is time to re- view all that you have learned and use it to make If extensive changes are made to the curriculum changes to the curriculum. As mentioned earlier, this during the revision process, it is helpful to conduct is the most important part of the entire pilot-evalua- a second pilot evaluation. This involves repeating tion process. The first step is to compile and analyze the methodologies described here during the first all the data collected. The pre- and post-test should presentation of the revised curriculum. For a large, be scored and results entered into a spreadsheet by technical curriculum that is expected to be adopted question; this will enable you to determine if there widely in training health care workers, the time and were particular content areas that were not well un- energy spent evaluating the curriculum will pay off derstood by participants (this assumes that the pre/ in the long run with a more effective training and post-test has been validated). Participant evaluation better prepared health care workers. results, both written and from the focus groups, should be summarized and analyzed to identify key themes and issues, and recurring comments about what worked and what did not. Observer and trainer feedback on specific sessions should be compiled in one place and relevant feedback from the trainer- debrief sessions should be summarized. Once you have a good sense of what the data are telling you, the next step is to revise the curriculum and the supporting materials based on the feed- back. This can be an extensive process depending on the type of feedback received; it may be neces- sary to prioritize all the suggested changes to ad-
  • 7. Piloting a Curriculum: A Technical Implementation Guide PAGE 7 Acknowledgments Funding This document was developed with funding from Cooperative Agreement U69HA00047 from the US Department of Health and Human Services Health Resources and Services Agency (HRSA); its contents are solely the responsibility of the authors and do not necessarily represent the views of HRSA. About I-TECH The International Training and Education Center for Health (I-TECH) is a collaboration between the University of Washington and the University of California, San Francisco. Its mission is to support the development of a skilled health care work force and well-orga- nized national health delivery systems to provide effective prevention, care, and treatment of infectious diseases in resource-limited settings. Staff work in Africa, Asia, and the Caribbean in partnership with local ministries of health, universities, non-governmental organizations, and medical facilities. I-TECH 901 Boren Avenue, Suite 1100 Seattle, Washington 98104 USA www.go2itech.org © 2008 University of Washington Please seek the permission of I-TECH before reproducing, adapting, or excerpting from this guide.
  • 8. Piloting a Curriculum: A Technical Implementation Guide PAGE 8 January 2010 ©I-TECH