SlideShare uma empresa Scribd logo
1 de 4
Baixar para ler offline
Best Practices
   IDEA Feedback for Chairs
   and Other Administrators
                                                                     Insight. Improvement. Impact.®


The primary purpose of all IDEA Feedback survey instruments is to provide information that
may be used to guide professional development efforts. Employing processes that encourage
conscientious responses by those surveyed and thoughtful reflection by those receiving results
is essential if the survey process is to be of value.

The following considerations and recommendations are intended to assist in the successful
implementation of the survey process.

Determining the Survey Purpose
Is the purpose of the feedback to determine continuance, to make other “summative” decisions,
and/or to guide improvement efforts?
The IDEA administrator feedback tools were designed to guide formative improvement.
Although the surveys can be used as part of the summative judgment process, a tangible
benefit of the IDEA tools is to use them for formative purposes. A discussion of how to
encourage reflection using IDEA results is addressed later in this document.

When the survey is being used for making administrative decisions, The IDEA Center strongly
recommends that other sources of evidence beyond the IDEA feedback tool be used and that
survey participants be informed of this.

Typically, expectations and goals for administrators vary from one unit to the next. For
example, respondents often are not aware of the charge an administrator received when he or
she assumed the position. Identifying and using other sources of information to assess those
unique expectations is clearly important.

Adopting a Systematic Rather than Sporadic Survey Administration Schedule
Routine ongoing survey administration is beneficial for all concerned. Predictable, ongoing
assessment processes reduce concerns about the intended purpose of an evaluation.
Campuses risk creating the impression that the evaluation/feedback process is tied to poor
performance when a survey is only administered when there is a “problem.” This would be
counterproductive to creating a culture where constructive feedback is valued for its support in
professional development.

The IDEA Center makes no recommendation as to how frequently these ongoing, systematic
administrator evaluations should occur, with one exception. A desirable time to seek initial
formative feedback is in one’s third semester of service. By this time respondents will have had
enough experience with the administrator to provide valuable feedback, but perceptions will not
be so entrenched that they will be difficult to change. Knowing what may need to be changed
early in one’s career increases the likelihood of a longer and more successful tenure.

Identifying Appropriate Respondents
Identifying who should be asked to complete a survey is not as simple a task as it might first
appear. Often political considerations interfere with gaining the most useful and valid
information. Some campuses select respondents by categories of employment and within those
categories there may be individuals new to campus or part-time employees who have had no
contact with the administrator. Criteria for removal of individuals from the list to be surveyed
should be created so only those who can provide valid feedback are included. In general, only
individuals who have had enough exposure to the administrator being rated to provide useful
feedback about that individual should be asked to complete a survey.
Whatever process is used to select potential survey respondents, it needs to be done in a
similar way for everyone who is being rated.

Using Additional Questions
The IDEA feedback systems allow for the addition of 20 user-defined questions. This provides
the opportunity to ask questions that may be of unique importance to your campus or to the
individual being rated. Given the effort required to administer these surveys, this opportunity
should not be lost.

Another use of extra questions is to collect useful information not directly related to the
individual being rated. Asking a few questions about the campus, college, or departmental
climate (or other issues) is an efficient way to collect information without conducting another
survey. Using this method also has a side benefit due to the relatively high response rates
typical of such surveys compared to more general campus climate or issue surveys.

Determining Dissemination of Results to Relevant Constituencies
There is no clear right or wrong approach to providing survey results to relevant constituencies,
but it must be considered thoughtfully and determined prior to the survey administration. It is
recommended that the “official” survey results be distributed in a consistent fashion. This is
especially important when a group of individuals is being surveyed at a similar time for a similar
purpose (e.g., all senior administrative staff, all of the deans in a university, or all of the chairs in
a college).

If the results are to be provided beyond the individual administrator and his or her supervisor, it
is important to determine how they will be conveyed and to whom.
     • How will the results be communicated? Determine if a written or verbal summary will be
         provided. If the results will be provided in writing, care should be taken because the
         results may be distributed beyond the intended audience – especially if they are
         distributed electronically.
    •   Who will prepare the summary? It is typical for the summary to be prepared
        collaboratively with the individual who was rated and his or her supervisor.
    •   Can the administrator provide additional information about his or her results? For
        example, chairs have provided more detailed survey results to their faculty at
        department meetings and described their plans for improvement at that time. On one
        campus an administrator early in his career with particularly poor results extended a
        public “mia culpa” to his staff and was very proactive in acknowledging his poor
        performance as well as his plans to change. Reports of the results of these more
        comprehensive feedback sessions have generally been received positively.
    •   What are the legal requirements for dissemination of survey results? Different states
        have different legal requirements concerning what survey results (if any) are required to
        be made available to the general public. Be sure to check your state’s requirements so
        that you collect data in a manner that protects you from disseminating results more
        broadly than you wish.

Prepare Respondents
Inform respondents by email, mail, or personal communication at least one week before the
survey is to be administered that they will receive a survey from The IDEA Center requesting
feedback about their administrator’s performance. The communications may come from one or
all of the following: the individual being rated, his/her superior, a human resources contact. The
purpose of the survey often helps determine who should send the communication. If the primary
purpose is to determine the individual’s continuance in his/her position, the administrator’s
superior might inform the raters. On the other hand, if the primary purpose is to guide personal
reflection, it might be more appropriate for the individual being rated to provide the information.
Some specific points to cover in the communication include:
  • Emphasize the value and importance of the feedback. No matter how well someone is
      currently performing, something can always be learned from constructive feedback from
      valid sources. And, without it, improvement rarely occurs.
   •   Inform respondents about the intended audience. If the results are used only to guide
       improvement, it may be that the only person who will see the results is the administrator
       being rated. If used for administrative purposes, the individual’s supervisor and perhaps
       others will view the findings. In other instances, the results may be published or
       summarized for a broader audience, including the raters.
   •   Set the tone that encourages constructive feedback. Encourage respondents to offer
       constructive and not mean-spirited or vindictive comments. This is especially important
       when soliciting responses to open-ended questions.
   •   Inform respondents of the survey’s purpose. It is important for them to know if the results
       will be used to determine continuance, to make other “summative” decisions, and/or to
       guide improvement efforts.
   •   Summarize other sources of evidence used in the evaluation process. A failure to inform
       those surveyed that other sources of evidence will be used in the evaluation process
       may lead them to conclude that their responses have more influence than they actually
       do for the administrative decision being made.
   •   Indicate if, how, what, and to whom results will be disseminated. Having a clear plan and
       communicating that plan early will provide clear expectations to those participating in the
       process.

Additional information about preparing respondents is available in the Guide to Administering
IDEA Feedback Systems for College and University Administrators
(www.theideacenter.org/AdminGuide).

Reflect, Reflect, Reflect
A primary purpose of all IDEA feedback surveys is to encourage reflection intended to elicit
higher performance. At the very least, the individual being rated should be asked to respond to
a few basic questions such as:
    • What results were affirming?
    • What results were surprising?
    • What may have been of concern?
    • How will you use this information to guide your professional development efforts?

The individual’s supervisor should reflect on the same questions and schedule a time free from
interruptions in a comfortable setting to discuss the results and how they might be used to
support future improvement efforts.

A practice that many individuals have found to be useful is for the individual being rated to print
a copy of the survey being used from The IDEA Center’s website (www.theideacenter.org) and
to then complete a self-rating. When the survey results are available, the individual being rated
can compare his or her self-rating to the average of the raters. When the average of the raters
for an item is more than .5 (on a five-point scale) to .75 (on a seven point scale) different from
the self-rating, a difference in perception between raters and the administrator may exist.
Perceptions are important, even if they are inaccurate, and it can be beneficial to ask why
individuals might view you differently than you view yourself. In addition, some campuses will
ask the administrator’s supervisor to also participate in this process to discover where additional
differences in perceptions might exist.

The opportunity for the administrator who was rated to reflect about the survey results with
someone who is not responsible for evaluating him/her is another near essential component of a
useful professional development process. The consultant should be a knowledgeable expert in
leadership development as well as understand the administrator’s role. A resource to help
individuals reflect upon results in a non-threatening way will enhance the value of the process.
Such a resource demonstrates the institution’s commitment to professional development.

A thoughtfully designed reflection process is essential to truly foster benefits from the survey
process. The above recommendations along with the Guide to Administering IDEA Feedback
Systems for College and University Administrators are intended to support the effective
implementation of the administrator feedback surveys.




                                                                                 ©2010 The IDEA Center

Mais conteúdo relacionado

Mais procurados

Alcazar methods of evaluation
Alcazar  methods of evaluationAlcazar  methods of evaluation
Alcazar methods of evaluationYouise Saculo
 
Developmental activities full2
Developmental activities full2Developmental activities full2
Developmental activities full2Muhammad Irfan
 
Disengagement Stages
Disengagement StagesDisengagement Stages
Disengagement StagesLeslie3509
 
Evaluation approaches presented by hari bhusal
Evaluation approaches presented by hari bhusalEvaluation approaches presented by hari bhusal
Evaluation approaches presented by hari bhusalHari Bhushal
 
ISCR_CDM-BS-MW Conference Dr Chandra Babu
ISCR_CDM-BS-MW Conference Dr Chandra BabuISCR_CDM-BS-MW Conference Dr Chandra Babu
ISCR_CDM-BS-MW Conference Dr Chandra BabuDr.Chandra Babu, PhD
 
Communication Audit MOY
Communication Audit MOYCommunication Audit MOY
Communication Audit MOYAshley Pollard
 
Nursing evaluations ppt
Nursing evaluations pptNursing evaluations ppt
Nursing evaluations pptJamie Saltkill
 
Doing the Right Things Right
Doing the Right Things RightDoing the Right Things Right
Doing the Right Things RightRaffaele Nappi
 
Stakeholder analysis-toolkit-v3
Stakeholder analysis-toolkit-v3Stakeholder analysis-toolkit-v3
Stakeholder analysis-toolkit-v3andre rahman
 
Vicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact EvaluationVicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact EvaluationJosh Chandler
 
Training program effectiveness a measuring instrument (1)
Training program effectiveness a measuring instrument (1)Training program effectiveness a measuring instrument (1)
Training program effectiveness a measuring instrument (1)TheGrowthFactor
 
Communication Audit - Port Nelson
Communication Audit - Port NelsonCommunication Audit - Port Nelson
Communication Audit - Port NelsonJada Polglase
 
Communications plan
Communications planCommunications plan
Communications planNick Taylor
 
IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Rob...
IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Rob...IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Rob...
IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Rob...Institute of Development Studies
 
Needs Assessments
Needs AssessmentsNeeds Assessments
Needs Assessmentsddcoker
 
self evaluation,peer evaluation,patient satisfaction and utilization review
self evaluation,peer evaluation,patient satisfaction and utilization reviewself evaluation,peer evaluation,patient satisfaction and utilization review
self evaluation,peer evaluation,patient satisfaction and utilization reviewteenajoseb
 
LPC Managing Differences and Difficult Populations
LPC Managing Differences and Difficult PopulationsLPC Managing Differences and Difficult Populations
LPC Managing Differences and Difficult PopulationsGlenn Duncan
 

Mais procurados (20)

Alcazar methods of evaluation
Alcazar  methods of evaluationAlcazar  methods of evaluation
Alcazar methods of evaluation
 
Developmental activities full2
Developmental activities full2Developmental activities full2
Developmental activities full2
 
Disengagement Stages
Disengagement StagesDisengagement Stages
Disengagement Stages
 
Evaluation approaches presented by hari bhusal
Evaluation approaches presented by hari bhusalEvaluation approaches presented by hari bhusal
Evaluation approaches presented by hari bhusal
 
ISCR_CDM-BS-MW Conference Dr Chandra Babu
ISCR_CDM-BS-MW Conference Dr Chandra BabuISCR_CDM-BS-MW Conference Dr Chandra Babu
ISCR_CDM-BS-MW Conference Dr Chandra Babu
 
Communication Audit MOY
Communication Audit MOYCommunication Audit MOY
Communication Audit MOY
 
Nursing evaluations ppt
Nursing evaluations pptNursing evaluations ppt
Nursing evaluations ppt
 
Doing the Right Things Right
Doing the Right Things RightDoing the Right Things Right
Doing the Right Things Right
 
Stakeholder analysis-toolkit-v3
Stakeholder analysis-toolkit-v3Stakeholder analysis-toolkit-v3
Stakeholder analysis-toolkit-v3
 
Vicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact EvaluationVicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact Evaluation
 
Training program effectiveness a measuring instrument (1)
Training program effectiveness a measuring instrument (1)Training program effectiveness a measuring instrument (1)
Training program effectiveness a measuring instrument (1)
 
Communication Audit - Port Nelson
Communication Audit - Port NelsonCommunication Audit - Port Nelson
Communication Audit - Port Nelson
 
assessment needs
 assessment needs assessment needs
assessment needs
 
Communications plan
Communications planCommunications plan
Communications plan
 
IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Rob...
IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Rob...IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Rob...
IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Rob...
 
Needs Assessments
Needs AssessmentsNeeds Assessments
Needs Assessments
 
QI Project presentation
QI Project presentationQI Project presentation
QI Project presentation
 
self evaluation,peer evaluation,patient satisfaction and utilization review
self evaluation,peer evaluation,patient satisfaction and utilization reviewself evaluation,peer evaluation,patient satisfaction and utilization review
self evaluation,peer evaluation,patient satisfaction and utilization review
 
LPC Managing Differences and Difficult Populations
LPC Managing Differences and Difficult PopulationsLPC Managing Differences and Difficult Populations
LPC Managing Differences and Difficult Populations
 
Impact evaluation an overview
Impact evaluation  an overviewImpact evaluation  an overview
Impact evaluation an overview
 

Destaque

Dillard University Academic Calendar 2012-2013_revised_091112
Dillard University Academic Calendar 2012-2013_revised_091112Dillard University Academic Calendar 2012-2013_revised_091112
Dillard University Academic Calendar 2012-2013_revised_091112Dillard University Library
 
Faith in the midst of sickness-mandarin
Faith in the midst of sickness-mandarinFaith in the midst of sickness-mandarin
Faith in the midst of sickness-mandarinAlex Tang
 
Dillard University Spring 2012 Final Exams and Work Study Students
Dillard University Spring 2012 Final Exams and Work Study StudentsDillard University Spring 2012 Final Exams and Work Study Students
Dillard University Spring 2012 Final Exams and Work Study StudentsDillard University Library
 
Duke U. Visting Faculty Call for Proposals 2012 13
Duke U. Visting Faculty Call for Proposals 2012 13Duke U. Visting Faculty Call for Proposals 2012 13
Duke U. Visting Faculty Call for Proposals 2012 13Dillard University Library
 
Du educational program change form revised 12 11
Du educational program change form revised 12 11Du educational program change form revised 12 11
Du educational program change form revised 12 11Dillard University Library
 

Destaque (9)

Dillard University Academic Calendar 2012-2013_revised_091112
Dillard University Academic Calendar 2012-2013_revised_091112Dillard University Academic Calendar 2012-2013_revised_091112
Dillard University Academic Calendar 2012-2013_revised_091112
 
Faith in the midst of sickness-mandarin
Faith in the midst of sickness-mandarinFaith in the midst of sickness-mandarin
Faith in the midst of sickness-mandarin
 
DU CTLAT Workshop Handouts 020410
DU CTLAT Workshop Handouts 020410DU CTLAT Workshop Handouts 020410
DU CTLAT Workshop Handouts 020410
 
Indiana university[1]
Indiana university[1]Indiana university[1]
Indiana university[1]
 
Dillard University Spring 2012 Final Exams and Work Study Students
Dillard University Spring 2012 Final Exams and Work Study StudentsDillard University Spring 2012 Final Exams and Work Study Students
Dillard University Spring 2012 Final Exams and Work Study Students
 
Duke U. Visting Faculty Call for Proposals 2012 13
Duke U. Visting Faculty Call for Proposals 2012 13Duke U. Visting Faculty Call for Proposals 2012 13
Duke U. Visting Faculty Call for Proposals 2012 13
 
Peer Review Guide Final March10[1]
Peer Review Guide Final March10[1]Peer Review Guide Final March10[1]
Peer Review Guide Final March10[1]
 
Du educational program change form revised 12 11
Du educational program change form revised 12 11Du educational program change form revised 12 11
Du educational program change form revised 12 11
 
If Products Could Speak Mar 30 2009
If Products Could Speak Mar 30 2009If Products Could Speak Mar 30 2009
If Products Could Speak Mar 30 2009
 

Semelhante a IDEA Best Practice Chairs & Admin

Evaluation Of A Employee Evaluation System
Evaluation Of A Employee Evaluation SystemEvaluation Of A Employee Evaluation System
Evaluation Of A Employee Evaluation SystemJenny Richardson
 
School of ManagementProgram EvaluationMPA 513Week 3.docx
School of ManagementProgram EvaluationMPA 513Week 3.docxSchool of ManagementProgram EvaluationMPA 513Week 3.docx
School of ManagementProgram EvaluationMPA 513Week 3.docxanhlodge
 
Performance appraisal survey
Performance appraisal surveyPerformance appraisal survey
Performance appraisal surveydbell3034
 
ISE Webinar - 'Squaring the circle'
ISE Webinar - 'Squaring the circle'ISE Webinar - 'Squaring the circle'
ISE Webinar - 'Squaring the circle'Anima & Atman
 
PHAR40101 Advanced Pharmaceutical Sciences.docx
PHAR40101 Advanced Pharmaceutical Sciences.docxPHAR40101 Advanced Pharmaceutical Sciences.docx
PHAR40101 Advanced Pharmaceutical Sciences.docxwrite5
 
Data Driven Decision Making Presentation
Data Driven Decision Making PresentationData Driven Decision Making Presentation
Data Driven Decision Making PresentationRussell Kunz
 
Evaluating Results and Benefits Week #7 Lecture 1The Importa
Evaluating Results and Benefits Week #7 Lecture 1The ImportaEvaluating Results and Benefits Week #7 Lecture 1The Importa
Evaluating Results and Benefits Week #7 Lecture 1The ImportaBetseyCalderon89
 
Recruitment & selection ch# 15 & 16
Recruitment & selection ch# 15 & 16Recruitment & selection ch# 15 & 16
Recruitment & selection ch# 15 & 16Dreams Design
 
Peer Review white paper
Peer Review white paperPeer Review white paper
Peer Review white paperAlvyn Yeoh
 
Methods Of Program Evaluation. Evaluation Research Is Offered
Methods Of Program Evaluation. Evaluation Research Is OfferedMethods Of Program Evaluation. Evaluation Research Is Offered
Methods Of Program Evaluation. Evaluation Research Is OfferedJennifer Wood
 
Program Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxProgram Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxstilliegeorgiana
 
How to Conduct a Survey gf form to anylyz
How to Conduct a Survey gf form to anylyzHow to Conduct a Survey gf form to anylyz
How to Conduct a Survey gf form to anylyzedenjrodrigo
 
Impact practice in the third sector for public health practitioners
Impact practice in the third sector   for public health practitionersImpact practice in the third sector   for public health practitioners
Impact practice in the third sector for public health practitionersCatherine A. Greaves
 
35471324 hr-project-on-performance-appraisal
35471324 hr-project-on-performance-appraisal35471324 hr-project-on-performance-appraisal
35471324 hr-project-on-performance-appraisalVishwash Singh
 
A Study On Employee Perception Towards Performance Appraisal System In Arun I...
A Study On Employee Perception Towards Performance Appraisal System In Arun I...A Study On Employee Perception Towards Performance Appraisal System In Arun I...
A Study On Employee Perception Towards Performance Appraisal System In Arun I...Jill Brown
 
Ch. 13 designing and conducting summative evaluations
Ch. 13 designing and conducting summative evaluationsCh. 13 designing and conducting summative evaluations
Ch. 13 designing and conducting summative evaluationsEzraGray1
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christieharrindl
 
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docx
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docxChapter 5 Program Evaluation and Research TechniquesCharlene R. .docx
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docxchristinemaritza
 
Performance appraisals new for update
Performance appraisals new for updatePerformance appraisals new for update
Performance appraisals new for updatemahesh sharma
 

Semelhante a IDEA Best Practice Chairs & Admin (20)

Steps in developing a valid competency model
Steps in developing a valid competency modelSteps in developing a valid competency model
Steps in developing a valid competency model
 
Evaluation Of A Employee Evaluation System
Evaluation Of A Employee Evaluation SystemEvaluation Of A Employee Evaluation System
Evaluation Of A Employee Evaluation System
 
School of ManagementProgram EvaluationMPA 513Week 3.docx
School of ManagementProgram EvaluationMPA 513Week 3.docxSchool of ManagementProgram EvaluationMPA 513Week 3.docx
School of ManagementProgram EvaluationMPA 513Week 3.docx
 
Performance appraisal survey
Performance appraisal surveyPerformance appraisal survey
Performance appraisal survey
 
ISE Webinar - 'Squaring the circle'
ISE Webinar - 'Squaring the circle'ISE Webinar - 'Squaring the circle'
ISE Webinar - 'Squaring the circle'
 
PHAR40101 Advanced Pharmaceutical Sciences.docx
PHAR40101 Advanced Pharmaceutical Sciences.docxPHAR40101 Advanced Pharmaceutical Sciences.docx
PHAR40101 Advanced Pharmaceutical Sciences.docx
 
Data Driven Decision Making Presentation
Data Driven Decision Making PresentationData Driven Decision Making Presentation
Data Driven Decision Making Presentation
 
Evaluating Results and Benefits Week #7 Lecture 1The Importa
Evaluating Results and Benefits Week #7 Lecture 1The ImportaEvaluating Results and Benefits Week #7 Lecture 1The Importa
Evaluating Results and Benefits Week #7 Lecture 1The Importa
 
Recruitment & selection ch# 15 & 16
Recruitment & selection ch# 15 & 16Recruitment & selection ch# 15 & 16
Recruitment & selection ch# 15 & 16
 
Peer Review white paper
Peer Review white paperPeer Review white paper
Peer Review white paper
 
Methods Of Program Evaluation. Evaluation Research Is Offered
Methods Of Program Evaluation. Evaluation Research Is OfferedMethods Of Program Evaluation. Evaluation Research Is Offered
Methods Of Program Evaluation. Evaluation Research Is Offered
 
Program Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxProgram Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docx
 
How to Conduct a Survey gf form to anylyz
How to Conduct a Survey gf form to anylyzHow to Conduct a Survey gf form to anylyz
How to Conduct a Survey gf form to anylyz
 
Impact practice in the third sector for public health practitioners
Impact practice in the third sector   for public health practitionersImpact practice in the third sector   for public health practitioners
Impact practice in the third sector for public health practitioners
 
35471324 hr-project-on-performance-appraisal
35471324 hr-project-on-performance-appraisal35471324 hr-project-on-performance-appraisal
35471324 hr-project-on-performance-appraisal
 
A Study On Employee Perception Towards Performance Appraisal System In Arun I...
A Study On Employee Perception Towards Performance Appraisal System In Arun I...A Study On Employee Perception Towards Performance Appraisal System In Arun I...
A Study On Employee Perception Towards Performance Appraisal System In Arun I...
 
Ch. 13 designing and conducting summative evaluations
Ch. 13 designing and conducting summative evaluationsCh. 13 designing and conducting summative evaluations
Ch. 13 designing and conducting summative evaluations
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christie
 
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docx
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docxChapter 5 Program Evaluation and Research TechniquesCharlene R. .docx
Chapter 5 Program Evaluation and Research TechniquesCharlene R. .docx
 
Performance appraisals new for update
Performance appraisals new for updatePerformance appraisals new for update
Performance appraisals new for update
 

Mais de Dillard University Library

Dillard University General Assembly Reminder Wednesday May 1 2013
Dillard University General Assembly Reminder Wednesday May 1 2013 Dillard University General Assembly Reminder Wednesday May 1 2013
Dillard University General Assembly Reminder Wednesday May 1 2013 Dillard University Library
 
Scholar Val 2013 DU Undergraduate Research in Psychology April 12 2013
Scholar Val 2013 DU Undergraduate Research in Psychology April 12 2013Scholar Val 2013 DU Undergraduate Research in Psychology April 12 2013
Scholar Val 2013 DU Undergraduate Research in Psychology April 12 2013Dillard University Library
 
Du curriculum committee guidelines revised 01-13
Du curriculum committee guidelines revised 01-13Du curriculum committee guidelines revised 01-13
Du curriculum committee guidelines revised 01-13Dillard University Library
 
DU CTLAT Multiculturalism Spring 2013 Presentation
DU CTLAT Multiculturalism Spring 2013 PresentationDU CTLAT Multiculturalism Spring 2013 Presentation
DU CTLAT Multiculturalism Spring 2013 PresentationDillard University Library
 
Dillard University S.O.A.R. Spring 2013 Play by-Play
Dillard University S.O.A.R. Spring 2013 Play by-Play Dillard University S.O.A.R. Spring 2013 Play by-Play
Dillard University S.O.A.R. Spring 2013 Play by-Play Dillard University Library
 
Dillard University Spring 2013 Resource Referral Guide
Dillard University Spring 2013 Resource Referral Guide Dillard University Spring 2013 Resource Referral Guide
Dillard University Spring 2013 Resource Referral Guide Dillard University Library
 
Dillard University Final Exam Schedule Fall 2012 rev.2
Dillard University Final Exam Schedule Fall 2012 rev.2Dillard University Final Exam Schedule Fall 2012 rev.2
Dillard University Final Exam Schedule Fall 2012 rev.2Dillard University Library
 
ABPSI Personal Statement Prep November 17th 2012 Dillard University
ABPSI Personal Statement Prep November 17th 2012 Dillard UniversityABPSI Personal Statement Prep November 17th 2012 Dillard University
ABPSI Personal Statement Prep November 17th 2012 Dillard UniversityDillard University Library
 
DU Fall 2012 QEP Speaker Professor Walter J. Lane
DU Fall 2012 QEP Speaker Professor Walter J. LaneDU Fall 2012 QEP Speaker Professor Walter J. Lane
DU Fall 2012 QEP Speaker Professor Walter J. LaneDillard University Library
 
AABHE Doctoral Student_Award_2013_Final_Document
AABHE Doctoral Student_Award_2013_Final_DocumentAABHE Doctoral Student_Award_2013_Final_Document
AABHE Doctoral Student_Award_2013_Final_DocumentDillard University Library
 
Take Back the Night October 23 2012 Dr. Eartha Johnson Dillard University
Take Back the Night October 23 2012 Dr. Eartha Johnson Dillard University  Take Back the Night October 23 2012 Dr. Eartha Johnson Dillard University
Take Back the Night October 23 2012 Dr. Eartha Johnson Dillard University Dillard University Library
 
Dillard University Faculty Staff Institute Program Fall 2012
Dillard University Faculty Staff Institute Program Fall 2012Dillard University Faculty Staff Institute Program Fall 2012
Dillard University Faculty Staff Institute Program Fall 2012Dillard University Library
 

Mais de Dillard University Library (20)

2013 pod travel fellowship announcement final
2013 pod travel fellowship announcement final2013 pod travel fellowship announcement final
2013 pod travel fellowship announcement final
 
Dillard University General Assembly Reminder Wednesday May 1 2013
Dillard University General Assembly Reminder Wednesday May 1 2013 Dillard University General Assembly Reminder Wednesday May 1 2013
Dillard University General Assembly Reminder Wednesday May 1 2013
 
So tl institute application du2013
So tl institute application du2013So tl institute application du2013
So tl institute application du2013
 
Scholar Val 2013 DU Undergraduate Research in Psychology April 12 2013
Scholar Val 2013 DU Undergraduate Research in Psychology April 12 2013Scholar Val 2013 DU Undergraduate Research in Psychology April 12 2013
Scholar Val 2013 DU Undergraduate Research in Psychology April 12 2013
 
UNCF 2013 Faculty Development Programs
UNCF 2013 Faculty Development ProgramsUNCF 2013 Faculty Development Programs
UNCF 2013 Faculty Development Programs
 
Du curriculum committee guidelines revised 01-13
Du curriculum committee guidelines revised 01-13Du curriculum committee guidelines revised 01-13
Du curriculum committee guidelines revised 01-13
 
Dillard university phonathon february 2013
Dillard university phonathon february 2013Dillard university phonathon february 2013
Dillard university phonathon february 2013
 
DU CTLAT Multiculturalism Spring 2013 Presentation
DU CTLAT Multiculturalism Spring 2013 PresentationDU CTLAT Multiculturalism Spring 2013 Presentation
DU CTLAT Multiculturalism Spring 2013 Presentation
 
DU Spring 2013 QEP Grid 2013
DU Spring 2013 QEP Grid 2013DU Spring 2013 QEP Grid 2013
DU Spring 2013 QEP Grid 2013
 
DU S.O.A.R. Advising Flowchart Jan. 2013
DU S.O.A.R. Advising Flowchart Jan. 2013DU S.O.A.R. Advising Flowchart Jan. 2013
DU S.O.A.R. Advising Flowchart Jan. 2013
 
Dillard University S.O.A.R. Spring 2013 Play by-Play
Dillard University S.O.A.R. Spring 2013 Play by-Play Dillard University S.O.A.R. Spring 2013 Play by-Play
Dillard University S.O.A.R. Spring 2013 Play by-Play
 
Dillard University Spring 2013 Resource Referral Guide
Dillard University Spring 2013 Resource Referral Guide Dillard University Spring 2013 Resource Referral Guide
Dillard University Spring 2013 Resource Referral Guide
 
Dillard University Final Exam Schedule Fall 2012 rev.2
Dillard University Final Exam Schedule Fall 2012 rev.2Dillard University Final Exam Schedule Fall 2012 rev.2
Dillard University Final Exam Schedule Fall 2012 rev.2
 
ABPSI Personal Statement Prep November 17th 2012 Dillard University
ABPSI Personal Statement Prep November 17th 2012 Dillard UniversityABPSI Personal Statement Prep November 17th 2012 Dillard University
ABPSI Personal Statement Prep November 17th 2012 Dillard University
 
DU Fall 2012 QEP Speaker Professor Walter J. Lane
DU Fall 2012 QEP Speaker Professor Walter J. LaneDU Fall 2012 QEP Speaker Professor Walter J. Lane
DU Fall 2012 QEP Speaker Professor Walter J. Lane
 
AABHE Doctoral Student_Award_2013_Final_Document
AABHE Doctoral Student_Award_2013_Final_DocumentAABHE Doctoral Student_Award_2013_Final_Document
AABHE Doctoral Student_Award_2013_Final_Document
 
AABHE 2013 Call for Proposals
AABHE 2013 Call for ProposalsAABHE 2013 Call for Proposals
AABHE 2013 Call for Proposals
 
AABHE Research & Writing Boot Camp
AABHE Research & Writing Boot CampAABHE Research & Writing Boot Camp
AABHE Research & Writing Boot Camp
 
Take Back the Night October 23 2012 Dr. Eartha Johnson Dillard University
Take Back the Night October 23 2012 Dr. Eartha Johnson Dillard University  Take Back the Night October 23 2012 Dr. Eartha Johnson Dillard University
Take Back the Night October 23 2012 Dr. Eartha Johnson Dillard University
 
Dillard University Faculty Staff Institute Program Fall 2012
Dillard University Faculty Staff Institute Program Fall 2012Dillard University Faculty Staff Institute Program Fall 2012
Dillard University Faculty Staff Institute Program Fall 2012
 

IDEA Best Practice Chairs & Admin

  • 1. Best Practices IDEA Feedback for Chairs and Other Administrators Insight. Improvement. Impact.® The primary purpose of all IDEA Feedback survey instruments is to provide information that may be used to guide professional development efforts. Employing processes that encourage conscientious responses by those surveyed and thoughtful reflection by those receiving results is essential if the survey process is to be of value. The following considerations and recommendations are intended to assist in the successful implementation of the survey process. Determining the Survey Purpose Is the purpose of the feedback to determine continuance, to make other “summative” decisions, and/or to guide improvement efforts? The IDEA administrator feedback tools were designed to guide formative improvement. Although the surveys can be used as part of the summative judgment process, a tangible benefit of the IDEA tools is to use them for formative purposes. A discussion of how to encourage reflection using IDEA results is addressed later in this document. When the survey is being used for making administrative decisions, The IDEA Center strongly recommends that other sources of evidence beyond the IDEA feedback tool be used and that survey participants be informed of this. Typically, expectations and goals for administrators vary from one unit to the next. For example, respondents often are not aware of the charge an administrator received when he or she assumed the position. Identifying and using other sources of information to assess those unique expectations is clearly important. Adopting a Systematic Rather than Sporadic Survey Administration Schedule Routine ongoing survey administration is beneficial for all concerned. Predictable, ongoing assessment processes reduce concerns about the intended purpose of an evaluation. Campuses risk creating the impression that the evaluation/feedback process is tied to poor performance when a survey is only administered when there is a “problem.” This would be counterproductive to creating a culture where constructive feedback is valued for its support in professional development. The IDEA Center makes no recommendation as to how frequently these ongoing, systematic administrator evaluations should occur, with one exception. A desirable time to seek initial formative feedback is in one’s third semester of service. By this time respondents will have had enough experience with the administrator to provide valuable feedback, but perceptions will not be so entrenched that they will be difficult to change. Knowing what may need to be changed early in one’s career increases the likelihood of a longer and more successful tenure. Identifying Appropriate Respondents Identifying who should be asked to complete a survey is not as simple a task as it might first appear. Often political considerations interfere with gaining the most useful and valid information. Some campuses select respondents by categories of employment and within those categories there may be individuals new to campus or part-time employees who have had no contact with the administrator. Criteria for removal of individuals from the list to be surveyed should be created so only those who can provide valid feedback are included. In general, only individuals who have had enough exposure to the administrator being rated to provide useful feedback about that individual should be asked to complete a survey.
  • 2. Whatever process is used to select potential survey respondents, it needs to be done in a similar way for everyone who is being rated. Using Additional Questions The IDEA feedback systems allow for the addition of 20 user-defined questions. This provides the opportunity to ask questions that may be of unique importance to your campus or to the individual being rated. Given the effort required to administer these surveys, this opportunity should not be lost. Another use of extra questions is to collect useful information not directly related to the individual being rated. Asking a few questions about the campus, college, or departmental climate (or other issues) is an efficient way to collect information without conducting another survey. Using this method also has a side benefit due to the relatively high response rates typical of such surveys compared to more general campus climate or issue surveys. Determining Dissemination of Results to Relevant Constituencies There is no clear right or wrong approach to providing survey results to relevant constituencies, but it must be considered thoughtfully and determined prior to the survey administration. It is recommended that the “official” survey results be distributed in a consistent fashion. This is especially important when a group of individuals is being surveyed at a similar time for a similar purpose (e.g., all senior administrative staff, all of the deans in a university, or all of the chairs in a college). If the results are to be provided beyond the individual administrator and his or her supervisor, it is important to determine how they will be conveyed and to whom. • How will the results be communicated? Determine if a written or verbal summary will be provided. If the results will be provided in writing, care should be taken because the results may be distributed beyond the intended audience – especially if they are distributed electronically. • Who will prepare the summary? It is typical for the summary to be prepared collaboratively with the individual who was rated and his or her supervisor. • Can the administrator provide additional information about his or her results? For example, chairs have provided more detailed survey results to their faculty at department meetings and described their plans for improvement at that time. On one campus an administrator early in his career with particularly poor results extended a public “mia culpa” to his staff and was very proactive in acknowledging his poor performance as well as his plans to change. Reports of the results of these more comprehensive feedback sessions have generally been received positively. • What are the legal requirements for dissemination of survey results? Different states have different legal requirements concerning what survey results (if any) are required to be made available to the general public. Be sure to check your state’s requirements so that you collect data in a manner that protects you from disseminating results more broadly than you wish. Prepare Respondents Inform respondents by email, mail, or personal communication at least one week before the survey is to be administered that they will receive a survey from The IDEA Center requesting feedback about their administrator’s performance. The communications may come from one or all of the following: the individual being rated, his/her superior, a human resources contact. The purpose of the survey often helps determine who should send the communication. If the primary purpose is to determine the individual’s continuance in his/her position, the administrator’s superior might inform the raters. On the other hand, if the primary purpose is to guide personal reflection, it might be more appropriate for the individual being rated to provide the information.
  • 3. Some specific points to cover in the communication include: • Emphasize the value and importance of the feedback. No matter how well someone is currently performing, something can always be learned from constructive feedback from valid sources. And, without it, improvement rarely occurs. • Inform respondents about the intended audience. If the results are used only to guide improvement, it may be that the only person who will see the results is the administrator being rated. If used for administrative purposes, the individual’s supervisor and perhaps others will view the findings. In other instances, the results may be published or summarized for a broader audience, including the raters. • Set the tone that encourages constructive feedback. Encourage respondents to offer constructive and not mean-spirited or vindictive comments. This is especially important when soliciting responses to open-ended questions. • Inform respondents of the survey’s purpose. It is important for them to know if the results will be used to determine continuance, to make other “summative” decisions, and/or to guide improvement efforts. • Summarize other sources of evidence used in the evaluation process. A failure to inform those surveyed that other sources of evidence will be used in the evaluation process may lead them to conclude that their responses have more influence than they actually do for the administrative decision being made. • Indicate if, how, what, and to whom results will be disseminated. Having a clear plan and communicating that plan early will provide clear expectations to those participating in the process. Additional information about preparing respondents is available in the Guide to Administering IDEA Feedback Systems for College and University Administrators (www.theideacenter.org/AdminGuide). Reflect, Reflect, Reflect A primary purpose of all IDEA feedback surveys is to encourage reflection intended to elicit higher performance. At the very least, the individual being rated should be asked to respond to a few basic questions such as: • What results were affirming? • What results were surprising? • What may have been of concern? • How will you use this information to guide your professional development efforts? The individual’s supervisor should reflect on the same questions and schedule a time free from interruptions in a comfortable setting to discuss the results and how they might be used to support future improvement efforts. A practice that many individuals have found to be useful is for the individual being rated to print a copy of the survey being used from The IDEA Center’s website (www.theideacenter.org) and to then complete a self-rating. When the survey results are available, the individual being rated can compare his or her self-rating to the average of the raters. When the average of the raters for an item is more than .5 (on a five-point scale) to .75 (on a seven point scale) different from the self-rating, a difference in perception between raters and the administrator may exist. Perceptions are important, even if they are inaccurate, and it can be beneficial to ask why individuals might view you differently than you view yourself. In addition, some campuses will ask the administrator’s supervisor to also participate in this process to discover where additional differences in perceptions might exist. The opportunity for the administrator who was rated to reflect about the survey results with someone who is not responsible for evaluating him/her is another near essential component of a
  • 4. useful professional development process. The consultant should be a knowledgeable expert in leadership development as well as understand the administrator’s role. A resource to help individuals reflect upon results in a non-threatening way will enhance the value of the process. Such a resource demonstrates the institution’s commitment to professional development. A thoughtfully designed reflection process is essential to truly foster benefits from the survey process. The above recommendations along with the Guide to Administering IDEA Feedback Systems for College and University Administrators are intended to support the effective implementation of the administrator feedback surveys. ©2010 The IDEA Center