O SlideShare utiliza cookies para otimizar a funcionalidade e o desempenho do site, assim como para apresentar publicidade mais relevante aos nossos usuários. Se você continuar a navegar o site, você aceita o uso de cookies. Leia nosso Contrato do Usuário e nossa Política de Privacidade.
O SlideShare utiliza cookies para otimizar a funcionalidade e o desempenho do site, assim como para apresentar publicidade mais relevante aos nossos usuários. Se você continuar a utilizar o site, você aceita o uso de cookies. Leia nossa Política de Privacidade e nosso Contrato do Usuário para obter mais detalhes.
ACON is NSW’s largest community-based HIV/AIDS and GLBT health organisationOrganisational review, recommended improved Planning & Evaluation.Roles of P&E created in 2012, Ryan & I started in July.Larger organisation, have capacity to have specific roles. May work in smaller orgs, not have dedicated roles. Give practical advice or tips take back to own place. Try not to confuse with ACON jargon.
Current Processes & Data Collection are:Non-standardisedHighly segmentedIndependentInconsistent Reporting & Evaluation which are:IsolatedFragmentedAd hocDecreased potential to:Have accurate and meaningful dataHave proactive, rather than reactive, reportingEvaluate & gauge activitiesMaximise outcomesShare knowledge & learningsReport & communicate feedback, internally & externallyConsultation process with all teams within ACON, found out what they’re currently doing (if any) and any challengesResulted in PEKM framework
As a parallel process to this, we ran an anonymous staff evaluation survey in September to diagnose the culture and practice of evaluation across the organisation.We are lucky that staff response showed genuine interest in evaluation and wanting to do it. They understand the value, so we are in good place to implement this framework. Overall, there was strong consensus that evaluation plays a number of important roles at ACON, mainly in improving service and they indicated that a majority of staff have incorporated evaluation results into program planning and delivery. Biggest weakness:There was a lack of shared learning and communication of evaluation processes and outcomes. This has become a core objective of the PEKM framework, the need for making the ‘sharing’ of evaluation a key aspect in order to improve organisational and individual learning. It’s important that we make a ‘safe’ place where it’s OK to talk about strategies and outcomes, even if they don’t work as intended. The main thing is that evaluation is planned in such a way that the right information can be captured to inform that process and that everyone learns from it and doesn’t repeat the same mistakes.Staff generally want the time to be able to do it properly, ensure that they’re capturing the right information and that the information is going somewhere useful. This requires the development of templates and clear guidelines.
Foundation of PEKM Framework is based on a hybrid of the well known health promotion and project management cycle with the added component of ‘SHARE’, as this aspect provides ACON the opportunity to leverage the greatest benefit from. Purpose to build capacity of all health promotion practitioners to do own planning and evaluation. Lots of health promotion officers very good at ‘doing’ but evaluation of the ‘doing’ is lacking. Our team here to support the P&E components and integrate so this become standard part of everyone’s role.
The PAS is structured into a thematic model, grouping similar-natured activities and operations into categories. The categories are tiered from a base of fundamental operational activities - the ‘Enablers’, through to partnership- and human resource-enhancing activities – ‘Capacity Building’, and finally into wider community engagement activities – ‘Marketing and Communication’, ‘Peer Support and Education’, and ‘Client Support’.This system allows for a list of activity types to be standardised for planning and evaluation mechanisms, and reporting requirements, whilst retaining flexibility for program delivery.
So what information do they need to capture?Output gets done the most, Quality fairly regularly. These are your ‘Process’ evaluation – how much did we do & how well did we do it? If your indicator is to provide 1,000 counselling sessions, that’s fairly easy to count, but have you made any difference? That’s where impact indicators come in. People get stuck on setting indicators.
From organisation perspective:Important to be able to select clear indicators that tell us whether we’ve met our goals or not. Eg. one objective may be to reduce the transmission of HIV. What do you think may be an indicator that tells us whether we’ve reduced it or not? Perhaps # of HIV notifications. External data source, credible data source, collected anyway, no extra workload for ACON or health practitioners.BUT what if one of the strategies use to deliver on this objective were to increase testing rates? What if the corresponding rise in the number of people getting tested meant we picked up more diagnoses? Does that mean we’ve failed if the notification numbers increase? Important to ensure that indicators are not viewed in isolation. Down at the operational end, evaluation much more concerned with Impact & Process level evaluation.
Evaluation Indicator BandsTHEME PRIMARY ACTIVITIESINDICATOR BANDOutput Quality Impact ImpactSVR ∆ KSC ∆ BehaviourPEER SUPPORTPeer Education Workshops X X X XPeer Education Outreach X XSocial Inclusion andConnectednessXNeedle and Syringe Program(NSP)X X XPriority Support and TreatmentsX XX X
Key Benefits and Desired Outcomes• Strong and adaptable foundation• Ability to harvest accurate data to inform organisationalactivities• Build ACON’s capacity to develop knowledge in order tobecome a learning and sharing organisation• Ability to monitor and reflect on health trends, and providetargeted responses to community needs• Increased quality in service(s) delivery• Greater operational efficiencies• Increased community, client and employee satisfaction.
Thank YouA special thanks to:Ryan JacksonPlanning CoordinatorHelen ConwayMonitoring & Evaluation OfficerAlan BrothertonDirector, Policy, Strategy & Research