O slideshow foi denunciado.
Utilizamos seu perfil e dados de atividades no LinkedIn para personalizar e exibir anúncios mais relevantes. Altere suas preferências de anúncios quando desejar.

RHINO Forum Presentation on DQR Framework

Slides from the RHINO Forum DQR Webinar Kickoff - Wednesday, September 27, 2017

Livros relacionados

Gratuito durante 30 dias do Scribd

Ver tudo
  • Seja o primeiro a comentar

  • Seja a primeira pessoa a gostar disto

RHINO Forum Presentation on DQR Framework

  1. 1. Data Quality Review (DQR) Framework Review of quality of health facility data
  2. 2. | 2 Why is health facility data important?Why is health facility data important?  For many indicators it is the only continuous/frequent source of data  It is most often the only data source that is available at the subnational level -- important for equity;  For many key indicators, it is the sole source of data. For example, PMTCT, ART, TB treatment outcomes, TB notification, confirmed malaria cases, causes of death, etc.
  3. 3. | 3 Quality of health facility data – why do we care?Quality of health facility data – why do we care?  High-quality data provide evidence to providers and managers to optimize healthcare coverage, quality, and services.  High-quality data help: ― Form an accurate picture of health needs, programs, and services in specific areas ― Inform appropriate planning and decision making ― Inform effective and efficient allocation of resources ― Support ongoing monitoring, by identifying best practices and areas where support and corrective measures are needed
  4. 4. | 4 Most common problems affecting data qualityMost common problems affecting data quality  Lack of guidelines to fill out the main data sources and reporting forms  Personnel not adequately trained  Misunderstanding about how to compile data, use tally sheets, and prepare reports  Un-standardized source documents and reporting forms  Arithmetic errors during data compilating  Lack of a reviewing process, before report submission to next level
  5. 5. | 5 Many tools have been used to address data qualityMany tools have been used to address data quality  GAVI DQA  WHO Immunization Data Quality Self-assessment (DQS)  Global Fund/MEASURE Evaluation DQA  RDQA - Self assessment version of DQA (with in- country adaptations)  Global Fund OSDV  PRISM  WHO Data Quality Report Card (DQRC)
  6. 6. | 6 DQR - harmonized approach to assessing and improving data quality DQR - harmonized approach to assessing and improving data quality  The DQR is a multi-pronged , multi-partner framework for country-led data quality assurance that proposes a harmonized approach to assessing data quality  It is a framework that builds on the earlier program- specific quality tools and methods while proposing the examination of data quality in a more systemic way that can meet the needs of multiple stakeholders  It is a framework that also includes the examination of existing facility data (that does not require additional data collection) missing from earlier tools  It provides valuable information for fitness-for-purpose to support the Health Sector Strategic Planning Cycle (e.g. health sector or program reviews)
  7. 7. | 7 Why are we recommending a harmonized approach to data quality? Why are we recommending a harmonized approach to data quality?  Data quality is a systems issue - multiple assessments for different diseases/programs are inefficient and burdensome for the health system  Can we satisfy the needs in data quality assurance of all stakeholders with one holistic data quality assessment?  The application of a standard framework to evaluate data quality enables the understanding of the adequacy of routine data used for health sector planning – can we link data quality assessment to health planning efforts?  Permits stakeholders to know that the routine data have undergone a known minimum level of scrutiny which lends credibility and confidence in the data
  8. 8. | 8 DQR FrameworkDQR Framework
  9. 9. | 9 Recommended Core Program IndicatorsRecommended Core Program Indicators Program Area Indicator Name Full Indicator Maternal Health Antenatal care 1st visit (ANC1) Number (%) of pregnant women who attended at least once during their pregnancy Immunization DTP3/Penta3 Number (%) of children < 1 year receiving three doses of DTP/Penta vaccine HIV/AIDS ART coverage Number and % of people living with HIV who are currently receiving ART TB Notified cases of all forms of TB Number (%) of all forms of TB cases (i.e. bacteriologically confirmed plus clinically diagnosed) reported to the national health authority in the past year (new and relapse) Malaria Confirmed malaria cases Number (%) of all suspected malaria cases that were confirmed by microscopy or RDT
  10. 10. | 10 DQR methodologyDQR methodology C om ponent s
  11. 11. | 11 Domains of Data QualityDomains of Data Quality
  12. 12. Data Quality Review (DQR) Desk review
  13. 13. 13  Completeness and timeliness ― Completeness of reports ― Completeness of data ― Timeliness of reports  Internal consistency ― Accuracy ― Outliers ― Trends ― Consistency between indicators  External consistency ― Data triangulation ― Comparison with data surveys ― Consistency of population trends  External comparisons (population denominators) Metrics for Data Quality Performance
  14. 14. 14 Completeness and Timeliness of Data This examines the extent to which: Data reported through the system are available and adequate for the intended purpose All entities that are supposed to report are actually reporting Data elements in submitted reports are complete Reports are submitted/received on time through the levels of the information system data flow
  15. 15. 15 • Completeness of reports (%) = # total reports available or received # total reports expected • Completeness of indicator data (%) = # indicator values entered (not missing) in the report # total expected indicator values • Timeliness (%) = # reports submitted or received on time # total reports available or received Completeness and Timeliness of Data
  16. 16. 16 Internal Consistency of Reported Data This dimension examines: The accuracy of reporting of selected indicators, by reviewing source documents Whether data are free of outliers (within bounds), by assessing whether specific reported values within the selected period (such as monthly) are extreme, relative to the other values reported Trends in reporting over time, to identify extreme or implausible values year-to-year The program indicator compared to other indicators with which they have a predicable relationship, to determine whether the expected relationship exists between the two indicators
  17. 17. 17 Internal Consistency: Outliers Metric Severity Definition National Level Subnational Level Outliers (Analyze each indicator separately.) Extreme (At least 3 standard deviations from the mean) % of monthly subnational unit values that are extreme outliers # (%) of subnational units in which ≥1 of the monthly subnational unit values over the course of 1 year is an extreme outlier value Moderate (Between 2–3 standard deviations from the mean, or >3.5 on modified Z-score method) % of subnational unit values that are moderate outliers # (%) of subnational units in which ≥2 of the monthly subnational unit values over the course of 1 year are moderate outliers 17
  18. 18. 18 Example: Outliers in a Given Year Dist Month Total Outliers % Outliers 1 2 3 4 5 6 7 8 9 10 11 12 A 2543 2482 2492 2574 3012 2709 3019 2750 3127 2841 2725 2103 1 8.3% B 1184 1118 1195 1228 1601 1324 1322 711 1160 1178 1084 1112 2 16.7% C 776 541 515 527 857 782 735 694 687 628 596 543 0 0% D 3114 2931 2956 4637 6288 4340 3788 3939 3708 4035 3738 3606 1 8.3% E 1382 1379 1134 1378 1417 1302 1415 1169 1369 1184 1207 1079 0 0% Nat’l 0 0 0 0 2 0 0 1 0 0 0 1 4 6.7% Months with at least one moderate outlier on the district monthly reports are shown in red.
  19. 19. 19 Metric Definition National Level Subnational Level Trends/ Consistency over Time (Analyze each indicator separately.) Conduct one of the following, based on indicator’s expected trend: • Indicators or programs with expected growth: Compare current year to the value predicted from the trend in the 3 preceding years • Indicators or programs expected to remain constant: Compare current year to the average of 3 preceding years # (%) of districts whose ratio of current year to predicted value (or current year to average of preceding 3 years) is at least ± 33% of national ratio Graphic depiction of trend to determine plausibility based on programmatic knowledge Internal Consistency: Trends Over Time
  20. 20. 20 Example: Trends over Time District Year Mean of Preceding 3 Years (2010- 2012) Ratio of 2013 to Mean of 2010-2012 % Difference between National and District Ratios2010 2011 2012 2013 A 30242 29543 26848 32377 28878 1.12 0.03 B 19343 17322 16232 18819 17632 1.07 0.08 C 7512 7701 7403 7881 7539 1.05 0.09 D 15355 15047 14788 25123 15063 1.67 0.44 E 25998 23965 24023 24259 24662 0.98 0.16 National 98450 93578 89294 108459 93774 1.16 Consistency trend: Comparison of district ratios to national ratios Any difference between district and national ratio that is ≥33% is highlighted in red.
  21. 21. 21 Internal Consistency: Comparing Related Indicators Metric Definition National Level Subnational Level Consistency among related indicators Maternal Health: ANC1 - IPT1 or TT1 (should be roughly equal) # (%) of subnational units where there is an extreme difference (≥ ± 10%) Immunization: DTP3 dropout rate = (DTP1 - DTP3)/DTP1 (should not be negative) # (%) of subnational units with # of DTP3 immunizations > DTP1 immunizations (negative dropout) HIV/AIDS: ART coverage - HIV coverage (should be <1) # (%) of subnational units where there is an extreme difference (≥ ± 10%) TB: TB cases notified - TB cases on treatment (should be roughly equal) # (%) of subnational units where there is an extreme difference (≥ ± 10%) Malaria: # confirmed malaria cases reported - cases testing positive (should be roughly equal) # (%) of subnational units where there is an extreme difference (≥ ± 10%)
  22. 22. 22 Example: Internal Consistency District ANC1 IPT1 Ratio of ANC1 to IPT1 % Difference between National & District Ratios A 20995 18080 1.16 0.02 B 18923 16422 1.15 0.02 C 7682 6978 1.10 0.07 D 12663 9577 1.32 0.12 E 18214 15491 1.18 0 National 78477 66548 1.18 % difference between ANC1 and IPT1, by district Districts with % difference ≥10% are flagged in red.
  23. 23. 23 External Consistency with Other Data Sources This dimension examines the level of agreement between two sources of data measuring the same health indicator. The two most common sources of data are: The routinely collected and reported data from the health management information system (HMIS) or program- specific information system A periodic population-based survey
  24. 24. 24 External Consistency: Compare with Survey Results Examples of Indicators Definition National Level Subnational Level ANC 1st visit Ratio of facility ANC1 coverage rates to survey ANC1 coverage rates # (%) of aggregation units used for the most recent population-based survey, such as province/state/region, whose ANC1 facility-based coverage rates and survey coverage rates differ by at least 33% 3rd dose DTP3 vaccine Ratio of DTP3 coverage rates from routine data to survey DTP3 coverage rates # (%) of aggregation units used for the most recent population-based survey, such as province/state/region, whose DTP3 facility-based coverage rates and survey coverage rates differ by at least 33%
  25. 25. 25 Example: External Consistency District Facility Coverage Rate Survey Coverage Rate Ratio of Facility to Survey Rates % Difference between Official and Alternate Denominator A 1.05 0.95 1.10 10% B 0.93 0.98 0.96 4% C 1.39 0.90 1.54 54% D 1.38 0.92 1.50 50% E 0.76 0.95 0.80 20% National 1.10 0.94 1.17 17% Comparison of HMIS and survey coverage rates for ANC1 Differences ≥ 33% are highlighted in red.
  26. 26. 26 External Comparison of Population Data This dimension examines two points: The adequacy of the population data used in the calculation of health indicators The comparison of two different sources of population estimates (for which the values are calculated differently) to see the level of congruence between the two sources
  27. 27. 27 External Comparison of Population Data Metric Definition National Level Subnational Level Consistency of population projections Ratio of population projection of live births from the country census bureau/bureau of statistics to a United Nations live births projection for the country NA Consistency of denominator between program data & official government population statistics Ratio of population projection for select indicator(s) from the census to values used by programs # (%) of subnational units where there is an extreme difference (e.g., ±10%) between the 2 denominators
  28. 28. 28 External Comparisons of Population Denominators District Official Government Estimate for Live Births Health Program Estimate for Live Births Ratio of Official Government to Health Program Estimates A 29855 29351 1.02 B 25023 30141 0.83 C 6893 7420 0.93 D 14556 14960 0.97 E 25233 25283 1.00 National 101560 107155 0.95 Comparison of national and subnational administrative unit ratios of official government live birth estimates Administrative units with differences ≥ ±10% are highlighted in red.
  29. 29. 29 How do we do the desk review? A data quality app has been created for DHIS and can be downloaded by users and applied to their country DHIS databases For those that do not have DHIS, an Excel tool has been developed to support data quality analysis The principles of the desk review can be applied in any software that a country has and are not limited to the tools presented
  30. 30. Data Quality Review (DQR) Data verification and system assessment
  31. 31. 31 Facility Survey Component of DQR There are 2 components: ― Data verification – Examines the accuracy of reporting of selected indicators, by reviewing source documents ― System assessment -- Review adequacy of system to collect, compile, transmit, analyze, and use HMIS & program data Survey at 2 levels ― Health facility ― District
  32. 32. 32 Accuracy: Data Verification Quantitative: Compares recounted to reported data Implement in 2 stagesImplement in 2 stages Assess on a limited scale if sites are collecting and reporting data accurately and on time In-depth verifications at the service delivery sites Follow-up verifications at the intermediate and central levels 32
  33. 33. 33 Data Verification Following Data Flow 33
  34. 34. 34 Accuracy: Verification Factor Verification Factor Numerator: Recounted data Denominator: Reported data  Over-reporting: <100%  Under-reporting: >100% Suggested range of acceptability: 100% +/- 10% (90% –110%) 34
  35. 35. 35 Verification factor  Weighted mean of verification ratios  Summarizes information on the reliability of reporting of the data reporting system  Indicates the degree of over-/under-reporting in the system ― e.g. VF = 0.80 indicates that of the total reported number of events, approximately 80% could be verified in source documents -> over-reporting
  36. 36. 36 Verification Factor Example v Indicator 1 Indicator 2 Recounted Reported VF Recounted Reported VF A 1212 1065 1.14 4009 4157 0.96 B 1486 1276 1.16 3518 3686 0.95 C 357 387 0.92 672 779 0.86 D 2987 3849 0.78 1361 1088 1.25 E 4356 4509 0.97 4254 3970 1.07 Data accuracy by district Indicators flagged in red are verification factors ≥ ±10% of 1. 36
  37. 37. 37 Verification Factors Plotted Graphically over-reported underreported 37
  38. 38. 38 Data verification Recommended maximum 5 indicators for review —ANC1, DTP3/Penta 3, ART coverage, TB cases, malaria cases (confirmed) Select a time period for the verification (3 months) — e.g. : July, August, September 2016 For each indicator: —Review the source documents and reports —Recount the number of events —Compare the recount to the reported events —Determine reasons for any discrepancies
  39. 39. 39 System Assessment Indicators Indicator Level Facility District Presence of trained staff X X Presence of guidelines X X No recent stock out of data collection tools X X Recently received supervision and written feedback X X Evidence of analysis and use data X X
  40. 40. 40 If the sampling permits it, system assessment findings can be disaggregated by strata
  41. 41. 41 Are there tools to support data verification and system assessment?  A paper-based questionnaire is available that can be adapted to a country situation  A data collection program has been developed in CSPro for tablets  An analysis tool has been developed in Excel for support the analysis of the data collected during this exercise
  42. 42. Data Quality Review (DQR) Desk review Excel Tool
  43. 43. 43 43 Opening Screen
  44. 44. 44 Input Basic Information Tab – Parameters of the analysis
  45. 45. 45 Input Administrative Units for the Analysis
  46. 46. 46 Input Program Areas and Indicators
  47. 47. 47 Input Quality Thresholds Recommended User-defined Domain 1: Completeness and Consistency of Reporting/Indicator Data Col 1 Col 2 1a 1a1a 75% 1a1b 75% 1a2a 75% 1a2b 75% 1a3a 75% 1a3b 75% 1a4a 75% 1a4b 75% 1b Program Area 1: Maternal_Health 1b1 Indicator 1: ANC 1st Visit 90% Program Area 2: Immunization 1b2 Indicator 1: 3rd dose DPT-containing vaccine 67% Program Area 3: HIV_AIDS 1b3 Indicator 1: Number of HIV+ persons currently on ART 90% Program Area 4: Malaria 1b4 Indicator 1: Number of confirmed malaria cases reported 90% Program Area 5: TB 1b5 Indicator 1: Number of Notified TB cases (all forms of TB) 75% Completeness of Region Level Reporting Completeness of Indicator Reporting: % of data elements that are non-zero values; % of data elements that are non-missing values Timeliness of Region Level Reporting Completeness of District Level Reporting Threshold Completeness and Timliness of Reporting from Health Facilities and Aggregation Levels: District, Region, Province Completeness of Province Level Reporting Timeliness of Province Level Reporting Timeliness of District Level Reporting Completeness of Health Facility Level Reporting Timeliness of Health Facility Level Reporting Quality Thresholds: 'Quality thresholds'are the values that set the limits of acceptable error in data reporting. The analyses in the DQR compare results to these thresholds to judge the quality of the data. Recommended values are included for each metric in column 1. User-defined thresholds can be input into col 2 which will take precedence over the values in col 1.
  48. 48. 48 Input Information on Completeness and Timeliness
  49. 49. 49 Input Population Data
  50. 50. 50 Input Data on Indicator Trends
  51. 51. 51 Input Indicator Data
  52. 52. 52 Summary Dashboard No. Indicator Definition National Score (%) # of districts not attaining quality threhold % of districts not attaining quality threshold 1a Completeness of District Reporting National district reporting completeness rate and districts with poor completeness of reporting 99.1% 1 2.1% 1b Timeliness of District Reporting National district reporting timeliness rate and districts with poor timeliness of reporting 90.3% 3 6.4% 1c Completeness of Facility Reporting National facility reporting completeness rate and districts with poor facility reporting completeness 96.1% 8 17.0% 1d Timeliness of Facility Reporting National facility reporting timeliness rate and districts with poor facility reporting timeliness 89.6% 11 23.4% Maternal_Health - ANC 1st Visit 98.9% 1 2.1% Immunization - 3rd dose DPT-containing vaccine 99.3% HIV_AIDS - Number of HIV+ persons in palliative care 99.8% Malaria - Number of confirmed malaria cases reported 99.8% Immunization - OPV3 98.9% 1 2.1% Multi-program - Penta 1st doses 99.1% Maternal_Health - ANC 1st Visit 98.9% 2 4.3% Immunization - 3rd dose DPT-containing vaccine 99.5% HIV_AIDS - Number of HIV+ persons in palliative care 100.0% Malaria - Number of confirmed malaria cases reported 99.5% 1 2.1% Immunization - OPV3 100.0% Multi-program - Penta 1st doses 100.0% 1f.1 Consistency of Reporting Completeness - District Reporting Consistency of district reporting completeness and districts deviating from the expected trend 105.8% 1f.2 Consistency of Reporting Completeness - Facility Reporting Consistency of facility reporting completeness and Districts deviating from the expected trend 103.5% DOMAIN 1: COMPLETENESS OF REPORTING BURUNDI - ANNUAL DATA QUALITY REVIEW: RESULTS, 2016 DOMAIN 2: INTERNAL CONSISTENCY OF REPORTED DATA Completeness of indicator data (missing values) 1e.1 1e.2 Indicator 1: Completeness and timeliness of reporting Indicator 1e: Completeness of indicator data - presence of missing and zero values Indicator 1f: Consistency of reporting completeness over time Completeness of indicator data (zero values)
  53. 53. 53 Domain 1: Completeness of Reporting
  54. 54. 54 Domain 1: Completeness of Indicator Data National score Program Area and Indicator Quality Threshold Type % No. % Name Missing 98.9% 1 2.1% Zero 98.9% 2 4.3% Missing 99.3% 1 2.1% Zero 99.5% 1 2.1% Missing 99.8% Zero 100.0% Missing 99.8% Zero 99.5% 1 2.1% Missing 98.9% 1 2.1% Zero 100.0% Missing 99.1% 1 2.1% Zero 100.0% Missing 99.3% 4 8.5% Zero 99.6% 4 8.5% Malaria - Number of confirmed malaria cases reported <= 90% - Multi-program - Penta 1st doses <= 90% District 21 - Indicator 1f: Consistency of Reporting Completeness Immunization - OPV3 <= 90% District 7 - Total (all indicators combined) HIV_AIDS - Number of HIV+ persons in palliative care <= 90% - - Indicator 1e: Completeness of Indicator Reporting - Presence of Missing and Zero Values 2016 Maternal_Health - ANC 1st Visit District 44 District 17, District 29 <= 90% Districts with > user-defined % of zero or missing values Immunization - 3rd dose DPT- containing vaccine <= 90% District 19 District 17 Interpretation of results: Indicator 1e • • • •
  55. 55. 55 Domain 2: Internal Consistency - Outliers National score % No. % Name 0.2% 1 2.1% 0.0% 0.5% 3 6.4% 0.0% 0.4% 2 4.3% 0.0% 0.2% Indicator 2a.1: Extreme Outliers (>3 SD from the mean) 2016 Malaria - Number of confirmed malaria cases reported Immunization - OPV3 Multi-program - Penta 1st doses Total (all indicators combined) DOMAIN 2: INTERNAL CONSISTENCY OF REPORTED DATA - Districts with extreme outliers relative to the mean District 39 Indicator 2a: Identification of Outliers - District 31, District 39 - Program Area and Indicator District 9, District 38, District 41 Maternal_Health - ANC 1st Visit Immunization - 3rd dose DPT-containing vaccine HIV_AIDS - Number of HIV+ persons in palliative care Interpretation of results - Indicator 2a1: • • • • • •
  56. 56. 56 Domain 2: Consistency over time Quality threshold National score (%) Number of districts with divergent scores Percent of districts with divergent scores Names of districts with divergent scores: 20% Expected trend Increasing Compare districts to: expected result 2b3: Consistency of 'General_Service_Statistics - OPD Total Visits' over time Year 2014 100% 5 38.5% District 6, District 7, District 8, District 9, District 11 0 100,000 200,000 300,000 400,000 500,000 600,000 0 100,000 200,000 300,000 400,000 500,000 General_Service_Statistics-OPDTotalVisits eventsforyearofanalysis Forcasted General_Service_Statistics - OPD Total Visits value for current year based on preceding years (3 years max) 0 1,000,000 2,000,000 3,000,000 2011 2012 2013 2014 Trend over time: General_Service_Statistics -OPD Total Visits Interpretation of results - Indicator 2c3: •This indicator is increasing over time (Outpatient visits are increasing - something we were expecting given social mobiliation for public health services. •Comparison of expected result (that the forecasted value is equal to the actual value for 2014) yeilds 5 districts with ratios that exceed the quality threhold of 20%. 3 are inferior of the quality threshold while 2 are greater. • Errors are not systematic (e.g. all in one direction) Review district outpatient registers in affected districts to confirm reported values.
  57. 57. 57 Domain 2: Consistency between related indicators Percent of districts with divergent scores 15.4% Names of districts with divergent scores: District 5, District 6 Indicator 2c: Internal Consistency - Consistency Between Related Indicators Consistency between related indicators - Ratio of two related indicators and Districts with ratios significantly different from the national ratio * 2c1: Maternal Health Comparison: ANC 1st Visit : IPT 1st Dose Year 2014 Expected relationship National Score (%) 114% Number of districts with divergent scores 2 equal Quality Threshold 10% Compare districts with: national rate Interpretationof results - Indicator 2c1: • Data seem pretty good - only district 5 has a largely discrepant value • IPT seens consistently lower than ANC1 - more pregnant women should be receiving IPT • Stock out of fansidar in Region 2 could explain low number of IPTin Districts 5 . Call DHIO in these districts to investigate •National rate is 114% - most districts are close to this value. District 6 is performing well relative to the other districts but is 'discrepant' relative to the national rate. - no follow up needed. 0 5000 10000 15000 20000 25000 30000 35000 40000 45000 50000 ANC1eventsforyearofanalysis IPT 1st Dose eventsfor year of analysis Scatter Plot: ANC 1st Visit : IPT1st Dose(Districts compared tonational rate)
  58. 58. 58 Domain 3: External Consistency – Consistency with survey values
  59. 59. 59 Domain 4: Consistency of population data – Comparison of denominators in use in-country Names of districts with divergent scores: District 1, District 5, District 7, District 12 National Score (%) 106% Number of districts with divergent scores 4 Percent of districts with divergent scores 30.8% Indicator4b: Consistency of denominatorbetween program data and official government population statistics Indicator4b1- Comparing the official Live Births denominator to aprogram denominator, if applicable Year 2014 Quality Threshold 10% 0.00 10000.00 20000.00 30000.00 40000.00 50000.00 60000.00 70000.00 ProgramdenominatorforLiveBirths Official government denominator for Live Births Interpretationof results- Indicator 4b1: • the Program denominatorsin Districts1, 7, and 12 seemtoo large - and too small inDistrict 5. Review growth rates used by program to estimateintercensal yearly values forlivebirths. •
  60. 60. Data Quality Review (DQR) Data verification and system Assessment Excel Chartbook
  61. 61. 61 Opening Screen
  62. 62. 62 CSPro Data Entry Application
  63. 63. 63 Create DQR Indicators in CSPro
  64. 64. 64 Export Data From CSPro and Paste into the Chartbook
  65. 65. 65 Input Analysis Disaggregations
  66. 66. 66 General Facility Information
  67. 67. 67 Availability of Source Documents and Reports
  68. 68. 68 Timeliness and Completeness
  69. 69. 69 Verification Factors
  70. 70. 70 Program Specific Verification Factors
  71. 71. 71 Program Specific – Reasons for Discrepancies
  72. 72. 72 System Assessment Results
  73. 73. 73 System Assessment – Subnational Results

    Seja o primeiro a comentar

    Entre para ver os comentários

Slides from the RHINO Forum DQR Webinar Kickoff - Wednesday, September 27, 2017

Vistos

Vistos totais

823

No Slideshare

0

De incorporações

0

Número de incorporações

320

Ações

Baixados

50

Compartilhados

0

Comentários

0

Curtir

0

×