Data quality and information use: A systematic review to improve evidence, Ethiopia
Health information and monitoring systems (HMIS) provide necessary data to health systems to monitor the utilization and quality of health services and make evidencebased decisions. Ethiopia has undertaken an extensive reform and re-design of its HMIS introduced in six of Ethiopia’s nine regions in 2008. To assess the data management and reporting systems, verify the data quality as well as the level of information use for decision-making an assessment was undertaken to identify areas for improvement. Data were collected via questionnaires to evaluate data collection and reporting functioning. Six functional areas of the data management and reporting systems were assessed on a scale of 0 to 2.0. Data accuracy was assessed by comparing data at the three reporting levels for consistency. These levels were: service delivery site (SDS); intermediate aggregation level (IAL) where reports from SDSs are aggregated; and programme monitoring and evaluation units at national level (M&E). Data accuracy of the nine selected key national indicators was compared at each level. Data was collected from 17 health districts; 32 service delivery facilities (26 health centres and 6 hospitals). Integrated data collection and reporting tools, standard operating guidelines and procedures were in place at all levels. Documentation and sources were available at all SDSs. However, resources were weakest at the SDS and IAL (scores of 0.5). Data management processing had an average score of 1.2. Content completeness and reporting timeliness remained below the 85% national target at all levels (SDSs 76.7% and 67.7%; IALs 62% and 39% and 29%, and 53% at national level). Data accuracy was 76% for the SDS level and 71% for the IALs. At the facility and district levels, 37% had utilized data from the HMIS in discussions and decision-making activities. Sustained monitoring and action to maintain good HMIS and data accuracy are essential in evaluating progress on health outcomes.
Les systèmes d'information de gestion de la santé (SIGS) fournissent les données nécessaires aux systèmes de santé pour suivre l’utilisation et la qualité des services de santé et prendre des décisions fondées sur des preuves. L’Ethiopie a entrepris une réforme et une refonte extensives de son SIGS introduit en 2008, dans six des neuf régions que compte le pays. Une évaluation a été engagée: en général pour apprécier la gestion des données et les systèmes de notification et pour vérifier la qualité des données ainsi que le niveau d’utilisation de l’information lié à la prise de décision et en particulier pour identifier les marges de progrès. Le rassemblement des données via des questionnaires permettra d’estimer la collecte des données et le fonctionnement de la notification. Six secteurs fonctionnels relatifs à la gestion des données et des systèmes de notification ont été évalués sur une échelle de 0 à 2. La précision des données a été analysée en comparant les données de trois niveaux de notification pour assurer la cohérence. Ces niveaux étaient les suivants: site de prestation des services (SPS); niveau intermédiaire d’agrégation (NIA) où les rapports issus des SPS sont agrégés; suivi de programme et unités d’évaluation au niveau national (S&E). La précision des données des neuf indicateurs clés nationaux sélectionnés a été mise en regard à chaque niveau. Les données ont été réunies au sein de 17 districts de santé et de 32 sites de prestation des services (26 centres de santé et 6 hôpitaux). Une collecte de données intégrée et des outils de notification ainsi que des directives et des procédures de fonctionnement standard étaient en place à tous les niveaux. La documentation et les sources étaient à disposition pour tous les SPS. Toutefois, les ressources étaient plus faibles pour SPS et NIA (résultats de 0,5). Le traitement de la gestion des données a généré un résultat moyen de 1,2. L’exhaustivité du contenu et l’opportunité de la notification sont restées au dessus de la cible nationale de 85% à tous les niveaux (SPS 76,7% et 67,7%; NIA 62% et 39% et 29%, et 53% au niveau national). La précision des données était de 76% pour SPS et de 71% pour NIA. Les sites et districts avaient quant à eux utilisé à hauteur de 37% des données issues du SIGS dans des discussions et activités liées à la prise de décision. Un suivi et une action durables afin de préserver la qualité du SIGS et la précision des données sont essentiels pour faire le bilan des progrès réalisés en matière de résultats sanitaires.
Os sistemas de monitorização e informação clínica (HMIS) fornecem os dados necessários a sistemas clínicos para monitorizar a utilização e qualidade dos serviços de saúde e tomar decisões baseadas em evidências. A Etiópia empreendeu uma reforma e uma reformulação do design extensivas do seu HMIS introduzido em seis de nove regiões da Etiópia em 2008. Para avaliar os sistemas de gestão de dados e de informação, verificar a qualidade dos dados, assim como o nível de informação utilizada na tomada de decisões, foi empreendida uma avaliação para identificar as áreas a melhorar. Foram reunidos dados através de questionários para avaliar a recolha de dados e o funcionamento da divulgação de informações. Foram avaliadas seis áreas funcionais dos sistemas de gestão de dados e de informação numa escala de 0 a 2,0. A precisão de dados foi avaliada quanto à consistência através da comparação de dados nos três níveis de divulgação de informações. Estes níveis eram: local de prestação de serviços (LPS); nível intermédio de agregação (NIA), onde são agregadas as informações do LPS; e monitorização do programa e unidades de avaliação a nível nacional (M&U). A precisão dos dados dos nove indicadores chave selecionados foi comparada a cada nível. Foram reunidos os dados de 17 distritos de saúde; 32 instalações de prestação de serviços (26 centros de saúde e 6 hospitais). As ferramentas de recolha de dados e de informação integradas, as diretrizes e os procedimentos operacionais standard estavam em vigor a todos os níveis. A documentação e as fontes estavam disponíveis em todos os LPS. No entanto, os recursos foram mais fracos no LPS e NIA (classificação de 0,5). O processamento de gestão de dados teve uma classificação média de 1,2. A integridade do conteúdo e a pontualidade das informações permaneceram abaixo dos 85% da meta nacional a todos os níveis (LPS 76,7% e 67,7%; NIA 62%, 39%, 29%, e 53% a nível nacional). A exatidão dos dados foi de 76% para o nível de LPS e 71% para os NIA. Ao nível das instalações e distrito, 37% utilizaram os dados do HMIS em discussões e atividades de tomada de decisão. A monitorização e ação sustentadas para manter um bom HMIS e exatidão de dados são essenciais no progresso de avaliação nos resultados da saúde.
In 2008, to strengthen the HMIS in Ethiopia, the Federal Ministry of Health (FMOH) introduced a new system. The newly designed HMIS was implemented in six of nine regions, namely Benishangul-Gumuz, Dire Dawa, Gambella, Harari, Amhara and Southern Nations, Nationalities and People’s Region.(1),(2) The objective of the new system is to ensure improved measurement and standardization to ensure good quality data – enabling better decisions and thus better health outcomes.
The quality of reported data and use of information is dependent on the underlying data management and reporting systems.(3),(4) Stronger systems ought to produce better quality data. In other words, for good quality data to be produced by and flow through a data management system, key HMIS functional components need to be in place at all levels of the system(4)(see Figure 1).
An assessment was performed to inform users and stakeholders of the current status of the functioning of the HMIS and its ability to provide quality monitoring and data to decision-makers. Three areas of HMIS were assessed:
Six functional components of the data management and reporting systems:
- M&E capabilities, role and responsibilities;
- data management process;
- linkage with national system;
- data collection and reporting forms;
- indicator definition and reporting guidelines;
The data quality in terms of:
- Information use
Methods and Materials
A cross-sectional design was used and data were collected through observation, interview and data review at the various respective critical levels of the flow of information.
Selection of Study Sites
Study sites where activities supporting the indicators were implemented were selected. The selection of study sites involved identifying regions, districts and individual health facilities using a multistage cluster sampling technique. As the study units have different volumes of service, the sampling involved a stratified random sampling of sites.
Data Collection Procedures and Tools
The assessment included three protocols,(4),(5) with data collection for all protocols occurring at all sites.
Firstly, functional components assessed six areas of the data management and reporting systems. Trained data collectors, with experience in data management, visited each site and collected observational data using a standardized checklist and a questionnaire. Scores were generated for each functional area at the three levels – SDS, IAL and M&E. For each of the six functional components questions were asked, with responses coded as follows:
0-no, not at all;
The scores were intended to be compared across functional areas as a means of prioritizing system strengthening activities.
The second data verification protocol evaluated two stages; an in-depth verification at the SDS and a follow-up verification at the IAL and programme M&E level (region or FMOH). The relationship between the SDS and IAS data was measured to establish if the selected indicators were reported accurately and on time. Nine indicators were selected among the key national indicators in the HMIS.(3),(4)Selection was based upon their relative application for decision-making. Indicators were evaluated for two months (May and June 2010). The data quality assessment determined if a sample of SDSs had accurately recorded the activity related to the selected indicators via a documentation review, and trace and verification. Data accuracy at the SDS was calculated by comparing the verified numbers to the site reported numbers during the period specified. Data verification at the IALs was determined by dividing the sum of reported counts from all SDSs divided by the total count contained in the summary report prepared by the relevant IAL. Likewise, data accuracy at the M&E unit level was calculated by dividing the sum of reported counts from all IALs by the total count contained in the summary report prepared by the relevant M&E unit. Values under 85% represent over reporting and over 130% under reporting. Availability, completeness and timeliness of reports from all SDSs and IALs were determined based on the MOH guidelines.
The third protocol concerned information use.
Data was entered and analysed on SPPSS version 19 software.
A total of 17 health districts (woredas) were randomly selected from the six regional states implementing HMIS in Ethiopia. These districts contributed a total of 32 SDSs of which 26 health centres and 6 district hospitals were reviewed. A HMIS was fully implemented over the previous two years in 19 of the 32 (59.4%) SDSs, in 7 of 17 (41.2%) of IALs and in 1 of 6 (16.7%) of the regions.
Observations on the basic infrastructure required for HMIS showed that a card room was a standard specification in 7 of 32 (21.9%) sites, and a standard Master Patient Index (MPI) box was available in 13 of 32 (40.6%) sites. Moreover, standard shelves were available in 15 of 32 (46.9%) of SDS (see Table 1). HMIS reporting formats were observed at 32 of 32 (100%) of the SDSs.
An assigned focal person for HMIS was observed at 25 of 32 (78%) facilities with 7 of 25 (28%) focal persons having information technology training. Regular budget allocation for HMIS running costs were found at 7 of the 32 (22%) facilities, 5 of 17 (29.4%) districts and 2 of 6 (33.3%) region level offices (see Table 1).
Data Management and Reporting Systems Performance
SDS: The performance of the six functional areas (see Figure 2) of the data management and reporting systems gave a mean score of 0.5 for resources. Data management processes and data quality controls scored 1.2 and links with the national reporting system scored 1.3. For SDSs the system shows relative strengths in the data collection and reporting tools (score 1.5).
IAL and M&E: The IALs had data collection and reporting tools scores of 0.5 at district, 1.25 at regional and 1.5 at national levels (see Figure 3). For resources the mean score value ranged between 1.5 at district level to 2.0 at regional and national levels. The M&E structure and capabilities scored 1.0 at district level, 1.5 at regional level and 1.6 a national level. Data management processes and data quality controls scored 1.0 at district, 1.25 at regional and 1.1 at national levels. Furthermore, links with the national reporting system scored 1.0, 1.6 and 1.4, at district, region and national levels respectively, indicating the presence of parallel reporting.
Documentation and reporting performance
SDS: Indicator source documents were available in 31 of the 32 (95%) sites. Completion of reporting forms was seen in 24 out of the 31 (77%) sites. However, regarding the dates for the indicator source documents only 21 of 31 (68%) fell within the agreed national reporting period (see Figure 4).
IAL & M&E: reports were available in 12 of 17 districts (71%). Completion of the fields of the key indicators were seen in 7 of 12 (62%) available district reports and in 6 of the 17 districts (39%) reports were received on time at the district level (see Figure 5). Furthermore, the reporting performance at regional and national levels showed that though the representative completeness reached 87% and above for both administrative levels, the content completeness and timeliness of reporting were as low as 39% and 73% at regional level. At national level content completeness and timeliness were at 29% and 53% respectively.
The accuracy ratio was related to over reporting at the SDS and IAL and under reporting at the M&E level. Accuracy of the observed data in 24 of 32 (76%) SDSs had an accuracy ratio that fell within the accepted range, 11% had accuracy ratio less that 70% and 7% were above 130% indicating under reporting. Comparing services, under reporting was more common for voluntary counselling and testing (VCT), proportion of deliveries attended by skilled persons (SBA) and tuberculosis case detection rate (TBCDR). There was over reporting for measles, Pentavalent, antiretroviral therapy (ART) and contraceptive acceptance rate (CAR) in the majority of the SDSs (see Figure 6).
Data was compiled on quarterly basis at 30 of 32 (94%) of SDSs and 12 of 17 (71%) IALs. Feedback reports based on HMIS data was observed in 35.3% of IAL and 50% of M&Es. Discussion and decisions based on HMIS data occurred in 37% of the facilities. These included patient utilization of services, service coverage and medicine stock-outs. Routine meetings to review managerial or administrative matters were conducted in 23.5% and 50% cases for the studied IAS and M&E levels respectively.
Results from the data management and reporting systems assessment enabled the team to understand qualitatively and quantitatively the operationality as well as the relative strengths and weaknesses of functional areas that affect the overall quality of data in the health systems. Findings showed that standard data collection and reporting tools were in place. However, implementation was low. Major factors were: inadequate provision of the required resources or inputs, including lack of trained focal persons, inadequate start-up costs that including basic infrastructure, such as the availability of card rooms, standard MBI boxes and shelves. The majority of administrative levels tended to allocate inadequate funding to operationalize the new HMIS on regular basis. A study done previously on the implementation progress of the country’s HMIS had observed similar findings.6
Furthermore, though source documents for the selected indicators were available for the reporting being verified, content completeness and reporting timeliness remained far below the national 85% target. Accuracy of reported data, moreover, was generally inadequate. A tendency to over report for the indicators was a common finding in nearly all of the reporting levels. Findings indicated that the level of data accuracy among the various levels was over-reported in nearly all of the health facilities, districts and regions, and was under reported at the M&E national level.
The study further acknowledged the inadequacy of regular supervision and feedback from senior levels to address the problems of inadequate documentation, late and incomplete reporting and inaccurate reporting.
These findings indicate the extent to which data quality can be adversely affected by limited investment in infrastructure and human resource capacity as well as by the performance of the data aggregation and reporting units of the system.
Furthermore, the study observed a limited culture of using information for decision-making in planning and management of implementing programmes. Just 37% of the facilities had exercised discussion and made decisions using findings from routine health information.
The present study documents the challenges and limitations of the information systems to serve as the foundation of decision-making and for monitoring the quality of service delivery. While achieving and maintaining data quality requires ongoing attention and a comprehensive approach in addressing the issues of data management and reporting systems and data accuracy, strengthening health information systems is one of the most powerful ways of improving health outcomes. To this effect, the assessment recommends instigating:
- A favourable administrative and legal environment that ensures or reinforces mandatory routine reporting;
- Sound data archiving;
- The designation of institutional responsibilities for the approval of national data collection instruments and methods;
- Infrastructure support to enhance the efficiency and quality of reporting as well as building capacities of health information experts. The latter will enhance the use of evidence based practices during supervision, planning and budgeting;
- Adopting procedures to address late, incomplete or inaccurate reports received from sub-reporting levels and corrections to earlier discrepancies in reports through regular integrated supportive supervision.
This study recommends that follow-up assessments on data management and reporting systems should be integrated into the routine supervision systems as a means identifying and monitoring necessary improvements.
The authors are keen to express their appreciation to the WHO country office for the technical as well financial support to undertake this relevant and timely work. Thanks also go to all experts at the M&E unit of the FMOH. The team offers its appreciation to the service delivery facilities and the various tiers of the health care system for their willingness and active participation in this important endeavour.
- Federal Democratic Republic of Ethiopia Ministry of Health, Health Management Information Systems, Monitoring and Evaluation. Information Use Guideline and Display Tools, January 2008.
- Federal Democratic Republic of Ethiopia Ministry of Health, Health Management Information Systems, Monitoring and Evaluation (M&E). HMIS Procedure Manual: Data recording and reporting procedures, January 2008.
- WHO. Monitoring the building blocks of health systems: a handbook of indicators and their measurement strategies, WHO, Geneva, Switzerland, 2010.
- WHO. World Health Organization guideline on DQS and LQS, 2008, 2009.
- USAID and Measure Evaluation. PRISM: Performance of Routine Information System Management. PRISM Tools for Assessing, Monitoring, and Evaluating RHIS Performance, PRISM Tools Version 3.1, March 2010.
- Federal Democratic Republic of Ethiopia Ministry of Health, Health Management Information Systems, Monitoring and Evaluation. Implementation status of HMIS and M&E in Ethiopia, September 2009.