0% found this document useful (0 votes)
8 views46 pages

HMIS

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 46

HMIS

STANDARD
OPERATING
PROCEDURES

Ministry of Health

February 2022
TABLE OF CONTENTS

LIST OF ABBREVIATIONS 4

DEFINITION OF TERMS 4

1.0 INTRODUCTION 5
1.1 BACKGROUND & OVERVIEW 5
1.2 ABOUT THE DOCUMENT 5
1.3 DISSEMINATION 6

2.0 DATA MANAGEMENT SYSTEM 6


2.1 PREAMBLE 6
2.2 DATA PROTECTION 6
2.3 DATA SECURITY 7
Security of Non-Electronic Data 7
Security of Electronic Data 7

3.0 DATA COLLECTION AND REPORTING 8


3.1 DATA COLLECTION TOOLS 8
3.2 DATA COLLECTION PROCEDURE 10
3.3 DATA REPORTING PROCEDURE 10
At Health Facility Level 10
At District Level 11
At Regional Level 11
At State Level 12
3.4 LATE REPORTS 12
3.5 DATA SUMMARY REPORTS 12

4.0 DATA QUALITY AUDITS 13


4.1 OVERVIEW 13
4.2 DQA OBJECTIVES 13
4.3 DQA DATA MANAGEMENT 14
4.4 DQA STANDARD PROCEDURES 15
Preparing for DQA 15
Conducting DQA 15
DQA Timelines 16
4.5 DQA ASSESSMENT REPORT 16
4.6 DQA SCORECARD 17
4.7 STAFF ROLES AND RESPONSIBILITIES IN DQA 17
Health Facility Staff and MOH Managers 18
Health Facility Records Officers 18
Health Management and Information Officers 18
Implementing Partners – Data & Program Officers 18

5.0 DATA DEMAND & USE 19

1
5.1 DATA UTILIZATION AT FACILITY LEVEL 20
5.2 DATA ANALYSIS AT DISTRICT / REGIONAL / NATIONAL LEVEL 21
5.2.1. Overview 21
5.2.2 Basic Epidemiological Concepts 22
5.3 DATA ANALYSIS AND PRESENTATION 25
Conduct a Data Quality Check 25
Date Analysis & Presentation 25
Date of Data Analysis 25

6.0 RESPONSIBILITIES FOR INDIVIDUAL STAFF CATEGORIES 29

7.0 FUTURE CONSIDERATIONS 29


7.1 HMIS UPDATE 29
7.2 ELECTRONIC DATA CAPTURE 29

8.0 ANNEXES 31
ANNEX 8.1: DISTRICT DATA LOG 32
ANNEX 8.1 b: DISTRICT PERFORMANCE 33
ANNEX 8.2: REGIONAL DATA LOG 34
ANNEX 8.2 b: REGIONAL PERFORMANCE 35
ANNEX 8.3: NATIONAL DATA LOG 36
ANNEX 8.3 b: NATIONAL PERFORMANCE 37
ANNEX 8.4: DQA TEMPLATE 38
ANNEX 8.5: DQA LOG 39
ANNEX 8.6: SCORECARD 40
ANNEX 8.7: SUPPORTIVE SUPERVISION TOOL 41
ANNEX 8.8: DATA VISUALISATION 42
ANNEX 8.9: MENTORSHIP GUIDELINES 43
BACKGROUND 43
GOALS AND OBJECTIVES OF MENTORSHIP 43
MENTOR AND MENTEE 44
MENTORSHIP PROCESS 44
HMIS MENTORSHIP 44

2
LIST OF TABLES AND FIGURES

TABLE 1: DHIS2 REGISTERS (DEC 2021) 8


TABLE 2: DHIS2 SUMMARY SHEETS (DEC 2021) 9
TABLE 3: FLOW OF PATIENT DATA 10
TABLE 4: DATA QUALITY DIMENSIONS 13
TABLE 5: INDICATORS & CORRESPONDING DATA SOURCES PER PROGRAM AREA 14
TABLE 6: DATA VALIDATION RULES FOR DHIS DATA ENTRY 14
TABLE 7: DQA/SCORE CARD COLOR-CODES 17
TABLE 8: SOMALIA NATIONAL INDICATORS (EXCERPT) – AVAILABLE IN DHIS 20
TABLE 9: RELEVANT EPIDEMIOLOGY TERMS 22
TABLE 10: EXAMPLE FOR HMIS DATA ANALYSIS PLAN 26
TABLE 11: RESPONSIBILITIES PER STAFF CATEGORY 29

FIGURE 1: AN APPROACH TO ENHANCE DDU CAPACITY TO IMPROVE DATA DEMAND AND USE. 19

3
LIST OF ABBREVIATIONS

CDQ Continuous Quality Assurance


DDU Data Demand & Use
DHIS District Health Information Software
DQA Data Quality Assessment / Audit
HF Health Facility
HIS Health Information System
HMIS Health Management Information System
M&E Monitoring & Evaluation
MFL Master Facility List
MOH Ministry of Health
SOP Standard Operating Procedures

DEFINITION OF TERMS

Data Elements Data collected at health facility level.


Process of inspecting, cleaning, and modelling
data with the goal of discovering useful
Data Analysis
information, informing conclusions, and
supporting decision making.
Strategy to identify opportunities for and
Data Demand & Use constraints to effective & strategic data
collection, availability, analysis, and use.
Measure of the condition of data based on
Data Quality accuracy, completeness, consistency, reliability
and whether it is up to date.
Process of scientifically & statistical evaluating
Data Quality Assessment / Audit data to determine its validity, identify incorrect
data and implement corrective action.
Periodic assessment of the quality, relevance,
Evaluation
and impact.
System for the collection, collation, analysis,
HMIS presentation, utilization, and dissemination of
health data / information.
Calculated formula based on a combination of
Indicators data elements and a core element of data
analysis.
Information Processed, organized & structured data.
Systematic and periodic process of collecting,
analyzing, and using information to track a
Monitoring
program’s progress from planning stage to
completion.

4
1.0 INTRODUCTION

1.1 BACKGROUND & OVERVIEW


Health information is one of the six building blocks of a health system. A well-
functioning health information system supports the delivery of health services by
ensuring the production, analysis, dissemination, and use of reliable and timely
information on health determinants, health system performance and health status.1
In early 2020, the need to revise the national list of health indicators was expressed in
the Health Sector Coordination Meeting. While reviewing the list of indicators between
August and December 2020, it was observed that some of the desired indicators were
not being captured by the existing HMIS tools and those indicators were added
accordingly. This in turn prompted the review and update of the existing data
collection tools / registers and summary sheets. Revision of the HMIS tools took place
between August 2020 and December 2021. Review workshops of the proposed
indicators and data collection tools were held with MOH and other stakeholders in
Mogadishu, Hargeisa and Garowe with the aim to review draft tools and provide the
Oslo experts with feedback on areas of improvement. The draft tools were then
shared with the stakeholders for final inputs before submitting for HSC endorsement in
December 2021.
All public health facilities are provided with the updated standard registers and
summary forms for HMIS data. The inside covers of the revised HMIS tools contain
standard guidelines on the collection and reporting of HMIS data. The tools are
identical irrespective of the level of service delivery, except for inpatient / outpatient
services (separate for hospitals, HC and PHUs) and LMIS / Supply (hospital & HCs vs.
PHUs).
In addition, the reporting and data collation for nutrition, HIV, TB, IDSR, Supply / LMIS,
and HR has now also been integrated into DHIS.
Going forward, the focus is now on data quality, data analysis, and data presentation
& use for monitoring and planning at all levels.

1.2 ABOUT THE DOCUMENT


This document serves multiple purposes:
• It describes the procedures for data collection, data management and these
procedures apply to both public and private facilities.
Security procedures for routine health management information system (HMIS) data
should be collected, recorded, and managed in accordance with the Ministry of
Health policies to protect patient confidentiality. This document aims at ensuring that
quality data is collected in a more efficient way by describing how to carry out
operations correctly and consistently. Following these procedures will help in
achieving uniformity in carrying out HMIS activities. This document should therefore be
available at every unit where HMIS data is either being generated, aggregated, or
analyzed, be it a public or private HF.

1Everybody business: strengthening health systems to improve health outcomes: WHO’s framework for action. World
Health Organization 2007; ISBN 978 92 4 159607

5
• It describes the steps required to prepare for and conduct DQA and presents
templates to be used in DQA process.
• It gives guidelines for basic data analysis, interpretation and presentation using
selected indicators from the national HMIS indicators list.
• It complements the national HMIS tools users’ manual, hence the two should
be used together.

1.3 DISSEMINATION
The SOP should be disseminated as follows: the FMOH / SMOH HMIS teams are trained
first and then facilitating the training the regional teams, who will in turn train the
district teams. The latter should then train and mentor the HF teams of each facility in
their district / area to complete the cascade.

2.0 DATA MANAGEMENT SYSTEM

2.1 PREAMBLE
• There should be a trained HMIS focal person at all levels where HMIS data is
either generated, aggregated, or processed.
• Each facility should have a unique code from the DHIS master facility which is
issued and maintained by national HMIS office. The code should be indicated
on all HF HMIS tools. Having an MFL is important for establishing the exact
number of HFs, and for resource allocations including supportive supervision.
• All HMIS data should be collected using the updated standard health facility
registers. The inside cover of all registers clearly explains how data should be
collated and the instructions have also been translated into Somali. This is
hoped to minimize data collection and reporting errors.
• Data should be summarized using standard monthly summary forms. The forms
contain guidelines, both in English and Somali, on how to summarize data from
the various registers.

2.2 DATA PROTECTION


All person-identifiable data must be treated with utmost confidentiality.
1. Patient records should be accessible only to a minimum number of authorized
people and those who need access to ensure delivery of medical services.
2. All staff accessing medical data shall be made aware of their responsibility to
maintain patient confidentiality. Staff should undertake an initial training prior
to assignment and regular refresher trainings. The latter should include data
ethics especially with regards to the handling of patient data.
3. Data containing patient names (Registers / Tracker App) should not be
transmitted in a way that could allow unauthorized interception. This includes
sending email attachments, as well as flash disk and CD-ROMS containing
patient data by postal or courier services.

6
4. Patient data should be analyzed at health facilities level, where it is generated,
and only the summarized data (summary sheets) should then be shared and
distributed.

ALL MEDICAL DATA SHOULD BE SECURELY STORED TO SAFEGUARD AGAINST


UNAUTHORISED ACCESS

2.3 DATA SECURITY


Security of Non-Electronic Data
HMIS data should be securely stored at all levels, including health facility level.
1. Data should be securely stored at health facility level, preferably lockable
metallic or wooden cabinets.
2. The most current health files (last 3 years) should be kept within the records
office while files older than 3 years should be safely archived.
3. Data should be stored chronologically and alphabetically using clear labels to
ease retrieval. For example, January 2022 reports should be adjacent
December 2021 reports while Bay region should appear before Sool in the filing
cabinet.
4. Anyone seeking access to the aggregated HF data should obtain permission
from the national, regional or district health offices or the HF manager in-
charge. Such request should include the reason why data is being requested,
the variables and period of interest.
5. Anyone seeking access to patient-level records from HF registers should obtain
written permission from the national HMIS office. Such request should include
the reason why data is being requested, the variables and period of interest.
Exceptions to this requirement include supportive supervision / mentorship visits
by relevant authorities.

Security of Electronic Data


Electronic data should only be stored on devices that are routinely and securely
backed up:
1. A daily back up of data is recommended. This should be done on the
computer’s hard-disk AND an external hard disk.
o A weekly backup should be done on a CD-ROM or external hard drive or
Cloud. This should be kept in a separate location.
o To reduce risk of overwriting newer files with old data, names of electronic
files should include the date the file was last saved.
2. Data for analysis should be anonymized and contain only minimum personal
identifiers necessary for analysis such as age, sex, and geographic location.
Patient names, DOB, mobile phone numbers, etc. should never appear in the
summary tools or reports of analyzed data.
3. Patient data must not be stored or transmitted on removable media or laptops
without encryption.
4. Computers used to enter or access data should have updated anti-virus
software to safeguard against data corruption and phishing.

7
5. Access to personalized data should be limited to authorized personnel. Each
user of the system should have a password protected individual account and
passwords must not be shared between users.
6. Assignment of access rights to DHIS should be handled by the national HMIS
coordinator. S/He should keep a record of authorized users and their access
levels.
7. Users should be advised against sharing their database login details with
anyone.
8. The database should have an audit trail for any edits made to the data (this is
inbuilt in DHIS). The trail should include original value, date of the change and
user details, and a brief explanation of why the change was made.
9. New staff should be taken through an induction program to familiarize them
with the HMIS system and data safety features.

ACCESS TO ELECTRONIC MEDICAL DATA SHOULD BE LIMITED


THROUGH PASSWORD PROTECTED USER ACCOUNTS
**
NO DATA SHOULD BE CHANGED OR DELETED
WITHOUT PROPER DOCUMENTATION

3.0 DATA COLLECTION AND REPORTING

3.1 DATA COLLECTION TOOLS


1. HMIS data should be collected using only the standard HMIS tools developed
and produced by the MOH As a basic requirement, each health facility should
have all the necessary registers and summary sheets (Tables 1 & 2).
2. For ease of reference, and to reduce the overlap of numbering across service
delivery levels, data recording tools have a unique number depending on the
level of service delivery (Tables 1 & 2).
3. All organizations and implementing partners who provide services via the
existing health facilities should use the national HMIS tools for reporting.
4. Existing HMIS tools should not be updated or edited without the approval of the
Ministry of Health.
5. For harmonization and standardization, no additional data collection and
reporting tools should be introduced without the approval of the MOH.
6. Any changes or updates to the system must be documented as part of the
MOH change protocol process.
7. All health workers should be trained on the data collection and summary tools.

Table 1: DHIS2 Registers (Dec 2021)


DHIS2 Registers (Dec 2021) New Label

8
Hospital Inpatient Register R 01
OPD Over-5 Register Hospital & HCs R 02
OPD Under-5 Register Hospital & HCs R 03
OPD Register PHUs & CH R 04
Pre-ART Register R 05-A
ART Register R 05-B
Immunization (EPI) Register R 06
Nutrition Register R 07
Birth Spacing (BS) Register R 08-A
Antenatal Care (ANC) Register R 08-B
Labor, Delivery & Maternity Register R 09-A
Postnatal Care (PNC) Register R 09-B
Laboratory Services Register R 10-A
Lab Results Register R 10-B
LMIS Essential Medicines Hospitals & HCs R 11-A
LMIS Program Supplies Hospitals & HCs R 11-B
LMIS Essential Supplies PHUs R 12
Theater Register R 13
TB Register R 14
STI Register R 15
VCT Register R 16

Table 2: DHIS2 Summary Sheets (Dec 2021)


DHIS2 Summary Sheets (Dec 2021) New Label
Hospital Inpatient Services MF 01
Hospital Outpatient Services MF 02
Health Center Outpatient Services MF 03
PHU / FHW / CHW Services MF 04
HIV / ART Services MF 05
EPI & Child Health Services MF 06
EPI Tally Sheet MF 07
Maternal & Reproductive Services MF 08
Mortality MF 09
Laboratory Services MF 10
Logistic Data Hospitals & HCs MF 11
Logistic Data PHUs MF 12
IDSR Hospitals & HCs WF 13
DS & TB Services QF 14
HR & Training QF 15

Refer to DHIS User & HF Manual for details on how to use the above tool.

9
3.2 DATA COLLECTION PROCEDURE
HMIS data should be collected from each Hospital, District Hospital, Health Center
(HC) and Primary Health Unit (PHU). All staff involved in the collection and
management of patient-related information must ensure that data use does not
"compromise" patient confidentiality:
1. Client data should initially be captured in the designated patient medical
cards / charts by the health care provider.
2. The provider should then immediately transfer the recorded data recorded
from the medical form into appropriate register (see Table 1).
3. The registers should be updated with each patient visit.
4. Each month should start on a new page of the register.
5. Staff should compile the page summaries at bottom of each fully completed
page of the register.
6. At end of each month, staff should add up all page totals for the relevant
month to get monthly summary.
7. District / Regional HMIS officers, MOH and IP staff should strengthen this process
through routine supportive supervision and continuous mentorship.

Each register and summary sheet has a version number and version date clearly printed
on the cover page. This reduces chances of using outdated registers.
The instructions / summary instructions / tables inside the front cover of every register
and summary sheet:
1. Describe what data should go into which column or cell
2. State if the tool is to be used together with any other tool e.g., tally sheet .

3.3 DATA REPORTING PROCEDURE


Data should be reported using the designated MOH standard summary sheets. This
should be done within the stipulated time to ensure availability of data for timely
decision making. Table 3 shows the general data flow for monthly reports and the
reporting deadlines.

Table 3: Flow of Patient Data


Source Destination Deadline
Individual patients Register Immediately
Register Facility Summary Form End of month
Facility Summary Form District / Regional HMIS officer 5th day of month
District / Regional HMIS officer DHIS Platform 10th day of month

At Health Facility Level


1. Health facilities should compile the monthly report in triplicate using data
from the page summaries (and EPI tally sheet). This should be done using
designated standard monthly summary sheets (Table 2).

10
2. The staff member completing the monthly summary sheet should
indicate his/her name and designation, and sign in the appropriate
section of the form.
3. The facility in-charge should cross-check all reports for errors before
submitting to the district / regional HMIS officer and use DQA forms and
data validation rules as a guide (Table 6).
4. If the report has no errors, the in-charge should sign the requisite section
of the report. If any errors are identified, request staff member to correct
the form and then review again. Ensure all expected reports are
prepared for submission.
5. Two copies of the report should then be submitted to the district /
regional HMIS officer by the 5th day of each month to enable data entry
to DHIS by the 10th of each month.
6. The facility should retain a copy of the report for their records as well as
for data quality audits (DQAs) by district/regional HMIS officers, MOH
and IPs.

At District Level
1. The district HMIS officer should ensure that s/he has received all expected
reports from all functional health facilities in the district and update the district
monthly data log (see Annex 1 for the District Data Log Template).
2. The HMIS officer should ensure that:
o All expected reports have been received.
o All fields in the received health facility reports are filled.
o Stamp and date all the received reports.
3. S/he should contact any facilities whose reports are either missing or
incomplete for clarifications before compiling the district report and update
the district reporting performance report (see Annex 1b).
4. All data queries should be resolved before submitting the data to the next
level, or before entering data into DHIS.
5. The district monthly HMIS report should then be compiled using the standard
Excel template. Skip this step if using DHIS.
6. The district monthly Excel file (if used), the district data log and Annex 1b should
be submitted to the regional HMIS officer by 7th day of each month.

At Regional Level
1. The regional HMIS officer should ensure that s/he has received all expected
reports from all districts in the region and update the regional data log (see
Annex 2 for the Regional Data Log Template).
2. The HMIS officer should ensure that:
o All expected reports have been received.
o All fields in the received monthly reports have been completed.
o Stamp and date all the received reports.
3. S/he should contact any districts whose reports are either missing or incomplete
for clarifications before compiling the regional report.
4. All data queries should be resolved before submitting the data to the next
level, or before entering data into DHIS.

11
5. The regional HMIS report should then be compiled using the standard Excel
template. Skip this step if using DHIS.
6. The regional monthly Excel file, copies of district & regional data logs and
Annex 2b should then be submitted to the national HMIS office by the 10th day
of each month.

At State Level
1. The National HMIS coordinator or his/her designee should ensure that s/he has
received all expected reports from all the regions and update the national
monthly data log (see Annex 3 for the National Data Log Template).
2. The HMIS officer should ensure that:
o All expected reports have been.
o All the fields in the received monthly reports have been completed.
o Stamp and date all the received reports.
3. S/he should contact any regions whose reports are either missing or incomplete
for clarifications before compiling the national HMIS report.
4. All data queries should be resolved before commencing data analysis.
5. A national HMIS report should then be compiled and shared with various
stakeholders and feedback provided to regions and districts. The analysis is
done using DHIS visualization features, such as pivot tables and maps.

Any queries identified after running DHIS validation rules should be discussed with the
respective facility/ facilities and resolved:
- HFs should recheck registers and – if required - make necessary adjustments on
summary forms.
- Any changes should be documented both on the summary sheet (hard copy) and DHIS
(soft copy) for future reference. In DHIS this is done by double-clicking the value that
is being modified.

3.4 LATE REPORTS


1. In case of delayed facility reports, the HMIS officer supporting the HF should
contact the HF within a day after the deadline and request a status update.
2. If it’s not possible to receive the report in good time, the officer should update
the data log with the available reports and submit.
3. Reports submitted after the deadline should still be entered into DHIS.

3.5 DATA SUMMARY REPORTS


During the first quarter of each year, a summary report should be compiled for the
previous year including:
1. Data summary / Performance data / Findings of interest / DQA highlights.
2. Quality assurance including significant DQA findings, and any quality control
measures taken to ensure good quality data.
3. Data graphics / tables / maps / etc. either as part of the report or in
appendices.
4. Attached activities reports for more details.

12
4.0 DATA QUALITY AUDITS

4.1 OVERVIEW
Routine monitoring and supportive supervision activities generate important data for
patient care and programme improvement. Data quality assessments help to identify
errors and provide information on the facilities’ mentorship needs on how to
strengthen good data practices.
This SOP contains detailed instructions for conducting HF data quality audits (DQAs).
It looks at completeness of data records and compares register records with
compiled data on the monthly summary sheets and DHIS reports.
The DQA process verifies key data characteristics and facilitates the process of
evaluating data standards. Table 4 summarizes the various dimensions of data quality.

Table 4: Data Quality Dimensions


Characteristics Definitions
Accuracy One of the components of data quality which refers to whether data
values are the correct / accurate.
Completeness Measures whether all data fields within a data collection tool / summary
sheet have been filled.
Consistency Logical coherence among related aspects of data. For example, the
number of pregnant women given Iron/Folate for treatment should not
be greater than number of anemic pregnant women.

Validity Measures the ability of data to reflect events/outcomes. For example, the
screening of a child’s nutritional status by other means besides MUAC
and Weight/Height is not valid.

Reliability The extent to which we can rely on the source data, and this evaluates
whether data show similar results when indicators are measured more
than once, using similar characteristics.

4.2 DQA OBJECTIVES


1. To check data variables for timeliness and completeness.
2. To evaluate data consistency, through comparison of data recorded in the
registers with the monthly summary sheets and DHIS online database.
3. To appraise data reliability by comparing data summaries in registers with
monthly summary forms.
4. To assess validity of reported data by confirming method of measurement from
source documents (and facility staff).
5. To assess the accuracy of data through double checking all data fields against
source documents.

13
4.3 DQA DATA MANAGEMENT
As DHIS data is aggregated from different registers, DQA needs to compare reported
data against all sources. Due to time-consuming nature of DQAs (4 hours on
average), the DQA team ought to focus on selected data elements for each
program area. A sample list of DQA indicators and their source documents are shown
in Table 5.

Table 5: Indicators & Corresponding Data Sources per Program Area


Program Area Indicator Registers Summary Form
R-01 MF-01
1. Number of new patients treated at HF this month
R-02 MF-02
OPD 2. Number of watery diarrhea cases among U5s
R-03 MF-03
3. Malaria cases confirmed by RDT/slide
R-04 MF-04
5. Number of children screened for malnutrition R-03
MF-04
Nutrition 6. Number of U5s moderately malnourished R-04
MF-06
7. Number of U5s severely malnourished R-06

9. Number of PW with ANC-1 visits R-04 MF-04


ANC
10. Number of PW with ANC-4 visits R-08 B MF-08

12. Number of deliveries in the health facility


R-04 MF-04
Deliveries 13. Number of deliveries monitored with partograph
R-09 A MF-08
14. Number of low-birth-weight babies
15. Number of children <1 immunized with Penta 1
R-04 MF-04
Immunization 16. Number of children <1 immunized with Penta 3
R-06 MF-06
17. Number of children <1 immunized against Measles
18. Number of first PNC checks within 48 hours R-04 MF-04
PNC
19. Number of new mothers counselled on IYCF R-09 B MF-08

20. Number of RDTs for Malaria


Laboratory R-10 A MF-10
21. Number of positive RDTs

23. RDT / ORS opening balance


R-11 A MF-11
Dispensary 24. RDT / ORS received
R-12 MF-12
25. RDT / ORS in store

27. Number of PW counselled and tested for HIV R-05 A MF-05


PMTCT (ANC)
28. Number of PW with positive HIV test result R-08 A MF-08
HF
HR Staff who attended RMNCH training QF-15
Records

Note: Registers R-04 & R-12 and Summary Sheets MF-04 & MF-12 are only applicable for PHUs.

Table 6: Data Validation Rules for DHIS Data Entry


Form Value 1 Rule Value 2
Medical/ General Admissions + Surgical Admissions +
Maternity/ Gynecological Admissions + Pediatric/ = Total Inpatient Admissions
NICU Admissions
MF-01
Medical/ General Deaths + Surgical Deaths +
Maternity/ Gynecological Deaths + Pediatric/ NICU = Total Inpatient Deaths
Deaths
MF-02 OPD curative child new (5-9yrs) +
= Total OPD Visits Child (5-9yrs)
MF-03 OPD curative child follow-up (5-9yrs)

14
MF-02
OPD curative child new (0-59m) +
MF-03 = Total OPD Visits Child (0-59m)
OPD curative child follow-up (0-59m)
MF-04
Malaria confirmed & treated
MF-03 Fever case tested for Malaria (RDT / Microscopy) <
(ACT / Primaquine)
Malaria confirmed & treated
Fever case tested for Malaria RDT <
MF-04 (ACT / Primaquine)
Live birth in community <= Delivery in community
Antenatal client HIV tested <= Antenatal client HIV positive
MF-05
Postnatal care HIV tested <= Postnatal care HIV positive
FACILITY BCG Immunizations (0-11m) < FACILITY Measles (0-11m)
MF-06 FACILITY Penta-1 Immunizations (0-11m) < FACILITY Penta-3 (0-11m)
MUAC Red + Yellow + Green = Child screened MUAC
MF-06
Total BCG Immunizations during the month <= BCG used this month
MF-07
ANC-1 Visits <= ANC-4 Visits
MF-08 Delivery assisted vaginal / Delivery Caesarean < Delivery normal
Live birth weight < 2.5 kg < Live birth in facility
MF-09 Maternal Death during Pregnancy <= Maternal Death Total
Total RDT Tests done >= Positive RDT Tests
MF-10
Total Slide Microscopy done >= Positive Microscopy Slides
MF-10
Total RDT done = RDT used this month
MF-11

4.4 DQA STANDARD PROCEDURES


Preparing for DQA
The following steps should be completed before conducting the facility DQA and are
crucial for a successful DQA:
1. Contact health facility manager in charge to schedule a DQA visit and agree
on date, time, and itinerary. Inform IPs about the confirmed date and request
their participation.
2. During conversation with HF manager also ensure that all registers / summary
sheets / other source documents will be available.
3. Familiarize the DQA team with results from previous DQA assessments at the
health facility.
4. Refer to the facility’s monthly summary sheets and DHIS data and review the
predetermined DQA indicators.

Identify any missing reports / data prior to the planned HF visit. Copies of any missing
reports should be delivered by the HF during the DQA visit.

Conducting DQA
1. Brief HF & IP staff about purpose and steps of conducting DQA.
2. Open the DQA tool and enter the facility name, DQA date, and names of the
staff conducting the DQA into the appropriate fields.
3. Obtain copies of relevant registers and summary sheets.

15
4. Start the DQA process addressing all DQA areas documented in Table 5 & 6.
Allow time for discussions on issues that may arise during the DQA visit.
5. Complete the DQA supportive supervision tool.
6. At the end of the DQA, hold a joint meeting with the facility staff to discuss the
results and agree on how to resolve the identified gaps or inconsistencies.
7. A copy of the DQA report with action plan should remain with the HF and IP for
reference in preparation for tailored mentorship.
8. The HF should register the visit in their monitoring log which should contain the
following information: Date of DQA / DQA score / action points including
timeframe when action items should have been addressed.

Following the DQA visit, the team should also summarize their findings through graphs
and tables which should be shared with the HF and IP.
These illustrations should also be made available to HF managers who should use these
sheets during their presentation of the DQA findings to health facility / clinical staff.
Ideally, these pages should be printed out for the facility staff to review and post on their
walls.

DQA Timelines
1. DQA visits should ideally be conducted at each facility every 3 months and at
a minimum every 6 months as this provides an opportunity to continuously
review gaps and weaknesses and their eventual resolution. The following are
some of the criteria that ought to be applied:
2. Conduct DQA every 3-6 months (or earlier if an immediate follow-up DQA is
required).
3. Data for the current month should not be included in the DQA process since it
is likely to be incomplete and therefore not representative.
4. Do however check register entries of the current month for completeness and
accuracy.
5. DQA follow-up & action points need to be agreed immediately after
completing the DQA exercise.
6. During each DQA exercise, the team should review monthly reports for previous
3 months against source documents, as well as DHIS entries.

District / Regional HMIS officers, program managers, IPs and MOH staff should support
and mentor facilities on proper recording and reporting to ensure that all DHIS reports
are complete, accurate, reliable, and timely.

4.5 DQA ASSESSMENT REPORT


There should be a written report following each DQA which includes specific
guidance for facility staff on how to improve data collection and recording. The
report should also indicate the timelines for both priority activities as well as joint
follow-up reviews and a scorecard highlighting performance levels.

For the DQA to be helpful to the program and facility staff:

16
• A written final DQA report should be shared with facility staff within 7 days after
the DQA.
• Include DQA reports in quarterly data review meetings and ensure that 2-3
slides in the quarterly data review presentation are dedicated to DQA. The
slides should summarize district/regional DQA results and should always include
action points.
• After the facility DQA report is shared and discussed with facility staff,
mentorship on implementation of corrective activities should be provided by
MOH and IP staff. Follow-up and continued mentorship are critical for
improving data as well as service provision.

4.6 DQA SCORECARD


To improve the quality-of-service provision and HMIS data, a scorecard has been
developed to assess and improve the health services and HMIS data collection. The
scorecard is one of the outcomes of a data quality audit and assists in comparative
assessment of performance at HF, district, region, and national levels. The scorecard
uses data quality aspects from DHIS (timeliness, completeness, data values and data
validation rules), HF registers and monthly summary sheets to rate service delivery
using a composite index. The latter combines several indicators in a standardized way
to assess the overall performance of the HF. The objectives of HMIS scorecard are to:
1. Provide visual presentation of the status/ rating of different indicators.
2. Compares levels of service indicators using a composite index.
3. Highlights differences across HFs/regions etc. for focused mentorship.
4. Encourages competition for service/ data quality across all levels fostering a
data use culture.
The HMIS scorecard is generated using the DQA template. The template generates a
score for tools in each program area, and an aggregate score for each health facility.
The performance results are color-coded into 3 groups as shown in Table 7.

Table 7: DQA/Score Card Color-Codes


Score Color Implication

≤ 50 % Red Immediate and intensive mentorship required

50 - 74 % Yellow Enhanced mentorship / supportive supervision required

≥ 75 % Green Routine Mentorship required

4.7 STAFF ROLES AND RESPONSIBILITIES IN DQA


Collaboration of IPs, district /regional HMIS officers, MOH and HF staff is essential for
effective implementation of DQAs. This section describes the roles of each staff
category in DQA process.

17
Health Facility Staff and MOH Managers
• Allocate a date and time for the DQA exercise every quarter and make sure
all required data tools are readily available (all registers in use and the monthly
summary forms from the past 3 month).
• Notify all HF staff about the exercise and ensure clinical services are not
disrupted.
• Ensure as many HF staff members as possible actively participate in
conducting DQA.
• Review results and identify measures for improving services/ data.
• Facilitate feedback sessions to share DQA results with all facility staff.
• Maintain a regularly updated log showing DQA dates as well as the respective
scores, and the dates feedback was provided to HF staff.

Health Facility Records Officers


• Actively participate in the DQA exercise at HF level.
• Distribute HF DQA reports to clinical staff and discuss strategies for
implementing service / data improvement measures.
• Ensure that refresher trainings for health facility staff match the highlighted
mentorship requirements and are conducted at regular intervals.

Health Management and Information Officers


• Support regular HF data quality reviews by checking the data submitted via
the monthly summary sheets with the source data from registers.
• Discuss and explain DQA findings and data improvement measures with HF
staff.
• Compile data DQA reports in collaboration with HF staff and send to program
manager for review.

Implementing Partners – Data & Program Officers


• Contact HF in-charge to agree on date and time for DQA exercise.
• Oversee DQA process to ensure:
o All agreed services / data sources are assessed.
o HF staff understand the purpose and support the process of DQA.
o Staff know how to interpret DQA results.
• Review DQA report to ensure service/ data improvement measures are
clearly articulated.
• Schedule a meeting with HF management – preferably on the same day or
within 3 day of the assessment - to discuss the DQA findings and suggested
action points / improvement measures.
• Schedule refresher trainings at regular intervals, ensuring that all HF staff take
part and that topics match the DQA findings / mentorship recommendations.
• Submit DQA reports and feedback to regional and national HMIS Officers.
• Prepare quarterly DQA reports including corrective actions taken and share
with MOH.

18
5.0 DATA DEMAND & USE

HMIS data should be presented in a form appropriate to the needs of various


stakeholders. As such, data should be analyzed at all levels to generate outputs that
can be used to assess/ improve health service delivery as well as plan and monitor
programs and services. To achieve this, capacity should be built in data use core
competencies.

Descriptive epidemiology covers Time, Place, and Person by using the 5Ws’:
What= Health issue of concern
Who = Persons affected
Where = Place or geographical location
When = Time
Why / How = Causes, Risk factors, odes of transmission
Health data should be presented in a way that enables decision makers answer these 5
questions. Data analysis and appropriate presentation is key.

Figure 1 is an illustration of data demand and use (DDU) capacity improvement. The
goal of DDU capacity-enhancement efforts is increased data use for decision-
making, planning and advocacy, resulting in improved health outcomes.

Source: MEval2
Figure 1: An approach to enhance DDU capacity to improve data demand and use.

The following procedures are recommended for improving DDU capacity at various levels:

2 MEASURE Evaluation Technical Brief (Nov 2015).

19
5.1 DATA UTILIZATION AT FACILITY LEVEL
1. HF staff involved in data management should be trained on basic data
interpretation/ analysis and presentation skills.
2. Each HF should identify 5 priority data elements and/or indicators (from the
national HMIS indicator list) for tracking and visualization (Tables 5 & 8).
3. Each HF should hold monthly data review meetings to review the 5 selected
data elements / indicators, prior to submission of their monthly summary sheets.
4. HF data analysis should at a minimum include trend analysis which will also assist
HF management in assessing the HF’s progress towards set targets.
5. District/ Regional HMIS officers should support each of the HFs in their areas to
generate graphs showing monthly trends. to encourage use of data for
decision making. If the required hard/ software / Wi-Fi is available at HF level,
this should also include capacity building on how to access DHIS and generate
HF dashboards and graphics.
6. MOH and IP staff should facilitate quarterly forums where HF staff can present
their analysis, exchange lessons learnt and receive feedback.

Table 8: Somalia National Indicators (Excerpt) – AVAILABLE IN DHIS


Indicator
1 Antenatal Client 1st Visit Coverage
Numerator: Antenatal clients (ANC) 1st visit
Denominator: Estimated number of pregnant women

2 Antenatal Client Dropout Rate


Numerator: ANC clients 1st visit MINUS ANC Clients 4th visit
Denominator: ANC clients 1st visit
3 Antenatal Client HIV Testing Rate
Numerator: Antenatal clients HIV test done
Denominator: ANC clients 1st visit

4 Skilled Birth Attendant Delivery Rate


Numerator: Deliveries conducted by a skilled birth at health facility
Denominator: Estimated number of deliveries / PW
5 PNC Rate (0-48 hours)
Numerator: Postnatal 1st visits (0-48 hours)
Denominator: Total deliveries in health facility

6 Penta-3 Immunization Coverage


Numerator: Pentavalent 3rd doses (0-11 months)
Denominator: Estimated catchment population 0-11 months
7 Pentavalent Dropout Rate
Numerator: Children <1 who received Penta 1 MINUS Children <1 who got Penta 3
Denominator: Children <1 who received Penta 1 vaccination during the month
8 Diarrhea Treatment Rate (6-59 months)
Numerator: Children (6-59 months) with diarrhea treated with ORS and/or Zinc
Denominator: Children (6-59 months) diagnosed with diarrhea
9 Malaria ACT Treatment Rate
Numerator: Patients with malaria treated with ACT
Denominator: Malaria RDT positive + Malaria microscopy positive + Fever cases

10 SAM / MAM Rate Assessed with MUAC


Numerator: SAM / MAM children assessed with MUAC
Denominator: Children screened using MUAC

20
5.2 DATA ANALYSIS AT DISTRICT / REGIONAL / NATIONAL LEVEL
5.2.1. Overview
• The goal is that HF data is not just used to gain information but should also be
analyzed and fed back provided to facilities at regular (at least quarterly)
intervals. It is recommended that DHIS Information products such data
dashboards are used during feedback sessions to generate talking points and
identify of success stories / areas of improvement as this will improve
understanding and action.
• Prior to data analysis, data should however always be checked for quality /
accuracy / inconsistency. These quality checks should include checking for
missing data, missing reports, numbers outside the expected range (outliers)
and any other inconsistencies. Any errors identified should be corrected – IN
COOPERATION WITH THE RESPECTIVE HEALTH FACILITY - using the source
documents (= registers).
• Data analysis should also always include a
o Trend analysis (as this will most likely highlight seasonal variations).
o Performance comparison across geographical regions.
o Possible causes of the observed trends (simple logistic regressions should
be performed to test the strength of any associations).
• Analyses should be shared with relevant stakeholders through meetings, print
media and appropriate information products.
• An annual HMIS report should be prepared so that it can also be shared with
relevant stakeholders.
• Every 3-5 years, trend reports should be prepared based on comprehensive
statistical analysis of HMIS and other relevant data with the objective of to:
o Show patterns / trends / new characteristics.
o Demonstrate correlations between different variables.
o Identify any predictors of disease burdens.
o Provide data analysis / visualizations in a regional, national, and global
context.

IF DQA RESULTS CONSISTENTLY SHOW DATA IS OF POOR QUALITY,


CAUTION SHOULD BE EXERCISED
BEFORE USING SUCH DATA FOR ANY ANALYSIS OR DECISION MAKING

A NOTE ON DATA SECURITY


If HF data is exported for analysis, all patient-identifiers must be removed from the dataset
prior to transmission. In exceptional cases where files containing patient identifiers must
be shared for off-site analysis, such files must be encrypted, and password protected
before transfer. The passwords for these files should be sent to the recipient via a separate
communication.

21
5.2.2 Basic Epidemiological Concepts
MOH and HMIS officers should have a basic understanding of epidemiological
concepts and skills as this will help them to analyze and interpret health data. The
table below provides an overview / a brief introduction to the basic, relevant terms.
Table 9: Relevant Epidemiology Terms

Number expression the central value in a data set which is


calculated by dividing the sum of the values by their number.
Example
Average The weight of 4 children is 11, 15, 16, 18 kilograms respectively.
Sum of values = 11+15+16+18 = 60
Number of children assessed = 4
Average (weight of 4 children assessed) = 60 / 4 = 15 kilograms

Data completeness refers to the comprehensiveness /


wholeness of data; there should be no gaps or missing
information for data to be truly complete.
Timeliness refers to the availability and accessibility of data for
monitoring, analysis and decision making.
Completeness Example
Completeness: MF-03 had 9 sections that should be populated every
&
month. If some sections are blank, then the Health Center Outpatient
Timeliness Services summary sheet is not complete. Another example is - Region
A has 21 HCs, so 21 copies of MF-03 are expected every month. If only
18 Forms are submitted, the completeness rate would be 18/21 =
[0.857 * 100] = 85.7%.
Timeliness: If 5 of the 18 reports were submitted after the submission
deadline (5th of each month), then the timeliness rate for the month
would be (18-5)/18 or 13/18 = [0.722 * 100] = 72.2%.

Composite indicator are mathematical combinations (or


aggregations) of a set of indicators from different data sources.
Composite
Indicator Example
Body mass index (BMI) is a common composite indicator. It is a
measure of body fat based on two variables - height and weight.

(Health) coverage is the proportion / extent to which eligible


patients have received an intervention.
Coverage Example
12,000 children under 5-years are living in Village B. If only 6,000
children in Village B have received deworming medication, then the
coverage for the deworming service is 6,000/12,000 = [0.5 * 100] = 50%.
This refers to the process of increasing a quantity by successive
additions.
Example
Cumulative HC A immunized 11 children under-1-year in January; 15 U1’s in Feb
and 16 U1’s in March 2022. In this case, the cumulative number of
children receiving immunizations during QTR-1, 2022 is 11+15+16 = 42
children.

22
The dropout rate refers to the number of people that fail to
complete a service as recommended.
Example
Dropout Rate If 10 children U1 receive Penta-1 and only 5 receive Penta-3
vaccinations, then the dropout rate would be (10-5)/10 or 5/10 = 50%.
Similarly, if 20 PW access ANC-1 services but only 5 attend ANC-4
services, the dropout rate is (20-5)/20 or 15/20 = [0.75 * 100] = 75%.
A drop-out rate of ≥ 10% suggests a problem with the respective service.

A fraction represents a part of a whole and is also known as a


proportion. It describes how many parts of a certain size there
are, for example one-half (1/2), two-fifth (2/5) or three-quarters
(3/4). The top number is called the “numerator” (= how many
Fraction
parts there are) and the bottom one the “denominator” (how
(= Proportion)
many parts the whole consists of).
Example
The fraction 5/10 means that we are looking at 5 out of a total of 10
units. 5 is the numerator whilst 10 is the denominator.

An incidence is the occurrence rate or frequency of a disease.


Example
Incidence Village B has a population of 12,000 children under-5. In February 2022,
120 children U5 were diagnosed with measles. The measles incidence
rate in the U5 age group during February is 120 / 12,000 = [0.01 * 100] =
1%.

In DHIS, the indicator is a core element of data analysis. An


indicator is a calculated formula based on a combination of
data elements / category options or constants.
Indicator (DHIS) Example
The indicator “Outpatient Utilization Rate” is calculated by via the
fraction of the data element “Total of all OPD visits” (= Numerator) vs
the “Total Population (= Denominator).

Maximum refers to the highest value in a list of values, whereas


Minimum is the lowest value.
Maximum
Example
& The weight of 4 children is 11, 15, 16, 18 kilograms respectively. In this list
Minimum the
Maximum / highest value = 18 kilograms.
Minimum / lowest value = 11 kilograms.

The median is the middle number in a list of values/ a data


sample where the entries are sorted in an ascending or
descending order, and it separates the higher half from the lower
Median half.
Example
The weight of 5 children is 11, 15, 16, 18, 20 kilograms respectively. In this
case the median weight is 16 kilograms.

23
Mode is the value that occurs most often in a data set.
Mode Example
The weight of 5 children in OPD is 12, 15, 16, 12, 20 kilograms respectively.
The modal weight is 12 kilograms.

A percentage is a number or ratio expressed as a fraction of 100.


Percentage Example
The fraction 2/5 equals 2/5 * 100 = 40% and 1 percent represents a
fraction of 1/100.

In epidemiology, prevalence is the proportion of a particular


population found to be affected by a medical condition / disease
at a specific time. It is derived by comparing the number of
people found to have the condition with the total number of
people studied and is expressed as a fraction or percentage.
Prevalence Example
Village B has a population of 12,000 children under-5. In January 2022,
120 children U5 were diagnosed with measles, and another 210 U5s were
diagnosed in February. Therefore, by end of February, a total of 330 U5s
(120 + 210) had been diagnosed with measles. Assuming all the U5s
diagnosed with measles were alive at the end of February, the measles
prevalence in Village B would be 330 / 12,000 = [0.0275 * 100] = 2.75 %.

Proportion
See description of Fraction.
(= Fraction)

A range is the area of variation between the upper and lower


limits of a data set.
Example
Range The weight of 4 children is 11, 15, 16, 18 kilograms respectively.
Lowest value / weight = 11 kilograms.
Highest value / weight = 18 kilograms.
Range of values / weights = 11 – 18 kilograms.

A ration indicates how many times one number contains another.


Ratio Example
If 8 girls and 6 boys have been immunized against measles, then the ratio
of girls to boys is 8 to 6 = 8:6.

Performance evaluates services delivery against set targets or


standards.

Performance Example
Village B has a population of 12,000 children U5. If only 2,000 children are
immunized, the Village B performance in terms of uptake of
immunization services would be 2,000 / 12,000 = [0.167 * 100] = 16.7%.
Trend is the general direction in which something is developing or
changing over time.
Trend Example
If 50 children U1 received Penta-1 vaccines in January and 80 U1s in
February 2022, this is an upward or positive trend.

24
5.3 DATA ANALYSIS AND PRESENTATION
The following steps are recommended when preparing for the quarterly data review
meetings:

Conduct a Data Quality Check


• Check reporting rates to confirm all reports have been received and are
complete.
• Run validation rules to confirm all data fields have been correctly entered.
• If both are acceptable, decide on analysis and presentation plan (Table 10).
o If not drill down to flag HFs with missing reports/ data and data outliers
and then contact HF managers to provide/ check correct values from
the registers.
o Proceed with analysis plan one the DHIS data is cleaned (Table 10).

Which errors should the HMIS officer correct, and which ones must be referred to HFs
for confirmation?
- Questionable results and extreme outliers must not be changed without written
confirmation from the managers.
- Corrections should not only be made in DHIS but also on the respective monthly summary
forms at the HF level.
- If ≥65% of entries from a HF do not pass validation rules, all data from that HF should be
rejected / should not be included in the analysis.

Date Analysis & Presentation


• Tables are the simplest presentation; however, graphs provide better to
visualization.
• Aim to provide sufficient details based on the 5W’s.
• Provide an interpretation of the data based on the context and, if required,
look for additional data to confirm the analysis / hypothesis.

Date of Data Analysis


Since is likely that more recent DHIS data is still being updated by regional and facility
HMIS officers, it is advised to document the date of analysis.

25
Table 10: Example for HMIS Data Analysis Plan
Program Area Program Indicator Where & How
Generated by DQA exercise
HMIS - HMIS scorecard
Provide analysis / evidence on data accuracy
- Average monthly OPD consultations last quarter
Service OPD - OPD workload trend last 12 months Available from dashboards and dataset reports
Uptake - Leading causes of OPD visits & contribution to case load
- Average monthly admissions last quarter
- Admissions trend last 12 months
Inpatient Available from dashboards and dataset reports
- Leading causes of admissions
- Leading cause of death in hospitals
- BCG coverage (incl. missed opportunities – Birth / Polio)
- Penta dropout rate
EPI Available from dashboards and dataset reports
- Measles coverage
- Vaccine wastage rate
- Average monthly diarrhea cases and 6-monthly trend
Child Health Diarrhea - Proportion of OPD visits related to diarrhea Available form dashboards and dataset reports
- Percentage of U5s with diarrhea treated with Zinc / ORS
- Average monthly pneumonia cases and 6-monthly trend
Pneumonia - Proportion of OPD visits related to pneumonia Available form dashboards and dataset reports
- Percentage of U5s with pneumonia treated with antibiotics
- Number / Percentage of U5s screened for nutrition status
- Percentage of MUAC Red / Yellow / Green
Nutrition - Percentage of SAM / MAM referred for management Available form dashboards and dataset reports
- Percentage of children U5 provided with Vitamin A
- Percentage of children U5 dewormed

26
Program Area Program Indicator Where & How
- ANC coverage
ANC - ANC 4+ completion rate Available form dashboards and dataset reports
- ANC dropout rate
- Delivery coverage in HFs
- Skilled birth attendant delivery rate
Delivery Available form dashboards and dataset reports
Maternal - Percentage of births monitored with partograph
Health - Low birth weight rate
- Post-delivery uterotonic use rate
PNC - PNC-1 (0-48 hours) coverage Available form dashboards and dataset reports
- Breastfeeding within one hour rate
- Percentage of WCBAs counselled on modern BS
FP - Modern BS new / repeat user rate Available form dashboards and dataset reports
- Percentage of HFs with stock-out of any FP commodities

- TT coverage among pregnant women


EPI Available form dashboards and dataset reports
- TT coverage among women of childbearing age

- Curative care consultation rate child / adult last quarter


OPD - Household visit rate per FHW Available form dashboards and dataset reports
Community - Malaria / Diarrhea / Pneumonia treatment rate child
Health
- Antenatal client referral rate
Maternal
- Home deliveries Available form dashboards and dataset reports
Health
- Postnatal visit new mother/ newborn within 48 hours

- MUAC screening rate U5s / PLW


- MUAC Red / SAM rate U5s / PLW
Nutrition Available form dashboards and dataset reports
- Vit A Supplementation rate
- Deworming rate

27
Program Area Program Indicator Where & How
- Clients counselled on HIV incl. VCT, OPD, ANC, Delivery
- HIV tests performed
HIV Available form dashboards and dataset reports
- HIV test positivity rate
- ART coverage
- TB case rate per 100,000 population
- TB new pulmonary cases bacteriologically confirmed
TB Available form dashboards and dataset reports
- TB treatment success rate
Individual - TB drug-resistant cases per 100,000 population
Diseases - Malaria incidence rate (confirmed) per 100,000 population
- Malaria suspect / fever test rate
- Malaria RDT testing rate
Malaria Available form dashboards and dataset reports
- Malaria test positivity rate
- Malaria ACT treatment rate
- Malaria ANC LLIN distribution rate
- Progress towards program targets:
Miscellaneous Monthly progress
Available form dashboards and dataset reports
Progress trend
Cumulative progress

28
6.0 RESPONSIBILITIES FOR INDIVIDUAL STAFF CATEGORIES

To ensure adherence to the standard operating procedures, some recommendations


are summarized in the below table.
Table 11: Responsibilities per Staff Category

Appoint a focal person for HMIS data reporting to ensure:


• Accurate data generation
Health Facility Staff • Timely submission of monthly reports
• Data visualization & analysis
• Data use for program monitoring & decision making

• District HMIS officers to support & mentor health facility staff


• Regional HMIS officers to support & mentor district / HF teams
HMIS Officers
• State HMIS officers to support & mentor regional teams
• Active participation in the DQA process
• Active participation in the DQA process
Implementing • Provide feedback to supported HFs
Partners • Encourage HF staff to use data for monitoring / planning
• Support MOH through mentorship and training
• Ensure standard data tools are available at HF level
• Ensure staff are adequately and sufficiently trained
Ministry of Health
• Routinely sharing data with partners and stakeholder
• Oversee periodic review and update of DHIS / tools / SOPs

7.0 FUTURE CONSIDERATIONS

7.1 HMIS UPDATE


During the 2021 DHIS revision and update, reporting tools have been simplified,
integrated, and merged to ensure harmonized reporting and standardized data
collection and to abolish parallel / vertical reporting structures.

7.2 ELECTRONIC DATA CAPTURE


The future vision and next step are electronic medical registers (EMRs with traditional
paper registers either partly or completely replaced by their electronic versions.
EMRs have the advantage of speed, accuracy, and efficiency. They are also easy
to update and minimize data compilation errors and the elimination of printing
registers regularly is beneficial with regards to both costs and the environment.

Prior to the introduction of EMRs, a readiness assessment will be conducted


evaluating the following points:
1. The electronic patient data system should be compatible with DHIS and allow
easy / automated data flow as required.

29
2. Electronic patient data capture systems should reflect the layout and design
of hard copy documents = registers to minimize data entry errors.
3. Electronic systems should include data validation, range checks and
consistency checks to ensure good data quality.
4. The system should be secure to prevent unauthorized access to the data.
5. The system should have automated levels of access and privileges allowing the
customization of relevant functions for individual users.
6. The system should automatically record user IDs, time and data stamps at the
time of data entry to enable data audits.

30
8.0 ANNEXES

31
ANNEX 8.1: DISTRICT DATA LOG
Name of District: Month & Year of report:

Name of Region: Number of health facilities that have submitted reports:

Number of registered health facilities: Number of HFs with incomplete reports:

Number of functional health facilities: Number of HFs with missing reports:

Expected Reports for calendar month Received Reports for calendar month Complete Reports Total
(Yes = 1 / No = 0) (Yes = 1 / No = 0) (Yes = 1 / No = 0)

Incomplete
Health Facility

Complete
Received

Missing
MF-01
MF-02

MF-03

MF-04

MF-05

MF-06

MF-07

MF-08

MF-09

MF-10

MF-11

MF-12

MF-01

MF-02

MF-03

MF-04

MF-05

MF-06

MF-07

MF-08

MF-09

MF-10

MF-11

MF-12
QF-14

QF-15

QF-14

QF-15
Names
MF-01

MF-02

MF-03

MF-04

MF-05

MF-06

MF-07

MF-08

MF-09

MF-10

MF-11

MF-12

QF-14

QF-15
(Include all HFs in
District)

Log completed by: Date:


Designation: Signature:

32
ANNEX 8.1 b: DISTRICT PERFORMANCE
Name of District: Number of operational health facilities in the district:

Name of Region:
INSTRUCTIONS:
Number of registered health facilities in district: Please type T for timely reports / L for late reports / N if a facility did not submit any reports

Year: Completeness Timeliness


Health Facility Names
(Include all HFs in District)
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec

Total Expected Reports (E)

Total Received Reports (R)

Total Timely Reports (T)

Total Late reports (L)

Log completed by: Date:


Designation: Signature:

33
ANNEX 8.2: REGIONAL DATA LOG
Name of Region: Month & Year of report:

Number of registered health facilities: Number of districts with missing reports:

Number functional heath facilities: Number of health facilities that submitted reports:

Number of HFs that submitted reports: Number of health facilities with missing reports:

Expected Reports for calendar month Received Reports for calendar month Complete Reports Total
(Yes = 1 / No = 0) (Yes = 1 / No = 0) (Yes = 1 / No = 0)

Incomplete
District Names

Complete
Received

Missing
MF-01
MF-02

MF-03

MF-04

MF-05

MF-06

MF-07

MF-08

MF-09

MF-10

MF-11

MF-12

MF-01

MF-02

MF-03

MF-04

MF-05

MF-06

MF-07

MF-08

MF-09

MF-10

MF-11

MF-12
QF-14

QF-15

QF-14

QF-15
MF-01

MF-02

MF-03

MF-04

MF-05

MF-06

MF-07

MF-08

MF-09

MF-10

MF-11

MF-12

QF-14

QF-15
(Include Districts
in the Region)

Log completed by: Date:


Designation: Signature:

34
ANNEX 8.2 b: REGIONAL PERFORMANCE
Name of Region: INSTRUCTIONS:
Please type T for timely reports / L for late reports / N if a district did not submit any reports
Number of Districts in Region:

Year: Completeness Timeliness


District Names
(Include all Districts in the Region)
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec

Total Expected Reports (E)

Total Received Reports (R)

Total Timely Reports (T)

Total Late reports (L)

Log completed by: Date:


Designation: Signature:

35
ANNEX 8.3: NATIONAL DATA LOG
Number of registered health facilities: Month & Year of report:

Number of functional health facilities: Number of regions / districts / HFs with missing reports:

Number functional heath facilities: Number of health facilities with missing reports:

Number of HFs that submitted reports: Number of health facilities with incomplete reports:

Expected Reports for calendar month Received Reports for calendar month Complete Reports Total
(Yes = 1 / No = 0) (Yes = 1 / No = 0) (Yes = 1 / No = 0)

Incomplete
Region Names

Complete
Received

Missing
MF-01
MF-02

MF-03

MF-04

MF-05

MF-06

MF-07

MF-08

MF-09

MF-10

MF-11

MF-12

MF-01

MF-02

MF-03

MF-04

MF-05

MF-06

MF-07

MF-08

MF-09

MF-10

MF-11

MF-12
QF-14

QF-15

QF-14

QF-15
MF-01

MF-02

MF-03

MF-04

MF-05

MF-06

MF-07

MF-08

MF-09

MF-10

MF-11

MF-12

QF-14

QF-15
(Include all
Regions)

Log completed by: Date:


Designation: Signature:

36
ANNEX 8.3 b: NATIONAL PERFORMANCE
Number of Regions: INSTRUCTIONS:
Please type T for timely reports / L for late reports / N if a district did not submit any reports

Year: Completeness Timeliness


Region Names
(Include all Regions)
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec

Total Expected Reports (E)

Total Received Reports (R)

Total Timely Reports (T)

Total Late reports (L)

Log completed by: Date:


Designation: Signature:

37
ANNEX 8.4: DQA TEMPLATE
NAME OF HEALTH FACILITY

DATE OF DQA

NAMES OF STAFF CONDUCTING DQA

Please complete the below table below using values obtained from HF registers, monthly
summary forms and the DHIS. Indicate analysis (%) in the last 2 Columns if values in the 3
sources are available.

Program Indicator Register Form DHIS Available Matching

4. New patients treated at HF this month


OPD 5. U5 watery diarrhea cases
6. RDT/Slide confirmed Malaria cases

8. Children U5 screened for malnutrition


Nutrition 9. U5 MAM cases
10. U5 SAM cases

11. ANC-1 visits


ANC
12. ANC-4 visits

15. Deliveries in the health facility


Deliveries16. Deliveries monitored with partograph
17. Low-birth-weight babies
18. Children U1 immunized with Penta-1
Immunization
19. Children U1 immunized with Penta-3
20. Children U1 immunized against Measles
20. PNC-1 checks within 48 hours
PNC
21. New mothers counselled on IYCF

22. RDT tests for Malaria


Laboratory
23. Positive RDT tests

26. RDT / ORS opening balance


Dispensary27. RDT / ORS received
RDT / ORS in store

PMTCT 29. PW counselled & tested for HIV


(ANC) 30. PW with positive HIV test results

HR Staff trained on RMNCH topics

Timeliness
Reporting Rate
HMIS
Validation Passes (DHIS validation rules)
DQA Score supportive supervision visits

SUMMARY
Total indicators selected

Total indicators with available entries (# and %)

Total indicators with matching entries (# and %)

In absence of complete DQA, DHIS validation rules, and reporting & timeliness rates should be used to
assess the data quality. The score is then the average of the two values.
During HF DQA, please do a visual inspection of all the forms submitted and identify the proportion of
missing data elements, and check for delays in data entry based on DHIS records and the date recorded
on the monthly summary forms.

38
ANNEX 8.5: DQA LOG
This log should be maintained for every HF / District / Region, and an overall one for national
level DQAs. It should be updated every time a DQA is conducted so that quality trends can
be tracked at all levels. This log should be completed and reviewed together with the DQA
template.

HF / DISTRICT / REGION

DQA LEVEL (tick one): HF [ ] District [ ] Region [ ] National [ ]

DQA TEAM MEMBERS

Key Action Points Next


DQA Feedback DQA Timeliness Reporting
DQA
Date Date Score (%) Rate Date
Description of Key Action Points date
Resolved

39
ANNEX 8.6: SCORECARD
Enter % values that were available and / or matching for every program area, and the % of
validation rules passed per program area.

The final score is the average of all entries in Column I and should be coded Red, Yellow, or
Green as shown in the table at the bottom of the page.

Note: This scorecard is used to assess the status / quality of HMIS data, however, scorecards
can also be used to assess the quality of clinical services and overall performance of a
program.

Register vs Forms Forms Vs DHIS Overall


Program
Available Matching Available Matching Available Matching Validation Overall
F G H I
A B C D E
= (B+D)/2 = (C+E)/2 (%) (F+G+H)/3
OPD
Nutrition
ANC
Deliveries
Immunization
PNC
Laboratory
Store
PMTCT - ANC
Timeliness
Reporting
Validation
DQA Score
Average

DQA Score Card Color-Codes


Score Color Implication

≤ 50 % Red Immediate and intensive mentorship required

50 - 74 % Yellow Enhanced mentorship / supportive supervision required

≥ 75 % Green Routine Mentorship required

40
ANNEX 8.7: SUPPORTIVE SUPERVISION TOOL

41
ANNEX 8.8: DATA VISUALISATION

See DHIS2 Trainers’ Manual:

• Section 2 – Trainers Guide to Data Visualizer (p 5-18)


• Section 3 – Pivot Tables (p 19-42)
• Section 4 – Maps (p 43-69)
• Section 5 – Dashboards (p 70-80)

42
ANNEX 8.9: MENTORSHIP GUIDELINES

BACKGROUND
The Somali health sector is committed to high quality health services provision. The
latter depends on good quality monitoring systems, including adequate
documentation tools, trained medical / HF staff and resources for monitoring and
supportive supervision of health programs. Ideally, supportive should also include a
mentorship program which fosters the capacity of the health staff to respond to the
health needs of their catchment populations.
This section provides is meant to server as a first step and guide towards a
mentorship program for staff members who are part of health systems monitoring
and evaluation.

HIGH-QUALITY MONITORING & EVALUATION ARE NOT ONLY A DONOR REQUIREMENT.


IT IS HOW THE IMPACT OF INTERVENTIONS IS MEASURED AND
THE ACHIEVEMENTS OF TARGETS IS ASSESSED.

Supportive supervision and mentorship are a joint effort / partnership between the
mentor and his/her mentee to improve motivation, knowledge, skills, and
performance. Although mentorship and supportive supervision have several
commonalities, the former is generally less hierarchical, more hands on and improves
skills of health workers while producing significant improvement in the capacity of the
health worker – and hence the health facility - to achieve their aims. Mentorship is a
relationship between a mentor and mentee to bring about exchange of learning and
cause development. Required systems and skills include data collection & collation,
data validation, analysis & interpretation as well as data visualization and
dissemination.

ENABLING HEALTH WORKERS TO COLLECT & USE DATA


TO GUIDE PROGRAMS AND POLICIES.

GOALS AND OBJECTIVES OF MENTORSHIP


The goal of mentorship is to provide health workers with the necessary knowledge
and skills to undertake appropriate tasks required for effective running of health
facilities and systems.

The objective of mentorship is to improve the knowledge & skills of health workers on
1. Assessing data systems at health facilities.
2. The importance of data and information.
3. Simple data analysis to provide information for decision making.
4. Data visualization & presentation.

Mentoring staff is important as it

43
• Contributes to the development of the organization's talent.
• Helps new staff members adjust quickly to a new role and organizational culture.
• Promotes diversity.
• Provides a broader perspective on the challenges facing staff at all levels.
• Creates a greater sense of involvement.
• Supports an innovative working environment.

Effective mentors
• Serve as a role model for effective organizational behaviors and attitudes.
• Give actionable advice and feedback.
• Resist the temptation to solve the problems of their mentees.
• Challenge the people they mentor to develop a plan for success.
• Create a foundation of support.
• Suspend judgment.

MENTOR AND MENTEE


Who can be a mentor?
Anyone who has undergone the necessary mentorship training, has knowledge, skills,
and experience in the required field and possesses the characteristics of an effective
mentor (see above).

Characteristics of a Mentee
• Able to accept constructive criticism.
• Transparent and sincere.
• Asks relevant questions to improve his/her knowledge, skills, and performance.
• Keeps his/her work supervisor informed on progress.
• Documents progress made during the mentorship program.

MENTORSHIP PROCESS
Preparing for a Mentorship
• Schedule of regular meetings to be agreed between mentor and mentee with
approval of facility manager in charge.
• Prepare for the topic of the planned visit e.g., hand-outs, exercises, tools.

How to mentor
• Listen & communicate effectively.
• Acknowledge achievements.
• Identify areas of strengths and weaknesses.
• Identify challenges and opportunities in the work environment.
• Identify appropriate resources and give tips on how to utilize these resources.

HMIS MENTORSHIP
How to mentor health workers on data monitoring

44
Data system mentorship fosters continuous capacity building of health workers on the
upkeep of high-quality data records and reports as well as their dissemination and
use.

Prior to mentorship commencing


• Assess needs based on DQA and HF assessment reports.
• Identify staff and specific topic requiring capacity building.
• Identify a mentor for the required topic from the in-country team.
• Agree on date & topic of mentorship visit with facility manager in charge.
• Ensure relevant staff are available and can dedicate at least 2-3 hours.
• Ensure necessary resources are available (Wi-Fi, computer, etc.)
• Prepare a short exercise to improve skills ideally with data from the respective HF.

At the time of the first mentorship visit


• Prepare for the topic of the planned visit e.g., hand-outs, exercises, tools.
• Plan to reach the facility well in advance to allow at least 2-3 hours of dedicated
time with the relevant staff member(s).
• On arriving at the health facility:
- Introduce yourself, by name and designation, to the HF manager and staff.
- Inform them about the purpose of the visit.
- Obtain staff names and designation especially the HMIS data point person.
- Agree on point person (HMIS officer / nurse) with who will be the mentee.
- Review tools and ensure all needed registers are available.
- Perform a quick DQA and assess data tools, data flow, monthly data reports,
data display and use.
- Identify and agree on strengths and weaknesses.
- Develop a work plan for the mentorship and improvement plan with timelines.
- Debrief with all staff.
- Agree on next visit date and areas for mentorship.

After mentorship visit


• Prepare mentorship visit report, attaching the agreed mentorship plan.
• Follow-up on mentorship plan and prepare for next visit.
• Provide a progress report at quarterly review meetings.

45

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy