HMIS
HMIS
HMIS
STANDARD
OPERATING
PROCEDURES
Ministry of Health
February 2022
TABLE OF CONTENTS
LIST OF ABBREVIATIONS 4
DEFINITION OF TERMS 4
1.0 INTRODUCTION 5
1.1 BACKGROUND & OVERVIEW 5
1.2 ABOUT THE DOCUMENT 5
1.3 DISSEMINATION 6
1
5.1 DATA UTILIZATION AT FACILITY LEVEL 20
5.2 DATA ANALYSIS AT DISTRICT / REGIONAL / NATIONAL LEVEL 21
5.2.1. Overview 21
5.2.2 Basic Epidemiological Concepts 22
5.3 DATA ANALYSIS AND PRESENTATION 25
Conduct a Data Quality Check 25
Date Analysis & Presentation 25
Date of Data Analysis 25
8.0 ANNEXES 31
ANNEX 8.1: DISTRICT DATA LOG 32
ANNEX 8.1 b: DISTRICT PERFORMANCE 33
ANNEX 8.2: REGIONAL DATA LOG 34
ANNEX 8.2 b: REGIONAL PERFORMANCE 35
ANNEX 8.3: NATIONAL DATA LOG 36
ANNEX 8.3 b: NATIONAL PERFORMANCE 37
ANNEX 8.4: DQA TEMPLATE 38
ANNEX 8.5: DQA LOG 39
ANNEX 8.6: SCORECARD 40
ANNEX 8.7: SUPPORTIVE SUPERVISION TOOL 41
ANNEX 8.8: DATA VISUALISATION 42
ANNEX 8.9: MENTORSHIP GUIDELINES 43
BACKGROUND 43
GOALS AND OBJECTIVES OF MENTORSHIP 43
MENTOR AND MENTEE 44
MENTORSHIP PROCESS 44
HMIS MENTORSHIP 44
2
LIST OF TABLES AND FIGURES
FIGURE 1: AN APPROACH TO ENHANCE DDU CAPACITY TO IMPROVE DATA DEMAND AND USE. 19
3
LIST OF ABBREVIATIONS
DEFINITION OF TERMS
4
1.0 INTRODUCTION
1Everybody business: strengthening health systems to improve health outcomes: WHO’s framework for action. World
Health Organization 2007; ISBN 978 92 4 159607
5
• It describes the steps required to prepare for and conduct DQA and presents
templates to be used in DQA process.
• It gives guidelines for basic data analysis, interpretation and presentation using
selected indicators from the national HMIS indicators list.
• It complements the national HMIS tools users’ manual, hence the two should
be used together.
1.3 DISSEMINATION
The SOP should be disseminated as follows: the FMOH / SMOH HMIS teams are trained
first and then facilitating the training the regional teams, who will in turn train the
district teams. The latter should then train and mentor the HF teams of each facility in
their district / area to complete the cascade.
2.1 PREAMBLE
• There should be a trained HMIS focal person at all levels where HMIS data is
either generated, aggregated, or processed.
• Each facility should have a unique code from the DHIS master facility which is
issued and maintained by national HMIS office. The code should be indicated
on all HF HMIS tools. Having an MFL is important for establishing the exact
number of HFs, and for resource allocations including supportive supervision.
• All HMIS data should be collected using the updated standard health facility
registers. The inside cover of all registers clearly explains how data should be
collated and the instructions have also been translated into Somali. This is
hoped to minimize data collection and reporting errors.
• Data should be summarized using standard monthly summary forms. The forms
contain guidelines, both in English and Somali, on how to summarize data from
the various registers.
6
4. Patient data should be analyzed at health facilities level, where it is generated,
and only the summarized data (summary sheets) should then be shared and
distributed.
7
5. Access to personalized data should be limited to authorized personnel. Each
user of the system should have a password protected individual account and
passwords must not be shared between users.
6. Assignment of access rights to DHIS should be handled by the national HMIS
coordinator. S/He should keep a record of authorized users and their access
levels.
7. Users should be advised against sharing their database login details with
anyone.
8. The database should have an audit trail for any edits made to the data (this is
inbuilt in DHIS). The trail should include original value, date of the change and
user details, and a brief explanation of why the change was made.
9. New staff should be taken through an induction program to familiarize them
with the HMIS system and data safety features.
8
Hospital Inpatient Register R 01
OPD Over-5 Register Hospital & HCs R 02
OPD Under-5 Register Hospital & HCs R 03
OPD Register PHUs & CH R 04
Pre-ART Register R 05-A
ART Register R 05-B
Immunization (EPI) Register R 06
Nutrition Register R 07
Birth Spacing (BS) Register R 08-A
Antenatal Care (ANC) Register R 08-B
Labor, Delivery & Maternity Register R 09-A
Postnatal Care (PNC) Register R 09-B
Laboratory Services Register R 10-A
Lab Results Register R 10-B
LMIS Essential Medicines Hospitals & HCs R 11-A
LMIS Program Supplies Hospitals & HCs R 11-B
LMIS Essential Supplies PHUs R 12
Theater Register R 13
TB Register R 14
STI Register R 15
VCT Register R 16
Refer to DHIS User & HF Manual for details on how to use the above tool.
9
3.2 DATA COLLECTION PROCEDURE
HMIS data should be collected from each Hospital, District Hospital, Health Center
(HC) and Primary Health Unit (PHU). All staff involved in the collection and
management of patient-related information must ensure that data use does not
"compromise" patient confidentiality:
1. Client data should initially be captured in the designated patient medical
cards / charts by the health care provider.
2. The provider should then immediately transfer the recorded data recorded
from the medical form into appropriate register (see Table 1).
3. The registers should be updated with each patient visit.
4. Each month should start on a new page of the register.
5. Staff should compile the page summaries at bottom of each fully completed
page of the register.
6. At end of each month, staff should add up all page totals for the relevant
month to get monthly summary.
7. District / Regional HMIS officers, MOH and IP staff should strengthen this process
through routine supportive supervision and continuous mentorship.
Each register and summary sheet has a version number and version date clearly printed
on the cover page. This reduces chances of using outdated registers.
The instructions / summary instructions / tables inside the front cover of every register
and summary sheet:
1. Describe what data should go into which column or cell
2. State if the tool is to be used together with any other tool e.g., tally sheet .
10
2. The staff member completing the monthly summary sheet should
indicate his/her name and designation, and sign in the appropriate
section of the form.
3. The facility in-charge should cross-check all reports for errors before
submitting to the district / regional HMIS officer and use DQA forms and
data validation rules as a guide (Table 6).
4. If the report has no errors, the in-charge should sign the requisite section
of the report. If any errors are identified, request staff member to correct
the form and then review again. Ensure all expected reports are
prepared for submission.
5. Two copies of the report should then be submitted to the district /
regional HMIS officer by the 5th day of each month to enable data entry
to DHIS by the 10th of each month.
6. The facility should retain a copy of the report for their records as well as
for data quality audits (DQAs) by district/regional HMIS officers, MOH
and IPs.
At District Level
1. The district HMIS officer should ensure that s/he has received all expected
reports from all functional health facilities in the district and update the district
monthly data log (see Annex 1 for the District Data Log Template).
2. The HMIS officer should ensure that:
o All expected reports have been received.
o All fields in the received health facility reports are filled.
o Stamp and date all the received reports.
3. S/he should contact any facilities whose reports are either missing or
incomplete for clarifications before compiling the district report and update
the district reporting performance report (see Annex 1b).
4. All data queries should be resolved before submitting the data to the next
level, or before entering data into DHIS.
5. The district monthly HMIS report should then be compiled using the standard
Excel template. Skip this step if using DHIS.
6. The district monthly Excel file (if used), the district data log and Annex 1b should
be submitted to the regional HMIS officer by 7th day of each month.
At Regional Level
1. The regional HMIS officer should ensure that s/he has received all expected
reports from all districts in the region and update the regional data log (see
Annex 2 for the Regional Data Log Template).
2. The HMIS officer should ensure that:
o All expected reports have been received.
o All fields in the received monthly reports have been completed.
o Stamp and date all the received reports.
3. S/he should contact any districts whose reports are either missing or incomplete
for clarifications before compiling the regional report.
4. All data queries should be resolved before submitting the data to the next
level, or before entering data into DHIS.
11
5. The regional HMIS report should then be compiled using the standard Excel
template. Skip this step if using DHIS.
6. The regional monthly Excel file, copies of district & regional data logs and
Annex 2b should then be submitted to the national HMIS office by the 10th day
of each month.
At State Level
1. The National HMIS coordinator or his/her designee should ensure that s/he has
received all expected reports from all the regions and update the national
monthly data log (see Annex 3 for the National Data Log Template).
2. The HMIS officer should ensure that:
o All expected reports have been.
o All the fields in the received monthly reports have been completed.
o Stamp and date all the received reports.
3. S/he should contact any regions whose reports are either missing or incomplete
for clarifications before compiling the national HMIS report.
4. All data queries should be resolved before commencing data analysis.
5. A national HMIS report should then be compiled and shared with various
stakeholders and feedback provided to regions and districts. The analysis is
done using DHIS visualization features, such as pivot tables and maps.
Any queries identified after running DHIS validation rules should be discussed with the
respective facility/ facilities and resolved:
- HFs should recheck registers and – if required - make necessary adjustments on
summary forms.
- Any changes should be documented both on the summary sheet (hard copy) and DHIS
(soft copy) for future reference. In DHIS this is done by double-clicking the value that
is being modified.
12
4.0 DATA QUALITY AUDITS
4.1 OVERVIEW
Routine monitoring and supportive supervision activities generate important data for
patient care and programme improvement. Data quality assessments help to identify
errors and provide information on the facilities’ mentorship needs on how to
strengthen good data practices.
This SOP contains detailed instructions for conducting HF data quality audits (DQAs).
It looks at completeness of data records and compares register records with
compiled data on the monthly summary sheets and DHIS reports.
The DQA process verifies key data characteristics and facilitates the process of
evaluating data standards. Table 4 summarizes the various dimensions of data quality.
Validity Measures the ability of data to reflect events/outcomes. For example, the
screening of a child’s nutritional status by other means besides MUAC
and Weight/Height is not valid.
Reliability The extent to which we can rely on the source data, and this evaluates
whether data show similar results when indicators are measured more
than once, using similar characteristics.
13
4.3 DQA DATA MANAGEMENT
As DHIS data is aggregated from different registers, DQA needs to compare reported
data against all sources. Due to time-consuming nature of DQAs (4 hours on
average), the DQA team ought to focus on selected data elements for each
program area. A sample list of DQA indicators and their source documents are shown
in Table 5.
Note: Registers R-04 & R-12 and Summary Sheets MF-04 & MF-12 are only applicable for PHUs.
14
MF-02
OPD curative child new (0-59m) +
MF-03 = Total OPD Visits Child (0-59m)
OPD curative child follow-up (0-59m)
MF-04
Malaria confirmed & treated
MF-03 Fever case tested for Malaria (RDT / Microscopy) <
(ACT / Primaquine)
Malaria confirmed & treated
Fever case tested for Malaria RDT <
MF-04 (ACT / Primaquine)
Live birth in community <= Delivery in community
Antenatal client HIV tested <= Antenatal client HIV positive
MF-05
Postnatal care HIV tested <= Postnatal care HIV positive
FACILITY BCG Immunizations (0-11m) < FACILITY Measles (0-11m)
MF-06 FACILITY Penta-1 Immunizations (0-11m) < FACILITY Penta-3 (0-11m)
MUAC Red + Yellow + Green = Child screened MUAC
MF-06
Total BCG Immunizations during the month <= BCG used this month
MF-07
ANC-1 Visits <= ANC-4 Visits
MF-08 Delivery assisted vaginal / Delivery Caesarean < Delivery normal
Live birth weight < 2.5 kg < Live birth in facility
MF-09 Maternal Death during Pregnancy <= Maternal Death Total
Total RDT Tests done >= Positive RDT Tests
MF-10
Total Slide Microscopy done >= Positive Microscopy Slides
MF-10
Total RDT done = RDT used this month
MF-11
Identify any missing reports / data prior to the planned HF visit. Copies of any missing
reports should be delivered by the HF during the DQA visit.
Conducting DQA
1. Brief HF & IP staff about purpose and steps of conducting DQA.
2. Open the DQA tool and enter the facility name, DQA date, and names of the
staff conducting the DQA into the appropriate fields.
3. Obtain copies of relevant registers and summary sheets.
15
4. Start the DQA process addressing all DQA areas documented in Table 5 & 6.
Allow time for discussions on issues that may arise during the DQA visit.
5. Complete the DQA supportive supervision tool.
6. At the end of the DQA, hold a joint meeting with the facility staff to discuss the
results and agree on how to resolve the identified gaps or inconsistencies.
7. A copy of the DQA report with action plan should remain with the HF and IP for
reference in preparation for tailored mentorship.
8. The HF should register the visit in their monitoring log which should contain the
following information: Date of DQA / DQA score / action points including
timeframe when action items should have been addressed.
Following the DQA visit, the team should also summarize their findings through graphs
and tables which should be shared with the HF and IP.
These illustrations should also be made available to HF managers who should use these
sheets during their presentation of the DQA findings to health facility / clinical staff.
Ideally, these pages should be printed out for the facility staff to review and post on their
walls.
DQA Timelines
1. DQA visits should ideally be conducted at each facility every 3 months and at
a minimum every 6 months as this provides an opportunity to continuously
review gaps and weaknesses and their eventual resolution. The following are
some of the criteria that ought to be applied:
2. Conduct DQA every 3-6 months (or earlier if an immediate follow-up DQA is
required).
3. Data for the current month should not be included in the DQA process since it
is likely to be incomplete and therefore not representative.
4. Do however check register entries of the current month for completeness and
accuracy.
5. DQA follow-up & action points need to be agreed immediately after
completing the DQA exercise.
6. During each DQA exercise, the team should review monthly reports for previous
3 months against source documents, as well as DHIS entries.
District / Regional HMIS officers, program managers, IPs and MOH staff should support
and mentor facilities on proper recording and reporting to ensure that all DHIS reports
are complete, accurate, reliable, and timely.
16
• A written final DQA report should be shared with facility staff within 7 days after
the DQA.
• Include DQA reports in quarterly data review meetings and ensure that 2-3
slides in the quarterly data review presentation are dedicated to DQA. The
slides should summarize district/regional DQA results and should always include
action points.
• After the facility DQA report is shared and discussed with facility staff,
mentorship on implementation of corrective activities should be provided by
MOH and IP staff. Follow-up and continued mentorship are critical for
improving data as well as service provision.
17
Health Facility Staff and MOH Managers
• Allocate a date and time for the DQA exercise every quarter and make sure
all required data tools are readily available (all registers in use and the monthly
summary forms from the past 3 month).
• Notify all HF staff about the exercise and ensure clinical services are not
disrupted.
• Ensure as many HF staff members as possible actively participate in
conducting DQA.
• Review results and identify measures for improving services/ data.
• Facilitate feedback sessions to share DQA results with all facility staff.
• Maintain a regularly updated log showing DQA dates as well as the respective
scores, and the dates feedback was provided to HF staff.
18
5.0 DATA DEMAND & USE
Descriptive epidemiology covers Time, Place, and Person by using the 5Ws’:
What= Health issue of concern
Who = Persons affected
Where = Place or geographical location
When = Time
Why / How = Causes, Risk factors, odes of transmission
Health data should be presented in a way that enables decision makers answer these 5
questions. Data analysis and appropriate presentation is key.
Figure 1 is an illustration of data demand and use (DDU) capacity improvement. The
goal of DDU capacity-enhancement efforts is increased data use for decision-
making, planning and advocacy, resulting in improved health outcomes.
Source: MEval2
Figure 1: An approach to enhance DDU capacity to improve data demand and use.
The following procedures are recommended for improving DDU capacity at various levels:
19
5.1 DATA UTILIZATION AT FACILITY LEVEL
1. HF staff involved in data management should be trained on basic data
interpretation/ analysis and presentation skills.
2. Each HF should identify 5 priority data elements and/or indicators (from the
national HMIS indicator list) for tracking and visualization (Tables 5 & 8).
3. Each HF should hold monthly data review meetings to review the 5 selected
data elements / indicators, prior to submission of their monthly summary sheets.
4. HF data analysis should at a minimum include trend analysis which will also assist
HF management in assessing the HF’s progress towards set targets.
5. District/ Regional HMIS officers should support each of the HFs in their areas to
generate graphs showing monthly trends. to encourage use of data for
decision making. If the required hard/ software / Wi-Fi is available at HF level,
this should also include capacity building on how to access DHIS and generate
HF dashboards and graphics.
6. MOH and IP staff should facilitate quarterly forums where HF staff can present
their analysis, exchange lessons learnt and receive feedback.
20
5.2 DATA ANALYSIS AT DISTRICT / REGIONAL / NATIONAL LEVEL
5.2.1. Overview
• The goal is that HF data is not just used to gain information but should also be
analyzed and fed back provided to facilities at regular (at least quarterly)
intervals. It is recommended that DHIS Information products such data
dashboards are used during feedback sessions to generate talking points and
identify of success stories / areas of improvement as this will improve
understanding and action.
• Prior to data analysis, data should however always be checked for quality /
accuracy / inconsistency. These quality checks should include checking for
missing data, missing reports, numbers outside the expected range (outliers)
and any other inconsistencies. Any errors identified should be corrected – IN
COOPERATION WITH THE RESPECTIVE HEALTH FACILITY - using the source
documents (= registers).
• Data analysis should also always include a
o Trend analysis (as this will most likely highlight seasonal variations).
o Performance comparison across geographical regions.
o Possible causes of the observed trends (simple logistic regressions should
be performed to test the strength of any associations).
• Analyses should be shared with relevant stakeholders through meetings, print
media and appropriate information products.
• An annual HMIS report should be prepared so that it can also be shared with
relevant stakeholders.
• Every 3-5 years, trend reports should be prepared based on comprehensive
statistical analysis of HMIS and other relevant data with the objective of to:
o Show patterns / trends / new characteristics.
o Demonstrate correlations between different variables.
o Identify any predictors of disease burdens.
o Provide data analysis / visualizations in a regional, national, and global
context.
21
5.2.2 Basic Epidemiological Concepts
MOH and HMIS officers should have a basic understanding of epidemiological
concepts and skills as this will help them to analyze and interpret health data. The
table below provides an overview / a brief introduction to the basic, relevant terms.
Table 9: Relevant Epidemiology Terms
22
The dropout rate refers to the number of people that fail to
complete a service as recommended.
Example
Dropout Rate If 10 children U1 receive Penta-1 and only 5 receive Penta-3
vaccinations, then the dropout rate would be (10-5)/10 or 5/10 = 50%.
Similarly, if 20 PW access ANC-1 services but only 5 attend ANC-4
services, the dropout rate is (20-5)/20 or 15/20 = [0.75 * 100] = 75%.
A drop-out rate of ≥ 10% suggests a problem with the respective service.
23
Mode is the value that occurs most often in a data set.
Mode Example
The weight of 5 children in OPD is 12, 15, 16, 12, 20 kilograms respectively.
The modal weight is 12 kilograms.
Proportion
See description of Fraction.
(= Fraction)
Performance Example
Village B has a population of 12,000 children U5. If only 2,000 children are
immunized, the Village B performance in terms of uptake of
immunization services would be 2,000 / 12,000 = [0.167 * 100] = 16.7%.
Trend is the general direction in which something is developing or
changing over time.
Trend Example
If 50 children U1 received Penta-1 vaccines in January and 80 U1s in
February 2022, this is an upward or positive trend.
24
5.3 DATA ANALYSIS AND PRESENTATION
The following steps are recommended when preparing for the quarterly data review
meetings:
Which errors should the HMIS officer correct, and which ones must be referred to HFs
for confirmation?
- Questionable results and extreme outliers must not be changed without written
confirmation from the managers.
- Corrections should not only be made in DHIS but also on the respective monthly summary
forms at the HF level.
- If ≥65% of entries from a HF do not pass validation rules, all data from that HF should be
rejected / should not be included in the analysis.
25
Table 10: Example for HMIS Data Analysis Plan
Program Area Program Indicator Where & How
Generated by DQA exercise
HMIS - HMIS scorecard
Provide analysis / evidence on data accuracy
- Average monthly OPD consultations last quarter
Service OPD - OPD workload trend last 12 months Available from dashboards and dataset reports
Uptake - Leading causes of OPD visits & contribution to case load
- Average monthly admissions last quarter
- Admissions trend last 12 months
Inpatient Available from dashboards and dataset reports
- Leading causes of admissions
- Leading cause of death in hospitals
- BCG coverage (incl. missed opportunities – Birth / Polio)
- Penta dropout rate
EPI Available from dashboards and dataset reports
- Measles coverage
- Vaccine wastage rate
- Average monthly diarrhea cases and 6-monthly trend
Child Health Diarrhea - Proportion of OPD visits related to diarrhea Available form dashboards and dataset reports
- Percentage of U5s with diarrhea treated with Zinc / ORS
- Average monthly pneumonia cases and 6-monthly trend
Pneumonia - Proportion of OPD visits related to pneumonia Available form dashboards and dataset reports
- Percentage of U5s with pneumonia treated with antibiotics
- Number / Percentage of U5s screened for nutrition status
- Percentage of MUAC Red / Yellow / Green
Nutrition - Percentage of SAM / MAM referred for management Available form dashboards and dataset reports
- Percentage of children U5 provided with Vitamin A
- Percentage of children U5 dewormed
26
Program Area Program Indicator Where & How
- ANC coverage
ANC - ANC 4+ completion rate Available form dashboards and dataset reports
- ANC dropout rate
- Delivery coverage in HFs
- Skilled birth attendant delivery rate
Delivery Available form dashboards and dataset reports
Maternal - Percentage of births monitored with partograph
Health - Low birth weight rate
- Post-delivery uterotonic use rate
PNC - PNC-1 (0-48 hours) coverage Available form dashboards and dataset reports
- Breastfeeding within one hour rate
- Percentage of WCBAs counselled on modern BS
FP - Modern BS new / repeat user rate Available form dashboards and dataset reports
- Percentage of HFs with stock-out of any FP commodities
27
Program Area Program Indicator Where & How
- Clients counselled on HIV incl. VCT, OPD, ANC, Delivery
- HIV tests performed
HIV Available form dashboards and dataset reports
- HIV test positivity rate
- ART coverage
- TB case rate per 100,000 population
- TB new pulmonary cases bacteriologically confirmed
TB Available form dashboards and dataset reports
- TB treatment success rate
Individual - TB drug-resistant cases per 100,000 population
Diseases - Malaria incidence rate (confirmed) per 100,000 population
- Malaria suspect / fever test rate
- Malaria RDT testing rate
Malaria Available form dashboards and dataset reports
- Malaria test positivity rate
- Malaria ACT treatment rate
- Malaria ANC LLIN distribution rate
- Progress towards program targets:
Miscellaneous Monthly progress
Available form dashboards and dataset reports
Progress trend
Cumulative progress
28
6.0 RESPONSIBILITIES FOR INDIVIDUAL STAFF CATEGORIES
29
2. Electronic patient data capture systems should reflect the layout and design
of hard copy documents = registers to minimize data entry errors.
3. Electronic systems should include data validation, range checks and
consistency checks to ensure good data quality.
4. The system should be secure to prevent unauthorized access to the data.
5. The system should have automated levels of access and privileges allowing the
customization of relevant functions for individual users.
6. The system should automatically record user IDs, time and data stamps at the
time of data entry to enable data audits.
30
8.0 ANNEXES
31
ANNEX 8.1: DISTRICT DATA LOG
Name of District: Month & Year of report:
Expected Reports for calendar month Received Reports for calendar month Complete Reports Total
(Yes = 1 / No = 0) (Yes = 1 / No = 0) (Yes = 1 / No = 0)
Incomplete
Health Facility
Complete
Received
Missing
MF-01
MF-02
MF-03
MF-04
MF-05
MF-06
MF-07
MF-08
MF-09
MF-10
MF-11
MF-12
MF-01
MF-02
MF-03
MF-04
MF-05
MF-06
MF-07
MF-08
MF-09
MF-10
MF-11
MF-12
QF-14
QF-15
QF-14
QF-15
Names
MF-01
MF-02
MF-03
MF-04
MF-05
MF-06
MF-07
MF-08
MF-09
MF-10
MF-11
MF-12
QF-14
QF-15
(Include all HFs in
District)
32
ANNEX 8.1 b: DISTRICT PERFORMANCE
Name of District: Number of operational health facilities in the district:
Name of Region:
INSTRUCTIONS:
Number of registered health facilities in district: Please type T for timely reports / L for late reports / N if a facility did not submit any reports
33
ANNEX 8.2: REGIONAL DATA LOG
Name of Region: Month & Year of report:
Number functional heath facilities: Number of health facilities that submitted reports:
Number of HFs that submitted reports: Number of health facilities with missing reports:
Expected Reports for calendar month Received Reports for calendar month Complete Reports Total
(Yes = 1 / No = 0) (Yes = 1 / No = 0) (Yes = 1 / No = 0)
Incomplete
District Names
Complete
Received
Missing
MF-01
MF-02
MF-03
MF-04
MF-05
MF-06
MF-07
MF-08
MF-09
MF-10
MF-11
MF-12
MF-01
MF-02
MF-03
MF-04
MF-05
MF-06
MF-07
MF-08
MF-09
MF-10
MF-11
MF-12
QF-14
QF-15
QF-14
QF-15
MF-01
MF-02
MF-03
MF-04
MF-05
MF-06
MF-07
MF-08
MF-09
MF-10
MF-11
MF-12
QF-14
QF-15
(Include Districts
in the Region)
34
ANNEX 8.2 b: REGIONAL PERFORMANCE
Name of Region: INSTRUCTIONS:
Please type T for timely reports / L for late reports / N if a district did not submit any reports
Number of Districts in Region:
35
ANNEX 8.3: NATIONAL DATA LOG
Number of registered health facilities: Month & Year of report:
Number of functional health facilities: Number of regions / districts / HFs with missing reports:
Number functional heath facilities: Number of health facilities with missing reports:
Number of HFs that submitted reports: Number of health facilities with incomplete reports:
Expected Reports for calendar month Received Reports for calendar month Complete Reports Total
(Yes = 1 / No = 0) (Yes = 1 / No = 0) (Yes = 1 / No = 0)
Incomplete
Region Names
Complete
Received
Missing
MF-01
MF-02
MF-03
MF-04
MF-05
MF-06
MF-07
MF-08
MF-09
MF-10
MF-11
MF-12
MF-01
MF-02
MF-03
MF-04
MF-05
MF-06
MF-07
MF-08
MF-09
MF-10
MF-11
MF-12
QF-14
QF-15
QF-14
QF-15
MF-01
MF-02
MF-03
MF-04
MF-05
MF-06
MF-07
MF-08
MF-09
MF-10
MF-11
MF-12
QF-14
QF-15
(Include all
Regions)
36
ANNEX 8.3 b: NATIONAL PERFORMANCE
Number of Regions: INSTRUCTIONS:
Please type T for timely reports / L for late reports / N if a district did not submit any reports
37
ANNEX 8.4: DQA TEMPLATE
NAME OF HEALTH FACILITY
DATE OF DQA
Please complete the below table below using values obtained from HF registers, monthly
summary forms and the DHIS. Indicate analysis (%) in the last 2 Columns if values in the 3
sources are available.
Timeliness
Reporting Rate
HMIS
Validation Passes (DHIS validation rules)
DQA Score supportive supervision visits
SUMMARY
Total indicators selected
In absence of complete DQA, DHIS validation rules, and reporting & timeliness rates should be used to
assess the data quality. The score is then the average of the two values.
During HF DQA, please do a visual inspection of all the forms submitted and identify the proportion of
missing data elements, and check for delays in data entry based on DHIS records and the date recorded
on the monthly summary forms.
38
ANNEX 8.5: DQA LOG
This log should be maintained for every HF / District / Region, and an overall one for national
level DQAs. It should be updated every time a DQA is conducted so that quality trends can
be tracked at all levels. This log should be completed and reviewed together with the DQA
template.
HF / DISTRICT / REGION
39
ANNEX 8.6: SCORECARD
Enter % values that were available and / or matching for every program area, and the % of
validation rules passed per program area.
The final score is the average of all entries in Column I and should be coded Red, Yellow, or
Green as shown in the table at the bottom of the page.
Note: This scorecard is used to assess the status / quality of HMIS data, however, scorecards
can also be used to assess the quality of clinical services and overall performance of a
program.
40
ANNEX 8.7: SUPPORTIVE SUPERVISION TOOL
41
ANNEX 8.8: DATA VISUALISATION
42
ANNEX 8.9: MENTORSHIP GUIDELINES
BACKGROUND
The Somali health sector is committed to high quality health services provision. The
latter depends on good quality monitoring systems, including adequate
documentation tools, trained medical / HF staff and resources for monitoring and
supportive supervision of health programs. Ideally, supportive should also include a
mentorship program which fosters the capacity of the health staff to respond to the
health needs of their catchment populations.
This section provides is meant to server as a first step and guide towards a
mentorship program for staff members who are part of health systems monitoring
and evaluation.
Supportive supervision and mentorship are a joint effort / partnership between the
mentor and his/her mentee to improve motivation, knowledge, skills, and
performance. Although mentorship and supportive supervision have several
commonalities, the former is generally less hierarchical, more hands on and improves
skills of health workers while producing significant improvement in the capacity of the
health worker – and hence the health facility - to achieve their aims. Mentorship is a
relationship between a mentor and mentee to bring about exchange of learning and
cause development. Required systems and skills include data collection & collation,
data validation, analysis & interpretation as well as data visualization and
dissemination.
The objective of mentorship is to improve the knowledge & skills of health workers on
1. Assessing data systems at health facilities.
2. The importance of data and information.
3. Simple data analysis to provide information for decision making.
4. Data visualization & presentation.
43
• Contributes to the development of the organization's talent.
• Helps new staff members adjust quickly to a new role and organizational culture.
• Promotes diversity.
• Provides a broader perspective on the challenges facing staff at all levels.
• Creates a greater sense of involvement.
• Supports an innovative working environment.
Effective mentors
• Serve as a role model for effective organizational behaviors and attitudes.
• Give actionable advice and feedback.
• Resist the temptation to solve the problems of their mentees.
• Challenge the people they mentor to develop a plan for success.
• Create a foundation of support.
• Suspend judgment.
Characteristics of a Mentee
• Able to accept constructive criticism.
• Transparent and sincere.
• Asks relevant questions to improve his/her knowledge, skills, and performance.
• Keeps his/her work supervisor informed on progress.
• Documents progress made during the mentorship program.
MENTORSHIP PROCESS
Preparing for a Mentorship
• Schedule of regular meetings to be agreed between mentor and mentee with
approval of facility manager in charge.
• Prepare for the topic of the planned visit e.g., hand-outs, exercises, tools.
How to mentor
• Listen & communicate effectively.
• Acknowledge achievements.
• Identify areas of strengths and weaknesses.
• Identify challenges and opportunities in the work environment.
• Identify appropriate resources and give tips on how to utilize these resources.
HMIS MENTORSHIP
How to mentor health workers on data monitoring
44
Data system mentorship fosters continuous capacity building of health workers on the
upkeep of high-quality data records and reports as well as their dissemination and
use.
45