Analysis Standard 4: Evaluation

Systematic and impartial evaluations improve education response activities and enhance accountability

Key Actions

Regular evaluations of education response activities produce credible and transparent data and inform future education activities

See Guidance Notes:

All stakeholders, including representatives of the affected community and education authorities, are involved in evaluation activities

See Guidance Notes:

Lessons and good practices are widely shared and inform future advocacy, programmes and policies

See Guidance Notes:

Guidance Notes
1
Distinction between monitoring and evaluation

Monitoring and evaluation are key to achieving the goals and objectives of education programmes. Monitoring is an ongoing process that regularly measures progress towards goals and objectives of education programmes. It allows education programme staff to make changes during the programme or project cycle to ensure that they stay on track for achieving their goals and objectives.

Evaluation is less frequent, usually conducted in the middle of or at the end of a programme or project cycle and carried out by external or independent actors. It measures outcomes and evaluates whether expected results have been achieved. Evaluations can also address whether activities were relevant to stated priorities, policies and legal instruments and whether programmes were implemented in an efficient manner.

Back to Top
2
Evaluations of education response activities

Evaluations of education response activities should use approaches and methods that produce timely and credible evidence of programme outcomes and impacts that can inform future action. ‘Impact’ is the measurable change that the programme has caused in people’s lives. Both qualitative and quantitative data disaggregated by sex and age are important. Quantitative data are about things that can be counted. They measure outcomes such as enrolment, attendance, drop-out and achievement. Qualitative data are about things that cannot be measured with numbers. They help to understand processes and explain results. Examples of qualitative data include information on what happens in schools or other learning spaces, and the reasons behind enrolment, attendance and drop-out rates.

Back to Top
3
Capacity building through evaluation

The evaluation budget should cover capacity-building workshops for relevant stakeholders, including education authorities, community representatives and learners. These can introduce and explain the evaluation, develop evaluation plans in a participatory and transparent way, and allow stakeholders to review and interpret findings together. Learners, teachers and other education personnel should be involved in the evaluation process to improve the accuracy of data collection and support the development of recommendations that can realistically be implemented. For example, teachers and other education personnel may add insights into practical difficulties resulting from proposed recommendations.

Back to Top
4
Sharing evaluation findings and lessons learned

Key findings in evaluation reports, particularly recommendations and lessons learned, should be shared in a form understandable to all, including community members. They should inform future work. Sensitive data need to be handled carefully to avoid contributing to the emergency or conflict and/or to avoid putting informants who contributed anonymous or sensitive information at risk.

Back to Top

Supporting Resources

Supporting Resources
Manual/Handbook/Guide

Accelerated Education Programme Monitoring & Evaluation Toolkit

Published by
Accelerated Education Working Group (AEWG)

The purpose of this toolkit is to support the design and implementation of M&E Frameworks for specific accelerated education programmes in order to support learning and accountability.

Arabic
English
French
Spanish
Report

Data Collection on Education Systems: Definitions, Explanations and Instructions

Published by
Organization for Economic Co-operation and Development (OECD)
United Nations Educational, Scientific and Cultural Organziation (UNESCO)

The objective of the UIS/OECD/EUROSTAT data collection on education statistics is to provide internationally comparable data on key aspects of education systems, specifically on the participation and completion of education programmes, as well as the cost and type of resources dedicated to education.

English
Framework

INEE Minimum Standards Indicator Framework

Published by
Inter-agency Network for Education in Emergencies (INEE)

This framework provides a way for education in emergency (EiE) stakeholders to demonstrate alignment with and progress towards the INEE Minimum Standards.

Arabic
English
French
Portuguese
Spanish
Manual/Handbook/Guide

Learning to live together: Design, monitoring and evaluation of education for life skills, citizenship, peace and human rights

Published by
United Nations Educational, Scientific and Cultural Organziation (UNESCO)
,
Deutsche Gesellschaft für Technische Zusammenarbeit (GTZ)

Learning to Live Together: Design, Monitoring and Evaluation of Education for Life Skills, Citizenship, Peace and Human Rights. The guide focuses on the theme of ‘learning to live together’, which is one of four competencies identified as important by the International Commission on Education for the Twenty-first Century.

English

Minimum Standards Applied to the Evaluation of Coaching Classes for Refugee Students

Published by
RET International

This document is an example of how the Refugee Education Trust used the INEE Minimum Standards to evaluate project implementation in Pakistan. The table provides a useful framework for analysis, dialogue, planning and recommendations for the future, and can be adapted for other programmes.

English

Indicators

Untitled Spreadsheet
INEE Domain INEE Standard Indicator/Program Requirements Clarification Numerator Denominator Target Disaggregation Source of Indicator Source of Data Available Tool Crisis Phase
Foundational Standards Community Participation Participation (FDN/Community Participation Std 1)

Community members participate actively, transparently, and without discrimination in analysis, planning, design, implementation, monitoring, and evaluation of education responses.
1.1 Percentage of parents actively participating in the conception and implementation of education in emergencies services Number of parents consulted Number of parents To be defined by program Gender Based on OCHA Indicator Registry Program documentation No tool required; INEE MS and indicator definitions sufficient All stages
1.2 Percentage of parents satisfied with the quality and appropriateness of response at the end of the project Number of parents satisfied with the quality and appropriateness of response at the end of the project Number of parents 100% NA Based on OCHA Indicator Registry Program documentation Tool required All stages
Resources (FDN/Community Participation Std 2)

Community resources are identified, mobilized and used to implement age-appropriate learning opportunities.
1.3 Analysis of opportunity to use local resources is carried out and acted on Scale 1-5 (1 = low, 5 = high) 5 NA New Program/procurement documentation Tool required All stages
Coordination Coordination (FDN/Coordination Std 1)

Coordination mechanisms for education are in place to support stakeholders working to ensure access to and continuity of quality education.
1.4 Percentage of regular relevant coordination mechanism (i.e., Education Cluster, EiEWG, LEGs) meetings attended by program team Number of regular relevant coordination mechanism (i.e.; Education Cluster, EiE Working Group (WG), Local Education Group (LEG) meetings attended by program team Number of regular relevant coordination mechanism (i.e. Education Cluster, EiEWG, LEGs) meetings held during organizational presence 100% NA New Meeting records No tool required; INEE MS and indicator definitions sufficient All stages
Analysis Assessment (FDN/Analysis Std 1)

Timely education assessments of the emergency situation are conducted in a holistic, transparent, and participatory manner.
1.5 Percentage of education needs assessments, carried out by the relevant coordinating body the program has participated in These include initial rapid and ongoing/rolling assessments Number of assessments organization contributed to Number of possible assessments organization could have contributed to 100% NA New Assessment records No tool required; INEE MS and indicator definitions sufficient All stages
Response Strategies (FDN/Analysis Std 2)

Inclusive education response strategies include a clear description of the context, barriers to the right to education, and strategies to overcome those barriers.
1.6 Strength of analysis of context, of barriers to the right to education, and of strategies to overcome those barriers Scale 1-5 (1 = low, 5 = high) 5 NA New Program documentation Tool required All stages
Monitoring (FDN/Analysis Std 3)

Regular monitoring of education response activities and the evolving learning needs of the affected population is carried out.
1.7 Percentage of education needs assessments carried out in defined time period Frequency to be defined by organization. Monitoring measures should be relevant to the desired program outcomes Number of education needs assessments carried out per year Number of education needs assessments required per year 100% NA New M&E plans and results No tool required; INEE MS and indicator definitions sufficient During program implementation
Evaluation (FDN/Analysis Std 4)

Systematic and impartial evaluations improve education response
activities and enhance accountability.
1.8 Number of evaluations carried out Number of evaluations carried out NA NA New M&E plans and results No tool required; INEE MS and indicator definitions sufficient Program completion
1.9 Percentage of evaluations shared with parents Number of evaluations shared with parents Number of evaluations 100% NA New M&E plans and results No tool required; INEE MS and indicator definitions sufficient Program completion