Analysis Standard 1: Assessment

Timely education assessments of the emergency situation are conducted in a holistic, transparent and participatory manner.

Key Actions

An initial rapid education assessment is undertaken as soon as possible, taking into account security and safety

See Guidance Notes:

The assessment collects disaggregated data that identify local perceptions of the purpose and relevance of education, barriers to access to education and priority educational needs and activities

See Guidance Notes:

Local capacities, resources and strategies for learning and education are identified, prior to and during the emergency

See Guidance Notes:

Context analysis is conducted to ensure that education responses are appropriate, relevant and sensitive to the potential for risks and conflict

See Guidance Notes:

Representatives of the affected population participate in the design and implementation of data collection

See Guidance Notes:

A comprehensive assessment of education needs and resources for the different levels and types of education is undertaken with the participation of key stakeholders

See Guidance Notes:

An inter-agency coordination committee coordinates assessments with other sectors and relevant stakeholders, to avoid duplication of efforts

See Guidance Notes:

Guidance Notes
Timing of initial assessments

The timing of initial assessments should take into consideration the security and safety of the assessment team and the affected population. The assessment should take place as soon as possible after an emergency and should assess all types of education at all affected locations, if feasible. Following the initial assessment, the data should be updated regularly through monitoring and evaluation. This includes a review of programme achievements, constraints and unmet needs. When overall assessments cannot be conducted immediately, partial initial assessments can gather information to inform immediate action.

Back to Top

Assessments should collect disaggregated data to inform the education response and assess continuing risk from conflict or disaster. ‘Disaggregated’ means that the information is separated into its component parts, and in this case analysed by sex and age group. Data identify educational capacities, resources, vulnerabilities, gaps and challenges to upholding the right to education for all affected groups. Assessments and eld visits by education and other emergency response providers should be coordinated to avoid the inefficient use of resources and over-assessment of certain affected populations or issues (see also Community participation standard 2 and the Coordination standard 1).

Assessments should make maximum use of existing sources of information. Primary data collection should be limited to what is required to ll gaps in knowledge and inform education stakeholders’ critical decisions. Where access is restricted, alternative strategies for collecting information can be explored. These may include contacting local leaders and community networks, and gathering secondary data from other sectors or pre-crisis databases. Pre-crisis data will also provide a measure against which to compare the emergency situation.

Data collection tools should be standardised in-country to facilitate the coordination of projects and to minimise the demands on people providing information. Where possible, assessment tools should be developed and agreed upon by all stakeholders prior to an emergency as part of preparedness planning. The tools should provide space for additional information deemed important by local respondents.

Assessment teams should include members of the affected community. They should be gender-balanced in order to capture more effectively the experiences, needs, concerns and capacities of male and female learners, teachers and other educational personnel, and parents and guardians. Appropriate authorities should be consulted.

Ethical considerations, including the basic principles of respect and non-discrimination, should underpin assessment. Collecting information can put people at risk because of the sensitivity of the information or simply because they have participated in the process (see also guidance note 5 below). Those collecting information have a responsibility to protect participants and must inform them of the following:

  • the purpose of collecting the data;
  • the right not to participate in the data collection process, or to withdraw at any time without negative effects;
  • the right to con dentiality and anonymity.
Back to Top
Analysis of the context

Analysis of the context, including disaster risk and conflict analysis, helps to ensure that education responses are appropriate, relevant and sensitive to the potential for conflict and disaster.

Risk analysis considers all aspects of the context that affect the health, security, and safety of learners. This helps to ensure that education is a protective measure rather than a risk factor. Risk analysis assesses risks to education, which may include:

  • insecurity, poor governance and corruption;
  • public health issues such as the prevalence of communicable diseases;
  • other social, economic, physical and environmental factors, including industrial hazards such as toxic gas releases and chemical spills;
  • risks specific to sex, age, disability, ethnic background and other factors relevant in the context.

Conflict analysis assesses the presence or risk of violent conflict to try to ensure that education interventions do not exacerbate underlying inequalities or conflict. This is necessary in both conflict and disaster situations. Conflict analysis asks questions about:

  • the actors who are directly or indirectly engaged in conflict, are affected by conflict or at risk of being affected;
  • the causes of actual or potential conflict and the factors that contribute to grievances;
  • the interactions between the actors, including education stakeholders, and causes of conflict.

Conflict analyses of specific regions or countries are often available from research organisations. They may need to be reconsidered from the perspective of education. If existing analyses are not available or applicable, a conflict analysis may be carried out by means of a workshop in the affected area or a desk study. Education stakeholders should advocate for appropriate agencies to undertake comprehensive conflict analyses, including education-specific information, and to share the findings with all interested parties.

A risk analysis report proposes strategies for risk management of natural and human-made hazards, including conflict. Strategies may include prevention, mitigation, preparedness, response, reconstruction and rehabilitation. For example, schools or learning spaces may be required to have contingency and security plans to prevent, mitigate and respond to emergencies. They could also prepare a risk map showing potential threats and highlighting factors that affect learners’ vulnerability and resilience.

Risk analysis is complemented by assessment of community resilience and local coping efforts, including resources and capacities. Knowledge, skills and capacities for disaster mitigation, preparedness and recovery are assessed and strengthened before as well as after an emergency, if possible, through preparedness and mitigation activities.

Back to Top
Data validity and methods of data analysis

Data analyses should clearly state:

  • the indicators;
  • data sources;
  • methods of collection;
  • data collectors;
  • data analysis procedures.

Where there are security risks for data collectors, the types of organisations involved in data collection and not the names of individual data collectors should be referenced. Limitations of the data collection or analysis that may affect the reliability of the findings, or their relevance to other situations, should be noted. For example, data may be made unreliable by respondents who in ate enrolment or attendance figures to maximise resource allocations or to avoid blame. It should also be noted if certain groups or issues are not addressed by programmes and monitoring systems.

In order to minimise bias, data should be drawn from several sources and compared. This technique strengthens the validity of data. The most affected groups, including male and female children and youth, should be consulted before conclusions are drawn. Local perceptions and knowledge should be central to the analysis to avoid a humanitarian response based on the perceptions and priorities of people from outside.

Back to Top
Participants in assessments

Participants in assessments should include education authorities and representatives of the affected population, including vulnerable groups. The participation of these groups in data and information collection, analysis, management and dissemination may be limited by difficult circumstances during the initial assessment. It should increase as the context becomes more stable. Assessments should facilitate communication in all languages of the community, including the use of sign language and Braille, where applicable.

Back to Top
Collaboration within the education sector and with other sectors

Collaboration within the education sector and with other sectors is crucial in maximising the quality, comprehensiveness and usefulness of assessments. Education stakeholders should harmonise needs assessments by conducting joint assessments or by coordinating assessments to avoid duplication by different agencies. Coordinated assessments produce stronger evidence of the impact of emergencies and facilitate coherent responses. They improve the accountability of humanitarian stakeholders by encouraging the sharing of information.

The education sector should work with other sectors to inform the education response regarding threats, risks and availability of services. This may include work with:

  • the health sector to obtain epidemiology data and information about threats of epidemics and to learn about available basic health services, including services for sexual and reproductive health and HIV prevention, treatment, care and support;
  • the protection sector to learn about the risks related to gender- based and sexual violence, orphans and other vulnerable populations within the community; the barriers to education; and the available social and psychosocial support services;
  • the nutrition sector to learn about school-based, community-based and other nutrition services;
  • the shelter and camp management sectors to coordinate safe and appropriate location, construction/re-construction of and access to learning and recreation facilities; and the provision of non-food items necessary for school facilities;
  • the water and sanitation sector to ensure that reliable water supply and appropriate sanitation are available at learning sites;
  • the logistics sector to organise procurement and delivery of books and other supplies.
Back to Top
Education and psychosocial needs

Disaggregated data on education and psychosocial needs and resources should be collected in general needs assessments. Assessment team members with local knowledge can support these aspects of assessments. Agencies should commit resources, staff and organisational capacity to carry them out.

Back to Top
Assessment findings

Assessment findings should be made available as soon as possible so that education activities can be planned. Pre-crisis data and post-crisis assessments that identify resource and education needs and/or violations or fulfilment of education rights by education authorities, NGOs, humanitarian agencies and the local community should also be shared.

Education authorities at the local or national level should coordinate the sharing of assessment findings. If such authorities lack capacity to do this, an international lead actor, such as the education sector coordination committee or the Education Cluster, can manage this process. The presentation of data in assessment findings should be standardised if possible so that the information can be used easily.

Back to Top

Supporting Resources

Supporting Resources
31 July 2013 Report Save the Children

Child Rights Situational Analysis Guidelines

These current guidelines draw on those experiences and provide tools and resources needed to develop a Child Rights Situation Analysis (CRSA). The
document describes the components and sequencing that will result in a good CRSA, and includes tips on how to manage the process in particularly complex or challenging situations.

1 May 2017 Manual/Handbook/Guide
Australian Agency for International Development (AusAID)

Conflict Analysis: Topic Guide

This topic guide focuses specifically on the systematic approaches and tools for conflict analysis developed for policy and practice. It draws on reflective sections in conflict analysis toolkits, and where available on policy, practitioner and academic texts that critique the toolkits.

1 March 2016 Toolkit US Agency for International Development (USAID), World Bank

Early Grade Reading Assessment (EGRA) Toolkit

In the interest of consolidating diverse experiences and developing a reasonably standardized approach to assessing children’s early reading acquisition, this “toolkit,” or user manual, serves as a guide for countries beginning to work with EGRA in such areas as local adaptation of the instrument, fieldwork, and analysis of results.

1 November 2016 Manual/Handbook/Guide United Nations Children's Fund (UNICEF)

Guide to Conflict Analysis

This Guide is a tool for UNICEF staff and leadership to understand, situate and operationalize conflict analysis into UNICEF programme planning and implementation.

1 January 2014 Manual/Handbook/Guide Norwegian Refugee Council (NRC)

Humanitarian Needs Assessment: The Good Enough Guide

The Assessment Capacities Project (ACAPS) and the Emergency Capacity Building Project (ECB) have produced this guide to fill the gap that existed for a practical resource that pulls together the main lessons learned from various initiatives and experiences.

1 January 2010 Manual/Handbook/Guide UNESCO International Institute for Education Planning (UNESCO-IIEP)

Matrix Analyzing Ways of Measuring Education Quality and Estimating School-age Population

This chapter provides two tools useful for monitoring: a matrix analyzing different ways of defining/measuring education quality (p. 126) and two methods for estimating school age population (p. 132). Extracted from UNESCO IIEP's Guidebook for Planning Education in Emergencies and Reconstruction.

1 July 2016 Technical Note

Questionnaire Design: How to design a questionnaire for needs assessments in humanitarian emergencies

The brief starts with an explanation of the main purpose of a questionnaire and the principles that should be followed to reach these objectives. Afterwards, the ten steps of questionnaire development are discussed. The brief concludes with sections on what to keep in mind specifically when designing a questionnaire and individual questions.


Untitled Spreadsheet
INEE Domain INEE Standard Indicator/Program Requirements Clarification Numerator Denominator Target Disaggregation Source of Indicator Source of Data Available Tool Crisis Phase
Foundational Standards Community Participation Participation (FDN/Community Participation Std 1)

Community members participate actively, transparently, and without discrimination in analysis, planning, design, implementation, monitoring, and evaluation of education responses.
1.1 Percentage of parents actively participating in the conception and implementation of education in emergencies services Number of parents consulted Number of parents To be defined by program Gender Based on OCHA Indicator Registry Program documentation No tool required; INEE MS and indicator definitions sufficient All stages
1.2 Percentage of parents satisfied with the quality and appropriateness of response at the end of the project Number of parents satisfied with the quality and appropriateness of response at the end of the project Number of parents 100% NA Based on OCHA Indicator Registry Program documentation Tool required All stages
Resources (FDN/Community Participation Std 2)

Community resources are identified, mobilized and used to implement age-appropriate learning opportunities.
1.3 Analysis of opportunity to use local resources is carried out and acted on Scale 1-5 (1 = low, 5 = high) 5 NA New Program/procurement documentation Tool required All stages
Coordination Coordination (FDN/Coordination Std 1)

Coordination mechanisms for education are in place to support stakeholders working to ensure access to and continuity of quality education.
1.4 Percentage of regular relevant coordination mechanism (i.e., Education Cluster, EiEWG, LEGs) meetings attended by program team Number of regular relevant coordination mechanism (i.e.; Education Cluster, EiE Working Group (WG), Local Education Group (LEG) meetings attended by program team Number of regular relevant coordination mechanism (i.e. Education Cluster, EiEWG, LEGs) meetings held during organizational presence 100% NA New Meeting records No tool required; INEE MS and indicator definitions sufficient All stages
Analysis Assessment (FDN/Analysis Std 1)

Timely education assessments of the emergency situation are conducted in a holistic, transparent, and participatory manner.
1.5 Percentage of education needs assessments, carried out by the relevant coordinating body the program has participated in These include initial rapid and ongoing/rolling assessments Number of assessments organization contributed to Number of possible assessments organization could have contributed to 100% NA New Assessment records No tool required; INEE MS and indicator definitions sufficient All stages
Response Strategies (FDN/Analysis Std 2)

Inclusive education response strategies include a clear description of the context, barriers to the right to education, and strategies to overcome those barriers.
1.6 Strength of analysis of context, of barriers to the right to education, and of strategies to overcome those barriers Scale 1-5 (1 = low, 5 = high) 5 NA New Program documentation Tool required All stages
Monitoring (FDN/Analysis Std 3)

Regular monitoring of education response activities and the evolving learning needs of the affected population is carried out.
1.7 Percentage of education needs assessments carried out in defined time period Frequency to be defined by organization. Monitoring measures should be relevant to the desired program outcomes Number of education needs assessments carried out per year Number of education needs assessments required per year 100% NA New M&E plans and results No tool required; INEE MS and indicator definitions sufficient During program implementation
Evaluation (FDN/Analysis Std 4)

Systematic and impartial evaluations improve education response
activities and enhance accountability.
1.8 Number of evaluations carried out Number of evaluations carried out NA NA New M&E plans and results No tool required; INEE MS and indicator definitions sufficient Program completion
1.9 Percentage of evaluations shared with parents Number of evaluations shared with parents Number of evaluations 100% NA New M&E plans and results No tool required; INEE MS and indicator definitions sufficient Program completion