Standard 6: Monitoring
There is regular monitoring of education response activities and of the evolving learning needs of the people affected.
في هذة الصفحة
1. Regular monitoring: Establish effective systems to regularly monitor education response activities from the onset of an emergency through to recovery.
أنظر للملاحظات الارشادية:
2. Safety and security: Monitor education response activities to ensure the safety and security of all learners and teachers and other education personnel.
أنظر للملاحظات الارشادية:
3. Participation in monitoring: Regularly consult and involve representatives of the community affected in monitoring activities and provide opportunities for training in data collection methodologies, as requested.
أنظر للملاحظات الارشادية:
4. Education management information system: Maintain education management information systems so it is possible to compile disaggregated education data and use it to guide the education response.
أنظر للملاحظات الارشادية:
5. Learning outcomes: Monitor learning outcomes regularly and use them to guide program design.
أنظر للملاحظات الارشادية:
6. Monitoring findings: Analyze and share education data regularly with all stakeholders, especially the communities affected, and marginalized and vulnerable groups.
أنظر للملاحظات الارشادية:
Monitoring is an ongoing process that regularly measures progress toward the goals and objectives of education programs. It also helps to determine whether programs are meeting the changing education needs of the people affected and responding to the changing context. Monitoring can happen at the response or program level. Stakeholders at each level can monitor inputs, outputs, activities, outcomes, and, where possible, impacts. Monitoring can be used to do the following:
- Track the coverage and quality of programs
- Identify possible areas for improvement
- Determine information flow and related roles and responsibilities
- Contribute to knowledge sharing and learning between organizations and programs
- Promote accountability at different levels, such as when gathering feedback from the people affected or reporting the responsible use of donor funds
For monitoring at the response level, the inter-agency coordination mechanism should coordinate, collect, and share information on the collective education response. Monitoring the quality and coverage of education responses should be a multi-sectoral, inter-agency effort. Education authorities and humanitarian and development actors should collaborate when developing monitoring plans, tools, and indicators. The 4Ws/5Ws (who is doing what, where, when, and for whom) are typically used to monitor a response at the cluster or sector level, but agencies should also have their own internal monitoring plans and tools. During the acute or early stages of a response, regular monitoring can be challenging. In the early stages, it may be necessary to adopt a light and flexible approach to monitoring that focuses on priority information. This may include the changing emergency context and needs, the number of goods and services that are delivered, or how they are being used. Alternative strategies might include collecting data remotely instead of in person or using paper-based collections tools instead of electronic. As the situation becomes more stable, monitoring teams can shift to a more structured approach including real time monitoring that focuses on program outcomes or objectives.
Monitoring at the program level can show what planned and unplanned effects education programs have. This is important information, as it can prevent a program from unintentionally increasing marginalization, discrimination, conflict, or the impact of natural and climate change-induced hazards. It can be useful to collect various types of information from schools and other education programs on a sample basis, for example disaggregated data on school enrollment and dropout. Both announced and unannounced monitoring can improve the validity of the data and give a quick overview of any needs and challenges.
In cases where CVA is being used, the following information can be collected:
- Whether the modality and payment mechanisms are still appropriate
- Any protection concerns linked to the program
- The prices of key education-related items in local marketplaces, which help to ensure that the value of assistance is still appropriate
Education stakeholders can monitor out-of-school children and young people through visits to a random sample of households. Household data can indicate why they do not enroll in or go to school. It also can be gathered in collaboration with child protection programs to monitor child protection risks, especially those associated with financial constraints, such as child labor. Data on ethnicity or other characteristics may be too sensitive or difficult to collect on a comprehensive basis. In such cases, sample surveys and qualitative feedback gathered, for example, through informal conversations can highlight challenges specific groups face.
Monitoring and reporting systems can also be valuable accountability mechanisms. They can make it possible to report violations of the safety and wellbeing of learners and of teachers and other education personnel. They also make it possible to evaluate the condition of the education infrastructure. This is particularly important if there is a risk of schools being used for military purposes, or of being targeted for armed attack, abduction, child recruitment into armed forces and armed groups, or of being threatened by disasters associated with natural hazards, or if infrastructure increases the risk of gender-based violence. For this aspect of monitoring, education stakeholders may need to work with local and national authorities, or with UN agencies or NGOs responsible for security, justice, protection, and human rights. It is important to consider the sensitivity of the information reported and to share it in accordance with data protection policies and ISPs (for more guidance, see Minimum Standards for Child Protection, Standard 6).
Those involved in monitoring should be able to collect information from all people affected in a culturally sensitive way. The team should be fluent in local languages, culturally aware, and include enumerators and data collectors of all genders. They should be able to accommodate the needs of different groups so they are able to participate meaningfully, including women, persons with disabilities, and members of a marginalized community. This may include family escorts, translation support, transportation, etc. The team will also need training on ethical and responsible data collection, including safeguarding and specific skills and principles related to working with children or persons with disabilities.
Representatives of the affected community, including young people and youth-led organizations, should be involved as early as possible in monitoring the effectiveness of the education programs that affect them directly. This is particularly important with non-formal education programs for specific groups, such as adolescent girls, gender diverse people, or learners with disabilities. Communities also can identify monitoring mechanisms that are already in place before new ones are created. It is important to involve members of the affected community in monitoring, but it is also important not to overburden them with reporting responsibilities or duplicate efforts. Multiple partners may ask community members to participate in assessment and monitoring activities, so stakeholders should keep requests to a minimum. Stakeholders should coordinate their monitoring work with communities through the inter-agency coordination mechanism.
An EMIS is usually managed by national authorities. An EMIS makes it possible to store, aggregate, and analyze education data at all levels of an education system. It may track data at the learner or school level. It can be a useful entry point for strengthening the humanitarian-development-peacebuilding nexus and for promoting alignment, collaboration, and long-term planning. An EMIS that is operational and adapted to crisis and risks should have the capability to track trends and collect comparable system-wide data that is useful for emergency preparedness, response, and recovery. A capable and relevant EMIS should be supported by and, where possible, interoperable with other data management systems. This includes those used by national statistics offices and line ministries.
An EMIS may be disrupted by an emergency or unable to be adapted to the data demands related to the emergency response and recovery. To develop, upgrade, or adapt a national EMIS or the equivalent, it is important for education authorities and humanitarian actors to collaborate across the data lifecycle. They should jointly identify and address needs at a national, regional, and local level, which includes the capacity to do the following:
- Define outcomes related to crises and risks that the EMIS is tied to in order to adapt indicators so they can be measured systematically to track trends and progress over time
- Strengthen and institutionalize data management roles and processes to drive longer-term commitment, limit the fragmentation and duplication of data among stakeholders, and optimize existing capacities
- Adapt data collection tools and infrastructure according to the in-country data needs and priorities while prioritizing sustainable systems that are adapted to the context (i.e., low tech)
- Support actively using, sharing, and re-using data, including feedback mechanisms, to drive evidence-based decision-making
Start as early as possible so that a functioning EMIS is in place by the recovery phase, ideally one within a government body. When certain areas cannot be reached or EMIS coverage is incomplete, a national or regional bureau for statistics or partner agency may be able to fill the gaps and provide support. Partners should collaborate with education authorities at all levels to produce, manage, and share data, including during education needs assessments. This will provide an opportunity to align with any existing EMIS. It will also help education authorities define the priorities and opportunities for adapting the EMIS to meet data needs during crises and for managing future risk.
In some contexts, an EMIS does not capture education data on the displacement or protection status of learners, those with disabilities, or certain geographic areas. This can make it challenging to collect accurate data on whether crisis-affected learners are participating in education. Inter-agency coordination mechanisms should have monitoring systems in place to ensure that data on crisis-affected learners is being collected. This can supplement the EMIS by collecting data more frequently or by collecting data on populations that the EMIS often does not cover, such as out-of-school children and young people. It is important to consider compatibility with the EMIS so that data the coordination mechanism’s partners collect can eventually be included or used to update a national EMIS, if appropriate. In contexts where crisis-affected learners are not recognized in a national EMIS, humanitarian and development actors can advocate for and work with education authorities to strengthen inclusion.
An EMIS requires standard policies and procedures for collecting, managing, and using data in line with national or regional data protection laws. Compatible software and hardware are essential. National and local education offices and other education sub-sectors, such as national training institutes, should have compatible equipment, protocols, and standards so they can exchange information. Mobile phones with special software can improve data collection, but a lack of technology should not prevent data collection in under-resourced areas. It is important that EMIS-related data processes are purpose-driven with a clear rationale of how the data is intended to benefit learners’ education. This will help avoid unnecessary data production, which increases the risks of overburdening teachers, wasting school leaders’ time, and exposing data subjects to risks without a clear benefit.
It is important to monitor learners while they are learning and after they complete or leave a program. The information collected should be comparable for learners at different stages of a program, which requires systematic planning. Quantitative and qualitative assessments can measure the following:
- Gross and fine motor skills, cognitive development, and social and emotional development in young children
- The literacy and numeracy skills of primary age learners
- Children’s and young people’s awareness and use of key social and emotional skills
- Increases in levels of attainment through formative and summative assessments
- Learners’ access to post-literacy reading materials
- The rates at which learners continue their education (e.g., secondary or higher education)
- Employment rates for learners after completing TVET or higher education
Monitoring learners post-program provides valuable program design feedback for all types and levels of education (for more guidance see Minimum Economic Recovery Standards, Employment Standards).
Findings from monitoring a program or response should be shared regularly among all stakeholders involved, especially the communities affected and vulnerable groups. This should be done in an accessible and meaningful way. Education stakeholders can use feedback from the populations affected to guide changes in the response activities. Partners often have their own organizational feedback mechanisms, but education authorities and coordination mechanisms should encourage their partners to harmonize how feedback is collected and analyzed. This will in turn guide changes made to the collective education response. It is critical to determine how to communicate these changes and decisions most effectively to the communities. This is an essential aspect of accountability.
المؤشرات
INEE Domain | INEE Standard | Indicator/Program Requirements | Clarification | Numerator | Denominator | Target | Disaggregation | Source of Indicator | Source of Data | Available Tool | Crisis Phase | |
Foundational Standards | Community Participation | Participation (FDN/Community Participation Std 1) Community members participate actively, transparently, and without discrimination in analysis, planning, design, implementation, monitoring, and evaluation of education responses. |
1.1 Percentage of parents actively participating in the conception and implementation of education in emergencies services | Number of parents consulted | Number of parents | To be defined by program | Gender | Based on OCHA Indicator Registry | Program documentation | No tool required; INEE MS and indicator definitions sufficient | All stages | |
1.2 Percentage of parents satisfied with the quality and appropriateness of response at the end of the project | Number of parents satisfied with the quality and appropriateness of response at the end of the project | Number of parents | 100% | NA | Based on OCHA Indicator Registry | Program documentation | Tool required | All stages | ||||
Resources (FDN/Community Participation Std 2) Community resources are identified, mobilized and used to implement age-appropriate learning opportunities. |
1.3 Analysis of opportunity to use local resources is carried out and acted on | Scale 1-5 (1 = low, 5 = high) | 5 | NA | New | Program/procurement documentation | Tool required | All stages | ||||
Coordination | Coordination (FDN/Coordination Std 1) Coordination mechanisms for education are in place to support stakeholders working to ensure access to and continuity of quality education. |
1.4 Percentage of regular relevant coordination mechanism (i.e., Education Cluster, EiEWG, LEGs) meetings attended by program team | Number of regular relevant coordination mechanism (i.e.; Education Cluster, EiE Working Group (WG), Local Education Group (LEG) meetings attended by program team | Number of regular relevant coordination mechanism (i.e. Education Cluster, EiEWG, LEGs) meetings held during organizational presence | 100% | NA | New | Meeting records | No tool required; INEE MS and indicator definitions sufficient | All stages | ||
Analysis | Assessment (FDN/Analysis Std 1) Timely education assessments of the emergency situation are conducted in a holistic, transparent, and participatory manner. |
1.5 Percentage of education needs assessments, carried out by the relevant coordinating body the program has participated in | These include initial rapid and ongoing/rolling assessments | Number of assessments organization contributed to | Number of possible assessments organization could have contributed to | 100% | NA | New | Assessment records | No tool required; INEE MS and indicator definitions sufficient | All stages | |
Response Strategies (FDN/Analysis Std 2) Inclusive education response strategies include a clear description of the context, barriers to the right to education, and strategies to overcome those barriers. |
1.6 Strength of analysis of context, of barriers to the right to education, and of strategies to overcome those barriers | Scale 1-5 (1 = low, 5 = high) | 5 | NA | New | Program documentation | Tool required | All stages | ||||
Monitoring (FDN/Analysis Std 3) Regular monitoring of education response activities and the evolving learning needs of the affected population is carried out. |
1.7 Percentage of education needs assessments carried out in defined time period | Frequency to be defined by organization. Monitoring measures should be relevant to the desired program outcomes | Number of education needs assessments carried out per year | Number of education needs assessments required per year | 100% | NA | New | M&E plans and results | No tool required; INEE MS and indicator definitions sufficient | During program implementation | ||
Evaluation (FDN/Analysis Std 4) Systematic and impartial evaluations improve education response activities and enhance accountability. |
1.8 Number of evaluations carried out | Number of evaluations carried out | NA | NA | New | M&E plans and results | No tool required; INEE MS and indicator definitions sufficient | Program completion | ||||
1.9 Percentage of evaluations shared with parents | Number of evaluations shared with parents | Number of evaluations | 100% | NA | New | M&E plans and results | No tool required; INEE MS and indicator definitions sufficient | Program completion |