Measure Review, Revision, and Resubmission Processes
The INEE Measurement Library (ML) peer review process upholds fairness, transparency, and professionalism when evaluating measures submitted for consideration in the library. In the interest of ensuring a sustained continuum of publishing high-quality measures and their accompanying materials, the INEE ML outlines criteria that inform the revision and resubmission process.
Measure Review
All submissions are initially reviewed by the INEE Secretariat to ensure that they are complete and adhere to submission requirements prior to being sent out to reviewers. If required pieces of information are missing, the INEE Secretariat may reach out to the developers to ask them to share this information. Incomplete submissions will not be reviewed without the timely resubmission of the required materials. Next, INEE shares submitted measures with members of the INEE ML Reference Group (MLRG) for peer-evaluation. The INEE MLRG consists of independent measurement experts who volunteer their time bi-annually to support a rigorous peer-review process. Each tool is evaluated by four reviewers; two psychometric experts to evaluate the psychometric properties of the measure; and two field experts to evaluate the usability and adaptability of the accompanying training materials in other contexts/settings.
The Measure Review Criteria outline the three ratings by which the INEE ML classifies submitted measures based on the quality and types of validity and reliability evidence available: the seedling, the young tree, and the fully-grown tree.
These ratings will be made in consideration of the proposed purpose(s) for which the measure was developed and evaluated. The ML Psychometric Rubric is a guideline for both the developers and reviewers for determining the types of psychometric evidence fit for a variety of purposes.
The review process and/or the MLRG reviewers ensure adherence to these requirements when assessing each submission. For instance, a tool submitted for program evaluation should demonstrate acceptable types of validity and reliability evidence for the purpose. In cases where a measure can be considered for a different purpose with higher ratings given the evidence, the reviewers and MLRG Co-chairs may decide to provide options for the developers to be accepted for a lower rating for the proposed purpose (e.g., seedling for the national monitoring purposes); or for a higher rating with a different purpose (e.g., sapling for the formative assessment purposes) and provide feedback and recommendations to meet the criteria for each purpose.
Communication of Review Decision
Upon completion of review by the MLRG, INEE will consolidate and communicate the feedback and the MLRG’s decisions on the acceptance, ratings, and adequate purposes of the measure given evidence to the developers.
Revision and Resubmission: Accepted Measures
Before a measure is fully accepted for publication, the developer is expected to address comments and/or concerns raised through the peer review process and communicated back to the developers by INEE. Depending on the nature of the feedback, the review report will outline either or both of the following options for addressing the feedback:
1. Revision and Resubmission: Accepted Measures
Before a measure is fully accepted for publication, the developer is expected to address comments and/or concerns raised through the peer review process and communicated back to the developers by INEE. Depending on the nature of the feedback, the review report will outline either or both of the following options for addressing the feedback:Revise the measure to be published within the tool rating given by the MLRG
The first section of the review report outlines the minimum and mandatory revisions to be undertaken before the measure is fully accepted by the INEE ML for publication. If the developer chooses to proceed with the revisions for the rating and purpose(s) recommended by the reviewers, they will have 14 days to address the reviewers’ comments. These comments may consist of:
- Additional information and explanation about the measure and/or the study and evidence already presented in the submitted tools and evidence report.
- Revisions to the submitted training materials that would allow better usage and adaptation of the training for future use.
INEE and the MLRG Co-Chairs will review the revised materials and proceed to publish the measure if the revisions are satisfactory.
2. Re-submit the measure for re-evaluation to be considered for a higher rating or for a different purpose
If the MLRG considers the measure and the evidence has a potential to be accepted with a higher rating and/or could be used for a different purposes if additional evidence and/or information is provided, the MLRG will provide optional recommendations and feedback for the developers to consider. These recommendations may include (but are not limited to):
- Additional evidence of either the reliability or validity of the measure, or both.
- Additional training materials that would align with additional evidence and other purpose(s).
If the developer wishes for their tool to be re-considered for a higher rating, they may choose to resubmit the tool for the re-evaluation for the next round of the review process. While the developer is not obliged to address comments in this section for the measure to be published at the given rating, choosing to do so will require a resubmission for re-evaluation through a peer review process. Resubmissions for re-evaluation are accepted in the subsequent call for ML submissions.
It is also acceptable for the developer to proceed with the first option of revising the measure for the rating and purpose(s) recommended by the reviewers to publish the tool on ML; then later decide to resubmit the measure with additional evidence and training materials for re-evaluation of a higher rating and/or for different purposes.