Reviewer Resources 2023
- Reviewer Guide
- Best Practices for a High Quality Peer Review Process Webinar
- Best Practices for a High Quality Peer Review Process Resources
- Examples of constructive and reliable & unconstructive and unreliable reviews
Reviewer Guide
The Reviewer Guide contains instructions on how to navigate the system as well a variety of best practices for completing a quality review:- Ensuring a Quality Review
- Notes about Reviewing Proposals
- Volunteering and Assignments as a Reviewer
- ASHE Conflict of Interest Policy
- Proposal Types
- Individual Presentations
- Session Submission
- Review Criteria
- Dates and Timeline
- Notes about using the ASHE Conference Portal
- Logging in to Review Proposals
- Accessing your Assigned Reviews
- Completing Your Reviews
- Navigating The Reviewer Assignment Page
- Reading the Proposal
- Completing the Reviewer Worksheet
- Submitting Your Review
Best Practices for a High Quality Peer Review Process Webinar
Monday, May 9, 2023
11:00am - 12:00pm Central/Minneapolis Time
As a scholarly association, one of the most important things we do as members is to support each other’s scholarship through peer review and feedback. In this one-hour webinar, we will accomplish four objectives: 1) Share best practices in providing quality, reliable, and constructive feedback to colleagues through the peer review process for the ASHE annual conference; 2) Review key features of the ASHE proposal evaluation criteria and rating process; 3) Provide and discuss aspects of both exemplar and problematic reviews; 4) Provide a 15-minute 'how-to' session on Navigating the Conference Portal. Whether it's your first time serving as a reviewer and you want to learn more about the process or your 30th time and you want to enhance your reviewing skills, we hope you'll join us. The webinar will be recorded and available on this webpage afterward.


- Angela Boatman, Boston College, ASHE 2023 Program Committee Chair
- Gerardo Blanco, Boston College, ASHE 2023 Program Committee Chair
Best Practices for a High Quality Peer Review Process Resources
To support an equitable, reliable, and quality peer review process for the ASHE conference, we synthesized recommendations on peer review based on existing research.
Reviewers’ practices are a deeply important part of creating a quality, equitable, and reliable peer review process. By agreeing to review for ASHE, we ask reviewers to commit to the practices, below.
2023 ASHE Reviewers will:
- Skim all proposals as soon as you receive them to ensure that you have either the topical, methodological, or generalist expertise to be able to review each proposal you have been assigned. Return proposals to ASHE if you do not have the appropriate expertise for reviews. You can do so by completing this form: https://forms.gle/5GtKX5rbNN4yUpiX9. Note: If you have expertise in the topic AND/OR the method, you are an appropriate reviewer for the proposal.
- Attend or listen to our ASHE Reviewer Webinar. The webinar will occur on 5/9 from 2:00pm-3:00pm Central/Minneapolis Time and be available by recording. This webinar is for all 2023 ASHE reviewers (and other members who want to join). We will discuss best practices in peer review, describe the ASHE proposal rubric and the review process in-detail, and learn from review examples.
-
Read the evaluation criteria for each proposal prior to reading the proposal. Be sure to read the criteria for rating so that your review of the proposal will focus on key details related to the evaluation rubric. Taking notes can help!
-
First, fill out the numerical ratings and the comments, and then finally the “Accept/Reject” categorical rating. This will allow your final decision “Accept/Reject” to build off the evaluative criteria in your numerical rankings and your specific feedback in comments.
-
Provide quality comments, including both positive aspects of the proposal and areas for improvement for all proposals. Ensure that your comments have a constructive and helpful tone. Be concrete about strengths and offer substantive, constructive, and specific comments toward improvement of the manuscript and learning of the author. For example, if there are concerns with the proposal, it may be helpful for you to not only point out what can be improved, but provide some possibilities for how the submitter(s) can go about improving the work. Comments should be at least 50 words long, but quality matters most.
-
Review for consistency across the three forms of evaluation (numerical ratings, comments, and “Accept/Reject” categorical rating). For example, proposals that reviewers indicate as “accept” should have strong and specific positive comments and also receive high numerical ratings. If you believe a proposal is worthy of “Accept,” ensure your numerical ratings reflect an average of 4 or above. If you rate any criteria lower than others, try to explain why in your comments.
- Use “Comments to the Association” for two reasons: 1) To clarify your opinion for proposals where you are unsure if it is worthy of acceptance; 2) For proposals that should be considered for a different section.
The ASHE Staff and Program Committee will:
- Ensure reviewers are prepared via a webinar, clear instructions, and communications. We will highlight best practices to reviewers.
- Calibrate reviewer expectations by reviewing the rubric in the webinar and providing examples of good and bad reviews to norm expectations for the types of feedback to provide, length, and tone.
- Assign a reasonable number of reviews to each reviewer and each proposal to multiple reviewers.
- To the degree possible, ensure each proposal is reviewed by someone who has expertise in the method or topic.
- Allow reviewers to “return” proposals to the section chair that they do not feel qualified to rate for re-assignment, as long as it is more than 7 days from the deadline.
- Require three forms of assessment of each proposal: 1) categorical response (Accept/Reject); 2) numeric ratings of proposal criteria; and 3) comments on reviews for the author. Structure the online rubric to encourage quality comments.
- Facilitate a second level of review on proposals with large differences in ratings across reviewers (e.g. 5,5, 1).
- Review all comments prior to sending reviews and decisions to authors.
References and Supplemental Reading on Peer Review Processes
American Association of University Professors (2015-16) Acquisitions Editorial Committee. (2016). AAUP Handbook: Best Practices for Peer Review.
Bornmann, L., & Daniel, H. D. (2005). Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees’ decisions. Scientometrics, 63, 297–320.
Bridges, D. (2009). Research quality assessment in education: impossible science, possible art?. British Educational Research Journal, 35(4), 497-517.
Capaccioni, A., & Spina, G. (2018). Guidelines for Peer Review. A Survey of International Practices. In The Evaluation of Research in Social Sciences and Humanities (pp. 55-69). Springer.
Cole, S., Cole, J. R., Rubin, L., National Academy of Sciences (U.S.). Committee on Science and Public Policy. (1978). Peer review in the National Science Foundation: phase one of a study : prepared for the Committee on Science and Public Policy of the National
Academy of Sciences. Washington: The Academy. Retrieved from: https://www.nationalacademies.org/includes/SciMat.pdf
Davis, W. E., Giner-Sorolla, R., Lindsay, D. S., Lougheed, J. P., Makel, M. C., Meier, M. E., ... & Zelenski, J. M. (2018). Peer-review guidelines promoting replicability and transparency in psychological science. Advances in Methods and Practices in Psychological Science, 1(4), 556-573.
Finn, C. E. (2002). The limits of peer review. Education Week, 21, 30-34.
Hackett, E. J., Chubin, D. E. (2003). Peer review for the 21st century: applications to education research. Paper presented at the conference entitled Peer Review of Education Research Grant Applications. Implications, Considerations, and Future
Directions, Washington, DC, USA.
Hodgson, C. (1997). How reliable is peer review? An examination of operating grant proposals simultaneously submitted to two similar peer review systems. Journal of Clinical Epidemiology, 50, 1189-1195.
Jayasinghe, U. W., Marsh, H. W., Bond, N. (2003). A multilevel cross-classified modelling approach to peer review of grant proposals: the effects of assessor and researcher attributes on assessor ratings. Journal of the Royal Statistical Society, Statistics in Society, 166,
279-300.
Klahr, D. (1985). Insiders, outsiders, and efficiency in a National Science Foundation panel. American Psychologist, 40, 148-154.
Langfeldt, L. (2004). Expert panels evaluating research: decision-making and sources of bias. Research Evaluation, 13, 51-62.
Lubienski, S. T. (2020). How to Review Conference Proposals (and Why You Should Bother). Educational Researcher, 49(1), 64-67.
Marsh, H. W., Jayasinghe, U. W., & Bond, N. W. (2008). Improving the peer-review process for grant applications: reliability, validity, bias, and generalizability. American psychologist, 63(3), 160.
Nylenna, M., Riis, P., Karlsson, Y. (1994). Multiple blinded reviews of the 2 manuscripts ñ effects of referee characteristics and publication language. Journal of the American Medical Association, 272, 149-151
Schröter, D. C., Coryn, C. L., & Montrosse, B. E. (2007). Peer review of abstracts submitted to the Graduate Student and New Evaluators Topical Interest Group for the 2006 American Evaluation Association conference. Journal of MultiDisciplinary Evaluation, 5(9), 25-40.
Sciullo, N. J., & Duncan, M. (2019). Professionalizing Peer Review Suggestions for a More Ethical and Pedagogical Review Process. Journal of Scholarly Publishing, 50(4), 248-264.
Silbiger, N. J., & Stubler, A. D. (2019). Unprofessional peer reviews disproportionately harm underrepresented groups in STEM. PeerJ, 7, e8247.
Wenneras, C., Wold, A. (1997). Nepotism and sexism in peer-review. Nature, 387: 341-343