Reviewer Resources

As a scholarly association, one of the most important things we do as members is support each other’s scholarship through peer review and feedback. Through the strategic planning process and preparations for the 2020 conference, we realized there are several ways to enhance the conference peer review and feedback process through full participation of our ASHE community. 
 
As part of the 2020 proposal submission and review process, we are excited to share these resources with you:
  1. Reviewer Guide
  2. Best Practices for a High Quality Peer Review Process Webinar ​
  3. Best Practices for a High Quality Peer Review Process Resources
  4. Examples of constructive and reliable & unconstructive and unreliable reviews 
If you have any questions, about these reviewer resources or about the review process, please contact ASHE 2020 Program Committee Co-Chair Corbin Campbell, ASHE Conference Coordinator James Hines, or any member of the Program Committee.
*Note: Reviews will be assigned around May 15.

Reviewer Guide

The Reviewer Guide contains instructions on how to navigate the system as well a variety of best practices for completing a quality review:
  • Ensuring a Quality Review
  • Notes about Reviewing Proposals
    • Volunteering and Assignments as a Reviewer
    • ASHE Conflict of Interest Policy
  • Proposal Types
    • Individual Presentations
    • Session Submission
  • Dates and Timeline
  • Notes about using the ASHE Conference Portal
    • Logging in to Review Proposals
    • Accessing your Assigned Reviews
  • Completing Your Reviews
    • Navigating The Reviewer Assignment Page
    • Reading the Proposal
    • Completing the Reviewer Worksheet
    • Submitting Your Review

Best Practices for a High Quality Peer Review Process Webinar

As a scholarly association, one of the most important things we do as members is to support each other’s scholarship through peer review and feedback. In this one-hour webinar, we will accomplish three objectives: 1) Share best practices in providing quality, reliable, and constructive feedback to colleagues through the peer review process for the ASHE annual conference; 2) Review key features of the ASHE proposal evaluation criteria and rating process; 3) Provide and discuss aspects of both exemplar and problematic reviews. Whether it's your first time serving as a reviewer and you want to learn more about the process or your 30th time and you want to enhance your reviewing skills, we hope you'll join us. The webinar will be recorded and available on this webpage afterward.

Live Captioning Transcript


Best Practices for a High Quality Peer Review Process Resources

To support an equitable, reliable, and quality peer review process for the ASHE conference, we synthesized recommendations on peer review based on existing research.

Reviewers’ practices are a deeply important part of creating a quality, equitable, and reliable peer review process.  By agreeing to review for ASHE, we ask reviewers to commit to the practices, below.

2020 ASHE Reviewers will:

  1. Skim all proposals as soon as you receive them to ensure that you have either the topical, methodological, or generalist expertise to be able to review each proposal you have been assigned.  Return proposals to ASHE co-chairs if you do not have the appropriate expertise for reviews.  Note: If you have expertise in the topic AND/OR the method, you are an appropriate reviewer for the proposal. 
  2. Attend or listen to our ASHE Reviewer Webinar. The webinar will occur on 5/15 from 3-4pm EST and be available by recording.  This webinar is for all 2020 ASHE reviewers (and other members who want to join).  We will discuss best practices in peer review, describe the ASHE proposal rubric and the review process in-detail, and learn from review examples.
  3. Read the evaluation criteria for each proposal prior to reading the proposal.  Be sure to tune your eye to the criteria for rating so that your read of the proposal will focus on key details related to the evaluation rubric.  Taking notes can help!
  4. First, fill out the numerical ratings and the comments, and then finally the “Accept/Reject” categorical rating.  This will allow your final decision “Accept/Reject” to build off the evaluative criteria in your numerical rankings and your specific feedback in comments. 
  5. Provide quality comments, including both positive aspects of the proposal and areas for improvement for all proposals.  Ensure that your comments have a constructive and helpful tone.  Comments should be at least 50 words long, but quality matters most. Try to be concrete about strengths and offer substantive, constructive, and specific comments toward improvement of the manuscript and learning of the author.
  6. Review for consistency across the three forms of evaluation (numerical ratings, comments, and “Accept/Reject” categorical rating).  For example, proposals that reviewers indicate as “accept” should have strong and specific positive comments and also receive high numerical ratings.  If you believe a proposal is worthy of “Accept,” ensure your numerical ratings reflect an average of 4 or above.  If you rate any criteria lower than others, try to explain why in your comments. (note: while an average of 4 is useful for “tuning” your ratings, there will be variation across sections and year to year on exact scores for Accepted proposals)
  7. Use “Comments to the Association” for two reasons: 1) To clarify your opinion for proposals where you are unsure if it is worthy of acceptance; 2) For proposals that should be considered for an alternative format (e.g., roundtable/poster) or different section. 

We also include what the ASHE 2020 program committee and ASHE office will do to support a strong peer review process.
The ASHE Office and 2020 Planning Committee will:

  1. Ensure reviewers are prepared via a webinar, clear instructions, and communications. We will highlight best practices to reviewers.
  2. Go over the evaluation rubric with reviewers in detail via webinar. Provide examples of good and bad reviews to norm expectations for the types of feedback to provide, length, and tone.  Discuss ratings together to calibrate reviewer expectations
  3. Assign a reasonable number of reviews to each reviewer and each proposal to multiple reviewers.
  4. To the degree possible, ensure each proposal is reviewed by someone who has expertise in the method or topic.
  5. Allow reviewers to “return” proposals to the section chair that they do not feel qualified to rate for re-assignment, as long as it is more than 7 days from the deadline.
  6. Require three forms of assessment of each proposal: 1) categorical response (Accept/Reject); 2) numeric ratings of proposal criteria; and 3) comments on reviews for the author. Structure the online rubric to encourage quality comments.
  7. Facilitate a second level of review on proposals with large differences in ratings across reviewers (e.g. 5,5, 1).
  8. Review all comments prior to sending reviews and decisions to authors. PC Chairs will review to ensure comments are professional in tone.

References and Supplemental Reading on Peer Review Processes
 
American Association of University Professors (2015-16) Acquisitions Editorial Committee. (2016). AAUP Handbook: Best Practices for Peer Review.

Bornmann, L., & Daniel, H. D. (2005). Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees’ decisions. Scientometrics, 63, 297–320.

Bridges, D. (2009). Research quality assessment in education: impossible science, possible art?. British Educational Research Journal, 35(4), 497-517.

Capaccioni, A., & Spina, G. (2018). Guidelines for Peer Review. A Survey of International Practices. In The Evaluation of Research in Social Sciences and Humanities (pp. 55-69). Springer.

Cole, S., Cole, J. R., Rubin, L., National Academy of Sciences (U.S.). Committee on Science and Public Policy. (1978). Peer review in the National Science Foundation: phase one of a study : prepared for the Committee on Science and Public Policy of the National

Academy of Sciences. Washington: The Academy. Retrieved from: https://www.nationalacademies.org/includes/SciMat.pdf

Davis, W. E., Giner-Sorolla, R., Lindsay, D. S., Lougheed, J. P., Makel, M. C., Meier, M. E., ... & Zelenski, J. M. (2018). Peer-review guidelines promoting replicability and transparency in psychological science. Advances in Methods and Practices in Psychological Science, 1(4), 556-573.

Finn, C. E. (2002). The limits of peer review. Education Week, 21, 30-34.

Hackett, E. J., Chubin, D. E. (2003). Peer review for the 21st century: applications to education research. Paper presented at the conference entitled Peer Review of Education Research Grant Applications. Implications, Considerations, and Future
Directions, Washington, DC, USA.

Hodgson, C. (1997). How reliable is peer review? An examination of operating grant proposals simultaneously submitted to two similar peer review systems. Journal of Clinical Epidemiology, 50, 1189-1195.

Jayasinghe, U. W., Marsh, H. W., Bond, N. (2003). A multilevel cross-classified modelling approach to peer review of grant proposals: the effects of assessor and researcher attributes on assessor ratings. Journal of the Royal Statistical Society, Statistics in Society, 166,
279-300.

Klahr, D. (1985). Insiders, outsiders, and efficiency in a National Science Foundation panel. American Psychologist, 40, 148-154.

Langfeldt, L. (2004). Expert panels evaluating research: decision-making and sources of bias. Research Evaluation, 13, 51-62.

Lubienski, S. T. (2020). How to Review Conference Proposals (and Why You Should Bother). Educational Researcher, 49(1), 64-67.

Marsh, H. W., Jayasinghe, U. W., & Bond, N. W. (2008). Improving the peer-review process for grant applications: reliability, validity, bias, and generalizability. American psychologist, 63(3), 160.

Nylenna, M., Riis, P., Karlsson, Y. (1994). Multiple blinded reviews of the 2 manuscripts ñ effects of referee characteristics and publication language. Journal of the American Medical Association, 272, 149-151

Schröter, D. C., Coryn, C. L., & Montrosse, B. E. (2007). Peer review of abstracts submitted to the Graduate Student and New Evaluators Topical Interest Group for the 2006 American Evaluation Association conference. Journal of MultiDisciplinary Evaluation, 5(9), 25-40.

Sciullo, N. J., & Duncan, M. (2019). Professionalizing Peer Review Suggestions for a More Ethical and Pedagogical Review Process. Journal of Scholarly Publishing, 50(4), 248-264.

Silbiger, N. J., & Stubler, A. D. (2019). Unprofessional peer reviews disproportionately harm underrepresented groups in STEM. PeerJ, 7, e8247.

Wenneras, C., Wold, A. (1997). Nepotism and sexism in peer-review. Nature, 387: 341-343


Examples

Click here for examples of constructive and reliable & unconstructive and unreliable reviews. These examples are not real reviews.