Reviewer Resources



As a scholarly association, one of the most important things we do as members is support each other’s scholarship through peer review and feedback. As part of the 2024 proposal submission and review process, we are excited to share these resources with you:

  1. Reviewer Guide
  2. Critique with Care: Best Practices for a High Quality Peer Review Session​
  3. Best Practices for a High Quality Peer Review Process Resources
  4. Examples of Constructive and Reliable & Unconstructive and Unreliable Reviews

If you have any questions, about these reviewer resources or about the review process, please contact ASHE 2024 Program Committee Co-Chairs Drs. Jonathan Pryor (jpryor@csufresno.edu) or Rosemary (Rosie) Perez (perezrj@umich.edu) or any member of the Program Committee.

Reviews will be due on Tuesday, May 28 at 4:00pm Central/Minneapolis Time


Reviewer Guide

The Reviewer Guide contains instructions on how to navigate the system as well a variety of best practices for completing a quality review:

  • Ensuring a Quality Review
  • Notes about Reviewing Proposals
    • Volunteering and Assignments as a Reviewer
    • ASHE Conflict of Interest Policy
  • Proposal Types
    • Individual Presentations
    • Session Submission
  • Review Criteria
  • Dates and Timeline
  • Notes about using the ASHE Conference Portal
  • Logging in to Review Proposals
  • Accessing your Assigned Reviews
  • Completing Your Reviews
    • Navigating The Reviewer Assignment Page
    • Reading the Proposal
    • Completing the Reviewer Worksheet
    • Submitting Your Review

Critique with Care: Best Practices for a High Quality Peer Review

As a scholarly association, one of the most important things we do as members is to support each other’s scholarship through peer review and feedback. In this one-hour session, we will accomplish four objectives:

  1. Share best practices in providing quality, reliable, and constructive feedback to colleagues through the peer review process for the ASHE annual conference
  2. Review key features of the ASHE proposal evaluation criteria and rating process
  3. Discuss aspects of both exemplar and problematic reviews
  4. Provide a 15-minute 'how-to' session on navigating the Conference Portal

Whether it's your first time serving as a reviewer and you want to learn more about the process or your 30th time and you want to enhance your reviewing skills, we hope you'll join us live or review the recording of the session that will be posted to this page afterward.

Moderators:

Rosemary (Rosie) Perez, PhD
(she/her)
Program Committee Co-Chair
Associate Professor, University of Michigan


Jonathan Pryor, PhD
(he/him/his)
Program Committee Co-Chair
Associate Professor, California State University, Fresno


Alicia Castillo Shrestha, MEd
(she/her)
ASHE Assistant Director, Conference and Events

 

Panelists:

Crystal E. Garcia

Crystal E. Garcia, PhD
(she/her/hers)
Associate Professor, University of Nebraska-Lincoln


Christen Priddie

Christen Priddie, PhD
(she/hers)
Assistant Research Scientist, Indiana University Bloomington


Stephen John Quaye

Stephen John Quaye, PhD
(he/him/his)
Associate Dean for Excellence in Graduate and Postdoctoral Training and Professor, The Ohio State University

 

Best Practices for a High Quality Peer Review Process Resources

PDF Document Resource: How to Review Conference Proposals (and Why You Should Bother) by Sarah Theule Lubienski

To support an equitable, reliable, and quality peer review process for the ASHE conference, we synthesized recommendations on peer review based on existing research.

References and Supplemental Reading on Peer Review Processes

American Association of University Professors (2015-16) Acquisitions Editorial Committee. (2016). AAUP Handbook: Best Practices for Peer Review.

Bornmann, L., & Daniel, H. D. (2005). Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees’ decisions. Scientometrics, 63, 297–320.

Bridges, D. (2009). Research quality assessment in education: impossible science, possible art?. British Educational Research Journal, 35(4), 497-517.

Capaccioni, A., & Spina, G. (2018). Guidelines for Peer Review. A Survey of International Practices. In The Evaluation of Research in Social Sciences and Humanities (pp. 55-69). Springer.

Cole, S., Cole, J. R., Rubin, L., National Academy of Sciences (U.S.). Committee on Science and Public Policy. (1978). Peer review in the National Science Foundation: phase one of a study : prepared for the Committee on Science and Public Policy of the National

Academy of Sciences. Washington: The Academy. Retrieved from: https://www.nationalacademies.org/includes/SciMat.pdf

Davis, W. E., Giner-Sorolla, R., Lindsay, D. S., Lougheed, J. P., Makel, M. C., Meier, M. E., ... & Zelenski, J. M. (2018). Peer-review guidelines promoting replicability and transparency in psychological science. Advances in Methods and Practices in Psychological Science, 1(4), 556-573.

Finn, C. E. (2002). The limits of peer review. Education Week, 21, 30-34.

Hackett, E. J., Chubin, D. E. (2003). Peer review for the 21st century: applications to education research. Paper presented at the conference entitled Peer Review of Education Research Grant Applications. Implications, Considerations, and Future Directions, Washington, DC, USA.

Hodgson, C. (1997). How reliable is peer review? An examination of operating grant proposals simultaneously submitted to two similar peer review systems. Journal of Clinical Epidemiology, 50, 1189-1195.

Jayasinghe, U. W., Marsh, H. W., Bond, N. (2003). A multilevel cross-classified modelling approach to peer review of grant proposals: the effects of assessor and researcher attributes on assessor ratings. Journal of the Royal Statistical Society, Statistics in Society, 166, 279-300.

Klahr, D. (1985). Insiders, outsiders, and efficiency in a National Science Foundation panel. American Psychologist, 40, 148-154.

Langfeldt, L. (2004). Expert panels evaluating research: decision-making and sources of bias. Research Evaluation, 13, 51-62.

Lubienski, S. T. (2020). How to Review Conference Proposals (and Why You Should Bother). Educational Researcher, 49(1), 64-67.

Marsh, H. W., Jayasinghe, U. W., & Bond, N. W. (2008). Improving the peer-review process for grant applications: reliability, validity, bias, and generalizability. American psychologist, 63(3), 160.

Nylenna, M., Riis, P., Karlsson, Y. (1994). Multiple blinded reviews of the 2 manuscripts ñ effects of referee characteristics and publication language. Journal of the American Medical Association, 272, 149-151.

Schröter, D. C., Coryn, C. L., & Montrosse, B. E. (2007). Peer review of abstracts submitted to the Graduate Student and New Evaluators Topical Interest Group for the 2006 American Evaluation Association conference. Journal of MultiDisciplinary Evaluation, 5(9), 25-40.<

Sciullo, N. J., & Duncan, M. (2019). Professionalizing Peer Review Suggestions for a More Ethical and Pedagogical Review Process. Journal of Scholarly Publishing, 50(4), 248-264.

Silbiger, N. J., & Stubler, A. D. (2019). Unprofessional peer reviews disproportionately harm underrepresented groups in STEM. PeerJ, 7, e8247.

Wenneras, C., Wold, A. (1997). Nepotism and sexism in peer-review. Nature, 387: 341-343.


Examples

ASHE provides a resource document here for examples of constructive and reliable & unconstructive and unreliable reviews. These examples are not real reviews.

PDF Document Resource: ASHE Examples for Reviewers