The focal article by Köhler et al. (2020) provides a useful framework for promoting robust, rigorous, and reliable reviewing by developing a competency framework for reviewers. We applaud important efforts such as this, aimed at enhancing the peer review process and creating common peer review expectations. It is also useful that Köhler et al. tackled this complex issue from the vantage point of multiple stakeholders (e.g., academia, organizations, granting agencies). In this commentary, we focus specifically on the peer review process in academic journals, given our reviewing and editorial experience. The author team's experience includes 157 collective years reviewing for peer-reviewed academic journals, membership on 33 different editorial boards, serving as associate editors for 15 different journals, and leading 7 journals as editor-in-chief, guest editor, or special issue editor. As outlined by Köhler et al. (2020), clear standards for what constitutes an effective review have potential benefits for authors (e.g., more developmental and actionable feedback), reviewers (e.g., clearer reviewing expectations, improved knowledge and skills for reviews), and journal editors and associate editors (henceforth referred to as "editors"; e.g., greater efficiency in processing reviews and reaching decisions), as well as the field as a whole (e.g., improved scientific rigor, reliability, and reproducibility). One of the approaches proposed in the focal article was training on the competency framework (1) in academic classes that discuss peer reviewing, (2) when providing feedback to PhD students who prepare reviews, and (3) at conferences and through the Consortium for the Advancement of Research Methods and Analysis (CARMA). As important and useful as training-based approaches are, such efforts tend to have a relatively limited reach, relative to the number of authors and reviewers (and editors) for which such training would be beneficial. More specifically, although there is overlap in membership, The Society for Industrial and Organizational Psychology has over 10,000 members, the Human Resource Management and Organizational Behavior Divisions of the Academy of Management have over 9,000 members, and the European Association of Work and Organizational Psychology has approximately 2,000 members. Thus, providing formal training for industrial and organizational (I-O) psychologist reviewers on the large-and-growing scale seems formidable or perhaps even prohibitive. Reviewing is also a volunteer activity, so it may be unrealistic to expect individuals to invest a great deal of time on reviewer training given many superseding obligations (e.g., for faculty, this may include teaching, conducting research, mentoring students, applying for grants, The third through ninth authors contributed equally and are listed alphabetically. The ideas presented in this commentary do not reflect formal policy at any journal (e.g., Journal of Applied Psychology) or professional association (e.g., American Psychological Association).