Peer Review Software and Tools: ScholarOne, Editorial Manager, and More

The infrastructure behind peer review is largely invisible to readers, but for authors, editors, and reviewers, the software platforms managing manuscript submissions shape nearly every step of the process. Understanding how these tools work — and what distinguishes one from another — helps researchers navigate submissions more effectively, ask better questions when problems arise, and recognize the limits of what any platform can solve.


What Peer Review Management Systems Actually Do

Peer review management software handles the administrative workflow that journals once managed through paper, postal mail, and individual correspondence. At their core, these systems track manuscript submissions, route files to editors, facilitate reviewer invitations and assignments, collect structured review forms, and communicate decisions back to authors.

The dominant platforms in academic publishing — ScholarOne Manuscripts (owned by Clarivate), Editorial Manager (developed by Aries Systems, now part of Research Square), and eJournal Press — are not interchangeable products. Each has a distinct feature set, pricing model, and relationship with specific publishers or disciplinary communities. A researcher submitting to a Nature Portfolio journal will encounter a different submission interface than one submitting to a journal hosted on the Public Knowledge Project's Open Journal Systems (OJS), which is a free, open-source alternative widely used by smaller and society-published journals.

The distinction between the platform and the editorial process is important. Software can enforce deadlines, send automated reminders, and flag incomplete submissions, but it cannot assess reviewer qualifications, catch conflicts of interest, or compensate for editorial under-resourcing. Understanding the full peer review process helps clarify where technology ends and human judgment begins.


Major Platforms: ScholarOne, Editorial Manager, and Open Journal Systems

ScholarOne Manuscripts is among the most widely deployed platforms in life sciences publishing. It is used by journals affiliated with major publishers including Wiley, the American Chemical Society, and the Institute of Electrical and Electronics Engineers (IEEE). Its workflow tools support complex routing logic, customizable review forms, and integration with researcher identification systems such as ORCID. Authors submitting via ScholarOne will typically create an account tied to their institutional email or ORCID iD, upload manuscript files in specified formats, and complete metadata forms that may include suggested reviewers, funding sources, and conflict-of-interest disclosures.

Editorial Manager, developed by Aries Systems, is similarly entrenched in biomedical and life sciences publishing. It is used by journals affiliated with Springer Nature, Elsevier, and many professional society publishers. Editorial Manager includes a companion tool called ProduXion Manager for production workflow, and it supports integration with iThenticate for similarity checking. The platform's author interface has historically drawn criticism for complexity, particularly around file format requirements and figure submission specifications.

Open Journal Systems (OJS), maintained by the Public Knowledge Project at Simon Fraser University, occupies a different segment of the market. It is open-source, freely available, and used extensively by university-based journals, regional science publishers, and open-access titles in lower-resource settings. OJS does not have the same institutional sales infrastructure as ScholarOne or Editorial Manager, but its transparency and customizability make it a significant presence in global scientific publishing.

Other platforms worth awareness include Manuscript Central (an earlier name for ScholarOne still referenced in older documentation), Bench>Press (used by Rockefeller University Press), and Submittable, which is more common in humanities and creative fields. The bioRxiv and medRxiv preprint servers operate separate submission infrastructure that interacts with some journal platforms through direct transfer agreements.


The Role of ORCID, CrossRef, and Persistent Identifiers

Peer review software does not operate in isolation. It connects to a broader infrastructure of scholarly identity and metadata standards. ORCID (Open Researcher and Contributor ID) is a nonprofit organization that issues persistent digital identifiers for researchers. Many journal platforms now require or strongly encourage ORCID integration during submission, because it reduces ambiguity about authorship, supports funding attribution, and enables record-keeping across name changes or institutional affiliations.

CrossRef, a nonprofit membership organization operated by Publishers International Linking Association (PILA), manages the digital object identifier (DOI) system for scholarly content. When a manuscript is accepted and published, the platform typically generates or registers a DOI through CrossRef. Authors and reviewers interacting with peer review software are, often without realizing it, engaging with this broader standards infrastructure.

The Committee on Publication Ethics (COPE), while not a software vendor, publishes detailed guidelines that many platforms operationalize in their review forms and editorial decision menus. COPE's flowcharts for editors — covering situations like alleged plagiarism, reviewer misconduct, or authorship disputes — are referenced by platforms in their documentation and used by journals as a basis for configuring editorial workflows.


What These Platforms Cannot Do

A persistent misconception is that sophisticated peer review software produces rigorous peer review. The platform structures the process; it does not guarantee the quality of the evaluation. Reviewer selection, the depth of review invited, the number of reviewers assigned, the turnaround expectations enforced — these are editorial decisions that software can facilitate but cannot replace.

Similarity-checking tools like iThenticate and Turnitin (both now owned by Clarivate) are often embedded in submission platforms and automatically generate overlap reports when manuscripts are uploaded. These reports flag textual similarity with previously published content, but they do not detect fabricated data, misrepresented methods, or citation manipulation. The confusion between plagiarism detection and research integrity assessment has caused real harm, both in cases where authors have been penalized for acceptable self-citation and in cases where more serious integrity failures were missed because an algorithm found no textual overlap.

For a deeper look at the ethical dimensions that software cannot solve, see the site's coverage of ethics in peer review.


Choosing the Right Tool: What Editors and Publishers Should Evaluate

For journals in the process of selecting or switching platforms, several considerations carry practical weight. First, the platform's compatibility with the journal's publishing model matters significantly — an open-access journal operating under Plan S compliance requirements, for example, needs submission software that can capture and report rights retention and licensing data accurately.

Second, support for different types of peer review — including double-blind, open, and post-publication review — varies across platforms. Not all systems support anonymizing author metadata cleanly across all file types, and some open review features (such as publishing referee reports alongside articles) require custom configuration or third-party integrations.

Third, accessibility and global usability deserve more attention than they typically receive. Authors submitting from institutions in lower-bandwidth environments or using non-Western language keyboards have documented difficulties with platforms optimized for North American and European institutional infrastructure. The Directory of Open Access Journals (DOAJ) and the African Journals Online (AJOL) platform have highlighted these friction points in their documentation for prospective member journals.


When to Seek Technical or Editorial Help

Authors experiencing submission problems — failed uploads, missing review invitations, or incorrect decision communications — should contact the journal's editorial office directly rather than the platform's technical support, unless the problem is clearly a system error (such as a portal outage). The editorial office has privileged access to the submission record and can intervene in ways that general technical support cannot.

Reviewers who have been invited through a platform but encounter access issues, deadline discrepancies, or concerns about the appropriateness of a review assignment should similarly contact the handling editor. Platform-level support can reset passwords and resolve account issues, but questions about the substance and fairness of a review assignment are editorial matters.

For guidance on the actual content of a review — not the software used to submit it — see how to write a peer review. For questions about the broader process, the site's frequently asked questions page addresses common points of confusion across submission stages.

Understanding the tools means understanding their limits. Peer review software streamlines administration; the quality of science communication depends on what happens inside that workflow, not the interface surrounding it.

📜 1 regulatory citation referenced  ·  🔍 Monitored by ANA Regulatory Watch  ·  View update log

References