Skip to main content
Our Mission

Building Trust Through Engineered Objectivity

In today’s complex regulatory landscape — where the credibility of sponsored science faces increasing scrutiny — SciPinion was established to build trust and maintain credibility through our Certified Peer Reviews.

Our goal is to introduce clarity and certainty from the expert community to the world’s toughest science problems — instilling universal trust in science.

SciPinion Certified Peer Reviews — six trust standards surrounding the Trust shield
EPA Recognized
Meets or exceeds SAP standards · Report No. 22-E-0053
Reduce bias — systematic methodology that eliminates sources of influence
Increase trust — process transparency that regulators can verify
Foster engagement — connecting experts to problems that matter
The Process

A Structured Path to Defensible Conclusions

Every SciPinion engagement follows a defined methodology — from expert recruitment through final reporting. This structure is what makes findings reproducible and conclusions defensible under regulatory scrutiny.

1
Phase 01
Pre-SciPi
Preparation & Expert Selection
2
Phase 02
SciPi
Triple-Blinded Deliberation
3
Phase 03
Post-SciPi
Reporting & Deliverables
SciPinion peer review process flowchart — Pre-SciPi, SciPi, and Post-SciPi phases

SciPinion’s end-to-end process, from expert recruitment to final deliverables. Each phase is detailed below.

Design Features

Three Phases of Every Engagement

Each panel engagement — called a SciPi (a collection of Scientific oPinions) — involves three distinct phases, with design options tailored to project objectives.

Pool of Ideal Reviewers

Four intersecting criteria define expert eligibility

Venn diagram showing four overlapping criteria for ideal reviewers

The ideal reviewer sits at the intersection of four criteria. Our recruitment process identifies experts who meet all requirements simultaneously.

1
Pre-SciPi
Preparation & Expert Selection

Collaborative planning between SciPinion and sponsor ensures the engagement meets its required objectives.

The sponsor can choose to assemble the review package or have SciPinion prepare it. For controversial topics, a third-party auditor can help vet the expertise of volunteers.
Charge questions can be prepared by the sponsor or SciPinion. A third-party editor can review questions to verify no leading or biased language is used.
2
SciPi
Triple-Blinded Deliberation

Triple-blinded process: The expert panel is blinded to sponsors, sponsors are blinded to the panel, and panelists are blinded to each other — eliminating groupthink and conformity pressures.

Round 1: Experts independently review materials and answer charge questions. They cannot see other panelists’ responses.
Round 2: Debate round where experts review aggregated results and engage in structured discussion. Experts can vote on fellow panelists’ contributions.
Round 3: Follow-up questions and opportunity for experts to revise their Round 1 answers based on insights from the debate.
3
Post-SciPi
Reporting & Deliverables

Flexible deliverables matched to project objectives. Report format and depth are determined upon completion of the SciPi.

PDF results directly from the SciPinion application for quick internal reference.
Analytical reports with outlier analysis and consensus metrics for panels of 10+ experts.
Formal reports suitable for regulatory submissions or peer-reviewed manuscript publication.
Expert testimony arrangements for panelists to present findings to regulatory bodies.
Our Rationale

Why We Do Things The Way We Do

Panel Selection

When we select a panel of experts for a peer review, we use an objective and quantitative metric of expertise — and a model picks the panel. The process is completely objective, quantitative, transparent, and reproducible.

We do not assume or make any prejudgments about what the opinions of any particular expert will be. If a sponsor desires or requires certain diversity in demographics (region of residence, gender, sector of employment), the model can account for those requirements without compromising objectivity.

100%
Quantitative, model-driven selection — no subjective judgment in panel composition

Panel Engagement

All engagements with experts occur online through the SciPinion web app. We do this to eliminate the negative influences that occur with face-to-face meetings. While working online, each expert’s identity is not disclosed — they are labeled sequentially as Expert 1, Expert 2, and so on. This eliminates undue influence from recognizing peers and affords greater psychological safety.

Upon completion, expert identities can be disclosed via three options: experts are identified but a key is not provided (preferred), experts are identified with a key provided, or — in rare cases involving highly controversial topics — experts may remain anonymous in perpetuity.

Identity Disclosure Options
Named, no key (preferred) Named with key Permanent anonymity
Managing Bias

Why Traditional Panels Fall Short

Face-to-face deliberations suffer from well-documented cognitive biases. Our methodology eliminates these failure points by design.

1

Groupthink

Social pressure drives conformity around incorrect positions.

2

Deference

Panelists defer to perceived authority regardless of argument quality.

3

Amplification

Early opinions get reinforced, drowning out alternatives.

4

Overbearing Members

Dominant personalities silence quieter experts.

The Chilling Effect

In today’s adversarial environment, experts have been attacked for speaking on controversial topics. Many stay on the sidelines despite being highly knowledgeable. Our anonymous methodology provides the psychological safety necessary for genuine scientific assessment.

Can any process truly eliminate bias?

Everyone has bias — some sources unknown even to the expert. Rather than claiming elimination, we minimize impact through structural safeguards: triple-blinding, anonymous deliberation, and panels large enough to capture the true distribution of expert opinion.

Why not “balance” viewpoints on the panel?

“Balancing” assumes you know experts’ opinions in advance — itself a form of bias. Worse, if true consensus is 95/5, a 50/50 panel produces fundamentally unrepresentative findings that tell you nothing about the field’s actual stance.

How does SciPinion’s approach differ?

We select based on expertise, objectivity, availability, and willingness — not predicted opinions. Sponsors cannot influence panel composition. Findings reflect the actual distribution of expert opinion, whether that supports the sponsor’s position or not.

Proven Reproducibility

Independent Panels, Consistent Results

When two independent panels review the same materials using our methodology, their findings converge. The adjacent graph compares favorability scores from Panel 1 and Panel 2 across 18 charge questions — the alignment demonstrates that our structured approach yields reproducible outcomes regardless of which specific experts participate.

No other company or government body has tested whether their approach to assembling and conducting peer review panels yields reproducible results. This is a fundamental validation that most peer review processes lack.

Only organization to have demonstrated panel reproducibility across independent expert groups

Reproducibility Analysis
Panel 1 vs. Panel 2 — 18 Charge Questions
Favorability score comparison: Panel 1 vs Panel 2 across 18 charge questions
Regulatory Validation

Trusted by Government Agencies

SciPinion is trusted by government agencies including Health Canada, the Centers for Disease Control and Prevention (CDC), and the U.S. EPA for rigorous, transparent scientific evaluations.

U.S. EPA
CDC
Health Canada

“The U.S. Environmental Protection Agency (EPA) acknowledges that SciPinion’s peer review process meets or exceeds the EPA’s FIFRA Science Advisory Panel (SAP) process in every point of comparison.”

— EPA Report No. 22-E-0053
View Full Comparison: SciPinion vs. FIFRA SAP
Criteria SciPinion FIFRA SAP
Time Frame ~4 to 5 months ~9 months
Standard Operating Procedures Internal SOP (published in Kirman 2019) Agency SOP
Panel Members 14 finalists (4 former EPA) 7 tier-1 + 10–12 ad hoc
Candidates Considered 1,491 applicants 20–100 nominations
Panel Selection Analysis Quantitative Qualitative / semi-quantitative
Expert Restrictions US & international US citizens (with exceptions)
Meeting Format Virtual & private Hybrid & public
Quantitative Consensus Analysis Yes No (seeks consensus informally)
Faster Timeline
15×
Larger Candidate Pool
100%
Quantitative Analysis
Global
Expert Access
Get Started

Let’s Discuss Your Science

Tell us about your scientific question and we’ll help you determine the right approach.