Skip to main content
Our Approach

Engineered for Objectivity

We don’t assemble panels to validate a predetermined conclusion. We design a process where independence is structural — built into how we recruit experts, how they interact, and how findings emerge.

Triple-blinded methodology. Quantitative consensus analysis. Results that regulators trust because the process earns it.

Our Mission

Building Trust Through Engineered Objectivity

In today’s complex regulatory landscape and activist-oriented attacks on science, SciPinion was established to build trust and maintain credibility through our Certified Peer Reviews.

Our goal is to introduce clarity and certainty from the expert community to the world’s toughest science problems — instilling universal trust in science.

Reduce bias — systematic methodology that minimizes sources of influence
Increase trust — process transparency that regulators can verify
Foster engagement — connecting experts to problems that matter
SciPinion Certified Peer Reviews - Six standards surrounding Trust shield: Independent expert selection, Sufficient panel size, Modified Delphi format, Question independence, Sponsor blinding, Expert anonymity
The Process

A Structured Path to Defensible Conclusions

Every SciPinion engagement follows a defined methodology — from expert recruitment through final reporting. The structure isn’t bureaucracy; it’s what makes the findings reproducible and the conclusions defensible.

SciPinion peer review process flowchart showing three phases: Pre-SciPi (expert recruitment and charge question development), SciPi (multi-round review with debate), and Post-SciPi (reporting and verification)
Proven Reproducibility

Independent Panels, Consistent Results

When two independent panels review the same materials using our methodology, their findings align. The graph shows favorability scores from Panel 1 and Panel 2 across 18 charge questions — the convergence demonstrates that our approach yields reproducible outcomes.

No other company or government body has tested or proven their approach to assembling and conducting peer review panels yields reproducible results.

Favorability score comparison showing Panel 1, Panel 2, and Combined results across 18 questions, demonstrating reproducible findings between independent expert panels
Design Features

Three Phases of Every Engagement

A SciPi is defined as a collection of Scientific oPinions. Each panel engagement involves three phases, with design options tailored to project objectives.

Pool of Ideal Reviewers

Venn diagram showing four overlapping criteria for ideal reviewers: People with relevant expertise, People that are objective, People that are available to participate, and People that are willing to participate. The center intersection represents the Pool of Ideal Reviewers.

The ideal reviewer sits at the intersection of four criteria. Our recruitment process identifies experts who meet all requirements.

1. Pre-SciPi Preparation & Expert Selection

Collaborative planning between SciPinion and sponsor ensures the engagement meets required objectives. During this phase:

The sponsor can choose to assemble the review package or have SciPinion prepare it. For controversial topics, a third-party auditor can help vet the expertise of volunteers.

Charge questions can be prepared by the sponsor or SciPinion. A third-party editor can review questions to verify no leading or biased language is used.

2. SciPi Triple-Blinded Deliberation

Triple-blinded process: The expert panel is blinded to sponsors, sponsors are blinded to the panel, and panelists are blinded to each other. This eliminates groupthink and conformity pressures.

Round 1: Experts independently review materials and answer charge questions. They cannot see answers from other panelists.

Round 2: Debate round where experts review aggregated results and engage in structured discussion. Experts can vote on fellow panelists’ contributions.

Round 3: Follow-up questions and opportunity for experts to revise Round 1 answers based on debate insights.

3. Post-SciPi Reporting & Deliverables

Flexible deliverables matched to project objectives. Report format and depth can be determined upon completion of the SciPi:

PDF results directly from the SciPinion application for quick internal reference.

Analytical reports with outlier analysis and consensus metrics for panels of 10+ experts.

Formal reports suitable for regulatory submissions or peer-reviewed manuscript publication.

Expert testimony arrangements for panelists to present findings to regulatory bodies.

Managing Bias

Why Traditional Panels Fail

Face-to-face deliberations suffer from well-documented cognitive biases. Our methodology addresses these failure points by design.

1

Groupthink

Social pressure drives conformity around incorrect positions.

2

Deference

Panelists defer to perceived authority regardless of argument quality.

3

Amplification

Early opinions get reinforced, drowning out alternatives.

4

Overbearing Members

Dominant personalities silence quieter experts.

The Chilling Effect: In today’s adversarial environment, experts have been attacked for speaking on controversial topics. Many stay on the sidelines despite being highly knowledgeable. Our anonymous methodology provides psychological safety for genuine assessment.
Can any process truly eliminate bias?

Everyone has bias—some sources unknown even to the expert. Rather than claiming elimination, we minimize impact through structural safeguards: triple-blinding, anonymous deliberation, and panels large enough to capture true opinion distribution.

Why not “balance” viewpoints on the panel?

“Balancing” assumes you know experts’ opinions in advance—itself a form of bias. Worse, if true consensus is 95/5, a 50/50 panel produces fundamentally unrepresentative findings that tell you nothing about the field’s actual stance.

How does SciPinion’s approach differ?

We select based on expertise, objectivity, availability, and willingness—not predicted opinions. Sponsors can’t influence composition. Findings reflect actual expert distribution, whether that supports the sponsor’s position or not.

Our Rationale

Why We Do Things The Way We Do

Panel Selection

When we select a panel of experts for a peer review, we use an objective and quantitative metric of expertise—and a model picks the panel. This process is completely objective, quantitative, transparent, and reproducible.

We do not assume or make any prejudgments about what the opinions of any particular expert will be. If a sponsor desires or requires certain diversity in demographics (region of residence, gender, sector of employment), the model can account for those requirements without compromising objectivity.

Panel Engagement

All engagements with experts occur online through the SciPinion web app. We do this to eliminate the negative influences that occur with face-to-face meetings. While working online, each expert’s identity is not disclosed—they are labeled sequentially as Expert 1, Expert 2, and so on.

This eliminates undue influence from recognizing peers and affords greater psychological safety.

Upon completion, expert identities can be disclosed via three options: experts are identified but a key is not provided (preferred), experts are identified with a key provided, or—in rare cases involving highly controversial topics—experts may remain anonymous in perpetuity.

Regulatory Validation

Trusted by Government Agencies

SciPinion is trusted by government agencies including Health Canada, the Centers for Disease Control and Prevention (CDC), and the U.S. EPA for rigorous, transparent scientific evaluations.

“The U.S. Environmental Protection Agency (EPA) acknowledges that SciPinion’s peer review process meets or exceeds the EPA’s FIFRA Science Advisory Panel (SAP) process in every point of comparison.”

— EPA Report No. 22-E-0053

View Full Comparison: SciPinion vs. FIFRA SAP
Criteria SciPinion FIFRA SAP
Time Frame ~4 to 5 months ~9 months
Standard Operating Procedures Internal SOP (published in Kirman 2019) Agency SOP
Panel Members 14 finalists (4 former EPA) 7 tier-1 + 10-12 ad hoc
Candidates Considered 1,491 applicants 20-100 nominations
Panel Selection Analysis Quantitative Qualitative / semi-quantitative
Expert Restrictions US & international US citizens (with exceptions)
Meeting Format Virtual & private Hybrid & public
Quantitative Consensus Analysis Yes No (seeks consensus informally)

Let’s Discuss Your Science

Tell us about your scientific question and we’ll help you determine the right approach — whether that’s a certified peer review panel, expert survey, or targeted consultation.