Skip to content

Instantly share code, notes, and snippets.

@swayson
Created October 26, 2025 07:31
Show Gist options
  • Save swayson/6d52d2fda910b16d7eaeaf1ece172e65 to your computer and use it in GitHub Desktop.
Save swayson/6d52d2fda910b16d7eaeaf1ece172e65 to your computer and use it in GitHub Desktop.
Claude Code Command Fact Check.

Fact Check

Verify claims and statements from websites, papers, or text by cross-referencing multiple sources.

Workflow

1. Gather Input

Determine input type:

  • URL provided: /fact-check https://example.com/article
  • Text provided: /fact-check "specific claim to verify"
  • No input: Prompt user for content to fact-check

If URL provided:

# Fetch content from URL
WebFetch tool with URL
Extract main claims and assertions

If file path provided:

# Read document (supports PDF, MD, TXT)
Read tool for text files
Extract claims for verification

2. Extract Claims

Identify verifiable statements:

  • Break down content into discrete claims
  • Identify factual assertions vs opinions
  • Note quantitative claims (numbers, dates, statistics)
  • Flag subjective vs objective statements

Present claims to user:

  • List all extracted claims
  • Ask which to verify (all or specific ones)
  • Check if user wants to specify preferred sources

3. Verify Each Claim

For each claim:

Cross-reference search:

# Use WebSearch to find supporting/contradicting evidence
Search for claim + key terms
Search for contradicting evidence
Search academic/authoritative sources if applicable

If user specified sources:

  • Prioritize those sources
  • Still check for contradicting evidence elsewhere

Evaluate evidence:

  • Check source credibility
  • Look for primary vs secondary sources
  • Note publication dates
  • Identify potential bias
  • Find consensus vs outlier positions

4. Assess Confidence

Use probabilistic thinking (Annie Duke method):

  • 90-100%: Multiple high-quality primary sources agree, no credible contradictions
  • 70-89%: Strong evidence from credible sources, minor contradictions explained
  • 50-69%: Mixed evidence, some credible sources support, some contradict
  • 30-49%: Weak support, significant contradictions, or low-quality sources
  • 10-29%: Strong evidence against claim, most sources contradict
  • 0-9%: Claim definitively false based on high-quality evidence

Consider:

  • Source quality and expertise
  • Recency of information
  • Consensus among experts
  • Base rates and prior probability
  • What we don't know (epistemic humility)

5. Generate Report

Create markdown file: fact-check-report-{timestamp}.md

Report structure:

# Fact Check Report
Generated: {timestamp}
Source: {URL or "User provided text"}

## Summary
- Total claims verified: X
- High confidence (70%+): X
- Medium confidence (30-69%): X
- Low confidence/False (<30%): X

## Detailed Findings

### Claim 1: "{claim text}"

**Verification Status**: ✓ Verified / ⚠ Uncertain / ✗ False / ~ Misleading

**Confidence**: X% (reasoning)

**Supporting Evidence**:
- [Source name](URL) - {excerpt or summary}
- [Source name](URL) - {excerpt or summary}

**Contradicting Evidence**:
- [Source name](URL) - {excerpt or summary}

**Analysis**:
{Nuanced explanation of why confidence is at this level}

**Key Uncertainties**:
- What we don't know
- Limitations of available evidence

---

{Repeat for each claim}

## Overall Assessment

{Summary of document/claim credibility}

## Sources Consulted
1. [Source](URL)
2. [Source](URL)
...

## Methodology Notes
- Search terms used
- Source selection criteria
- Limitations of this fact-check

6. Present Results

Show user:

  • Summary of findings
  • Path to detailed report
  • Offer to investigate specific claims further
  • Ask if user wants to fact-check related claims

Rules

Source Quality:

  • Prioritize primary sources over secondary
  • Academic/peer-reviewed > news > blogs
  • Check source funding/bias
  • Note conflicts of interest
  • Prefer recent sources for time-sensitive claims

Epistemic Humility:

  • Be explicit about uncertainty
  • Don't force binary true/false on complex claims
  • Note when evidence is insufficient
  • Distinguish "no evidence found" from "evidence of absence"

Handling Opinions:

  • Mark subjective statements clearly
  • Don't fact-check opinions as true/false
  • Can verify attributed quotes
  • Can check if opinion is widely held

Safety:

  • Don't fact-check personal/private information
  • Warn if claim requires specialized expertise
  • Note when topic is actively debated
  • Avoid political bias in source selection

Input Arguments

URL mode:

/fact-check https://example.com/article

Text mode:

/fact-check "The Earth is 4.5 billion years old"

With source specification:

/fact-check "claim" --sources nature.com,science.org

Interactive mode:

/fact-check
# Prompts for input

Usage Examples

Example 1 - URL:

/fact-check https://example.com/health-article

Example 2 - Inline claim:

/fact-check "Coffee reduces risk of Alzheimer's by 65%"

Example 3 - Interactive:

/fact-check
> What would you like me to fact-check?
[paste article text or URL]
> Should I verify all claims or specific ones?
All
> Any preferred sources?
None, use your judgment

Output

Analyzing content...
Extracted 12 verifiable claims

Verifying claim 1/12: "X happened in 2020"
  - Searching for evidence...
  - Found 5 sources
  - Confidence: 85%

...

Report generated: fact-check-report-20251026-143022.md

Summary:
✓ Verified (70%+): 8 claims
⚠ Uncertain (30-69%): 3 claims
✗ False/Misleading (<30%): 1 claim

Would you like me to investigate any claims further?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment