Blueprint Playbook for Inductive Health

Who the Hell is Jordan Crawford?

Founder of Blueprint. I help companies stop sending emails nobody wants to read.

The problem with outbound isn't the message. It's the list. When you know WHO to target and WHY they need you right now, the message writes itself.

I built this system using government databases, public records, and 25 million job posts to find pain signals most companies miss. Predictable Revenue is dead. Data-driven intelligence is what works now.

The Old Way (What Everyone Does)

Your GTM team is buying lists from ZoomInfo, adding "personalization" like mentioning a LinkedIn post, then blasting generic messages about features. Here's what it actually looks like:

The Typical Inductive Health SDR Email:

Subject: Modernize Your Disease Surveillance Platform Hi Sarah, I noticed your health department is still managing disease surveillance with legacy systems. That must create delays in outbreak detection and case investigation. Inductive Health helps public health agencies modernize their surveillance infrastructure with our cloud-native platform that integrates disease tracking, syndromic monitoring, and immunization registries. We've helped agencies like Florida and Washington improve their outbreak response times and reduce IT maintenance burden. Would you be open to a quick call to discuss how we could help modernize your systems? Best, Account Executive at Inductive Health

Why this fails: The prospect is an expert. They've seen this template 1,000 times. There's zero indication you understand their specific situation. Delete.

The New Way: Intelligence-Driven GTM

Blueprint flips the approach. Instead of interrupting prospects with pitches, you deliver insights so valuable they'd pay consulting fees to receive them.

1. Hard Data Over Soft Signals

Stop: "I see you're hiring compliance people" (job postings - everyone sees this)

Start: "Your NEDSS completeness score is 67% - bottom quartile nationally per CDC's latest quality dashboard" (government database with exact metric)

2. Mirror Situations, Don't Pitch Solutions

PQS (Pain-Qualified Segment): Reflect their exact situation with such specificity they think "how did you know?" Use government data with dates, record numbers, specific metrics.

PVP (Permissionless Value Proposition): Deliver immediate value they can use today - analysis already done, benchmarks already pulled, patterns already identified - whether they buy or not.

Inductive Health Plays: Data-Driven Intelligence

These messages demonstrate precise understanding of the prospect's current situation (PQS) or deliver immediate actionable value (PVP). Every claim traces to specific government databases with verifiable metrics.

PVP Public + Internal Strong (9.3/10)

Immunization Registry Modernization ROI: Proven Coverage Improvements

What's the play?

Show immunization program managers concrete evidence of coverage improvements from registry modernization using multi-jurisdiction analysis. Deliver ROI proof with specific timelines and quantified outcomes they can use to justify modernization budget.

Why this works

Immunization program managers face constant pressure to improve vaccination coverage but lack hard evidence that modern registries deliver measurable results. You're providing analysis they can't easily replicate - tracking modernization outcomes across multiple jurisdictions over time. The specificity (14 jurisdictions, 8.4 percentage point average improvement, 18-month timeframe) builds credibility, and applying the math directly to their current gap makes it immediately actionable.

Data Sources
  1. Company Internal Data - Implementation timelines, staff training durations, 6-month post-launch vaccination coverage improvement rates
  2. County Health Rankings - county, vaccination_rates, health_outcomes
  3. CDC Public Health Data Strategy (PHDS) Adoption Metrics - legacy_system_adoption, modernization_timeline

The message:

Subject: Registry upgrades that added 8-12% coverage We analyzed 14 jurisdictions that upgraded from legacy IIS platforms to modern registries between 2020-2023 - average coverage improvement was 8.4 percentage points within 18 months. Your current 3-county gap of 33 points could close to 24-25 points with modernization. Want the case study breakdown showing which registry features drove the gains?
DATA REQUIREMENT

This play requires analysis of immunization coverage changes correlated with registry modernization projects across multiple jurisdictions - tracking technology adoption dates + coverage outcomes over time across 14+ jurisdictions.

This synthesis of modernization outcomes is genuinely hard to find elsewhere. Competitors don't have actual immunization improvement metrics from registry implementations.
PVP Public + Internal Strong (9.1/10)

Outbreak Detection Speed Benchmarking: Peer Comparison for Underperforming Jurisdictions

What's the play?

Show public health directors how their outbreak detection speed compares to peer jurisdictions using proprietary timing data combined with CDC quality metrics. Provide concrete benchmarks they can use to justify modernization budget.

Why this works

Health directors know they should improve outbreak detection speed but lack objective benchmarks to understand where they stand. You're delivering analysis that requires multi-source synthesis they can't easily replicate - combining CDC NNDSS data with jurisdiction characteristics and actual outbreak detection timelines. The specificity (47 jurisdictions, 6.3 days vs 2.1 days, percentile ranges) creates urgency and gives them ammunition for budget conversations.

Data Sources
  1. Company Internal Data - case_investigation_completion_time, case_detection_latency, jurisdiction_size_class
  2. CDC NNDSS Data Quality Dashboards - jurisdiction, data_quality_score, timeliness_score
  3. CDC NSSP (National Syndromic Surveillance Program) Data - data_lag_days

The message:

Subject: Your outbreak detection: 6.3 days vs peers at 2.1 We analyzed 47 state health departments' COVID, flu, and norovirus outbreak detection speeds from 2023-2024 using CDC NNDSS data - yours averaged 6.3 days from index case to official outbreak declaration. Comparable-sized jurisdictions using modern surveillance platforms averaged 2.1 days. Want the full benchmark report showing your position vs all 47 states?
DATA REQUIREMENT

This play requires case investigation completion times and outbreak detection latency across 15+ anonymized customer jurisdictions, segmented by jurisdiction size (state, large county, small county, tribal). Median and percentile timing data cross-referenced with CDC NNDSS outbreak reporting timelines.

This is synthesized analysis combining internal timing data with CDC public data - competitors cannot replicate this multi-source synthesis.
PQS Public Data Strong (8.7/10)

Low Vaccination Counties with Legacy Immunization Registries Facing Outbreak Risk

What's the play?

Target counties with below-threshold vaccination rates AND legacy immunization registry infrastructure AND rising disease case counts. These jurisdictions are at critical risk during outbreak season and lack rapid vaccine deployment capability needed for modern public health emergencies.

Why this works

You're combining three specific data points (exact vaccination percentages, registry system age, specific county names) that together tell a story of vulnerability the recipient can't ignore. The Georgia 2019 measles outbreak precedent adds concrete urgency - this isn't theoretical risk. The question "Who's responsible for immunization program modernization?" routes directly to the decision-maker without feeling pushy.

Data Sources
  1. County Health Rankings - county, state, vaccination_rates, health_outcomes
  2. State Immunization Program Data - registry_system, technical_infrastructure, last_updated
  3. CDC NNDSS - disease_name, case_count

The message:

Subject: Your 3 counties at 62% MMR with registry from 2008 Baker, Mitchell, and Dawson counties show 62% MMR coverage - 13 points below herd immunity threshold - and your immunization registry was last upgraded in 2008. Georgia's 2019 measles outbreak started in similar coverage gaps. Who's responsible for immunization program modernization?
PQS Public Data Strong (8.6/10)

High Respiratory Surge States with Multi-Day NSSP Detection Lag

What's the play?

Alert health directors in real-time when their state shows CDC-documented respiratory surges BUT their NSSP-to-case-report lag exceeds 2+ days compared to peer jurisdictions. Create urgency during active surge periods by showing them they're missing the outbreak window.

Why this works

You're catching them during an active crisis (respiratory surge) with specific timing data they can verify (4.2 days in December, specific month). The peer comparison (Colorado and Washington detected 3 days earlier) creates competitive pressure and proves faster detection is possible. The question routes to the technical owner without requiring immediate commitment.

Data Sources
  1. CDC Respiratory Illness Surveillance Dashboard (RESP-NET) - state, respiratory_virus, trend, hospitalization_rate
  2. CDC NSSP (National Syndromic Surveillance Program) Data - data_lag_days
  3. CDC NNDSS Data Quality Dashboards - case_transmission_lag, timeliness_score

The message:

Subject: Your NSSP syndromic lag hit 4.2 days in December During December's respiratory surge, your state's NSSP syndromic data lagged 4.2 days behind real-time. Colorado and Washington detected the same surge patterns 3 days earlier with real-time feeds. Who manages your syndromic surveillance integration?
PVP Public + Internal Strong (8.5/10)

14 Registries That Improved Coverage 8+ Points

What's the play?

Track immunization coverage changes across state registries that modernized from 2020-2023, showing average improvement of 8.4 percentage points with feature-level attribution of what drove results.

Why this works

The sample size (14 registries) and timeframe (2020-2023) build credibility. Identifying which features drove results (provider engagement, automated reminder workflows) helps recipients understand what they're actually buying. The simple ask ("Want the detailed breakdown?") is low-commitment but gets them engaged.

Data Sources
  1. Company Internal Data - implementation_duration, post_launch_coverage_improvement, starting_system_type
  2. County Health Rankings - vaccination_rates
  3. CDC Public Health Data Strategy (PHDS) Adoption Metrics - legacy_system_adoption

The message:

Subject: 14 registries that improved coverage 8+ points We tracked 14 state immunization registries that modernized from 2020-2023 - they improved coverage by an average of 8.4 percentage points. The improvements came primarily from better provider engagement and automated reminder workflows. Want the detailed breakdown of which features drove results?
DATA REQUIREMENT

This play requires analysis of immunization coverage changes correlated with registry modernization projects across multiple jurisdictions - tracking technology adoption + coverage outcomes over time.

Combined with public County Health Rankings data to show current vaccination rates. This synthesis is unique to your business.
PQS Public Data Strong (8.4/10)

Legacy NEDSS Jurisdictions with Bottom-Quartile Data Quality Scores

What's the play?

Target state and local health departments still running legacy NEDSS systems AND scoring in the bottom quartile for data quality/timeliness. These jurisdictions face imminent federal compliance pressure as CDC pushes 40% modernization by 2026.

Why this works

You're citing a specific percentile ranking (67%, bottom 25%) they can verify in under 60 seconds via CDC's public dashboard. The CDC scrutiny threat is real and actionable - federal compliance pressure creates urgency. The outbreak delay implication (3-5 days) directly relates to their KPIs. The routing question is simple and doesn't require commitment.

Data Sources
  1. CDC Public Health Data Strategy (PHDS) Adoption Metrics - nedss_version_status, legacy_system_adoption, jurisdiction
  2. CDC NNDSS Data Quality Dashboards - jurisdiction, data_quality_score, timeliness_score, case_transmission_lag

The message:

Subject: Your NEDSS data quality ranks bottom 25% CDC's Data Quality Dashboard shows your jurisdiction's NEDSS completeness score at 67% - bottom quartile nationally. That triggers enhanced CDC scrutiny and delays your outbreak detection by 3-5 days compared to top performers. Who's leading the data quality improvement effort?
PVP Public + Internal Strong (8.3/10)

Outbreak Detection Benchmark: Where You Rank

What's the play?

Benchmark outbreak detection speed using 2023-2024 CDC data showing the recipient's jurisdiction detects outbreaks in 6.3 days vs 2.1-day average for comparable states, with clear secondary transmission implications.

Why this works

The specific comparison (6.3 days vs 2.1 days) is embarrassing and hard to ignore. The secondary transmission implication translates the timing gap into real public health impact. The sample size (47 jurisdictions) gives credibility. The methodology offer is a simple yes/no question that gets them engaged without commitment.

Data Sources
  1. Company Internal Data - case_investigation_completion_time, case_detection_latency, jurisdiction_size_class
  2. CDC NNDSS Data Quality Dashboards - data_quality_score, timeliness_score
  3. CDC NSSP Data - data_lag_days

The message:

Subject: Outbreak detection benchmark: where you rank We benchmarked 47 jurisdictions' outbreak detection speed using 2023-2024 CDC data - you're detecting outbreaks in 6.3 days vs 2.1-day average for comparable states. The 4-day gap means more secondary transmissions before intervention. Want to see the methodology and full peer rankings?
DATA REQUIREMENT

This play requires analysis of CDC NNDSS outbreak reporting timelines cross-referenced with jurisdiction characteristics and technology stack (requires internal research combining CDC data with technology adoption patterns).

Provides performance visibility that helps justify modernization investments. This synthesis is unique to your analysis capabilities.
PQS Public Data Okay (7.9/10)

62% MMR in 3 Counties - Outbreak Risk Territory

What's the play?

Target specific counties at 62% MMR vaccination coverage - 33 percentage points short of herd immunity threshold. Show the exact gap calculation with specific county names.

Why this works

The herd immunity benchmark (95% required) is accurate and creates urgency. The gap calculation (33 percentage points short) is clear and concerning. Specific county names (Baker, Mitchell, Dawson) show you've done homework. The question is straightforward without being pushy.

Data Sources
  1. County Health Rankings - county, state, vaccination_rates
  2. State Immunization Program Data - registry_system, last_updated

The message:

Subject: 62% MMR in 3 counties - outbreak risk territory Baker, Mitchell, and Dawson counties are at 62% MMR vaccination coverage according to your state immunization registry. Herd immunity requires 95% - you're 33 percentage points short in these counties. Is someone tracking the gap closure plan?
PQS Public Data Okay (7.8/10)

4-Day Syndromic Delay During Respiratory Surge

What's the play?

Show state's NSSP data had a 4-day lag during December respiratory surge, meaning they're responding to outbreaks that started 4 days earlier than they think.

Why this works

The specific timing (4-day lag, December respiratory surge) is verifiable and creates urgency. The implication (responding to outbreaks that started 4 days earlier) is clear and concerning. The question is simple, though slightly obvious - of course epidemiology teams know about lag, but it still routes effectively.

Data Sources
  1. CDC Respiratory Illness Surveillance Dashboard (RESP-NET) - state, respiratory_virus, trend
  2. CDC NSSP Data - data_lag_days

The message:

Subject: 4-day syndromic delay during respiratory surge Your state's NSSP data showed a 4-day lag during the December respiratory surge. That delay means you're responding to outbreaks that started 4 days earlier than you think. Is your epidemiology team aware of the lag?
PQS Public Data Okay (7.2/10)

67% NEDSS Completeness at Your Health Department

What's the play?

Show the jurisdiction's exact NEDSS data completeness score (67%) with peer comparison showing 85%+ jurisdictions detect outbreaks 4 days faster on average.

Why this works

The specific score (67%) is verifiable. The peer comparison (85%+ jurisdictions) provides an actionable benchmark. The detection speed gap (4 days faster) directly relates to their KPIs. The question is simple and routes effectively. Slightly weaker than variant 1 because it fails the competitor test - anyone can cite CDC dashboards.

Data Sources
  1. CDC NNDSS Data Quality Dashboards - jurisdiction, data_quality_score, timeliness_score

The message:

Subject: 67% NEDSS completeness at your health department Your NEDSS data completeness score is 67% according to CDC's latest quality metrics. Jurisdictions at 85%+ detect outbreaks 4 days faster on average. Is someone already working on the completeness gaps?

What Changes

Old way: Spray generic messages at job titles. Hope someone replies.

New way: Use public data to find health departments in specific painful situations. Then mirror that situation back to them with evidence.

Why this works: When you lead with "Your NEDSS completeness score is 67% - bottom quartile nationally" instead of "I see you're modernizing public health systems," you're not another sales email. You're the person who did the homework.

The messages above aren't templates. They're examples of what happens when you combine real data sources with specific situations. Your team can replicate this using the data recipes in each play.

Data Sources Reference

Every play traces back to verifiable public data. Here are the sources used in this playbook:

Source Key Fields Used For
CDC NNDSS Data Quality Dashboards jurisdiction, data_quality_score, data_completeness, timeliness_score, case_transmission_lag Identifying jurisdictions with low data quality scores and slow case reporting
CDC Public Health Data Strategy (PHDS) Adoption Metrics jurisdiction, nedss_version_status, legacy_system_adoption, modernization_timeline Finding jurisdictions still on legacy NEDSS systems
CDC Respiratory Illness Surveillance Dashboard (RESP-NET) state, facility, respiratory_virus, case_count, trend, hospitalization_rate Identifying states with respiratory illness surges
CDC NSSP (National Syndromic Surveillance Program) Data health_system, jurisdiction, syndrome_name, visit_count, data_lag_days Measuring syndromic surveillance data lag and participation
County Health Rankings (CDC/RWJF) county, state, population, vaccination_rates, health_metrics, health_outcomes Finding counties with low vaccination rates and health burden
State Immunization Program Data state_name, registry_name, registry_system, coverage_rates, last_updated, technical_infrastructure Identifying jurisdictions with legacy immunization registry systems
CDC NNDSS (National Notifiable Diseases Surveillance System) jurisdiction_name, disease_name, case_count, case_status, date_reported Tracking disease case counts and reporting timeliness by jurisdiction
ASTHO Member Directory state_health_agency_name, director_name, director_title, director_email Contact information for State Health Officers and program managers
Company Internal Data case_investigation_completion_time, implementation_duration, post_launch_coverage_improvement Benchmarking outbreak detection speed and registry modernization ROI