Blueprint Playbook for Viz.ai

Who the Hell is Jordan Crawford?

Founder of Blueprint. I help companies stop sending emails nobody wants to read.

The problem with outbound isn't the message. It's the list. When you know WHO to target and WHY they need you right now, the message writes itself.

I built this system using government databases, public records, and 25 million job posts to find pain signals most companies miss. Predictable Revenue is dead. Data-driven intelligence is what works now.

The Old Way (What Everyone Does)

Your GTM team is buying lists from ZoomInfo, adding "personalization" like mentioning a LinkedIn post, then blasting generic messages about features. Here's what it actually looks like:

The Typical Viz.ai SDR Email:

Subject: Accelerate Your Stroke Care Outcomes Hi Dr. Smith, I saw your hospital recently posted about expanding your neurology department - congrats! At Viz.ai, we help healthcare systems like yours accelerate time-critical diagnoses through AI-powered medical imaging analysis. We've helped 1,700+ hospitals improve door-to-intervention times and patient outcomes. Our platform seamlessly integrates with your existing PACS/EHR systems and provides real-time specialist notifications for stroke, cardiac, and trauma cases. Would you have 15 minutes next week to discuss how we could help your team improve diagnostic speed and clinical outcomes? Best, Sarah

Why this fails: The prospect is an expert. They've seen this template 1,000 times. There's zero indication you understand their specific situation. Delete.

The New Way: Intelligence-Driven GTM

Blueprint flips the approach. Instead of interrupting prospects with pitches, you deliver insights so valuable they'd pay consulting fees to receive them.

1. Hard Data Over Soft Signals

Stop: "I see you're hiring radiology staff" (job postings - everyone sees this)

Start: "Your off-hours stroke cases average 89 minutes imaging-to-notification vs 42 minutes during day shift" (CMS quality data with actual performance metrics)

2. Mirror Situations, Don't Pitch Solutions

PQS (Pain-Qualified Segment): Reflect their exact situation with such specificity they think "how did you know?" Use government data with dates, facility names, actual performance metrics.

PVP (Permissionless Value Proposition): Deliver immediate value they can use today - analysis already done, patterns already identified, benchmarks already calculated - whether they buy or not.

Viz.ai Overview

Company: Viz.ai

Core Problem: Healthcare providers struggle with delayed disease detection and diagnosis because medical imaging analysis is time-consuming and prone to human oversight, causing patients to wait for treatment and increasing clinical workload. Viz.ai automates medical image analysis to accelerate diagnosis delivery and improve patient outcomes.

Product Type: B2B SaaS - Healthcare AI

Ideal Customer Profile

  • Multi-hospital health systems
  • Large integrated hospital networks
  • Hospitals with advanced neuroimaging departments
  • Healthcare systems with stroke centers, trauma centers, and cardiac programs
  • Enterprise-level healthcare organizations (1,700+ hospitals served; typically 200+ employees in clinical/operations roles)

Target Persona

Title: VP of Clinical Operations or Chief Medical Officer

Key Responsibilities:

  • Overseeing acute care service lines (stroke, cardiac, trauma, neurovascular)
  • Managing care coordination across multiple hospital sites
  • Improving diagnostic speed and clinical outcomes
  • Integration of AI-enabled workflows into existing systems

Key KPIs:

  • Door-to-intervention time (stroke cases: target <90 min)
  • Clinical detection and sensitivity rates
  • Inter-hospital transfer time reduction
  • Care team mobilization speed

Viz.ai Intelligence Plays

These messages demonstrate precise understanding of the prospect's situation (PQS) or deliver immediate actionable value (PVP). All insights trace back to verifiable data sources.

PVP Public + Internal Strong (9.1/10)

Site-by-Site Stroke Transfer Analysis

What's the play?

For multi-hospital integrated delivery networks, analyze transfer times between specific facility pairs to identify where care coordination breaks down most severely. Show them exactly which site combinations have the longest delays.

Why this works

Network-level analysis is sophisticated and immediately actionable. The variance between site pairs shows exactly where to prioritize process improvements. This helps clinical leaders fix system-level problems rather than guessing which facilities need attention first.

Data Sources
  1. CMS Hospital Compare Quality Data - facility affiliations, network structure
  2. Internal transfer timestamp data - origin-destination pairs, handoff times

The message:

Subject: Your site-by-site stroke transfer analysis Mapped transfers across your 3-hospital network and found Memorial-to-Regional averages 142 minutes vs Eastside-to-Regional at 89 minutes. That 53-minute variance between site pairs shows where coordination breaks down most severely. Want the complete matrix showing which facility combinations have longest delays?
DATA REQUIREMENT

This play requires aggregated inter-facility transfer data showing origin-destination pairs, timestamps, and handoff stages across hospital networks.

Combined with public CMS network affiliation data. This synthesis is unique to your operational insights.
PVP Public + Internal Strong (8.9/10)

Off-Hours Diagnostic Delay Gap Analysis

What's the play?

Show comprehensive stroke centers their hour-by-hour diagnostic performance data, revealing the specific time windows when off-hours cases experience the longest delays compared to day shift performance.

Why this works

Off-hours performance is a known blind spot for clinical operations leaders. The specificity of knowing exact time deltas and having hour-by-hour breakdowns makes this immediately actionable for staffing decisions and workflow improvements. The brain tissue urgency resonates clinically.

Data Sources
  1. CMS Hospital Outpatient Quality Reporting (OQR) Program Data - ED efficiency measures, imaging process measures
  2. Stroke Center Certification Database - certification level verification
  3. Internal performance benchmarks - aggregated off-hours vs daytime diagnostic speed data

The message:

Subject: Your 11pm-7am stroke cases - 47min delay gap I pulled your imaging timestamps and found off-hours cases average 89 minutes vs 42 minutes during day shifts. That's a 47-minute gap when neurologist availability is lowest and every minute costs brain tissue. Want the hour-by-hour breakdown showing exactly when delays spike?
DATA REQUIREMENT

This play requires aggregated before/after implementation data showing off-hours vs daytime diagnostic speed improvements, stratified by stroke center certification level and region, from hospital customers processing stroke imaging cases 24/7.

Combined with public CMS quality data. This synthesis is proprietary to your operational insights.
PQS Public Data Strong (8.9/10)

Certification Renewal with Declining Imaging Efficiency

What's the play?

Target primary and comprehensive stroke centers with certifications expiring within 12 months that show declining imaging efficiency metrics. The combination of certification deadline plus performance decline creates urgency.

Why this works

Certification renewal is a high-stakes deadline with real consequences. The 22% decline with exact minute increases shows you've tracked their performance trend over time. This creates immediate urgency without being pushy - it's just data.

Data Sources
  1. Stroke Center Certification Database (Joint Commission) - certification dates, levels
  2. CMS Outpatient Imaging Efficiency Data - imaging efficiency metrics over time
  3. CMS Hospital Compare Quality Data - door-to-imaging time trends

The message:

Subject: Your stroke certification renews in 4 months - imaging slowed 22% Your comprehensive stroke center certification renews March 2025 and door-to-imaging time increased from 31 to 38 minutes over the past 12 months. That 22% slowdown puts you above the 30-minute benchmark required for recertification. Is someone already prepping the quality metrics for Joint Commission?
PVP Public Data Strong (8.8/10)

Quarterly Imaging Trends Before Recertification

What's the play?

Deliver quarter-by-quarter performance analysis showing the consistent decline pattern in imaging efficiency, helping stroke centers identify which specific process steps degraded over time before their certification audit.

Why this works

The quarterly trend shows a process degradation pattern that Joint Commission auditors will flag. Offering the breakdown of which specific steps slowed down helps them address root causes before the audit, creating clear preparation value.

Data Sources
  1. Stroke Center Certification Database - certification renewal dates
  2. CMS Outpatient Imaging Efficiency Data - quarterly imaging metrics
  3. CMS Hospital Compare Quality Data - quality measure trends

The message:

Subject: Your quarterly imaging trends before March recert Your stroke center recertifies March 2025 and Q1 2024 imaging averaged 31 minutes, Q2 33 minutes, Q3 36 minutes, Q4 38 minutes. That consistent quarterly decline shows a process degradation pattern Joint Commission will flag during audit. Want the quarter-by-quarter analysis showing which process steps slowed down most?
PVP Public + Internal Strong (8.8/10)

Inter-Hospital Transfer Time Reduction Analysis

What's the play?

For multi-hospital networks, deliver facility-by-facility transfer time analysis showing which site pairs have the longest coordination delays, helping them prioritize which routes need process improvements first.

Why this works

Multi-site network analysis is valuable and sophisticated. Identifying exactly where coordination breaks down (during handoff between community hospital detection and comprehensive center intervention) helps fix system-level problems. Clear value even if they don't buy.

Data Sources
  1. CMS Hospital Compare Quality Data - network affiliations, facility types
  2. Internal transfer coordination data - origin-destination timestamps

The message:

Subject: Your inter-hospital transfers lose 58 minutes Mapped your 3-site network and found stroke transfers average 118 minutes vs 60-minute target coordination time. That 58-minute gap happens during the handoff between community hospital detection and comprehensive center intervention. Want the facility-by-facility breakdown showing which site pairs have the longest delays?
DATA REQUIREMENT

This play requires aggregated inter-facility transfer data showing origin-destination pairs, timestamps, and handoff stages across hospital networks.

Combined with public network affiliation data. This network-level synthesis is unique to your operational insights.
PQS Public + Internal Strong (8.7/10)

Off-Hours Diagnostic Delay Gap

What's the play?

Target comprehensive stroke centers with significant off-hours vs daytime diagnostic speed gaps. Use CMS quality data combined with benchmarking to show them they're missing the 60-minute target window when specialist availability drops overnight.

Why this works

The 47-minute delta is extremely specific and concerning. Off-hours blind spots are real - clinical leaders don't track this well. The easy routing question makes it frictionless to respond. Credibility comes from clearly analyzing their actual timestamps.

Data Sources
  1. CMS Hospital Outpatient Quality Reporting (OQR) Program Data - imaging efficiency metrics
  2. Stroke Center Certification Database - certification level
  3. Internal performance benchmarks - aggregated off-hours performance data

The message:

Subject: Your off-hours stroke imaging takes 47 minutes longer Between 11pm-7am, your stroke imaging-to-notification averages 89 minutes vs 42 minutes during day shift. That's missing the 60-minute target window when specialist availability drops overnight. Who's tracking your off-hours diagnostic delays?
DATA REQUIREMENT

This play requires aggregated hospital imaging timestamp data segmented by time-of-day, compared against their own daytime performance baseline and peer benchmarks.

Combined with public CMS quality data. This time-segmented analysis is unique to your operational insights.
PVP Public Data Strong (8.7/10)

Month-by-Month Performance Decline Analysis

What's the play?

Deliver 12-month door-to-imaging performance trend showing exactly when performance started declining, helping stroke centers identify root causes before their Joint Commission recertification audit.

Why this works

The urgent timeline creates real pressure. Month-by-month trend data is actionable - it helps them identify exactly when processes broke down so they can address root causes. Clear audit prep value with low commitment ask.

Data Sources
  1. Stroke Center Certification Database - certification renewal dates
  2. CMS Outpatient Imaging Efficiency Data - monthly imaging metrics
  3. CMS Hospital Compare Quality Data - quality trends over time

The message:

Subject: March 2025 recert prep - your 12-month imaging trend Your comprehensive stroke certification renews March 2025 and I pulled 12 months of door-to-imaging performance showing 31 to 38 minute drift. That trend puts you at risk for the Joint Commission 30-minute benchmark during renewal audit. Want the month-by-month breakdown showing exactly when performance started declining?
PVP Public Data Strong (8.6/10)

Day-by-Day Trauma Volume Patterns

What's the play?

Deliver detailed trauma imaging volume analysis showing Saturday-Sunday peak patterns compared to weekday averages, helping trauma centers understand exactly when bottlenecks occur and optimize staffing.

Why this works

The specific weekend volume analysis with staffing correlation is actionable. Offering detailed breakdown they can act on helps build business case for weekend coverage adjustments. This is immediately useful operational intelligence.

Data Sources
  1. National Trauma Data Bank (NTDB) - imaging procedure volumes by day
  2. CMS Outpatient Imaging Efficiency Data - imaging efficiency metrics

The message:

Subject: Your weekend trauma imaging - 12 study backlog Analyzed your trauma volumes and found Saturday-Sunday peaks at 23 pending studies vs 11 weekday average. That weekend backlog correlates with 90-minute diagnostic delays when you're running half radiologist coverage. Want the day-by-day volume patterns showing exactly when bottlenecks occur?
PVP Public + Internal Strong (8.6/10)

Hourly Overnight Diagnostic Performance

What's the play?

Deliver hour-by-hour overnight stroke case performance data showing which specific hours have the worst diagnostic delays, helping comprehensive stroke centers target coverage gaps during peak fatigue periods.

Why this works

Extremely granular time analysis reveals surprising insights (early morning 2am-5am worst performance). The fatigue correlation is plausible and actionable. This helps target specific coverage gaps to improve patient outcomes.

Data Sources
  1. CMS Hospital Outpatient Quality Reporting (OQR) Program Data - imaging process measures
  2. Internal case-level performance data - hourly timestamps showing overnight performance variance

The message:

Subject: Your overnight stroke cases - hour-by-hour delays Pulled your imaging timestamps and found 2am-5am cases average 103 minutes vs 89 minute off-hours average and 42 minute day shift. Those early morning hours are your worst performance window when overnight radiologist fatigue peaks. Want the hourly breakdown showing exactly when overnight delays are most severe?
DATA REQUIREMENT

This play requires case-level imaging timestamps segmented by hour showing performance variance throughout overnight shift, from hospitals processing stroke imaging cases 24/7.

Combined with public CMS quality data. This hourly-level analysis is unique to your operational insights.
PQS Public Data Strong (8.6/10)

March 2025 Recertification with Performance Decline

What's the play?

Target stroke centers with certifications expiring in March 2025 that show door-to-imaging time increases year-over-year. The specific renewal date plus benchmark comparison creates urgency.

Why this works

Urgent timeline tied to actual renewal date. Specific metric decline is actionable. Benchmark comparison adds context without being pushy. Creates urgency through data, not sales pressure.

Data Sources
  1. Stroke Center Certification Database - certification renewal dates
  2. CMS Outpatient Imaging Efficiency Data - door-to-imaging time trends
  3. CMS Hospital Compare Quality Data - quality benchmarks

The message:

Subject: March 2025 recertification - your imaging time up 7 minutes Your stroke center recertifies in March 2025 and door-to-imaging has drifted from 31 to 38 minutes year-over-year. Joint Commission's 30-minute benchmark now looks at risk with current performance trending wrong direction. Who's leading your recertification quality prep?
PQS Public + Internal Strong (8.5/10)

Overnight Diagnostic Performance Gap

What's the play?

Target comprehensive stroke centers showing significant performance variance between overnight and day shift stroke case processing. Focus on facilities where overnight delays double their day shift averages.

Why this works

Time-of-day analysis is valuable and not commonly tracked. The doubling effect is alarming. Off-hours blind spot is real for most clinical operations teams. The easy yes/no question reduces friction.

Data Sources
  1. CMS Hospital Outpatient Quality Reporting (OQR) Program Data - imaging process measures
  2. Internal performance data - shift-time segmented diagnostic speed

The message:

Subject: 89-minute off-hours delays vs 42-minute day shift Your overnight stroke cases average 89 minutes imaging-to-notification compared to 42 minutes during day shift. That off-hours gap doubles when neurologist availability is most constrained and door-to-needle time is most critical. Is anyone tracking overnight diagnostic performance separately?
DATA REQUIREMENT

This play requires imaging timestamp data segmented by shift time showing performance variance between overnight and day operations.

Combined with public CMS quality data. This shift-segmented analysis is unique to your operational insights.
PQS Public Data Strong (8.4/10)

Trauma Centers with Weekend Imaging Backlogs

What's the play?

Target Level I/II trauma centers with high imaging case volumes that show weekend backlogs when radiologist staffing drops. Identify facilities where Saturday-Sunday pending studies peak significantly above weekday averages.

Why this works

Very specific about weekend problem. The 90-minute delay implication is serious for trauma outcomes. Staffing correlation shows deep understanding. Easy question to answer makes it frictionless.

Data Sources
  1. National Trauma Data Bank (NTDB) - imaging procedure volumes
  2. CMS Outpatient Imaging Efficiency Data - imaging efficiency metrics

The message:

Subject: Your trauma imaging queue hits 23 studies on weekends Saturday-Sunday your trauma imaging queue peaks at 23 pending studies vs 11 on weekdays. That 12-study backlog adds 90+ minutes to critical diagnoses when radiologist staffing drops by half. Is someone tracking weekend imaging capacity gaps?
PQS Public + Internal Strong (8.4/10)

Multi-Hospital IDNs with Transfer Coordination Inefficiency

What's the play?

Target multi-hospital integrated delivery networks (3+ facilities) with above-benchmark inter-hospital transfer times. Focus on networks where stroke transfers between community hospitals and comprehensive centers exceed 60-minute coordination targets.

Why this works

Network transfer insight is specific and valuable. Handoff detail shows operational understanding. Transfer coordination is a known pain point for multi-site systems. Easy tracking question.

Data Sources
  1. CMS Hospital Compare Quality Data - network affiliations
  2. Internal transfer timestamp data - showing handoff stages between facilities

The message:

Subject: 58-minute gap in your inter-hospital transfers Stroke transfers between your community hospitals and comprehensive center average 118 minutes vs 60-minute coordination target. That 58-minute delay happens during handoffs when community ED waits for comprehensive center bed assignment and transport coordination. Is anyone tracking transfer coordination times across facilities?
DATA REQUIREMENT

This play requires transfer timestamp data showing handoff stages between facilities in multi-hospital networks.

Combined with public CMS network affiliation data. This handoff-level analysis is unique to your operational insights.
PQS Public + Internal Strong (8.3/10)

Multi-Hospital Transfer Coordination Delays

What's the play?

Target multi-hospital networks (3+ facilities) with stroke transfers averaging significantly above 60-minute coordination benchmarks. Focus on networks where inter-facility transfers show consistent delays during patient handoffs.

Why this works

Specific to multi-site operations. The 58-minute gap is concerning for patient outcomes. Transfer coordination is a known pain point. Easy routing question.

Data Sources
  1. CMS Hospital Compare Quality Data - network structure
  2. Internal transfer data - origin-to-destination times across hospital networks

The message:

Subject: Your 3 hospitals average 118 minutes for inter-facility transfers Across your 3 stroke-certified facilities, inter-hospital transfers average 118 minutes vs the 60-minute coordination target. That 58-minute delay means patients transferred from community hospitals miss optimal intervention windows. Who's managing care coordination between your sites?
DATA REQUIREMENT

This play requires transfer timestamp data across hospital networks showing origin-to-destination times.

Combined with public network affiliation data. This network-level synthesis is unique to your operational insights.
PQS Public Data Strong (8.3/10)

Saturday Trauma Imaging Backlogs

What's the play?

Target trauma centers where Saturday afternoon trauma imaging queues peak significantly above weekday averages, correlating with reduced radiologist coverage during high-volume periods.

Why this works

Saturday-specific insight is valuable. Staff count correlation adds credibility. Weekend coverage is known pain point. Easy routing question.

Data Sources
  1. National Trauma Data Bank (NTDB) - imaging procedure volumes by day
  2. CMS Outpatient Imaging Efficiency Data - imaging efficiency metrics

The message:

Subject: 23 pending trauma studies queue up Saturdays Your Saturday trauma imaging queue peaks at 23 pending studies compared to 11 weekday average. That weekend backlog happens when radiologist coverage drops to 3 staff vs 6 weekday, creating 90-minute diagnosis delays. Who's managing weekend capacity planning?
PQS Public Data Strong (8.2/10)

Weekend Trauma Imaging Capacity Gaps

What's the play?

Target Level I/II trauma centers with documented weekend imaging backlogs that correlate with reduced radiologist staffing. Focus on facilities where Saturday pending studies peak at 2x weekday averages.

Why this works

Saturday-specific insight is actionable. Staff correlation shows understanding. Weekend coverage pain is real.

Data Sources
  1. National Trauma Data Bank (NTDB) - imaging volumes
  2. CMS Outpatient Imaging Efficiency Data - efficiency metrics

The message:

Subject: Your trauma backlog peaks at 23 studies Saturdays Saturday afternoon your trauma imaging pending queue reaches 23 studies compared to 11 weekday peak. That 12-study weekend backlog correlates with 3 radiologist coverage vs 6 weekday staff during your highest volume period. Who's handling weekend staffing optimization?

What Changes

Old way: Spray generic messages at job titles. Hope someone replies.

New way: Use public data to find hospitals in specific painful situations. Then mirror that situation back to them with evidence.

Why this works: When you lead with "Your off-hours stroke cases average 89 minutes vs 42 minutes day shift" instead of "I see you're expanding your neurology department," you're not another sales email. You're the person who did the homework.

The messages above aren't templates. They're examples of what happens when you combine real data sources with specific situations. Your team can replicate this using the data recipes in each play.

Data Sources Reference

Every play traces back to verifiable data. Here are the sources used in this playbook:

Source Key Fields Used For
CMS Hospital Compare Quality Data facility_name, facility_id, quality_measures, imaging_efficiency_metrics, patient_safety_indicators Hospital quality benchmarking, network affiliations
CMS Outpatient Imaging Efficiency Data facility_name, imaging_measure_rates, imaging_efficiency_metrics Identifying diagnostic bottlenecks and efficiency gaps
CMS Hospital Outpatient Quality Reporting (OQR) Program Data ed_efficiency_measures, imaging_process_measures, ed_throughput_metrics ED efficiency and imaging process performance tracking
Stroke Center Certification Database (Joint Commission) certification_level, certification_date, facility_name Certification renewal tracking and compliance risk identification
National Trauma Data Bank (NTDB) imaging_procedures_performed, time_to_diagnosis, facility_level Trauma center imaging volumes and diagnostic speed analysis
Viz.ai Internal Performance Data door_to_diagnosis_time, off_hours_performance_delta, care_coordination_metrics Proprietary benchmarks and before/after implementation analysis