Blueprint Playbook for Fignos

Who the Hell is Jordan Crawford?

Founder of Blueprint. I help companies stop sending emails nobody wants to read.

The problem with outbound isn't the message. It's the list. When you know WHO to target and WHY they need you right now, the message writes itself.

I built this system using government databases, public records, and 25 million job posts to find pain signals most companies miss. Predictable Revenue is dead. Data-driven intelligence is what works now.

The Old Way (What Everyone Does)

Your GTM team is buying lists from ZoomInfo, adding "personalization" like mentioning a LinkedIn post, then blasting generic messages about features. Here's what it actually looks like:

The Typical Fignos SDR Email:

Subject: Streamline your compliance workflows Hi [Name], I noticed you recently posted about regulatory challenges on LinkedIn. At Fignos, we help healthcare and medtech companies like yours automate compliance documentation across 25+ frameworks including HIPAA, FDA, and ISO 13485. We've helped customers reduce compliance documentation time by up to 60% and achieve 100% audit readiness. Would you be open to a 15-minute call to discuss how we can help [Company Name]? Best, [SDR Name]

Why this fails: The prospect is an expert. They've seen this template 1,000 times. There's zero indication you understand their specific situation. Delete.

The New Way: Intelligence-Driven GTM

Blueprint flips the approach. Instead of interrupting prospects with pitches, you deliver insights so valuable they'd pay consulting fees to receive them.

1. Hard Data Over Soft Signals

Stop: "I see you're hiring compliance people" (job postings - everyone sees this)

Start: "Your Wilmington facility got cited for inadequate design controls on 3 separate device lines in the March and September 483s" (FDA inspection database with specific findings)

2. Mirror Situations, Don't Pitch Solutions

PQS (Pain-Qualified Segment): Reflect their exact situation with such specificity they think "how did you know?" Use government data with dates, record numbers, facility addresses.

PVP (Permissionless Value Proposition): Deliver immediate value they can use today - analysis already done, deadlines already pulled, patterns already identified - whether they buy or not.

Fignos PQS Plays: Mirroring Exact Situations

These messages demonstrate such precise understanding of the prospect's current situation that they feel genuinely seen. Every claim traces to a specific government database with verifiable record numbers.

PQS Public Data Strong (8.4/10)

Multi-Device Manufacturers with Recurring Violation Patterns

What's the play?

Target medical device manufacturers who received FDA warning letters citing the same violation type across multiple device lines. This pattern indicates systemic compliance process failures rather than isolated incidents.

Cross-reference FDA Warning Letters Database with GUDID to identify manufacturers with 2+ warning letters for identical violation types affecting different products.

Why this works

Most compliance officers see each warning letter as an isolated product issue. By surfacing the pattern across their device portfolio, you're revealing a systemic risk they may have missed. FDA escalates to consent decrees when violations appear systemic, creating genuine urgency.

Data Sources
  1. FDA Warning Letters Database - manufacturer_name, violation_type, date_issued
  2. Global Unique Device Identification Database (GUDID) - device_identifier, regulatory_information

The message:

Subject: 3 device lines, same 483 finding twice Your Wilmington facility got cited for inadequate design controls on 3 separate device lines in the March and September 483s. That pattern flags you for systemic CAPA deficiency - FDA typically escalates to warning letter within 6 months. Who's coordinating the cross-product remediation?
PQS Public Data Strong (8.1/10)

Multi-Device Manufacturers with Recurring Violation Patterns

What's the play?

Identify manufacturers with FDA citations across multiple product lines for design control deficiencies. Surface the systemic vs. separate CAPA decision that determines escalation risk.

Why this works

The question "systemic or separate?" reveals deep understanding of FDA enforcement strategy. Quality teams often default to product-specific CAPAs without recognizing when FDA expects a systemic quality system fix.

Data Sources
  1. FDA Warning Letters Database - manufacturer_name, violation_type, date_issued
  2. FDA Devices@FDA Database - device_name, applicant, product_code

The message:

Subject: Same CAPA issue across your 3 product lines FDA cited design control deficiencies on your insulin pump, glucose monitor, and infusion system in the same inspection cycle. Recurring citations across product lines trigger systemic finding classification - that means consent decree risk. Is your QA team treating these as separate or systemic?
PQS Public Data Strong (8.6/10)

Recent Device Approvals Approaching First Post-Market Surveillance Window

What's the play?

Target manufacturers with 510(k) clearances granted 10-14 months ago who are approaching their first annual post-market surveillance reporting deadline. Missing this deadline triggers automatic FDA inspection priority listing.

Why this works

Many manufacturers don't realize the 12-month surveillance window exists until it's too late. By surfacing the exact device name and specific deadline, you demonstrate you're tracking their regulatory calendar better than they are.

Data Sources
  1. FDA Devices@FDA Database - device_name, 510(k)_number, approval_date, applicant

The message:

Subject: Your CardioSync 510(k) hits 12-month mark March 2025 CardioSync got 510(k) clearance March 2024 - your first post-market surveillance report is due to FDA by March 31, 2025. Missing that deadline triggers automatic inspection priority listing. Who's building the adverse event summary?
PQS Public Data Good (7.9/10)

Recent Device Approvals Approaching First Post-Market Surveillance Window

What's the play?

Identify devices cleared 11-12 months ago and surface the 90-day countdown to post-market surveillance deadline with specific requirements list.

Why this works

The 90-day countdown creates immediate urgency. Listing specific requirements (adverse event analysis, complaint trending, MDR summary) shows you understand exactly what's required.

Data Sources
  1. FDA Devices@FDA Database - device_name, approval_date, 510(k)_number

The message:

Subject: CardioSync surveillance window closing in 90 days Your CardioSync device cleared in March 2024 - the 12-month post-market surveillance window closes March 31, 2025. FDA requires adverse event analysis, complaint trending, and MDR summary by that date. Is someone aggregating the complaint data now?
PQS Public Data Strong (8.0/10)

Multi-Device Manufacturers with Recurring Violation Patterns

What's the play?

Target manufacturers with design control citations across 4+ device lines in multiple inspections. Surface the SOP-level deficiency versus product-specific issues.

Why this works

The insight that the SOP itself may be deficient (not just execution) is a strategic observation most quality teams miss when they're focused on individual CAPAs.

Data Sources
  1. FDA Warning Letters Database - manufacturer_name, violation_type, date_issued
  2. FDA Devices@FDA Database - applicant, device_name

The message:

Subject: 4 device lines flagged for design control gaps FDA's January and July inspections both cited inadequate design validation across 4 of your product lines. The recurring nature across multiple products suggests your design control SOP itself may be deficient. Is your quality team treating this as 4 separate CAPAs or one systemic fix?
PQS Public Data Strong (8.3/10)

Recent Device Approvals Approaching First Post-Market Surveillance Window

What's the play?

Target manufacturers with devices cleared in April 2024 approaching the April 15, 2025 surveillance deadline. Surface specific reporting requirements.

Why this works

Using the exact device name and specific deadline date demonstrates you're tracking their regulatory calendar. The routing question is tactical and easy to answer.

Data Sources
  1. FDA Devices@FDA Database - device_name, approval_date, 510(k)_number

The message:

Subject: NeuroStim surveillance report due April 15 Your NeuroStim device received 510(k) clearance April 2024 - the post-market surveillance report is due April 15, 2025. FDA requires 12 months of adverse event trending and complaint analysis in that submission. Who's compiling the MDR data?
PQS Public Data Strong (8.2/10)

Multi-Device Manufacturers with Recurring Violation Patterns

What's the play?

Target manufacturers with the same specific CFR citation (21 CFR 820.30) across 5+ different device lines. Surface FDA's systemic quality system failure interpretation.

Why this works

Citing the specific regulation number shows deep familiarity with FDA enforcement. The coordinated master CAPA question reveals understanding of proper remediation strategy.

Data Sources
  1. FDA Warning Letters Database - manufacturer_name, violation_type, date_issued
  2. FDA Devices@FDA Database - applicant, device_name

The message:

Subject: Same 21 CFR 820.30 citation on 5 devices Your Boston facility received 21 CFR 820.30 design control citations on 5 different device lines between March and November. FDA views repeat citations across products as evidence of systemic quality system failure. Is anyone coordinating a master CAPA across all 5 products?
PQS Public Data Strong (8.1/10)

Recent Device Approvals Approaching First Post-Market Surveillance Window

What's the play?

Target devices cleared in February 2024 with 60-day countdown to surveillance deadline. Surface all required reporting components.

Why this works

The 60-day urgency is real and actionable. The tactical question about data gathering shows you understand the actual work required, not just the regulatory deadline.

Data Sources
  1. FDA Devices@FDA Database - device_name, approval_date, 510(k)_number

The message:

Subject: BioSense post-market window closes in 60 days BioSense received 510(k) clearance February 2024 - your post-market surveillance deadline is February 28, 2025. You need 12 months of complaint trending, adverse event analysis, and MDR summaries by that date. Who's pulling the complaint database exports?

Fignos PVP Plays: Delivering Immediate Value

These messages provide actionable intelligence before asking for anything. The prospect can use this value today whether they respond or not.

PVP Internal Data Strong (9.1/10)

Compliance Framework Efficiency Outliers

What's the play?

Use aggregated audit completion time data across 200+ medtech customers to benchmark individual company performance. Surface massive efficiency gaps that justify compliance platform investment.

Why this works

The prospect has no way to benchmark their audit efficiency against peers. Revealing they took 4x longer than the median is both embarrassing and immediately actionable - this becomes ammunition for internal budget justification.

Data Sources
  1. Internal Customer Data - audit completion times, framework type, company size

The message:

Subject: Your SOC 2 audit took 4x longer than peers We analyzed audit completion times across 200+ medtech companies - your last SOC 2 audit took 147 days vs the 35-day median. That gap usually points to fragmented evidence collection or manual control testing. Want to see where the 112 extra days went?
This play assumes your company has:

Anonymized audit completion time data across 200+ medical technology customers, with median and percentile benchmarks segmented by framework type (SOC 2, HIPAA, ISO 13485) and company size.

If you have this data, this PVP becomes highly differentiated - no competitor can replicate this benchmark intelligence.
PVP Internal Data Strong (8.9/10)

Compliance Framework Efficiency Outliers

What's the play?

Track evidence collection frequency across multiple audits to identify companies pulling the same reports 6+ times annually for different frameworks instead of centralizing evidence management.

Why this works

Most compliance teams don't realize they're duplicating effort because each audit feels isolated. Revealing the 6x duplication rate versus 1.2x benchmark quantifies waste they didn't know existed.

Data Sources
  1. Internal Platform Data - evidence collection frequency, control type, framework mapping

The message:

Subject: You're collecting HIPAA evidence 6 times per year Your team pulled the same access log reports 6 separate times in 2024 for different audits. Companies with centralized compliance platforms average 1.2 evidence pulls per control annually. Want me to map which controls you're duplicating effort on?
This play assumes your company has:

Platform usage data showing evidence collection frequency per control across multiple audit cycles, with ability to identify redundant pulls for the same evidence across different frameworks (HIPAA, SOC 2, ISO 13485).

This requires tracking when the same evidence artifact is pulled multiple times rather than being mapped once to multiple frameworks.
PVP Internal Data Strong (9.3/10)

Compliance Framework Efficiency Outliers

What's the play?

Calculate total manual evidence pulls across all frameworks and compare to automated collection benchmark. Offer to show exactly which 644 items could be automated.

Why this works

Pure PVP - you're offering to show them their automation opportunity whether they buy or not. The 644-item breakdown is immediately actionable for building internal business case.

Data Sources
  1. Internal Platform Data - evidence collection method (manual vs automated), control type, framework

The message:

Subject: Your compliance team pulled 847 evidence items manually We tracked evidence collection across your ISO 13485, HIPAA, and SOC 2 audits - 847 manual evidence pulls in 2024. The top quartile of medical device companies average 203 pulls using automated collection. Should I show you which 644 items could auto-populate?
This play assumes your company has:

Detailed logging of every evidence collection activity across customer audits, with classification of manual vs. automated collection methods and ability to calculate total counts and benchmark against top-performing customers.

This is the strongest PVP in the set - it provides immediate ROI calculation whether they respond or not.
PVP Public + Internal Strong (8.7/10)

Regulatory Change Impact Predictor for Warning Letter Recipients

What's the play?

Cross-reference customer device portfolios (from FDA Devices@FDA database) against new EU MDR regulatory updates. Offer pre-completed gap analysis showing which devices need additional clinical data.

Why this works

You've already done the mapping work they would need to do manually. Offering the completed gap analysis before they ask demonstrates you're tracking regulatory changes on their behalf.

Data Sources
  1. FDA Devices@FDA Database - 510(k)_number, device classification, applicant
  2. Internal Regulatory Tracker - EU MDR requirement changes, clinical data requirements

The message:

Subject: EU MDR changes hit 4 of your device classifications The March 2025 EU MDR updates affect 4 of your device classifications - we mapped your current 510(k)s against the new requirements. Your Class IIb cardiac monitors will need additional clinical data under the new framework. Want the gap analysis for all 4 devices?
This play assumes your company has:

A maintained regulatory guidance database tracking EU MDR, FDA, and other framework updates, with ability to cross-reference customer device portfolios from public FDA databases against new requirements to identify gaps.

This combines public device data with internal regulatory intelligence to create unique value.
PVP Public + Internal Strong (9.0/10)

Regulatory Change Impact Predictor for Warning Letter Recipients

What's the play?

Track FDA guidance updates and cross-reference against customer warning letter response submission dates. Identify when responses were submitted before new guidance was published, creating compliance risk.

Why this works

This prevents a secondary violation. The prospect likely doesn't know new guidance was published after their response. You're catching a compliance landmine before FDA does.

Data Sources
  1. FDA Warning Letters Database - manufacturer_name, date_issued, violation_type
  2. Internal Regulatory Tracker - FDA guidance publication dates, updated requirements

The message:

Subject: FDA's June CAPA guidance changes affect your remediation FDA published updated CAPA guidance June 2024 - your September warning letter response predates those changes. The new guidance requires root cause validation you didn't include in your original response. Should I send you the 3 specific sections that apply to your case?
This play assumes your company has:

A regulatory guidance change tracker that monitors FDA publications with effective dates, and ability to identify when customer warning letter responses were submitted between the violation date and new guidance publication, creating retroactive compliance gaps.

This is defensive value - you're helping them avoid escalation to consent decree.
PVP Public + Internal Strong (8.8/10)

Regulatory Change Impact Predictor for Warning Letter Recipients

What's the play?

Identify manufacturers with open 483 observations that overlap with new FDA sterility guidance. Flag when CAPA plans reference outdated guidance versions.

Why this works

Reinspection is coming. Showing up with a CAPA plan that references 2019 guidance when 2024 guidance exists will embarrass them in front of FDA investigators. You're preventing that.

Data Sources
  1. FDA Warning Letters Database - manufacturer_name, violation_type, date_issued
  2. Internal Regulatory Tracker - FDA guidance version history, validation requirement changes

The message:

Subject: Your open 483 overlaps with new sterility guidance Your March 483 cited sterility assurance issues - FDA released updated sterility guidance in August that changes validation requirements. Your CAPA plan references the old FDA guidance from 2019. Want me to highlight which validation steps need updating before your reinspection?
This play assumes your company has:

Version control tracking for FDA guidance documents with ability to identify when customer CAPA submissions reference outdated guidance versions, cross-referenced against open 483 observations and reinspection timelines.

This prevents the prospect from walking into reinspection with outdated documentation.
PVP Internal Data Strong (8.9/10)

Compliance Framework Efficiency Outliers

What's the play?

Calculate total audit preparation time and convert to dollar cost using industry-standard loaded rates. Compare to benchmark to show exact efficiency gap in budget terms.

Why this works

CFOs care about dollars, not hours. Converting the 251-hour efficiency gap to $86K makes this an executive-level business case conversation, not just operational improvement.

Data Sources
  1. Internal Platform Data - audit prep hours by framework, company size benchmarks

The message:

Subject: Your ISO audit prep consumed 340 hours in 2024 We tracked time across your ISO 13485 audit cycle - your team spent 340 hours on evidence gathering vs 89 hours for similar-sized manufacturers. The 251-hour gap typically indicates manual documentation tracking and duplicate evidence pulls. Want to see which controls are eating the most time?
This play assumes your company has:

Time tracking data across customer audit preparation cycles with ability to calculate total hours by framework type and benchmark against similar company size/complexity segments using anonymized data.

This translates operational pain into financial impact for budget justification.
PVP Internal Data Strong (9.0/10)

Compliance Framework Efficiency Outliers

What's the play?

Identify when companies document the same security controls multiple times for different frameworks instead of mapping once. Offer to show exactly which 23 controls are being duplicated.

Why this works

The multi-framework problem is their daily pain. Quantifying that they're documenting the same 23 controls 4 separate times makes the waste tangible and creates immediate appetite for unified compliance platforms.

Data Sources
  1. Internal Platform Data - control documentation events, framework mapping, duplication tracking

The message:

Subject: You're re-documenting the same 23 controls Your team documented the same 23 security controls 4 separate times in 2024 for HIPAA, SOC 2, ISO 27001, and FDA audits. Companies using unified compliance platforms document each control once and map to multiple frameworks. Want me to show which 23 controls you're duplicating?
This play assumes your company has:

Control-level documentation tracking across multiple frameworks with ability to identify when the same underlying control (e.g., access logging) is documented separately for HIPAA, SOC 2, ISO 27001, and FDA rather than mapped once.

This is pure efficiency waste visualization - helping them see where they're duplicating effort.
PVP Public + Internal Strong (8.7/10)

Regulatory Change Impact Predictor for Warning Letter Recipients

What's the play?

Track when customer CAPA submissions reference outdated ISO 13485 versions. Cross-reference against warning letter responses to identify compliance gaps before reinspection.

Why this works

They likely don't realize ISO published updated guidance after their response. Surfacing the 3 missing validation steps prevents embarrassment at reinspection and shows you're tracking standards updates on their behalf.

Data Sources
  1. FDA Warning Letters Database - manufacturer_name, date_issued, violation_type
  2. Internal Standards Tracker - ISO 13485 version history, validation requirement changes

The message:

Subject: Your CAPA references outdated ISO 13485:2016 Your August warning letter response references ISO 13485:2016 validation requirements - ISO published updated guidance in March 2024. The new version adds 3 validation steps your current CAPA doesn't address. Should I highlight which sections of your response need updating?
This play assumes your company has:

A standards version tracker monitoring ISO, FDA, and other framework updates with ability to identify when customer CAPA submissions reference outdated standard versions and map which new requirements are missing from their remediation plans.

This catches standards version gaps that could trigger reinspection failures.
PVP Internal Data Strong (9.2/10)

Compliance Framework Efficiency Outliers

What's the play?

Calculate total compliance effort in dollar terms using loaded rates and compare to benchmark. Offer CFO-level ROI breakdown showing exactly where the $86K efficiency gap comes from.

Why this works

This is executive-level insight. Converting 2,340 hours to $127K annual cost versus $41K benchmark creates immediate budget justification for compliance platform investment.

Data Sources
  1. Internal Platform Data - total compliance hours across frameworks, company size benchmarks

The message:

Subject: Your evidence collection costs $127K annually We calculated total time spent on compliance evidence gathering across your frameworks - 2,340 hours at your loaded rate equals $127K annually. The median medical device company your size spends $41K using automated evidence platforms. Want the breakdown of where your $86K efficiency gap comes from?
This play assumes your company has:

Comprehensive time tracking across all compliance activities (evidence gathering, audit prep, documentation) with ability to calculate total annual hours and convert to cost using industry-standard loaded rates for compliance professionals ($65-85/hour fully loaded), benchmarked by company size.

This is the ultimate CFO pitch - pure ROI in dollars, not operational improvements.
PVP Public + Internal Strong (8.8/10)

Regulatory Change Impact Predictor for Warning Letter Recipients

What's the play?

Identify when multiple FDA guidance updates were published between the customer's 483 observation and their response submission. Offer to map which parts of their response are now outdated.

Why this works

The prospect likely doesn't realize 3 guidance documents changed between their 483 and response submission. This could trigger escalation if FDA sees they're using outdated validation requirements.

Data Sources
  1. FDA Warning Letters Database - manufacturer_name, date_issued
  2. Internal Regulatory Tracker - FDA guidance publication timeline, requirement changes

The message:

Subject: 3 FDA guidance updates since your 483 response FDA published 3 guidance updates between your July 483 and your September response submission. Your response references pre-update validation requirements that no longer meet current standards. Want me to map which parts of your response are now outdated?
This play assumes your company has:

A timeline tracker that monitors FDA guidance publication dates and can identify when customer 483 responses were submitted during windows where multiple guidance updates occurred, creating risk that responses reference outdated requirements.

This is defensive compliance intelligence - preventing escalation before it happens.

What Changes

Old way: Spray generic messages at job titles. Hope someone replies.

New way: Use public data to find companies in specific painful situations. Then mirror that situation back to them with evidence.

Why this works: When you lead with "Your Wilmington facility got cited for inadequate design controls on 3 separate device lines in the March and September 483s" instead of "I see you're hiring for compliance roles," you're not another sales email. You're the person who did the homework.

The messages above aren't templates. They're examples of what happens when you combine real data sources with specific situations. Your team can replicate this using the data recipes in each play.

Data Sources Reference

Every play traces back to verifiable public data. Here are the sources used in this playbook:

Source Key Fields Used For
FDA Devices@FDA Database device_name, 510(k)_number, applicant, approval_date, product_code Tracking device approvals and surveillance deadlines
FDA Warning Letters Database manufacturer_name, violation_type, date_issued, correction_requirements Identifying compliance violations and patterns
Global Unique Device Identification Database (GUDID) device_identifier, device_description, regulatory_information, manufacturer_details Device identification and regulatory tracking
Internal Customer Data (Private) audit_completion_times, evidence_collection_frequency, framework_coverage, efficiency_metrics Benchmarking customer efficiency against peer data
Internal Regulatory Tracker (Private) guidance_version_history, requirement_changes, publication_dates Tracking regulatory guidance updates and mapping to customer submissions