Blueprint Playbook for PayScale

Who the Hell is Jordan Crawford?

Founder of Blueprint. I help companies stop sending emails nobody wants to read.

The problem with outbound isn't the message. It's the list. When you know WHO to target and WHY they need you right now, the message writes itself.

I built this system using government databases, public records, and 25 million job posts to find pain signals most companies miss. Predictable Revenue is dead. Data-driven intelligence is what works now.

The Old Way (What Everyone Does)

Your GTM team is buying lists from ZoomInfo, adding "personalization" like mentioning a LinkedIn post, then blasting generic messages about features. Here's what it actually looks like:

The Typical PayScale SDR Email:

Subject: Competitive compensation insights for Stanford Hi Sarah, I noticed Stanford recently posted several faculty hiring positions on LinkedIn. With talent competition heating up in higher education, ensuring competitive compensation is critical. PayScale helps leading universities like yours benchmark salaries, manage pay equity, and make confident compensation decisions at scale. Our real-time market data covers 20% of the U.S. workforce. Would you be open to a 15-minute call next week to discuss how we're helping institutions reduce compensation cycle time by 80%? Best, Alex

Why this fails: The prospect is an expert. They've seen this template 1,000 times. There's zero indication you understand their specific situation. Delete.

The New Way: Intelligence-Driven GTM

Blueprint flips the approach. Instead of interrupting prospects with pitches, you deliver insights so valuable they'd pay consulting fees to receive them.

1. Hard Data Over Soft Signals

Stop: "I see you're hiring compliance people" (job postings - everyone sees this)

Start: "Your 2024 IPEDS submission shows associate professor salaries at $104K - that's $18K behind Stanford and Berkeley" (government database with specific figures)

2. Mirror Situations, Don't Pitch Solutions

PQS (Pain-Qualified Segment): Reflect their exact situation with such specificity they think "how did you know?" Use government data with dates, record numbers, dollar amounts.

PVP (Permissionless Value Proposition): Deliver immediate value they can use today - analysis already done, benchmarks already pulled, patterns already identified - whether they buy or not.

PayScale's Best Plays (Sorted by Quality Score)

These plays are ranked by quality score, not data source type. The highest-scoring messages appear first - whether they use public data, internal data, or a hybrid approach.

PVP Public + Internal Strong (9.3/10)

Assistant Professor Offer Acceptance Model

What's the play?

Combine IPEDS public salary data with internal offer acceptance tracking to identify the precise salary positioning threshold that maximizes hiring success. Show universities the exact dollar gap where acceptance rates collapse.

Why this works

The $8K threshold is a specific, actionable insight tied directly to hiring outcomes the VP of HR actually cares about. The 78% vs 41% acceptance rate creates an immediate business case for addressing compensation gaps. This connects abstract market data to real recruiting pain.

Data Sources
  1. IPEDS Human Resources Survey Data - faculty salaries by rank, institution, year
  2. Internal Offer Acceptance Data - acceptance/rejection by department, salary offered, market gap

The message:

Subject: Your assistant professor offer acceptance model Analyzed 4 years of your IPEDS data plus offer acceptance patterns - when you're within $8K of market median, acceptance rate is 78%. Beyond $8K gap, it drops to 41%. You're currently $12K behind market in 6 critical hiring departments. Want the offer optimization model for your 2025 hiring cycle?
DATA REQUIREMENT

This play requires access to internal offer acceptance data by department over 4 years to correlate with market positioning from IPEDS.

This synthesis is unique to PayScale's dataset - competitors cannot replicate this analysis.
PVP Public + Internal Strong (9.1/10)

Faculty Retention Risk Model by Salary Gap

What's the play?

Cross-reference 3 years of IPEDS salary data with internal turnover tracking to identify departments where salary gaps predict 2x turnover rates. Deliver a retention risk model with specific department examples.

Why this works

This is genuinely insightful - it connects compensation gaps to actual turnover outcomes using multi-year data. The Computer Science example with exact numbers ($24K gap, 40% turnover) makes it concrete and believable. The $20K threshold and 2x pattern are non-obvious insights worth paying for.

Data Sources
  1. IPEDS Human Resources Survey Data - faculty salaries by department, 3-year history
  2. Internal Faculty Turnover Data - departures by department, reasons, tenure

The message:

Subject: Mapped your retention risk by salary gap Analyzed 3 years of your IPEDS data against peer institutions and identified 6 departments where you're losing faculty at 2x the rate when salary gaps exceed $20K. Your Computer Science department fits this pattern exactly - $24K behind market, 40% turnover last 2 years. Should I send the retention risk model?
DATA REQUIREMENT

This play requires internal faculty turnover data by department over 3 years to correlate with IPEDS salary positioning.

Only PayScale has both the market data and turnover patterns to surface this retention risk insight.
PVP Public + Internal Strong (8.9/10)

Executive Compensation Adjustment Scenarios

What's the play?

Use IRS Form 990 data to identify nonprofits with executive compensation at the 90th+ percentile, then deliver 3 modeled scenarios to bring them to 60th percentile while maintaining competitiveness. Include a board presentation deck.

Why this works

Specific scenarios with dollar amounts are immediately actionable. The board deck is high value - it solves the recipient's problem before they respond. Addressing both cost reduction AND retention shows strategic thinking. The Q1 timing reference demonstrates process knowledge.

Data Sources
  1. IRS Form 990 Tax Return Data (ProPublica) - executive compensation by nonprofit
  2. Internal Nonprofit Compensation Data - percentiles by revenue band, mission type

The message:

Subject: Modeled 3 scenarios to bring you to 60th percentile Your CEO comp at $485K is 92nd percentile - I modeled 3 adjustment scenarios to bring you to 60th percentile ($380K) while maintaining competitiveness. Includes board presentation deck with peer benchmarks, cost impact, and retention risk analysis. Should I send the scenarios for your Q1 compensation review?
DATA REQUIREMENT

This play requires aggregated executive compensation data from 1,200+ nonprofit customers, segmented by revenue band and mission category.

PayScale's proprietary nonprofit benchmark database enables scenario modeling competitors cannot match.
PVP Public + Internal Strong (8.8/10)

Donor Perception Risk Score for Executive Compensation

What's the play?

Cross-reference Form 990 executive compensation data against CharityNavigator, GuideStar, and BBB Wise Giving Standards thresholds. Calculate a quantified donor perception risk score and deliver mitigation strategy.

Why this works

The risk score is quantified and specific (7.2/10). Multi-platform analysis shows thoroughness. The mitigation strategy is what recipients actually need to protect donor relationships. This addresses reputational risk, not just compliance - much more valuable.

Data Sources
  1. IRS Form 990 Tax Return Data (ProPublica) - executive compensation
  2. CharityNavigator, GuideStar, BBB Wise Giving Standards - watchdog thresholds
  3. Internal Donor Behavior Patterns - correlation between comp flags and giving trends

The message:

Subject: Donor perception risk score for your exec comp Mapped your 5 highest-paid executives against CharityNavigator, GuideStar, and BBB Wise Giving Standards thresholds - calculated a donor perception risk score of 7.2/10. 3 of your executives trigger automatic flags on at least 2 watchdog platforms. Should I send the mitigation strategy with peer comp justifications?
DATA REQUIREMENT

This play requires ability to cross-reference multiple watchdog platforms and calculate weighted risk scores based on donor behavior patterns from PayScale's nonprofit customer base.

This multi-source synthesis is unique to PayScale's data infrastructure.
PVP Public Data Strong (8.7/10)

Executive Compensation Rationale Package

What's the play?

Use Form 990 data to build a peer justification analysis for nonprofits with executive compensation above 75th percentile. Deliver board-ready talking points and percentile rankings across 28+ comparable organizations.

Why this works

Board-ready deliverable is exactly what CFOs need for compensation committee meetings. 28 comparable organizations is a substantial peer set that provides credibility. Talking points solve the recipient's immediate problem whether they buy anything or not.

Data Sources
  1. IRS Form 990 Tax Return Data (ProPublica) - executive compensation, revenue, expenses
  2. GuideStar Peer Group Data - comparable organizations by mission, geography, size

The message:

Subject: Your executive comp rationale package ready Built a peer justification analysis for your executive team compensation using 990 data from 28 comparable nonprofits in your subsector and geography. Includes percentile rankings, market comparisons, and board-ready talking points for your CEO's $485K total comp positioning. Want me to send it for your next compensation committee meeting?
PVP Public + Internal Strong (8.6/10)

Peer Faculty Salary Benchmarking Analysis

What's the play?

Build peer analysis comparing university faculty salaries against 15 R1 universities in their athletic conference. Identify departments below 30th percentile and correlate with turnover rates where market gaps exceed $15K.

Why this works

Athletic conference as peer set makes intuitive sense to university leaders. Connecting compensation to turnover is genuinely valuable - it moves beyond abstract benchmarks to business outcomes. Clear deliverable waiting with low-commitment ask.

Data Sources
  1. IPEDS Human Resources Survey Data - faculty salaries by department and rank
  2. Internal Turnover Data - faculty departures by department correlated with salary gaps

The message:

Subject: Your 2024 IPEDS benchmarking analysis ready Built a peer analysis comparing your faculty salaries against 15 R1 universities in your athletic conference - found 9 departments below 30th percentile. Included turnover rates by department where you're trailing market by $15K+. Want me to send the full breakdown?
DATA REQUIREMENT

This play requires internal turnover data by department to correlate with IPEDS salary gaps.

Only PayScale can combine public IPEDS data with proprietary turnover patterns to surface retention risk.
PQS Public Data Strong (8.3/10)

Multi-Department Faculty Salary Lag

What's the play?

Use IPEDS data to identify universities with 10+ departments falling below 25th percentile for faculty salaries vs peer R1 institutions. Call out specific high-value departments like Computer Science and Engineering with exact dollar gaps.

Why this works

Very specific - calling out 14 departments with two critical ones named. $22K gap is actionable and concerning for competitive hiring. Percentile context gives internal framing for budget conversations. Good routing question identifies decision-maker.

Data Sources
  1. IPEDS Human Resources Survey Data - faculty salaries by department, rank, institution

The message:

Subject: 14 of your departments below market 25th percentile Cross-referenced your IPEDS submissions against 12 peer R1 institutions - 14 of your academic departments fall below 25th percentile for faculty salaries. Computer Science and Engineering are both trailing market by $22K+ at associate level. Who owns the competitive positioning analysis for faculty comp?
PQS Public Data Strong (8.1/10)

Multiple Executives Above Watchdog Thresholds

What's the play?

Use Form 990 data to identify nonprofits where CEO, CFO, and COO all exceed 80th percentile compensation for their revenue band. Flag that this combination triggers enhanced scrutiny from CharityNavigator and donor watchdog groups.

Why this works

Multiple executives flagged is more concerning than one outlier. Specific watchdog groups mentioned add credibility to the reputational risk. Philosophy documentation question is smart routing to board governance. This is probably top-of-mind for the recipient already.

Data Sources
  1. IRS Form 990 Tax Return Data (ProPublica) - officer compensation, revenue
  2. GuideStar Compensation Benchmarks - percentile rankings by revenue band

The message:

Subject: 3 of your executives above watchdog thresholds Analyzed your 990 against GuideStar compensation benchmarks - your CEO, CFO, and COO are all above 80th percentile for organizations with your revenue. That combination triggers enhanced scrutiny from CharityNavigator and donor watchdog groups. Who manages the executive compensation philosophy documentation?
PQS Public Data Okay (7.8/10)

CEO Compensation at 92nd Percentile

What's the play?

Use Form 990 data to identify nonprofits with CEO compensation above 90th percentile for their revenue band. Reference CharityNavigator's 75th percentile threshold as a donor concern trigger.

Why this works

Specific dollar amount and percentile from real data creates credibility. CharityNavigator flag represents real reputational risk. Board committee question routes to appropriate decision-maker. May feel slightly defensive to recipient.

Data Sources
  1. IRS Form 990 Tax Return Data (ProPublica) - CEO total compensation, revenue
  2. CharityNavigator Compensation Guidelines - percentile thresholds

The message:

Subject: Your CEO compensation at 92nd percentile for revenue Your 990 filing shows CEO total comp at $485K - that's 92nd percentile for nonprofits with $25M-$50M revenue in your subsector. CharityNavigator flags compensation above 75th percentile as potential donor concern. Is your board compensation committee aware of the percentile positioning?
PQS Public Data Okay (7.4/10)

Associate Professor Salary Lag vs Top Peers

What's the play?

Use IPEDS data to identify universities where associate professor median salaries trail top peer institutions (Stanford, Berkeley) by $15K+. Provide percentile ranking among R1 universities in region.

Why this works

Specific peer comparison with exact dollar amounts creates immediate context. Percentile ranking gives internal framing for budget conversations. Easy routing question identifies decision-maker. May feel obvious to recipient - they likely already know this.

Data Sources
  1. IPEDS Human Resources Survey Data - faculty salaries by rank and institution

The message:

Subject: Your associate professor salaries trail Stanford by $18K IPEDS data shows your associate professor median is $104K vs Stanford's $122K and Berkeley's $118K. That $18K gap puts you at 15th percentile among R1 universities in your region. Is compensation planning already addressing this for the 2025 cycle?

What Changes

Old way: Spray generic messages at job titles. Hope someone replies.

New way: Use public data to find organizations in specific compensation crisis situations. Then mirror that situation back to them with evidence.

Why this works: When you lead with "Your IPEDS submission shows 14 departments below 25th percentile - Computer Science trails by $22K" instead of "I see you're hiring faculty," you're not another sales email. You're the person who did the homework.

The messages above aren't templates. They're examples of what happens when you combine real data sources with specific situations. Your team can replicate this using the data recipes in each play.

Data Sources Reference

Every play traces back to verifiable public data or PayScale's proprietary aggregated compensation database. Here are the sources used in this playbook:

Source Key Fields Used For
IPEDS Human Resources Survey Data institution_name, faculty_salary_by_rank, employees_by_position, state University faculty salary benchmarking, department-level gaps, peer comparisons
IRS Form 990 (ProPublica) organization_name, ein, revenue, officer_compensation, employee_count Nonprofit executive compensation percentile analysis, watchdog threshold checks
GuideStar Peer Group Data mission_category, revenue_band, geographic_region, comparable_organizations Nonprofit peer identification for executive compensation justifications
CharityNavigator Guidelines compensation_percentile_thresholds, donor_concern_triggers Nonprofit executive compensation reputational risk assessment
PayScale Internal Data aggregated_compensation_by_role, offer_acceptance_rates, turnover_by_salary_gap Proprietary benchmarks, retention models, offer optimization analysis