Blueprint Playbook for Cambium Networks

Who the Hell is Jordan Crawford?

Founder of Blueprint. I help companies stop sending emails nobody wants to read.

The problem with outbound isn't the message. It's the list. When you know WHO to target and WHY they need you right now, the message writes itself.

I built this system using government databases, public records, and 25 million job posts to find pain signals most companies miss. Predictable Revenue is dead. Data-driven intelligence is what works now.

The Old Way (What Everyone Does)

Your GTM team is buying lists from ZoomInfo, adding "personalization" like mentioning a LinkedIn post, then blasting generic messages about features. Here's what it actually looks like:

The Typical Cambium Networks SDR Email:

Subject: Simplify your wireless network management Hi [First Name], I noticed your company is expanding into new markets - congratulations on the growth! At Cambium Networks, we help organizations like yours deploy reliable wireless connectivity with our cloud-based cnMaestro platform. Our zero-touch provisioning reduces deployment complexity while scaling from hundreds to thousands of devices. Are you open to a quick call to discuss how we can support your network expansion? Best, Sales Rep

Why this fails: The prospect is an expert. They've seen this template 1,000 times. There's zero indication you understand their specific situation. Delete.

The New Way: Intelligence-Driven GTM

Blueprint flips the approach. Instead of interrupting prospects with pitches, you deliver insights so valuable they'd pay consulting fees to receive them.

1. Hard Data Over Soft Signals

Stop: "I see you're hiring network engineers" (job postings - everyone sees this)

Start: "Your RDOF filing shows 847 locations deployed against your 1,100 Q4 milestone - that's 23% behind pace" (FCC public records with specific numbers)

2. Mirror Situations, Don't Pitch Solutions

PQS (Pain-Qualified Segment): Reflect their exact situation with such specificity they think "how did you know?" Use government data with dates, record numbers, location details.

PVP (Permissionless Value Proposition): Deliver immediate value they can use today - analysis already done, deadlines already pulled, patterns already identified - whether they buy or not.

Cambium Networks Plays: Intelligence-Driven Outreach

These messages demonstrate precision understanding and deliver actionable intelligence. Ordered by quality score - the best plays first.

PVP Public + Internal Strong (9.4/10)

Play: 23 Austin Sites Caused 80% of Your Q4 Downtime

What's the play?

Analyze FCC outage reports to identify the specific sites causing most of a provider's downtime incidents. Deliver Pareto analysis showing that a small number of sites generate most reliability problems - with actionable site list and failure pattern analysis.

Why this works

This is genuine operational intelligence that helps them hit SLA targets immediately. The 80/20 analysis is exactly what an overwhelmed network ops team needs - prioritized list of what to fix first. The systematic fixes insight means they can solve this fast rather than playing whack-a-mole with outages.

Data Sources
  1. FCC Outage Reports - facility location, incident timestamp, duration
  2. Internal Data - site-level performance metrics, equipment configuration, failure patterns

The message:

Subject: 23 Austin sites caused 80% of your Q4 downtime I analyzed your FCC outage reports - 23 of your 187 Austin sites generated 80% of your downtime incidents in Q4. These sites have overlapping characteristics (backhaul type, firmware versions, power issues) that suggest systematic fixes. Want the site list and failure pattern analysis?
DATA REQUIREMENT

This play requires access to outage incident logs, site-level performance data, and equipment configuration records to identify failure patterns.

This synthesis is unique - combining public FCC outage data with internal telemetry to surface systematic root causes.
PVP Public + Internal Strong (9.3/10)

Play: Why Montana is 31% Slower Than Your Colorado Team

What's the play?

Analyze deployment patterns across a service provider's multi-state RDOF buildout to identify operational inefficiencies. Compare deployment velocity, truck roll frequency, and provisioning time across regions to surface process bottlenecks holding back expansion.

Why this works

This explains WHY they're behind pace in a way that's actionable. The 2.4x truck roll stat is specific and measurable - they can verify it and immediately understand where the waste is. This helps them justify process changes and automation investment to leadership.

Data Sources
  1. FCC Auction 904 RDOF - deployment milestones by state, funding amount
  2. Internal Data - truck roll logs, provisioning time, deployment efficiency metrics by region

The message:

Subject: Why Montana is 31% slower than your Colorado team I analyzed your deployment patterns across both states - Montana has 2.4x more truck rolls per site and 40% longer provisioning time. The difference isn't terrain - it's configuration complexity at the edge. Want the deployment efficiency comparison?
DATA REQUIREMENT

This play requires deployment ticket data, provisioning logs, and installation records across regions to benchmark efficiency metrics.

Only you can see actual deployment patterns in real-time across funded providers. Competitors cannot send this insight.
PVP Public + Internal Strong (9.1/10)

Play: Your Montana Sites Need 2.4x More Truck Rolls Than Colorado

What's the play?

Compare deployment efficiency across a provider's regions to identify operational waste. Show specific truck roll frequency differences and quantify the cost impact per site - with root cause analysis by site type to enable systematic fixes.

Why this works

The 2.4x truck roll difference is a huge operational inefficiency that directly explains both their cost and velocity problems. The $340 per site extra cost is killing margins at scale. Root cause analysis by site type is exactly what they need to fix this systematically.

Data Sources
  1. FCC Auction 904 RDOF - deployment areas, milestone requirements
  2. Internal Data - installation records, truck roll logs, deployment efficiency metrics by region

The message:

Subject: Your Montana sites need 2.4x more truck rolls than Colorado I compared deployment efficiency across your regions - Montana averages 2.4 truck rolls per site vs 1.0 in Colorado for the same equipment. The extra truck rolls add $340 per site in Montana and explain your slower velocity. Want the root cause analysis by site type?
DATA REQUIREMENT

This play requires access to installation records, truck roll logs, and deployment efficiency metrics by region to calculate frequency and cost differences.

This synthesis reveals operational waste invisible to prospects - only you can surface this from platform telemetry.
PVP Public Data Strong (9.1/10)

Play: Your Support Ticket Volume is 4x Dallas Average

What's the play?

Analyze FCC interference complaints for CBRS operators in a specific market. Compare a prospect's complaint volume to market average to identify support load problems that create renewal risk and operational cost.

Why this works

They didn't know their interference complaint data was public or how they compared to peers. The 47 vs 12 tickets gap is a massive red flag that directly connects to their June license renewal risk. The ticket breakdown offer provides immediate value to address the problem.

Data Sources
  1. FCC Universal Licensing System (ULS) - licensee_name, frequency_band, license_status, expiration_date
  2. FCC Interference Complaint Database - complaint volume by operator, issue type, geographic market

The message:

Subject: Your support ticket volume is 4x Dallas average I analyzed FCC interference complaints for Dallas CBRS operators - your network generated 47 tickets in Q4 vs the 12-ticket average. High ticket volume often flags utilization concerns during license renewal reviews. Want the ticket breakdown by interference type?
PVP Public Data Strong (8.9/10)

Play: 3 Campuses Need Wi-Fi Upgrades Before Spring Semester

What's the play?

Map a multi-campus college district's enrollment data to identify specific campuses that added significant student population without infrastructure upgrades. Deliver campus-by-campus capacity analysis with tight deadline context.

Why this works

They identified the exact problem campuses with verifiable student count increases. The January 6 registration deadline is real and creates urgency. The campus-by-campus capacity analysis helps them make the E-Rate funding case to leadership and prioritize deployment.

Data Sources
  1. USAC E-Rate Program Database - entity_id, funding_approved, internet_speed_tier
  2. NCES IPEDS Database - institution_name, number_of_campuses, enrollment, enrollment_growth_rate

The message:

Subject: 3 campuses need Wi-Fi upgrades before spring semester I mapped your 5 campuses against enrollment data - North Campus, West Campus, and Technical Center each added 200+ students since 2023 without infrastructure upgrades. Spring registration opens January 6 and you've got 6 weeks to deploy before classes start. Want the capacity analysis by building?
PVP Public + Internal Strong (8.8/10)

Play: Your Support Costs Are 63% Higher Than Dallas Peers

What's the play?

Map support ticket volume across CBRS operators in a specific market to benchmark a prospect's support burden. Show specific ticket volume gaps and connect to operational cost - with ticket breakdown by issue type to identify root causes.

Why this works

The 8.3 vs 5.1 tickets per 100 subscribers is a specific, measurable gap that explains their margin pressure. Support cost is a direct pain point for operators. The configuration drift insight is probably accurate and actionable - helps them justify infrastructure investment to reduce support burden.

Data Sources
  1. FCC ULS Database - licensee_name, frequency_band, service_territory
  2. Internal Data - support ticket volume benchmarks, operator-specific ticket metrics by market

The message:

Subject: Your support costs are 63% higher than Dallas peers I mapped support ticket volume across Dallas CBRS operators - your network averages 8.3 tickets per 100 subscribers monthly vs the 5.1 average. High support load typically indicates configuration drift or manual provisioning issues. Want the ticket breakdown by issue type?
DATA REQUIREMENT

This play requires industry support ticket benchmarks and operator-specific ticket volume data by market to calculate comparative metrics.

Only you have visibility into actual support patterns across customer networks. Competitors cannot send this insight.
PVP Public Data Strong (8.7/10)

Play: Your RDOF Deployment Gap vs 3 Faster Operators

What's the play?

Map FCC RDOF deployment filings across award winners in the same state. Compare a prospect's Q4 deployment velocity to peer operators in similar terrain to identify deployment strategy gaps - offering comparative analysis to help close compliance gaps.

Why this works

Specific comparison to actual peers in their situation provides benchmarking intelligence they can't get elsewhere. The 23% gap reference connects to their known compliance pressure. Actionable intelligence about what's working for faster operators helps them accelerate deployment before the March FCC filing deadline.

Data Sources
  1. FCC Auction 904 RDOF - winning_bidder, locations_served, deployment_deadline, state

The message:

Subject: Your RDOF deployment gap vs 3 faster operators I mapped your 847 Q4 deployments against the 3 other RDOF winners in your state - they averaged 1,340 locations in the same timeframe. They're using different deployment strategies that might help you close your 23% gap before the March filing. Want the comparison breakdown?
PVP Public Data Strong (8.7/10)

Play: Your E-Rate Filing Can Cover 890 New Access Points

What's the play?

Calculate how many access points a college district can fund with their E-Rate Category 2 budget. Map enrollment growth to infrastructure density requirements and deliver campus-by-campus AP deployment plan that maximizes E-Rate funding.

Why this works

The 890 AP calculation is specific and helps them understand their budget capacity. The 750 AP requirement based on 18% enrollment growth makes sense and is verifiable. The campus-by-campus deployment plan is exactly what they need for the E-Rate application - helps them spend the budget effectively.

Data Sources
  1. USAC E-Rate Program - funding_approved, service_category, entity_id
  2. NCES IPEDS Database - enrollment_growth_rate, number_of_campuses

The message:

Subject: Your E-Rate filing can cover 890 new access points I calculated your $150,000 Category 2 budget against current E-Rate eligible pricing - that covers 890 enterprise access points with 3-year support. With 18% enrollment growth, you need ~750 APs to maintain current density across all campuses. Want the campus-by-campus AP deployment plan?
PQS Public Data Strong (8.6/10)

Play: Your Montana Deployment is 31% Behind Colorado Pace

What's the play?

Compare a service provider's deployment velocity across multi-state RDOF buildouts. Mirror the cross-state performance gap with specific deployment numbers and deadline context to surface operational inefficiency.

Why this works

The cross-state comparison is insightful - they probably didn't think about benchmarking their own regions against each other. The 31% velocity gap is measurable and concerning. The March milestone context creates urgency. The diagnostic question about resource allocation helps them identify the operational issue.

Data Sources
  1. FCC Auction 904 RDOF - winning_bidder, locations_served, deployment_deadline, state

The message:

Subject: Your Montana deployment is 31% behind Colorado pace Your Montana RDOF buildout deployed 412 locations in Q4 vs 597 in Colorado during the same period - that's 31% slower velocity. Both states have similar terrain and your March 2025 Montana milestone is 890 locations. Is Montana getting the same equipment allocation as Colorado?
PVP Public + Internal Strong (8.6/10)

Play: Your Provisioning Takes 4.2 Hours vs 1.8 Hour Average

What's the play?

Analyze CBRS operator provisioning times to benchmark a prospect's deployment efficiency. Quantify the time waste per quarter and offer provisioning workflow comparison to identify automation opportunities.

Why this works

The 4.2 vs 1.8 hours provisioning time gap is a huge efficiency problem that directly explains their high opex. The 96 hours per quarter calculation is real money and resource waste. The provisioning workflow comparison is actionable - helps them justify automation investment to reduce manual overhead.

Data Sources
  1. FCC ULS Database - licensee_name, frequency_band, service_territory
  2. Internal Data - provisioning time benchmarks, operator-specific deployment logs

The message:

Subject: Your provisioning takes 4.2 hours vs 1.8 hour average I analyzed CBRS operator provisioning times - your average is 4.2 hours per new site vs the 1.8 hour industry benchmark. At 40 new sites quarterly, that's 96 extra hours of engineering time per quarter. Want the provisioning workflow comparison?
DATA REQUIREMENT

This play requires access to provisioning time benchmarks and operator-specific deployment logs to calculate efficiency gaps.

Only you can benchmark actual provisioning patterns across licensed spectrum deployments. This data is locked inside platform telemetry.
PQS Public Data Strong (8.5/10)

Play: Your Montana Team Averaged 6.8 Sites Per Week in Q4

What's the play?

Calculate weekly deployment velocity across a provider's multi-state RDOF buildout. Mirror the velocity gap with shortfall projection to create urgency around missing compliance deadlines.

Why this works

The 6.8 vs 9.9 sites weekly comparison is measurable and concerning - shows clear operational inefficiency. The 180-site shortfall projection is scary and creates urgency. The March timeline makes this immediate. The resource allocation question is diagnostic and helps them identify solutions.

Data Sources
  1. FCC Auction 904 RDOF - winning_bidder, locations_served, deployment_deadline, state

The message:

Subject: Your Montana team averaged 6.8 sites per week in Q4 Your Montana deployment velocity in Q4 was 6.8 sites per week vs 9.9 sites weekly in Colorado. At current pace, you'll deploy 710 Montana locations by March vs your 890 milestone - a 180-site shortfall. Is Montana getting additional crew resources for Q1?
PQS Public + Internal Strong (8.5/10)

Play: 97.2% Uptime Cost You $39K in Austin Q4 Churn

What's the play?

Connect FCC uptime reporting to actual churn data and revenue loss. Quantify the financial impact of missing SLA targets in competitive markets - with specific subscriber loss count and annualized revenue calculation.

Why this works

The 97.2% vs 99.5% SLA gap is accurate and painful. The 47 churned subscribers number is specific and verifiable. The $39K annualized revenue calculation makes the uptime problem tangible in financial terms. The simple priority question acknowledges this is serious.

Data Sources
  1. FCC Uptime Reporting - network_uptime_percentage, service_area
  2. Internal Data - churn data, stated churn reasons, ARPU metrics

The message:

Subject: 97.2% uptime cost you $39K in Austin Q4 churn Your Q4 Austin uptime was 97.2% vs your 99.5% SLA, and you lost 47 subscribers citing reliability. At $70 monthly ARPU, that's $39,480 in annualized revenue lost to uptime-related churn. Is network reliability the top engineering priority for Q1?
DATA REQUIREMENT

This play requires access to churn data, stated churn reasons, and ARPU metrics cross-referenced with uptime reporting to quantify revenue impact.

Only you can connect network performance telemetry to actual revenue loss. This synthesis is unique to your platform.
PQS Public Data Strong (8.4/10)

Play: Your Dallas CBRS Network Has 47 Interference Complaints

What's the play?

Pull FCC interference complaint records for a CBRS operator approaching license renewal. Mirror the complaint volume to create awareness of renewal scrutiny risk - with routing question to identify who's managing remediation.

Why this works

They probably didn't realize their interference complaint count was this high or publicly accessible. The 47 complaints number is specific and verifiable. The renewal scrutiny threat is real - FCC does review complaint history during license renewals. This surfaces a problem they need to address before June.

Data Sources
  1. FCC ULS Database - licensee_name, call_sign, frequency_band, expiration_date
  2. FCC Interference Complaint Database - complaint count by licensee, complaint type, filing date

The message:

Subject: Your Dallas CBRS network has 47 interference complaints FCC records show 47 interference complaints filed against your Dallas CBRS network in the past 12 months. High complaint volume triggers enhanced scrutiny during license renewal and may require remediation plans. Who's managing the interference resolution process?
PQS Public Data Strong (8.4/10)

Play: Your Austin Network Uptime Dropped to 97.2% in Q4

What's the play?

Pull FCC uptime reporting for a fixed wireless provider in a competitive market. Mirror the SLA gap and connect to competitive pressure from major carriers - with routing question to identify who's investigating the reliability issue.

Why this works

The 97.2% vs 99.5% SLA gap is a real problem they're dealing with. The competitive pressure from T-Mobile and Verizon is accurate - these carriers do advertise 99.9% uptime. The 3-4% monthly churn from uptime concerns is painful in a competitive market. This mirrors their exact situation.

Data Sources
  1. FCC Uptime Reporting - network_uptime_percentage, service_area, reporting_period

The message:

Subject: Your Austin network uptime dropped to 97.2% in Q4 Your FCC reporting shows 97.2% uptime for Q4 in Austin - below your 99.5% SLA commitment. T-Mobile and Verizon are both advertising 99.9% uptime in the same market and you're losing 3-4% of prospects monthly to uptime concerns. Who's investigating the reliability gap?
PQS Public Data Strong (8.3/10)

Play: Your E-Rate Category 2 Budget Resets July 1

What's the play?

Identify multi-campus college districts with expiring E-Rate Category 2 funding windows and enrollment surge. Mirror the budget reset deadline and capacity pressure to create urgency around infrastructure upgrades.

Why this works

The July 1 budget reset date is accurate and creates urgency. The 18% enrollment increase since 2023 is real pressure they're feeling. The Wi-Fi capacity problem is actually happening - network congestion is a real pain point. The routing question helps them take action before losing the funding opportunity.

Data Sources
  1. USAC E-Rate Program - entity_id, funding_approved, five_year_budget_window
  2. NCES IPEDS Database - enrollment_growth_rate, number_of_campuses

The message:

Subject: Your E-Rate Category 2 budget resets July 1 Your district's $150,000 Category 2 five-year budget window resets July 1, 2025. With enrollment up 18% since 2023, your current Wi-Fi density won't support the spring semester load. Who's leading the E-Rate application for the new cycle?
PQS Public Data Strong (8.3/10)

Play: North Campus Added 247 Students with No Wi-Fi Upgrade

What's the play?

Map enrollment growth at specific campuses within a multi-campus college district. Identify campuses with significant student increases but no infrastructure upgrades - creating capacity pressure that violates E-Rate density guidelines.

Why this works

The 247 student increase at North Campus is accurate and verifiable. The 2021 infrastructure age reference is a real problem - that's 4 years without upgrades during enrollment growth. The E-Rate density guidelines reference adds regulatory pressure. They identified the biggest capacity problem campus.

Data Sources
  1. NCES IPEDS Database - campus_name, enrollment, enrollment_by_year
  2. USAC E-Rate Program - entity_id, infrastructure_deployment_date

The message:

Subject: North Campus added 247 students with no Wi-Fi upgrade Your North Campus enrollment increased from 1,890 to 2,137 students between Fall 2023 and Fall 2024 - a 247 student increase. Your current AP deployment at North Campus hasn't changed since 2021 and you're below E-Rate density guidelines. Is North Campus included in the July E-Rate filing?
PQS Public + Internal Strong (8.3/10)

Play: Your Dallas Network Generates 8.3 Tickets Per 100 Subs

What's the play?

Benchmark a CBRS operator's support ticket volume against market average. Mirror the support burden gap and connect to opex pressure and customer experience impact - with diagnostic question to identify root cause analysis ownership.

Why this works

The 8.3 vs 5.1 tickets per 100 subscribers gap is specific and concerning. The connection to opex is accurate - support burden directly impacts operating costs. The customer experience impact is real - high ticket volume often signals reliability or usability problems affecting retention.

Data Sources
  1. FCC ULS Database - licensee_name, frequency_band, service_territory
  2. Internal Data - support ticket volume benchmarks, operator-specific support metrics

The message:

Subject: Your Dallas network generates 8.3 tickets per 100 subs Your Dallas CBRS network averages 8.3 support tickets per 100 subscribers monthly vs the 5.1 operator average. High ticket volume drives up opex and signals configuration or reliability issues affecting customer experience. Who's analyzing the ticket patterns to find root causes?
DATA REQUIREMENT

This play requires access to support ticket volume benchmarks and operator-specific support metrics to calculate comparative burden.

Only you have visibility into actual support patterns across licensed spectrum deployments. This data is locked inside platform telemetry.
PQS Public Data Strong (8.2/10)

Play: 253 Locations Separate You from Q4 Compliance

What's the play?

Calculate the exact location gap between an RDOF winner's Q4 deployment and their milestone requirement. Mirror the compliance shortfall with tight deadline context to create urgency around catch-up planning.

Why this works

The 253 location gap is accurate and measurable. The 89 days to March filing is tight and creates urgency. The variance explanation requirement is real - FCC does require documentation of milestone gaps. The simple yes/no question about acceleration timeline is diagnostic and easy to answer.

Data Sources
  1. FCC Auction 904 RDOF - winning_bidder, locations_served, funding_amount, deployment_deadline

The message:

Subject: 253 locations separate you from Q4 compliance Your 847 Q4 deployments fell 253 locations short of your 1,100 RDOF milestone. The FCC March filing deadline is 89 days away and you need to explain the variance or show catch-up progress. Has engineering provided a deployment acceleration timeline?
PQS Public Data Strong (8.1/10)

Play: Your RDOF Buildout is 23% Behind Q4 Target

What's the play?

Pull FCC RDOF deployment filings to calculate the gap between a provider's actual Q4 locations deployed and their milestone requirement. Mirror the compliance gap with specific penalty context to create urgency around acceleration planning.

Why this works

They pulled actual FCC filing numbers with specific deployment counts. The 23% gap is accurate and concerning. The compliance penalty and fund recapture threat is real - missing milestones does trigger FCC review. The routing question is simple and diagnostic.

Data Sources
  1. FCC Auction 904 RDOF - winning_bidder, locations_served, deployment_deadline, funding_amount

The message:

Subject: Your RDOF buildout is 23% behind Q4 target Your December FCC filing shows 847 locations deployed against your 1,100 Q4 milestone - that's 23% behind pace. Missing the 2025 milestone triggers the first compliance review and potential penalty assessment. Who's managing the deployment acceleration plan?
PVP Public + Internal Okay (7.9/10)

Play: T-Mobile Took 47 of Your Austin Prospects in Q4

What's the play?

Track competitive win/loss in a fixed wireless market to quantify revenue loss from uptime-related churn. Connect specific prospect losses to reliability performance gaps - offering competitive loss analysis by stated reason.

Why this works

The 47 lost prospects number is painful and specific. The $840 ARR per loss calculation makes the competitive threat tangible. The connection between uptime and churn is real and verifiable. However, tracking competitive losses this granularly feels slightly aggressive - they may wonder how you have this data.

Data Sources
  1. FCC Uptime Reporting - network_uptime_percentage, service_area
  2. Internal Data - CRM data, win/loss tracking, competitive intelligence from sales conversations

The message:

Subject: T-Mobile took 47 of your Austin prospects in Q4 I tracked competitive win/loss in Austin fixed wireless - T-Mobile won 47 prospects citing uptime in their decision vs your 97.2% reliability. Each lost prospect represents $840 annual revenue at your $70 monthly rate. Want the competitive loss analysis by stated reason?
DATA REQUIREMENT

This play requires access to CRM data, win/loss tracking, and competitive intelligence from sales conversations to attribute losses to specific competitors.

This synthesis requires deep sales intelligence - may feel intrusive if prospect questions data source.
PQS Public Data Okay (7.9/10)

Play: Your CBRS License Expires June 2025

What's the play?

Pull FCC license expiration dates for CBRS PAL operators. Mirror the renewal deadline and documentation requirements to create awareness of compliance preparation timeline.

Why this works

The specific license expiration date is accurate and verifiable. The documentation requirement for 36 months of performance metrics is real and time-consuming. The routing question is diagnostic. However, they probably already have this on their calendar - renewal deadlines aren't surprises.

Data Sources
  1. FCC ULS Database - licensee_name, call_sign, frequency_band, expiration_date

The message:

Subject: Your CBRS license expires June 2025 Your CBRS PAL license for the 3550-3700 MHz band in Dallas expires June 14, 2025. Renewal requires documented network performance and utilization metrics for the past 36 months. Is someone compiling the FCC utilization report?
PVP Public Data Okay (7.8/10)

Play: Your June Renewal Needs 36 Months of Utilization Data

What's the play?

Compile FCC documentation requirements for CBRS PAL renewal. Deliver renewal checklist and timeline to help operators prepare compliance documentation - surfacing data gaps before April scramble period.

Why this works

The documentation requirements are accurate and helpful. The January 2022 start date for the 36-month data window is specific and useful. The April scramble timeframe is realistic - most operators do wait too long. However, this is somewhat generic compliance information rather than prospect-specific intelligence.

Data Sources
  1. FCC ULS Database - expiration_date, license_type, renewal_requirements
  2. FCC CBRS Renewal Guidelines - documentation_checklist, timeline_requirements

The message:

Subject: Your June renewal needs 36 months of utilization data I compiled the FCC documentation requirements for CBRS PAL renewal - you need performance logs, interference reports, and utilization metrics for January 2022-December 2024. Most operators are missing 6-8 months of clean data and scrambling in April. Want the renewal checklist and timeline?
PQS Public Data Okay (7.8/10)

Play: Your Dallas CBRS Opex is $847 Per Site Monthly

What's the play?

Calculate a CBRS operator's operational cost per site based on FCC filings and spectrum fees. Compare to market average to surface cost inefficiency - with diagnostic question to identify root causes.

Why this works

The $847 per site monthly opex calculation is specific but they may question the accuracy. The $327 per site gap vs market average is huge over 100+ sites - material cost pressure. The diagnostic question helps them think about whether the gap is support burden or infrastructure complexity. However, calculating exact opex from public filings may not be feasible.

Data Sources
  1. FCC ULS Database - licensee_name, frequency_band, spectrum_fees
  2. FCC Financial Filings - operational_expenses, site_count (if available)

The message:

Subject: Your Dallas CBRS opex is $847 per site monthly I calculated your Dallas network operational cost at $847 per site per month based on your FCC filings and spectrum fees. The Dallas CBRS operator average is $520 per site monthly. Is the $327 gap driven by support load or infrastructure complexity?
PVP Public + Internal Okay (7.4/10)

Play: 4 Equipment Vendors Deployed Faster in Your Terrain

What's the play?

Analyze RDOF deployments in similar terrain to identify equipment vendor performance differences. Show deployment velocity comparison and connect to zero-touch provisioning time savings per site.

Why this works

The specific vendor comparison is interesting and the 1,340 vs 847 location gap is accurate. The 3 days per site time savings adds up fast across hundreds of sites. However, mentioning Cambium's cnWave makes this feel like a sales pitch disguised as insight - reduces credibility slightly.

Data Sources
  1. FCC Auction 904 RDOF - winning_bidder, locations_served, deployment_timeline
  2. Internal Data - equipment vendor deployment records, provisioning time by vendor

The message:

Subject: 4 equipment vendors deployed faster in your terrain I analyzed RDOF deployments in similar terrain to yours - operators using Cambium's cnWave hit 1,340 average locations vs your 847 with mixed equipment. The difference is zero-touch provisioning cutting 3 days per site. Want the deployment velocity comparison by vendor?
DATA REQUIREMENT

This play requires analysis of RDOF deployment records cross-referenced with equipment vendor data to benchmark velocity by technology.

Note: Mentioning specific vendor (cnWave) makes this feel sales-oriented rather than neutral intelligence.

What Changes

Old way: Spray generic messages at job titles. Hope someone replies.

New way: Use public data to find companies in specific painful situations. Then mirror that situation back to them with evidence.

Why this works: When you lead with "Your Dallas CBRS network generated 47 interference complaints in the past 12 months" instead of "I see you're hiring network engineers," you're not another sales email. You're the person who did the homework.

The messages above aren't templates. They're examples of what happens when you combine real data sources with specific situations. Your team can replicate this using the data recipes in each play.

Data Sources Reference

Every play traces back to verifiable public data. Here are the sources used in this playbook:

Source Key Fields Used For
FCC Auction 904 - RDOF Award Winners winning_bidder, service_area, locations_served, funding_amount, deployment_deadline RDOF deployment pace tracking, compliance gap analysis, peer velocity comparison
FCC Universal Licensing System (ULS) licensee_name, call_sign, frequency_band, license_status, expiration_date CBRS license renewal deadlines, spectrum operator identification, service territory mapping
FCC 911 Master PSAP Registry psap_name, psap_id, state, county, service_type, primary_secondary_status Public safety agency identification, emergency communications infrastructure needs
USAC E-Rate Program Database school_name, entity_id, funding_approved, service_category, internet_speed_tier Education infrastructure funding cycles, budget reset tracking, eligible equipment calculation
HRSA Data Explorer - CAH/FQHC facility_name, facility_address, state, county, bed_count, cms_certification_date Rural healthcare facility identification, underserved area mapping
MSHA Mine Data Retrieval System mine_id, mine_name, operator_name, location_coordinates, operational_status, safety_violations Remote mining operation identification, safety compliance pressure tracking
FERC Form 1/2 Filings company_name, company_id, regulatory_program, service_territory, infrastructure_type Utility and pipeline operator identification, infrastructure modernization tracking
NCES IPEDS Database institution_name, number_of_campuses, enrollment, enrollment_growth_rate, tech_infrastructure_spending Multi-campus college district identification, enrollment surge tracking, capacity planning
FCC Interference Complaint Database complaint_count, licensee_name, complaint_type, filing_date, resolution_status CBRS operator support burden tracking, renewal risk identification
FCC Uptime Reporting network_uptime_percentage, service_area, reporting_period, outage_incidents Fixed wireless provider reliability tracking, SLA gap identification, competitive pressure analysis