Blueprint Playbook for Intergraph (Hexagon)

Who the Hell is Jordan Crawford?

Founder of Blueprint. I help companies stop sending emails nobody wants to read.

The problem with outbound isn't the message. It's the list. When you know WHO to target and WHY they need you right now, the message writes itself.

I built this system using government databases, public records, and 25 million job posts to find pain signals most companies miss. Predictable Revenue is dead. Data-driven intelligence is what works now.

The Old Way (What Everyone Does)

Your GTM team is buying lists from ZoomInfo, adding "personalization" like mentioning a LinkedIn post, then blasting generic messages about features. Here's what it actually looks like:

The Typical Intergraph (Hexagon) SDR Email:

Subject: Transform Your Infrastructure Management Hi Sarah, I noticed your utility company is managing critical infrastructure assets across multiple sites. Are you facing challenges with asset visibility and real-time operational monitoring? Intergraph's geospatial platform helps utilities like yours integrate data management systems and improve network design efficiency. We've worked with leading utilities to reduce planning cycle time and improve emergency response. Would love to show you a demo. Do you have 15 minutes next week? Best, Alex

Why this fails: The prospect is an expert. They've seen this template 1,000 times. There's zero indication you understand their specific situation. Delete.

The New Way: Intelligence-Driven GTM

Blueprint flips the approach. Instead of interrupting prospects with pitches, you deliver insights so valuable they'd pay consulting fees to receive them.

1. Hard Data Over Soft Signals

Stop: "I see you're hiring compliance people" (job postings - everyone sees this)

Start: "Your facility at 1234 Industrial Pkwy received EPA violation #2024-XYZ on March 15th" (government database with record number)

2. Mirror Situations, Don't Pitch Solutions

PQS (Pain-Qualified Segment): Reflect their exact situation with such specificity they think "how did you know?" Use government data with dates, record numbers, facility addresses.

PVP (Permissionless Value Proposition): Deliver immediate value they can use today - analysis already done, deadlines already pulled, patterns already identified - whether they buy or not.

Company Overview: Intergraph (Hexagon)

Company URL: https://intergraph.com

Core Problem They Solve: Enterprise organizations lack integrated geospatial and infrastructure data management systems to plan, design, and operate critical infrastructure assets (utilities, transportation, government) efficiently and safely.

Product Type: Enterprise Software | Geospatial Intelligence Platform

Target ICP: Large enterprises (500+ employees) managing extensive asset networks including electric utilities, water utilities, gas utilities, telecommunications carriers, oil and gas operators, government agencies, and transportation infrastructure organizations. These companies require real-time operational visibility, regulatory compliance, emergency response capabilities, and capital asset planning across distributed geographic areas.

Primary Buyer Persona: Director of GIS Operations / Infrastructure Operations Manager / VP of Engineering Operations responsible for network design, asset lifecycle management, real-time operational visibility, regulatory compliance, and cross-departmental collaboration for capital projects.

Intergraph (Hexagon) PVP Plays: Delivering Immediate Value

These messages provide actionable intelligence before asking for anything. The prospect can use this value today whether they respond or not.

PVP Public + Internal Strong (9.3/10)

Your Storm Response 40% Slower Than Metro Area

What's the play?

Benchmark major outage response performance across utilities in the same metro region using actual storm event data. Show prospects exactly how their restoration time compares to nearby peers during recent verifiable weather events.

Why this works

Storm response is the #1 visible KPI for utility operations managers. Every minute of delay is public, embarrassing, and measured. When you tell them "you're 40% slower than the metro average during the November storm," you're hitting their most painful public performance metric with recent, verifiable data. The crew dispatch analysis offers immediate tactical value to improve their most scrutinized operational outcome.

Data Sources
  1. Internal Customer Storm Response Data - restoration times, crew dispatch patterns, asset visibility capabilities
  2. EIA Form 861 - service territory identification, customer counts by metro region

The message:

Subject: Your storm response 40% slower than metro area Benchmarked major outage response across 22 utilities in your metro region - your average restoration time is 40% above area median. The 8 utilities with real-time asset visibility restored power 4.2 hours faster during the November storm. Want the storm response analysis showing crew dispatch patterns?
DATA REQUIREMENT

This play requires aggregated storm response metrics (restoration times, crew dispatch patterns) across 22+ utilities in the same metro region during a recent major outage event.

This synthesis of actual performance data from comparable utilities is proprietary - only you have visibility into how peer utilities performed during the same weather event.
PVP Public + Internal Strong (9.1/10)

Your Outage Response 22 Minutes Slower Than Peers

What's the play?

Use internal benchmarking data from regional utility customers to show prospects exactly how their outage-to-crew-dispatch time compares to peer utilities. Quantify the specific time gap and offer detailed process analysis to identify where delays occur.

Why this works

Outage response time is the single most visible KPI for infrastructure operations managers. When you tell them "you're 22 minutes slower than regional peers" with specific numbers (53 minutes vs. 31 minutes), you're exposing a performance gap they feel daily but may not have quantified. The offer of a benchmark showing dispatch process gaps provides immediate diagnostic value to improve their #1 operational metric.

Data Sources
  1. Internal Customer Performance Data - emergency response times, dispatch metrics across regional utilities
  2. EIA Form 861 - regional utility identification, service territory data

The message:

Subject: Your outage response 22 minutes slower than peers Compared emergency response times across 38 regional utilities - your average outage-to-crew-dispatch is 22 minutes above median. The 12 utilities with integrated asset visibility average 31-minute dispatch times vs. your 53 minutes. Want the anonymized benchmark showing dispatch process gaps?
DATA REQUIREMENT

This play requires aggregated outage response metrics (outage-to-crew-dispatch times, process steps) across 38+ regional utility customers with performance measurement capabilities.

This benchmarking data is proprietary to your customer base - competitors cannot provide regional peer comparisons without similar implementation scale.
PVP Public + Internal Strong (8.8/10)

5 Utilities in Your State Finished Last Quarter

What's the play?

Use recent customer implementation data from the same state to show prospects realistic deployment timelines and ROI achievement. Provide concrete evidence that peer utilities completed full asset digitization and hit positive ROI within specific timeframes.

Why this works

State-level specificity makes this immediately relevant - these are regulatory peers facing the same compliance environment. The recent timeline (Q4 2024) proves this isn't outdated case study material. The specific ROI timing (month 8) and tangible improvements (38% faster response, reduced truck rolls) help the prospect build an internal business case using data from utilities they likely know by name. The phasing approach offers practical implementation guidance they can use today.

Data Sources
  1. Internal Customer Deployment Data - implementation timelines, ROI measurements, operational improvements from 5+ state utilities
  2. EIA Form 861 - state utility identification, service territory verification

The message:

Subject: 5 utilities in your state finished last quarter Five utilities in your state completed full asset digitization in Q4 2024 - average deployment was 11 months from kickoff to completion. Their combined ROI hit positive at month 8 through reduced truck rolls and faster fault isolation. Want the deployment timeline comparison showing their phasing approach?
DATA REQUIREMENT

This play requires tracked deployment timelines, ROI metrics, and operational improvement measurements from 5+ utility customers in the same state.

This state-specific implementation data with measured outcomes is proprietary to your customer base - competitors cannot provide this regional peer evidence.
PVP Public + Internal Strong (8.7/10)

Your Capital Planning Cycle 9 Weeks Longer

What's the play?

Benchmark capital project approval timelines across utility customers to identify prospects with abnormally long planning-to-approval cycles. Show exactly how much time they're losing compared to peer utilities and quantify the financial impact of delayed projects.

Why this works

Capital project delivery timeline is a critical KPI for infrastructure directors. A 9-week delay in planning-to-approval directly impacts project delivery, budget utilization, and executive visibility. The $12M in accelerated projects quantifies the financial upside of process improvement. The offer of a process comparison showing "where your 9 weeks are lost" provides immediate diagnostic value to identify bureaucratic bottlenecks.

Data Sources
  1. Internal Customer Project Data - capital project approval timelines, process steps, completion rates across 29+ utilities
  2. FERC Form 1 - plant in service valuations, capital investment patterns

The message:

Subject: Your capital planning cycle 9 weeks longer Benchmarked capital project timelines across 29 utilities - your planning-to-approval cycle averages 17 weeks vs. 8-week median. Utilities with integrated asset models reduced cycle time by 58% and accelerated $12M in deferred projects. Want the process comparison showing where your 9 weeks are lost?
DATA REQUIREMENT

This play requires aggregated capital project approval timelines and process analysis across 29+ utility customers with project tracking capabilities.

This benchmarking data showing actual planning cycle bottlenecks is proprietary to your implementation base - competitors cannot provide this without similar project visibility.
PVP Public + Internal Strong (8.6/10)

Your Engineering Team Routing 340 Data Requests Monthly

What's the play?

Analyze work order and data request patterns across utility customers to identify prospects with abnormally high internal data request volumes. Show exactly how much engineering overhead they're burning on manual data fulfillment compared to utilities with integrated platforms.

Why this works

Engineering directors feel the pain of constant data requests from field crews but may not have quantified the volume. When you tell them "340+ internal data requests monthly," you're surfacing a metric they haven't measured but immediately recognize as a massive time sink. The 78% reduction and $180K annual savings tie directly to their budget concerns. The workflow analysis showing where requests originate provides immediate tactical value to justify platform investment or headcount decisions.

Data Sources
  1. Internal Customer Work Order Data - data request volume, request patterns, engineering overhead analysis across 41+ utilities
  2. EIA Form 861 - utility size, operational complexity indicators

The message:

Subject: Your engineering team routing 340 data requests monthly Benchmarked work order systems across 41 utilities - yours processes 340+ internal data requests monthly from field crews. Utilities with integrated asset platforms reduced those requests by 78% and cut engineering overhead by $180K annually. Want the workflow analysis showing where your requests originate?
DATA REQUIREMENT

This play requires analysis of work order ticket volume, data request patterns, and engineering overhead across 41+ utility customers with work order system integration.

This benchmarking data showing internal request patterns and efficiency gains is proprietary to your customer base - competitors cannot quantify these operational savings without similar implementation data.
PVP Public + Internal Strong (8.4/10)

3 Utilities in Your Region Finished Digitization

What's the play?

Use geographic proximity (200 miles) to make recent customer implementations feel immediately relevant. Show prospects that nearby utilities completed full asset digitization and achieved measurable operational improvements.

Why this works

Geographic specificity (200 miles, "your region") makes this feel like hyper-local intelligence rather than generic marketing. The recent timeline (past 9 months) proves this isn't outdated case study material. The 38% outage response improvement ties directly to the prospect's #1 KPI. The competitive angle ("puts you at disadvantage for storm response contracts") creates urgency by framing this as a competitive threat. The deployment timeline case study offers practical implementation guidance.

Data Sources
  1. Internal Customer Deployment Data - implementation timelines, operational improvements from 3+ regional utilities
  2. EIA Form 861 - utility location, service territory for proximity calculation

The message:

Subject: 3 utilities in your region finished digitization Three utilities within 200 miles of you completed full asset digitization in the past 9 months - their average outage response improved by 38%. Your current paper-based records put you at competitive disadvantage for storm response contracts. Want the case study showing their deployment timelines?
DATA REQUIREMENT

This play requires tracked deployment timelines and measured outage response improvements from 3+ utility customers within the same geographic region.

This regional peer data with specific operational outcomes is proprietary to your customer base - competitors cannot provide this hyperlocal implementation evidence.
PVP Public + Internal Strong (8.1/10)

Your Digitization Lag vs. 47 Peer Utilities

What's the play?

Use aggregated asset digitization completion rates from your customer base to show prospects exactly how they compare to peer utilities. Quantify the time lag and translate it into concrete operational costs and performance impacts.

Why this works

The specificity of "47 utilities" and "18 months behind median" feels like genuine research, not marketing fluff. The quantified cost implications ($2.3M maintenance costs, 40 minutes extended outage response) help the prospect justify budget internally. The low-commitment ask (just send the anonymized benchmark report) makes it easy to engage. The data synthesis requiring internal benchmarking only you have passes the "how did they know that" test.

Data Sources
  1. Internal Customer Digitization Data - asset discovery completion rates, digitization timelines across 47+ utilities
  2. EIA Form 861 - utility size, asset counts, service territory for peer comparison

The message:

Subject: Your digitization lag vs. 47 peer utilities We benchmarked 47 utilities' asset digitization rates - yours is tracking 18 months behind the median completion timeline. That gap typically adds $2.3M in avoidable maintenance costs and extends outage response by 40 minutes. Want the anonymized benchmark report showing where you stand?
DATA REQUIREMENT

This play requires aggregated asset digitization completion rates (median % of assets digitized at 6/12/18-month milestones) across 47+ utility implementations.

This benchmarking data is proprietary to your customer base - only you have visibility into actual digitization timelines across comparable utilities.

What Changes

Old way: Spray generic messages at job titles. Hope someone replies.

New way: Use proprietary benchmarking data from your customer base to show prospects exactly how they compare to peer utilities in their region.

Why this works: When you lead with "Your outage response is 22 minutes slower than 38 regional peers" instead of "I see you're hiring for operations roles," you're not another sales email. You're the person with insider data they can't get anywhere else.

The messages above aren't templates. They're examples of what happens when you combine internal performance benchmarks with specific utility situations. Your team can replicate this using aggregated customer data from your implementation base.

Data Sources Reference

Every play traces back to internal benchmarking data combined with public utility identification. Here are the sources used in this playbook:

Source Key Fields Used For
Internal Customer Storm Response Data restoration times, crew dispatch patterns, asset visibility capabilities Benchmarking storm response performance across regional utilities
Internal Customer Performance Data emergency response times, dispatch metrics, outage-to-crew-dispatch times Comparing outage response performance across utilities
Internal Customer Deployment Data implementation timelines, ROI measurements, operational improvements, phasing approaches Showing realistic deployment timelines and ROI achievement
Internal Customer Project Data capital project approval timelines, process steps, completion rates Benchmarking capital planning cycle efficiency
Internal Customer Work Order Data data request volume, request patterns, engineering overhead analysis Quantifying internal data request overhead and efficiency opportunities
Internal Customer Digitization Data asset discovery completion rates, digitization timelines, milestone tracking Benchmarking asset digitization progress across utilities
EIA Form 861 - Annual Electric Power Industry Report utility_name, service_territory_states_counties, distribution_circuits_count, customer_counts_by_sector Utility identification, service territory verification, peer grouping by region and size
FERC Form 1 - Electric Utility Annual Report (PUDL) plant_in_service_valuations, depreciation_by_function, transmission_lines_schedule_422 Capital investment patterns, asset valuation context