The worst vendor performance problems aren't the sudden ones. A vendor who disappears overnight or whose lead quality collapses visibly over two weeks gets noticed fast. The real damage comes from the slow decline — the gradual, month-over-month erosion that stays just below the threshold of obvious concern while steadily consuming budget that should be going somewhere more productive.
This scenario plays out at PI firms more often than anyone likes to admit. Here's how it typically unfolds.
Related guide: See our complete guide to evaluating PI lead vendors — the 7 metrics that define vendor quality and how to build a vendor scorecard.
How a Slow Decline Starts
Vendor performance degradation rarely has a single, identifiable cause. It's usually a confluence of factors: market saturation in a geography, increased competition for the same leads, changes in the vendor's own lead sourcing strategy, or shifts in how the vendor is managing call routing and filtering.
In month one of the decline, the numbers look roughly normal. Lead volume might be down 8%, which is within the range of normal monthly variation. The conversion rate might be off slightly, but it's within noise. You're spending $40,000 with this vendor and getting about what you expect.
In month two, volume is down another 10% and conversion is trending slightly lower. But you've had slow months before. The vendor is still sending “qualified” leads by their standards. Nothing triggers an alert.
By month four, you're spending the same $40,000 per month but your cost per signed case from this vendor has risen from $1,800 to $2,900. The signed case count is down meaningfully. But because each month looks only marginally worse than the month before — not dramatically worse — the pattern doesn't surface as a problem.
Why the Pattern Stays Hidden
Slow vendor decline is particularly hard to detect for several structural reasons.
Monthly Reporting Has Too Much Noise
A single month's performance data for any vendor contains natural variation. Holidays, seasonality, and case-specific factors can move conversion rates 15 to 20% in either direction without indicating any real performance change. When you're evaluating a vendor month by month, a 15% drop looks like noise. Over four months, it's a 50% degradation — but you never see the cumulative view if you're only looking at the most recent month.
Spreadsheet Reporting Doesn't Surface Trends
Most PI firms track vendor performance in a spreadsheet or a simple monthly report. Those formats show you current numbers — this month's leads, this month's cases, this month's cost per case. They rarely show you a rolling trend over the past six months in a format that makes a gradual decline visually obvious.
If you have to manually compare month-over-month data across a tab in a spreadsheet, you're unlikely to catch a 10% month-over-month decline before it compounds into something significant.
The Attribution Lag Hides Case Conversion Problems
Because cases take time to sign after a lead arrives — and because reporting is often done at month-end — there's a built-in delay between when lead quality declines and when that decline shows up in your signed case numbers. A vendor whose lead quality started degrading in September might not show up as having declining conversion rates until your October or November reporting, because October's signed cases still include some September leads that were in the pipeline.
Good Vendor Relationships Create Blind Spots
When you have a good working relationship with a vendor — you've worked with them for two years, they respond quickly, they occasionally make credit adjustments when you push back — there's a human tendency to give them the benefit of the doubt. “It's been a slow month.” “The market is tough right now.” “They've been solid historically.” These are all reasonable statements that can delay the recognition of a real performance problem.
What the Cost Looks Like
Let's make the numbers concrete. A vendor degrading from $1,800 to $2,900 cost per case over four months, at $40,000 per month in spend:
- Month 1 (baseline): $40,000 spend → 22 signed cases at $1,818/case
- Month 2: $40,000 spend → 19 signed cases at $2,105/case
- Month 3: $40,000 spend → 17 signed cases at $2,353/case
- Month 4: $40,000 spend → 14 signed cases at $2,857/case
Over four months, you signed 72 cases from this vendor instead of the 88 you would have received at baseline performance. At an average case value of $15,000 in attorney fees, that's approximately $240,000 in lost case value — while spending the same $160,000. You didn't lose the money all at once, which is why it didn't feel like an emergency. But it was still $240,000 in forgone revenue.
Lost Cases
16
vs. baseline performance
Lost Case Value
$240K
at $15K avg attorney fees
Same Budget Spent
$160K
over 4 months
How It Typically Gets Discovered
Most slow vendor declines surface through one of three mechanisms — none of them ideal.
The first is a quarterly or annual review where someone looks at rolling performance data and notices that a vendor's cost per case has been climbing steadily. By the time this surfaces in a quarterly review, it's been happening for three months.
The second is an intake team complaint. Frontline intake staff often notice declining lead quality before the numbers show it in reports. “We've been getting a lot of calls from Vendor X that aren't great quality lately” is a signal worth taking seriously — but it's an informal feedback loop, not a systematic measurement.
The third is an outside prompt: a contract renewal, a vendor calling to discuss the relationship, or someone deciding to audit one specific vendor for unrelated reasons. These are random triggers, not systematic monitoring.
What Would Catch It Earlier
The solution to slow vendor decline isn't complicated in concept — it's weekly or bi-weekly tracking of cost per signed case by vendor, with a trailing three-month trend view. That format makes gradual declines visible as patterns rather than noise.
The implementation challenge is that most firms aren't set up to produce that level of tracking without significant manual effort. When vendor performance review is a three-hour manual process, it happens monthly at best. When it's automated and takes five minutes to review, it happens weekly — and four-month declines get caught in week four, not month four.
The firms most likely to catch slow vendor declines early are the ones that have connected their marketing spend data to their case outcome data in a system that updates regularly. Whether that's a well-built spreadsheet reviewed weekly or a purpose-built attribution platform, the underlying requirement is the same: consistent data + consistent review cadence = problems caught early.
