Most PI firms buy revenue intelligence the way they buy office furniture: someone sees a demo, likes the look of it, and signs a contract. Six months later, the platform is half-implemented, the marketing director is still pulling numbers from spreadsheets, and the managing partner is asking why the firm is paying $4,000 a month for a tool nobody uses.
A structured evaluation process prevents this. It takes more time upfront — typically 4 to 6 weeks — but it dramatically increases the odds that the platform you choose will actually deliver the 15 to 20% marketing ROI improvement that these tools promise.
Here's how to run one, step by step.
Step 1: Assemble the Right Evaluation Team
Revenue intelligence touches marketing, intake, finance, and firm leadership. If only one person evaluates it, you'll miss requirements that surface later — usually during implementation, when they're expensive to address.
Your evaluation team should include:
- Marketing director or leader — primary user, owns vendor relationships and budget allocation. This person will live in the platform daily.
- Intake manager — understands lead flow, knows where data breaks down between lead capture and case signing.
- Managing partner or COO — approves budget, needs to understand what reports will look like and how cost per case data will inform partner meetings.
- IT or operations lead (if you have one) — evaluates integration complexity, data security, and system compatibility.
Assign one person as the evaluation lead. That's usually the marketing director. Their job is to coordinate schedules, gather feedback, and keep the process moving.
Step 2: Define Your Requirements Before You See a Single Demo
This is where most firms skip ahead. They schedule demos with three vendors in the same week, watch impressive presentations, and then try to remember which one had the feature they cared about. Don't do that.
Before any demo, create a requirements document. It doesn't need to be elaborate — a shared spreadsheet works. Organize it into three tiers:
Must-Have Requirements
These are non-negotiable. If a vendor can't meet them, they're out.
- Cost per case tracking from lead to settlement
- Native integration with your case management system (e.g., LeadDocket, Filevine)
- Vendor-level performance reporting (not just channel-level)
- Settlement-cycle attribution that handles your 6- to 18-month timeline
- Data export capability — you own your data
Important Requirements
These significantly improve value but aren't dealbreakers.
- Automated alerts when vendor performance drops
- Budget vs. actual tracking across vendors
- Intake performance metrics by source
- Configurable dashboards for different stakeholders
- Multi-location reporting (if applicable)
Nice-to-Have Requirements
These differentiate vendors when two are otherwise close.
- Predictive analytics for lead volume
- Automated vendor scorecard generation
- API access for custom reporting
- Case type and geography breakdowns
| Tier | Examples | Rule | |
|---|---|---|---|
| Must-Have | CPC tracking, CMS integration, settlement attribution | Fail = vendor eliminated | |
| Important | Automated alerts, budget vs. actual, multi-location | Score 1-5, weighted 2x | |
| Nice-to-Have | Predictive analytics, API access, auto scorecards | Score 1-5, tiebreaker |
Step 3: Identify Your Shortlist
Don't evaluate more than three vendors. Two is even better. The quality of your evaluation degrades with every additional option — demo fatigue is real, and your team's time is limited.
To build your shortlist, look for vendors that specifically serve PI firms. Generic legal tech platforms and repurposed SaaS analytics tools will look good in a demo but struggle with the PI-specific challenges: settlement delays, vendor grading, intake integration, and the multi-system data problem.
Ask each vendor to complete a brief capabilities questionnaire based on your must-have requirements before scheduling a demo. This eliminates vendors who can't meet your baseline — and it saves everyone's time.
Step 4: Run Structured Demos
Here's where most evaluation processes go wrong: the vendor controls the demo. They show their best features, use perfect sample data, and steer the conversation away from limitations. You need to reverse that dynamic.
Before each demo, send the vendor a list of scenarios you want to see. These should be based on your actual operations:
- “Show me the cost per case for a vendor where we spend $40,000 per month and received 85 leads last month, 22 of which signed, and 3 of which have settled so far.”
- “Show me what happens when a lead comes in without a source tag. How is it handled? Where does it appear in reports?”
- “Show me the report I would present to my managing partner at our monthly budget review.”
- “Show me how you handle a vendor whose cost per lead is low ($180) but whose cost per signed case is high ($4,800) because their conversion rate is poor.”
Have every stakeholder on the call attend the same demo for each vendor. Comparing notes from separate demos is unreliable.
Step 5: Score Each Vendor Consistently
After each demo, have every team member score the vendor independently before discussing. This prevents anchoring — where the most vocal person in the room shapes everyone else's opinion.
Use a simple scoring framework:
- Must-have requirements: Pass/Fail. If any must-have fails, the vendor is eliminated regardless of other scores.
- Important requirements: Score 1 to 5. Multiply by 2 for weighted scoring.
- Nice-to-have requirements: Score 1 to 5. No multiplier.
- Ease of use: Score 1 to 5. Could your marketing director use this daily without IT support?
- PI-specific depth: Score 1 to 5. Did this feel like a platform built for PI, or a generic tool with a PI label?
- Team confidence: Score 1 to 5. How confident is your team that this vendor will deliver what they promised?
Total the weighted scores. The math won't make the decision for you, but it will surface where your team agrees and where opinions diverge.
Step 6: Check References — the Right Way
Every vendor will give you their happiest customers as references. That's expected. Your job is to ask questions that go beyond “Do you like the platform?”
Questions that actually reveal truth:
- “How long did implementation take from contract signing to your first useful report? Was it what you expected?”
- “How many hours per week does your team spend in the platform? Is that more or less than you expected?”
- “Have you made a specific budget decision — cutting a vendor or increasing spend — based on data from this platform? What was it?”
- “What's the one thing you wish the platform did better?”
- “If you were starting over, would you choose this vendor again?”
If a reference can point to a specific dollar amount they saved or reallocated — say, “We cut $35K per month from a vendor that looked good on cost per lead but was terrible on cost per case” — that's a strong signal.
Step 7: Negotiate a Pilot Period
This is the step that protects you the most. Before committing to an annual contract, ask for a 60- to 90-day pilot. Not a free trial — a paid pilot with defined success criteria.
Success criteria for a PI revenue intelligence pilot should include:
- Integration with your case management system is live and syncing data accurately within 30 days.
- Your marketing director can pull a vendor-level cost-per-case report without help from the vendor's support team by day 45.
- At least one actionable insight — a vendor to cut, a budget to reallocate, a performance trend to investigate — surfaces by day 60.
If the vendor won't agree to a pilot, that's not automatically a disqualifier — some vendors have legitimate reasons for annual commitments, especially when implementation costs are high. But it should prompt a conversation about what happens if the platform doesn't meet expectations.
Common Mistakes in PI Revenue Intelligence Evaluations
After watching firms go through this process, the most common mistakes are predictable:
- Evaluating features instead of outcomes. The question isn't “Does it have 47 dashboard widgets?” The question is “Will this show me which of my 8 vendors is wasting $30K per month?”
- Letting the demo drive the conversation. Every demo looks good when the vendor controls the narrative. Your scenarios, your data, your questions.
- Ignoring the settlement delay problem. If you don't specifically ask how the platform handles cases that take 12+ months to settle, you won't discover the limitation until you're already committed.
- Skipping the intake manager's input. Your intake team knows where data breaks down. They know which vendors send leads without proper tagging. Their perspective is critical.
Week 1
Assemble evaluation team, document requirements, identify 2-3 vendors
Week 2
Send capabilities questionnaires, schedule demos
Week 3
Run structured demos with full team, score independently
Week 4
Compare scores, conduct reference checks
Week 5
Negotiate terms, define pilot success criteria
Week 6
Final decision and contract signing
The Timeline That Works
A well-run evaluation takes 4 to 6 weeks. Here's a realistic timeline:
- Week 1: Assemble team, document requirements, identify shortlist of 2 to 3 vendors.
- Week 2: Send capabilities questionnaires, schedule demos.
- Week 3: Run demos with full team, score independently.
- Week 4: Compare scores, conduct reference checks.
- Week 5: Negotiate terms, define pilot success criteria.
- Week 6: Final decision and contract signing.
Yes, this takes longer than signing up after a single impressive demo. But for a tool that will measure whether your $200K to $500K monthly marketing spend is actually working, six weeks of due diligence is a rounding error. The cost of choosing wrong — lost months, lost data, lost confidence from your partners — is far higher.
Related guide: See our complete guide to revenue intelligence for PI firms — the four layers, the maturity model, and what RI replaces in your current stack.
