Back to Blog
Revenue Intelligence5 min read2026-04-01

The 5 Questions Every PI Managing Partner Should Ask About AI

Your marketing director just sent over a proposal for an AI-powered analytics tool. Your intake manager forwarded an article about "predictive lead scoring."…

The 5 Questions Every PI Managing Partner Should Ask About AI

Your marketing director just sent over a proposal for an AI-powered analytics tool. Your intake manager forwarded an article about “predictive lead scoring.” And last week, a vendor pitched you on “machine learning for case valuation.” The budget requests are increasing. The clarity is not.

You are not opposed to AI. You are opposed to writing checks for technology your team cannot explain in plain business terms. That is a reasonable position. The problem is that most AI pitches are designed to make you feel like saying no means falling behind — when in reality, saying yes without asking the right questions is the bigger risk.

Here are five questions that will separate a sound investment from an expensive distraction. Each one includes what a good answer sounds like and what should make you push back.

Question 1: What Specific Decisions Will This Help Us Make Better?

This is the question that kills 80% of bad AI proposals on the spot. If the tool does not connect directly to a decision your firm makes regularly — where to allocate budget, which vendors to keep, how to staff intake — it is a solution without a problem.

AI tools are not inherently valuable. They are valuable when they improve the quality or speed of a decision that has real financial consequences. A tool that helps you identify which lead vendors are delivering cases that actually settle above $150,000 — that changes a budget decision. A tool that “uses AI to visualize your data” does not change anything if you are still guessing about what the data means.

What to Listen For

Red-Flag Answers

  • "It gives you better insights into your data"
  • "It uses machine learning to find patterns"
  • "It will transform how your team works"
  • "Everyone in your space is adopting this"

Good Answers

  • "It tells you which vendors to increase budget for and which to cut, based on cost per settled case"
  • "It flags when a lead source's conversion rate drops below your threshold before you waste a full month of spend"
  • "It predicts which case types will settle highest by source so you can allocate accordingly"

The pattern is straightforward. Good answers name the decision. Red-flag answers name the technology. You are buying an outcome, not an algorithm.

Question 2: What Data Does This Need That We Don't Currently Have?

Every AI tool has a data prerequisite. If your team cannot clearly articulate what data the tool needs, where that data lives today, and what gaps exist — the implementation will stall. You will pay for a platform that sits half-configured while your marketing director spends three months trying to get clean data into it.

This question matters more than most partners realize. A firm that tracks cost per lead in a spreadsheet but has no systematic connection between lead source and settlement outcome is not ready for AI-powered attribution. That is not a criticism. It is a sequencing issue. You need a data foundation before you build on top of it.

What to Listen For

Red-Flag Answers

  • "It works with whatever data you have"
  • "We just need to connect it to your CRM and it handles the rest"
  • "The AI figures out what data matters"

Good Answers

  • "It needs lead source, cost data, signed case status, and settlement amounts — we currently have the first three in LeadDocket but settlement data is tracked separately in our case management system"
  • "We will need 6 months of historical cost-per-vendor data and intake disposition records to baseline"
  • "There is a 30-day integration period to connect the data sources before reporting is accurate"

A vendor who tells you their tool “works with whatever you have” is either oversimplifying or selling you something that produces unreliable output. The honest answer is always specific about what is needed and what gaps need closing first.

Question 3: How Will We Measure Whether It's Working?

If you cannot define success before you buy the tool, you will not be able to evaluate it after. This is where many AI investments quietly become permanent line items — nobody can prove they are working, but nobody wants to admit the investment was a mistake, so the subscription renews indefinitely.

The success metric should connect to something your firm already measures and cares about. Cost per case. Marketing spend efficiency. Time spent on reporting. Vendor retention decisions. If the proposed metric is something you have never tracked before and cannot validate independently, you have no way to hold the tool accountable.

What to Listen For

Red-Flag Answers

  • "You'll see the value once you start using it"
  • "Our clients typically see improvement across the board"
  • "The dashboard will show you ROI metrics"

Good Answers

  • "Within 90 days you should see a measurable reduction in blended cost per case — we benchmark against your current numbers at implementation"
  • "We measure success by whether you can reallocate at least 15% of vendor spend based on data the platform surfaces"
  • "Your marketing director should go from 15 hours per week on reporting to under 2 hours — that is a trackable time savings"

The best tools commit to a specific, time-bound outcome and give you a way to verify it independently. If the vendor cannot tell you what success looks like in concrete terms, they are not confident in their own product.

Question 4: What Happens to Our Process if the Tool Goes Away?

This is the dependency question, and it is the one most partners forget to ask. If you build your marketing accountability process around a platform that owns all the logic, all the data transformations, and all the reporting — what happens if the vendor raises prices 40%? What happens if they get acquired? What happens if the tool simply does not deliver and you need to walk away?

You are not looking for zero dependency. Any useful tool creates some dependency. You are looking for whether the tool strengthens your team's capabilities or replaces them entirely. The difference matters when something changes.

What to Listen For

Red-Flag Answers

  • "You won't want to leave once you see the results"
  • "Our proprietary algorithm is what makes this work"
  • "All your data lives in our platform"

Good Answers

  • "Your data stays in your systems — we read from your CRM and case management tools, we do not replace them"
  • "If you cancel, you keep all historical reports and exports — nothing is locked behind our paywall"
  • "The process we help you build — tracking cost per case by vendor through settlement — is yours regardless of whether you use our tool or a spreadsheet"

The right tool makes your team better at a process they own. The wrong tool makes your team dependent on a process they cannot replicate.

Question 5: What Is the Realistic Payback Timeline?

You run a law firm. You understand that investments have payback periods. A new associate does not generate net revenue on day one. A marketing campaign does not produce settled cases in the first quarter. AI tools are no different — and any vendor who tells you otherwise is selling a fantasy.

The key word here is “realistic.” A 60-to-90-day payback timeline on a revenue intelligence tool is credible if the firm is spending $200,000 or more per month on lead generation, because even a small reallocation based on better data can recover the platform cost quickly. A “see ROI in 30 days” claim for any analytics tool should raise questions.

Payback Timeline Reality Check
Red-Flag ClaimRealistic Expectation
Time to first usable reportDay 130–45 days after integration
Time to first budget reallocation decisionWeek 160–90 days with baseline data
Time to measurable cost per case improvement30 days90–180 days (settlement lag dependent)
Time to full ROI paybackImmediate6–12 months for complete attribution cycle

Be skeptical of compressed timelines. Be equally skeptical of vendors who cannot give you a timeline at all. The honest answer includes dependencies — it depends on your data readiness, your team's adoption, and the settlement cycle for your case types. Anyone who ignores those variables is not doing the math.

The Meta-Question: Is Your Data Foundation Ready for AI?

Here is the question behind all five questions. Before you evaluate any AI tool, ask whether your firm has the data infrastructure to support it.

If you cannot tell your marketing director today what your cost per signed case is for each vendor — broken down by case type, tracked over the last 12 months, with at least partial settlement data attached — then an AI tool is not your next step. Your next step is building the data foundation that makes AI useful.

This is not a sales pitch for waiting. It is a sequencing argument. The firms that get the most value from AI in marketing attribution are the ones that already have clean, connected data flowing from lead source through intake through settlement. AI accelerates what is already working. It does not fix what is broken.

Think of it this way: buying an AI analytics tool without a data foundation is like buying a car without fuel. The car is real. The engineering is sound. But you are not going anywhere until the tank is full.

Start with the data. Connect your lead sources to your intake outcomes to your settlement results. Get cost per case by vendor into a system that updates automatically — not a spreadsheet that requires 15 hours a week. Once that foundation exists, the right AI tool will be obvious, because you will know exactly what decision you need it to improve.

And when your marketing director brings you the next AI proposal, you will have five questions ready. The ones who can answer all five clearly deserve a serious conversation. The ones who cannot should come back when they can.

Related guide: See our complete Managing Partner's Guide to Marketing ROI — what to ask, what to measure, and how to know if your marketing spend is producing a return.

Related guide:For the full Revenue Intelligence framework behind this piece, read our pillar:Revenue Intelligence for PI Firms — covering Performance, Intake, Source, and Financial Intelligence, plus the maturity assessment every firm should run.

See it in action

Discover how RevenueScale tracks cost per case from click to settlement.

Book a Demo

Want to see Revenue Intelligence in action?

See how RevenueScale connects your marketing spend to case outcomes — so you can cut waste, scale winners, and prove ROI to partners.