How We Picked an AI Partner for Our Practice
Eighteen months ago, our practice decided we needed help with AI. We weren’t looking for robots to replace clinicians — we wanted smarter scheduling, better data analytics, and some relief from the administrative overhead that was eating our staff alive. The question wasn’t whether to bring in an AI partner. It was how to choose one without wasting time and money.
I’m writing this because the process was harder than I expected, and I think other medical practice owners could benefit from hearing what we learned.
Step One: Defining What We Actually Needed
This sounds obvious but we almost skipped it. Our initial brief was something vague like “we want to use AI to improve efficiency.” That’s not a brief. That’s a wish.
We sat down with our practice manager and our senior billing coordinator and listed every administrative task that consumed more than two hours per week. The list was long: scheduling and rescheduling, CPAP compliance report generation, prior authorisation submissions, referral tracking, patient recall management, and billing follow-up.
Then we ranked them by two criteria: how much time they consumed and how much the current process cost us in errors or delays. Scheduling and billing rose to the top. Patient recall was close behind.
Having that ranked list changed every subsequent conversation. When vendors pitched us features we hadn’t asked for, we could redirect to our priorities. When they couldn’t address our top needs, we could disqualify them quickly.
Step Two: The Vendor Landscape
We talked to seven companies over about six weeks. The range was remarkable. Two were large healthcare IT companies adding AI modules to existing products. Three were startups focused specifically on healthcare AI. One was a general AI consultancy that worked across industries. One was a group we’ve worked with previously on a smaller project who understood our practice context.
Here’s what we evaluated:
Healthcare expertise. We asked each vendor to describe what a typical patient journey looks like in a sleep medicine clinic. Two couldn’t do it. They were out.
Data security and privacy. Non-negotiable. We asked for data processing agreements, security certifications, and specifics about where patient data would be stored. Any vendor who responded with vague assurances rather than documentation was disqualified.
Integration capability. Could the tool connect with our practice management system, our sleep study reporting software, and our CPAP cloud platforms? Custom integrations are expensive and fragile. Native API integrations are vastly preferable.
Pricing transparency. Two vendors had pricing so complex our practice manager couldn’t model the annual cost. That’s a red flag — if you can’t predict what you’ll pay, you can’t budget for it.
References. Speaking with actual users was far more informative than any demo. One reference call revealed a vendor’s implementation took twice as long as promised.
Step Three: The Pilot
We narrowed to two finalists and ran a 60-day pilot with each, focused on scheduling optimisation. We defined success metrics upfront:
- Reduction in no-show rate
- Reduction in scheduling-related phone calls
- Staff satisfaction (measured by simple weekly survey)
- System uptime and reliability
The pilot revealed things that demos never could. One platform had excellent AI capabilities but a clunky user interface that our reception staff found frustrating. The other was less sophisticated algorithmically but intuitive enough that the team adopted it within a week.
We went with the more usable option. Fancy AI that nobody wants to use is worthless.
The Mistakes We Made
We waited too long to involve staff. Our admin team should have been in the room from the first vendor meeting. We brought them in at the pilot stage, which was too late — we’d already wasted time evaluating tools that didn’t address their actual workflow.
We underestimated implementation time. Every vendor said “four to six weeks.” Reality was closer to ten. Build a buffer into your timeline.
We didn’t negotiate hard enough on the contract. Our initial contract locked us in for two years with no performance benchmarks. Insist on a shorter initial term with renewal contingent on agreed metrics.
What I’d Tell Another Practice Owner
Start with your problems, not with the technology. Don’t get seduced by impressive demos — ask whether the tool solves a problem you actually have.
Talk to references. Multiple references. Ask them specifically about implementation challenges, hidden costs, and ongoing support quality.
Run a real pilot with real data and real users. Vendor demonstrations are scripted performances. Pilots reveal reality.
Involve your front-line staff from day one. They’re not obstacles to digital transformation — they’re the people who determine whether it succeeds or fails.
And be patient. The right AI partnership won’t transform your practice overnight. But over six to twelve months, a well-chosen tool with strong support behind it will make a measurable difference to how your practice operates.
We’re about a year into our AI journey now. It hasn’t been flawless, but our scheduling is tighter, our billing is cleaner, and our staff are spending more time on patient care and less time on administrative drudgery. That was the whole point.