Listening Smarter: Building Better Patient Experience Surveys
Oct. 31, 2025Patient experience surveys have evolved greatly over time — from yesteryear's suggestion boxes placed in hospital lobbies and informal face-to-face interviews to today's standardized, nationally mandated tools linked to CMS reimbursements, national recognition and awards, and better clinical experiences and outcomes.
Now Houston Methodist is working to take patient surveying to the next level. The idea is to develop future technology that will yield even more useful predictive analytic models.
"We're building tools that don't currently exist," says Courtenay Bruce, associate chief experience officer at Houston Methodist and the project leader. "We're beta testers, using our hospital data, implementing the new technology and then refining it. We're the products' co-creators and guinea pigs."
Houston Methodist has long prioritized listening to patients and families, well before today's data-driven surveys became standard. For example, the health system partnered with vendors to send timely email and text reminders, along with educational content, to help patients prepare for surgeries and stay engaged afterward. These efforts have led to strong patient adherence and fewer readmissions, reflecting Houston Methodist's commitment to proactive, compassionate care.
Foremost among the new innovations is a digital rounding platform that records patient's real-time feedback when clinicians check in on them and that Houston Methodist has expanded and customized to cover issues that align with survey topics and quality measures.
A machine-learning platform that analyzes free-text comments from patient experience surveys and organizes them into themes and subthemes, has also proved invaluable at the hospital.
"Before we had access to this platform, reviewing and categorizing thousands of patient comments was a time-intensive process prone to delays and human error," says Lindsay Phend, director of Patient Experience Systems & Operations at Houston Methodist. "You'd pull everything into a spreadsheet, try to categorize the different themes, what looks like a trend. This tool does it for you in a matter of clicks."
But surveys have their limitations. Bruce yearns for a metric, for instance, to show an association between patient experience "responsiveness" scores (how patients score staff on the question of how quickly we respond to restroom needs and other requests) and, say, the speed at which staff respond when a patient falls getting out of their hospital bed. Currently, there isn't conclusive data to show a connection.
Bruce's projects involve predictive tools using machine learning and AI to enhance the patient's hospital experience.
Existing predictive models, based on a handful of demographic considerations, "were no better than our gut," says Bruce. So she came up with another 10 variables — how long an ER patient had to wait for a bed, how long before they could resume eating after surgery, how long they've been in the hospital — that she thought might yield greater insights about a patient's experience.
Such insights detect those patients who might have encountered a gap in their experience. These moments, while often small, can feel significant to the patient.
"Our goal is to compassionately support patients who may be experiencing moments of uncertainty or discomfort during their stay," says Bruce. "By bringing awareness to these situations, we empower our care teams to respond with empathy and intentional communication. Often, these are simple, human-centered touches. A kind word, a timely update, or a thoughtful gesture can make a meaningful difference in how a patient feels cared for."
The difference in the two models: The new one is accurate 80% of the time. The old: 26%.
Houston Methodist will run the new model information through the rounding program so that clinicians know where they need to have more intentional touchpoints, where they need to make the patient experience stronger.
The importance of such contact already is apparent with the existing rounding platform, where staff round on patients to elicit their feedback. Eighty percent of those who felt they were rounded on gave their experience Houston Methodist a high overall rating, compared with 70% of those who did not perceive rounding.
The other predictive tool under development is one anticipating patient experience three months in advance. The old model, based on historical trends involving factors like seasonality, volume and composition of surgery patients, is accurate some 99% of the time, but that's mostly because its predictions are fairly intuitive — for instance, that ER waits will be longer in February.
That doesn't exactly tell hospital officials anything they didn't already know. So Houston Methodist introduced new data sets into the model to make the AI system more sophisticated. As a result, the model is now better at identifying the factors causing problems and the actions that can be taken. It's also better at detecting patterns.
"What we've done is essentially identified shouts and whispers," says Bruce. "Shouts are things the data is saying are highly visible to patients, they're screaming at us that we need to improve them. But it's also something we can probably improve pretty quickly because it's usually something that's specific to the individual and so feels more controllable. Whispers are things you want to get on top of because they're not temporary blips, they're usually systemic. If you don't address it, it's going to impact patients' experience."
Bruce says her approach to innovation is to help both clinicians and patients.
"I think we're learning and becoming more sophisticated in improving the patient experience and concentrating our time and effort to make the lives of patients and clinicians easier," says Bruce. "With each sort of iterative change we make, we get closer and closer."