Title: User Research for Healthcare PMs: Balancing Compliance and Empathy
TL;DR
Most healthcare PMs fail user research not because they lack empathy, but because they treat compliance as a barrier, not a framework. The strongest candidates align research design with regulatory constraints from day one — not after discovery. In a recent hiring committee at a top-tier digital health company, 7 of 12 PM candidates were rejected despite strong clinical insights because their research plans violated HIPAA-safe harbors or assumed access to PHI without IRB pathways. The ones who advanced didn’t just talk to users — they designed studies that could survive an audit.
Who This Is For
This is for product managers with 3–8 years of experience transitioning into healthcare from consumer tech, fintech, or B2B SaaS — particularly those applying to roles at digital health startups, hospital systems, or health tech divisions of large tech firms. You’ve run usability tests and customer interviews before, but you’ve never had a research protocol rejected by a privacy officer or been asked to justify your data retention plan in a sprint review. If your last product required FDA Class II clearance or handled identifiable health data, this is not beginner guidance — it’s calibration.
How do healthcare PMs conduct user research without violating compliance?
User research in healthcare isn’t scaled-back consumer research — it’s structurally different. The moment you plan to record a conversation with a patient, you’re in regulated territory. In a Q3 debrief at a telehealth unicorn, a hiring manager rejected a candidate’s proposal to conduct “in-home ethnographic studies” because the candidate hadn’t accounted for physical safeguards under HIPAA’s physical access controls. The candidate assumed consent was enough. It wasn’t. Consent governs use — not storage, transmission, or device security.
The difference isn’t in empathy; it’s in architecture. Not emotional depth, but data provenance.
You don’t design the research flow first and compliance second. You design the compliance skeleton — data classification, retention windows, access tiers — and then hang the research methods on it.
For example, a senior PM at Epic once shared how her team wanted to study how nurses documented patient pain levels. Instead of shadowing at the bedside (which would require logging every observer in the EHR audit trail), they used de-identified screen recordings from a sandbox environment. The data wasn’t “real” in the moment, but it was compliant, and the insights were valid.
Three structural constraints to bake into every research plan:
- Data classification: Is the information PHI, PII, or de-identified? If it’s PHI, you must apply HIPAA’s 18 identifiers rule.
2. Data flow: Where does audio, video, or notes go? Is it encrypted in transit and at rest? Is the tool (e.g., Zoom, Notion) BAA-compliant?
- Retention: How long are recordings stored? One PM lost an offer because their research plan kept audio files for 6 months — 150 days longer than the company’s IRB-approved window.
The strongest candidates don’t say “We’ll get a BAA.” They say, “We’ll use pseudonymized IDs, store audio in AWS with KMS encryption, and delete raw files after transcript extraction.”
Not “I want to understand the user” — but “I’ve mapped the research data lifecycle to our SOC 2 controls.”
What research methods actually work under HIPAA and FDA constraints?
Most PMs default to interviews, surveys, and usability tests — but in healthcare, those methods fail if not modified. The method isn’t the problem; the data handling is.
Take surveys: A candidate at a health insurance PM interview proposed sending a 10-question survey to members about chronic care experiences. He planned to use Typeform. The hiring manager stopped him: “Typeform isn’t BAA-capable. Even if you don’t ask for names, a combination of DOB, ZIP, and condition could re-identify someone. You’re collecting PHI by accident.”
The compliant alternative? Use a certified platform like Qualtrics Core with BAA in place — and design questions to avoid the 18 identifiers. Don’t ask for “diagnosis date” — ask “How many months since you started treatment?” Not “provider name” — but “type of provider (e.g., primary care, specialist).”
For interviews, audio recording is the tripwire. At a recent debrief for a mental health app role, a PM candidate wanted to record patient calls to analyze emotional tone. But the company’s IRB only permitted audio with explicit written consent and required deletion within 30 days. The candidate hadn’t planned for either. He was seen as naive — not malicious, but unprepared.
Better approach: Use live note-takers with standardized empathy grids. One PM at a diabetes startup used a two-person team: one led the conversation, the other captured emotional cues on a structured rubric (e.g., frustration spikes at insulin logging, relief when auto-fill reminders fired). No audio, no storage — just timestamped behavioral annotations.
Usability testing is safest when it’s simulated. Real EHR access for research requires authorization, audit logging, and often shadow accounts. At a hospital system PM interview, a candidate proposed having clinicians test a new order-entry flow in a live environment. The panel shut it down: “You can’t test in production without a change control board review.” The winning candidate used a mock EHR built in Figma with simulated patient data — all synthetic, none real.
The pattern: Not raw access, but structured proxies.
Not real data, but high-fidelity simulations.
Not reactive compliance, but design-led constraint.
How do you demonstrate empathy without overstepping privacy?
Empathy in healthcare isn’t about how much you feel — it’s about how much you respect boundaries. The PM who asks the most personal questions isn’t the most empathetic. The one who designs a consent flow that lets patients opt into exactly what they’re comfortable with — that’s empathy in architecture.
In a hiring committee at a remote monitoring company, two candidates interviewed providers about burnout. One asked, “How many patients have you lost to suicide this year?” The other asked, “What parts of documentation feel heaviest during high-stress periods?” The first got detailed stories. The second got systemic insights — and passed the interview.
The problem wasn’t curiosity. It was power asymmetry.
Patients and clinicians know you’re building something. They assume you’ll protect them. When you ask for trauma narratives without a clear data use policy, you breach that trust — even if you mean well.
Empathy signals in compliant research:
- Let participants control recording: “Would you like this session recorded? If so, would you prefer audio-only or notes?”
- Offer tiered participation: “You can join as a speaker, a silent observer, or review a summary afterward.”
- Use third-party facilitators: At a pediatric app trial, the PM hired a clinical social worker to conduct interviews. Not to avoid work — to ensure therapeutic boundaries weren’t crossed.
One PM at a behavioral health startup built a “consent dashboard” for research participants. It showed, in real time, what data had been collected, how it was stored, and when it would be deleted. Patients could revoke access with one click. The engineering lead called it overkill. The compliance officer called it the best empathy tool they’d seen.
Empathy isn’t verbal. It’s operational.
Not “I hear you” — but “I’ve built a system that won’t hurt you.”
How do hiring managers evaluate healthcare PM research skills?
Hiring managers don’t assess research ability by the depth of insights — they assess it by the maturity of constraints. In 30+ PM debriefs I’ve sat in on, the top signal of readiness isn’t clinical knowledge. It’s whether the candidate preemptively addresses data governance.
At a Google Health PM interview last year, a candidate described running 15 patient interviews for a diabetes coaching feature. Strong results. But when asked, “How were recordings stored?” they said, “On my laptop, encrypted.” The room went quiet. No follow-up. They didn’t advance.
Why? Because “encrypted laptop” isn’t a system — it’s a personal habit. Systems are shared, auditable, repeatable. A better answer: “We used a shared AWS S3 bucket with bucket policies enforcing MFA delete, versioning, and lifecycle rules to auto-delete after 45 days.”
Hiring managers look for three judgment signals:
1. Preemptive compliance: Did you mention BAAs, de-identification, or IRB before being asked?
- Data minimalism: Did you collect only what you needed? One PM used synthetic nurse avatars instead of real shift logs — that impressed the panel.
- Audit readiness: Could your research file stand up to a HIPAA audit? If not, it’s a liability.
In a recent debrief for a clinical decision support role, a hiring manager said, “I don’t care if they used Jobs-to-be-Done framework. I care that they knew their user testing required a deviation log because it involved off-label use simulation.”
You’re not being hired to be clever. You’re being hired to not get the company sued.
Interview Process / Timeline: What happens in a healthcare PM research evaluation?
At most health tech companies, the PM interview process has 5 stages — and research judgment is tested in 3 of them.
Take-home assignment (3–5 days)
You’re given a health product problem — e.g., “Reduce missed specialist referrals in primary care.” 80% of candidates submit a discovery plan with patient interviews and clinician shadowing. The top 20% include: data classification matrix, list of BAA-covered tools, and a mock consent form. I’ve seen offers withdrawn because the candidate planned to use non-BAA cloud storage.Behavioral interview (45 mins)
You’ll be asked, “Tell me about a time you conducted user research.” The trap? Focusing on insights. The winning answer structures the story around constraints: “We wanted to record sessions, but PHI rules meant we used live scribes. Here’s how we ensured accuracy…” One candidate at a medtech firm advanced because they mentioned using DICOM anonymization tools during a radiology workflow study.Whiteboard session (60 mins)
You’re asked to design a research plan for a new feature. The hiring manager will interrupt with compliance curveballs: “What if the hospital won’t allow audio recording?” “What if the data is considered research-grade and needs IRB?” The candidates who pivot fastest — to simulations, de-identified proxies, or third-party facilitators — win.Cross-functional review (30 mins)
You present to a privacy officer or clinical lead. This isn’t about product vision — it’s about risk. At a recent session, a PM candidate lost points for saying, “We’ll get consent.” The privacy officer replied, “Consent isn’t a free pass. What’s your data minimization strategy?” The candidate hadn’t prepared.Hiring committee
The final debate isn’t about feature ideas. It’s about whether your research approach introduces enterprise risk. In one case, a candidate with strong clinical insights was rejected because their plan stored patient quotes in Notion — a tool without BAA at that company.
Preparation Checklist: What healthcare PMs must do before interviewing
Map research methods to data risk levels: High-risk (audio, video, identifiable notes) requires BAA, encryption, and retention rules. Low-risk (synthetic data, group summaries) doesn’t. Know the difference.
Memorize the 18 HIPAA identifiers: Not just name and SSN — include facial photos, device IDs, and full ZIP codes. If your research touches any, it’s PHI.
Learn the IRB threshold: Is your study “quality improvement” or “research”? The first doesn’t need IRB; the second does. One PM got rejected for calling a patient feedback loop “QI” when it involved hypothesis testing — a red flag.
Study BAA-capable tools: Know which platforms support BAAs (e.g., AWS, Zoom for Healthcare, Qualtrics) and which don’t (Notion, Google Forms, standard Slack).
Prepare a compliance teardown: Pick a past project and rework it for HIPAA. How would you change it? One candidate brought this as a one-pager — it became the model for the team’s onboarding kit.
Work through a structured preparation system (the PM Interview Playbook covers healthcare research design with real debrief examples from Epic, Oscar, and Verily — including how to structure a compliant discovery sprint).
Mistakes to Avoid
Mistake 1: Assuming consent overrides all rules
BAD: “We got patient consent, so we can record and store freely.”
GOOD: “Consent allows use, but storage and access are governed by HIPAA and internal policies. We encrypted files, limited access to 3 team members, and auto-deleted after 30 days.”
Scene: In a debrief at a health AI startup, a candidate said, “Patients signed forms.” The compliance lead responded, “Forms don’t replace technical safeguards.” Offer withdrawn.
Mistake 2: Using consumer-grade tools for sensitive research
BAD: Storing interview notes in Google Docs or running surveys on SurveyMonkey (without enterprise BAA).
GOOD: Using Notion only if the company has a BAA with Automattic, or defaulting to HIPAA-compliant alternatives.
Scene: A PM at a mental health app used Airtable for session tracking. Didn’t realize it lacked BAA support. The security team flagged it in onboarding. Delayed start by 4 weeks.
Mistake 3: Prioritizing realism over compliance
BAD: Insisting on real EHR access or live patient observation without audit controls.
GOOD: Using Figma prototypes with synthetic data, or partnering with clinical educators to run simulations.
Scene: A candidate wanted to observe ER triage. Didn’t plan for audit logging of access. The hiring manager said, “You’re not just a PM — you’re a data steward.” Rejected.
FAQ
Is clinical experience required to run healthcare user research?
No. Clinical knowledge helps, but judgment on data governance matters more. I’ve seen non-clinical PMs design compliant studies that clinicians bungled by assuming “good intent” justified rule-breaking. The core skill is systems thinking — not medical training.
Can I use consumer research methods in healthcare?
Not directly. You can adapt them — but only after re-architecting for compliance. A standard usability test becomes compliant when it uses synthetic patients, encrypted platforms, and no PHI capture. The method survives; the implementation changes. If you’re not ready to redesign the pipeline, you’re not ready for the role.
How much compliance detail should I include in interviews?
Enough to show you’ve operationalized it. Don’t recite HIPAA law. Do say, “We classified data as PHI, used BAA-covered tools, and limited retention to 45 days.” One sentence per control. More feels robotic. Less feels negligent. The sweet spot is showing you’ve built research like a product — with requirements, constraints, and failure modes.
Related Reading
- healthcare PM vs fintech PM: Which Industry Comparison Is Better in 2026?
- Healthcare PM Product Sense: Solving Real Problems at Epic and 23andMe
- Best PM Clubs and Organizations at Wharton for Career Prep
- PM Collaboration Tools 2026
The book is also available on Amazon Kindle.
Need the companion prep toolkit? The PM Interview Prep System includes frameworks, mock interview trackers, and a 30-day preparation plan.
About the Author
Johnny Mai is a Product Leader at a Fortune 500 tech company with experience shipping AI and robotics products. He has conducted 200+ PM interviews and helped hundreds of candidates land offers at top tech companies.