Science

AI Detects Hypertension from Voice in a Two-Week Study

Max Global: AI isn’t just reading scans; it’s learning to listen. In new research from Klick Labs, AI detects hypertension from voice using short smartphone recordings, pointing to a low-friction way to screen for chronic high blood pressure between clinic visits.

MAX Global reviews the study design, the findings, and the implications, backed by sources.

AI Detects Hypertension from Voice in a Two-Week Study

How AI detects hypertension from voice in a two-week smartphone study

Klick Labs enrolled 245 adults who recorded brief, scripted phrases on a mobile app up to six times per day for two weeks. Each clip was converted into hundreds of acoustic features—sometimes called vocal biomarkers—capturing pitch variability, energy distribution, and spectral contrasts that humans do not perceive consistently. Trained machine-learning models then predicted whether a participant had chronic high blood pressure.

In this cohort, AI detects hypertension from voice with reported accuracy up to 84% in women and 77% in men, according to Klick’s summary of the peer-reviewed paper in IEEE Access and independent trade-press coverage.

What those accuracy numbers actually reflect

The team evaluated two diagnostic thresholds. At a systolic/diastolic cutoff of ≥135/85 mmHg, balanced accuracy reached 84% for women and 77% for men. At the stricter ≥140/90 mmHg threshold, accuracy shifted to 63% for women and 86% for men. These numbers describe how well the classifier separated likely hypertensive from non-hypertensive participants in that sample; they do not imply a confirmed diagnosis. Read pragmatically, AI detects hypertension from voice well enough for a first-pass screen that raises suspicion and nudges people toward validated measurements with a cuff or ambulatory monitor.

Why voice can encode blood-pressure risk

Speech is shaped by respiration, vascular mechanics, and autonomic tone. When the cardiovascular system is under strain, the neuromuscular control of the larynx and the breath that powers speech can shift in subtle, measurable ways. By aggregating many short clips over days, AI detects hypertension from voice without needles, cuffs, or wearables—just the phone a person already carries. That practicality explains why outlets summarized the study as AI detects high blood pressure by listening, and why the approach could expand low-friction screening between office visits.

AI Detects Hypertension from Voice in a Two-Week Study

Real-world fit, limits, and what must be proven next

The research used fixed phrases, a specific app, and controlled instructions. Generalization is the hurdle. A tool that claims AI detects hypertension from voice must hold across microphones, background noise, languages, accents, and co-morbidities. Prospective trials should report not only accuracy but also clinical outcomes: Does earlier flagging lead to faster confirmation and better control rates? Until such evidence accumulates, any positive signal from a voice screen should be confirmed by standard devices. That framing keeps expectations disciplined and patient-safe.

A related but distinct track: voice agents in BP programs

In parallel, health systems are testing voice AI agents that call patients at home, guide them to take self-measured blood pressure readings, and log results into records. Early reports presented at the American Heart Association’s Hypertension Scientific Sessions 2025 suggest these agents improve reporting accuracy and reduce program costs compared with staff calls. This line of work does not claim that AI detects hypertension from voice acoustics; rather, it shows how voice technology can already strengthen logistics around hypertension care today. Together, the two tracks—acoustic screening and automated follow-up—outline a broader voice-enabled future for cardiovascular care.

The bottom line

For now, treat the approach as an accessibility play. AI detects hypertension from voice with encouraging accuracy in a two-week study of 245 adults, suggesting real potential as a noninvasive screening aid. If larger, prospective studies reproduce these results across devices and populations, voice screening could help close gaps—especially where cuffs and clinic time are scarce. Until then, the safe workflow is simple: let the app listen, but let a validated cuff confirm.

Sources 1 2 3

Related Articles

Back to top button