Researchers at MIT Lincoln Laboratory are developing ways for technology to predict the presence of Covid-19 from analyzing speech sounds undetectable to the human ear.
They found that ‘biomarkers derived from vocal system coordination can indicate the presence of COVID-19.’ When we speak, our vocal system works together to create the necessary air flow and sounds that we form into words.
They want to add this vocal screening into the How We Feel app. This app asks users questions about their daily health status and demographics, with the aim to use these data to pinpoint hotspots and predict the percentage of people who have the disease in different regions of the country. Asking users to also submit a daily voice memo to screen for biomarkers of COVID-19 could potentially help scientists catch on to an outbreak.
“A sensing system integrated into a mobile app could pick up on infections early, before people feel sick or, especially, for these subsets of people who don’t ever feel sick or show symptoms,” says Jeffrey Palmer, who leads the research group. “This is also something the U.S. Army is interested in as part of a holistic COVID-19 monitoring system.” Even after a diagnosis, this sensing ability could help doctors remotely monitor their patients’ progress or monitor the effects of a vaccine or drug treatment.
In cyberpunk, we write about robots taking biometric data and suggesting that the person is experiencing stress, their heart rate increased, or they are angry, much to the humor of the reader and the humiliation of the main character.
Yet here is an app that could be used to help you manage your mental and physical state just by listening to your voice.
We truly are in a sci-fi age.