The University of Massachusetts Amherst

Feature Stories

Data to Knowledge

Research advances improve data capture from wearable tech
  • A researcher wears a pair of prototype eyeglass sensors.

Whether it’s data collected by smart eyeglasses or textiles, a wristwatch or a shoe, Deepak Ganesan’s work has major implications for some of society’s biggest public health issues, from autism to elder care, insomnia to opioid addiction.

Deepak Ganesan has spent the better part of a decade trying to work around noise. Not the noise that sends you racing to Amazon to buy a pair of acoustic headphones. The noise Ganesan contends with is the kind that clutters data – the data collected by sensors in today’s most advanced wearable devices.

Ganesan, a computer science professor at UMass Amherst, directs mobile health initiatives at the Center for Personalized Health Monitoring (CPHM), part of the university’s Institute for Applied Life Sciences (IALS). He leads the center’s work on the computer science aspects of wearable technologies for mobile health – pervasive sensing and improving data capture and analysis from both novel and standard, off-the-shelf devices.

Along with his CPHM colleagues, Ganesan is laser-focused on finding solutions to society’s most vexing health problems.

Measuring “data in the wild”

There’s an inherent challenge in the work Ganesan is doing. It’s really hard to get meaningful insights from data collected in a real-world environment like a hospital – “data in the wild,” as he calls it.

“Wearables aren’t designed to give high-quality data,” Ganesan said. “In a hospital, you’re wearing devices, because you have limited mobility and expect to be monitored a lot. For example, if you wanted high-quality ECG data, you’d wear a tight vest without complaining about it. But if it’s an everyday wearable, you’d wear loose clothing or accessories like wristwatches and spectacles that you already use. And then what you get is noisy data.”

Ganesan’s adventures in mobile health research took an exciting turn when he and colleague, Benjamin Marlin, were hand-picked to represent the university on an elite team of scientists from 11 universities at the Center for Excellence for Mobile Sensor Data to Knowledge (MD2K) in Memphis, TN. A five-year project funded by the National Institutes of Health, the Big Data to Knowledge Initiative addresses barriers to processing complex mobile sensor data.

Ganesan’s work with MD2K focused on the question, ‘How can we extract actionable information despite noisy data from wearable devices?’

“MD2K developed many new technologies and collected datasets on behaviors like smoking,” he said. That data and Ganesan’s leadership in the mobile health space led the team to explore technologies like smart eyeglasses.

“They say eyes are the windows to the soul,” Ganesan said. “There’s a lot of information about people’s habits and their health that manifests in the eyes. But it’s not easy to stick something in them.”

Seeing eye to eye at MD2K

To get around their research constraints, the MD2K team turned to eyeglasses, exploring new ways to glean useful information about eye movement and facial gestures from them. But those devices weren’t challenge-free, either.

“Eyeglasses are continuously moving around, their position constantly changing,” Ganesan said. “It’s also hard to attach a big battery to them.”

They developed low-power designs with roughly the same power consumption as a Fitbit. But rather than track steps, the MD2K team used smart eyeglasses that tracked eye movement and blink patterns. Among the work’s goals was to understand how to measure eye gaze patterns, pupil dilation, and fatigue and drowsiness, all of which are extremely valuable in applications ranging from determining the impact of sleep disturbances to measuring the effect of medication on Parkinson’s patients.

Ganesan’s team also realized that eyeglasses can provide a lot more information than just eye movements if they have electrodes at the bridge of the nose. “We can measure things about facial expression from that location including grimacing during pain,” Ganesan said. “Raising your eyebrows, squinting – anything connected to pain we were able to pick up and create a metric for it.” They tested their smart eyeglasses on several people by inducing pain slowly, observing facial expressions and collecting data.

While the eyeglass experiments showed significant promise for studying everything from Parkinson’s to substance abuse, work still needs to be done to bring the technology to scale.

“There are lots of wearable devices on the market,” Ganesan explained, “but getting from the point where it works on a limited population to scale is tough. People are different, and making things work across the population is difficult.”

Moving the needle with machine learning

Ganesan has looked at scaling technologies from two perspectives – sensing devices and machine learning analytics. It’s a painstaking yet extremely thorough approach to extracting insights from wearable sensors. Sensor devices feed data to machine learning algorithms, which, in turn, extract actionable insights that can help make informed decisions based on that data. Machine learning components extract features from data, apply algorithms, assign metrics, test and validate.

In the process, Ganesan does what research scientists do: He asks questions. Lots of them. How do you obtain accurate measures despite a noisy environment? How do you even pick up the data? What kind of information do you get from wearables? And if you don’t know where on the body the noise is coming from, how do you distinguish which information is the best?

To study smart clothing more closely, Ganesan partnered with Trisha Andrew, an expert in textile electronics who heads up UMass Amherst’s Wearable Electronics Lab. Years ago, Andrew developed a coating method to transform everyday garments into movement sensors. Today, Andrew and Ganesan, with graduate students Ali Kiaghadi and S. Zohreh Homayounfar, have introduced “phyjamas,” physiological-sensing textiles that can be woven or stitched into sleep garments.

“The challenge we faced was how to obtain useful signals without changing the aesthetics or feel of the textile,” said Andrew.

“Our insight was that even though sleepwear is worn loosely, there are several parts of such a textile that are pressed against the body due to our posture and contact with external surfaces," said Ganesan. "This includes pressure exerted by the torso against a chair or bed, pressure when the arm rests on the side of the body while sleeping, and light pressure from a blanket over the sleepwear. Such pressured regions of the textile are potential locations where we can measure ballistic movements caused by heartbeats and breathing and these can be used to extract physiological variables,” he adds.

The difficulty, according to Ganesan, is that these signals can be individually unreliable, particularly in loose-fitting clothing, but signals from many sensors placed across different parts of the body can be intelligently combined to get a more accurate composite reading.

“Dynamics are inherent in real-world signals,” Ganesan said, “but the key is to understand how to use these dynamics to our advantage to get the best data possible under these scenarios."

Creating solutions that work

Whether it’s data collected by smart eyeglasses or textiles, a wristwatch or a shoe, Ganesan’s work has major implications for some of society’s biggest public health issues, from autism to elder care, insomnia to opioid addiction, diabetes to heart attacks.

Projects centered on drug addiction have explored sensitive questions around data from electrocardiograms. For instance, can you detect that someone’s ingesting cocaine? Is there a way to trigger just-in-time interventions? Can you help break a smoking habit by monitoring hand gestures and understanding smoking triggers? Can you detect alertness in a truck driver or sleep stage information in an insomniac?

To that end, Ganesan co-founded Lumme Inc., which focuses on wearables and machine learning for addiction treatment. Lumme’s platform is designed to act like a therapist in your pocket, offering the right help at the right time. The digital platform uses sensors on a smartphone and a smartwatch to detect, predict and prevent behavior that affects health.

Smokers download Lumme's app and use it with a smartwatch during their quit program. Since the platform is completely automated, users are not required to manually enter or log their habits. The app passively monitors their smoking pattern and delivers therapy even before they experience cravings. Ganesan says the goal is not only to know these things are happening, but also how to predict behavior. “We can’t force people to break their habits,” he said, “but we can intervene to help.”

“Lumme is working on a larger phase two trial with over 100 subjects. They went through the whole smoking cessation loop, and it's quite promising compared to every other technology that's out there,” said Ganesan.

Looking 10 years out

As a new school year begins, Ganesan and his peers return to their research at IALS, where every day brings them closer to their long-term goal of reducing disruption levels in commercial sensors.

“We still have a long way to go,” he said, “but we’re getting closer. There are big, grand societal problems that we need to solve, but our imagination of how to solve these problems is often limited by technologies that are currently available. This can lead to incremental progress. You have to go beyond and think 10 years out about radical new technologies that can be developed to make significant inroads into the problem. That’s what leads to innovation.”

And no doubt, they’ll get there. “There’s a lot of excellent faculty at UMass Amherst,” Ganesan says. “It’s the whole ecosystem, and having an umbrella like IALS to bring it together.”

University Relations