Mobile Health Sensing and Analytics Laboratory (mHealthLab)


Mobile Health Sensing and Analytics Laboratory (mHealthLab)

Develops Algorithms and Processes for Large Scale Wearable Sensor Networks

Designed to provide a state-of-the-art testbed for performing mobile health experiments at scale, and develop robust and personalized mHealth detectors. Currently, a chasm separates the complexity of analytical methods that are needed to achieve clinically valid measures of complex health targets over varied user populations, and the stringent computational and energy constraints imposed by wearable devices. mHealthLab is designed to bridge this gap and enable continuous personalization of detection models to individual users through a design methodology that takes into account the constraints and opportunities of wearable-smartphone-cloud platforms. Our goal is to design a personalized mobile healthcare system that obtains timely information from individuals to personalize detectors, and continually re-learns how to split sensing and computation across diverse devices to provide accurate real-time health and wellness information.

mHealthLab is intended to be a transformative wireless health research testbed with more than a hundred users with mobile phones and wearables, that will allow efficient access to a userbase to allow continuous design of mHealth detectors for targets such as eating, smoking, drinking, exercise, stress, and others. The software infrastructure provided by mHealthLab includes:

  • Subject recruitment tools that will solicit potential participants from the pool of users,
  • Data collection methods to continuously collect data from the phone, either through the LTE network or via WiFi access points on campus,
  • Access to de-identified data for specific research purposes,
  • Access to a variety of plugins to obtain different types of sensor data from the mobile phone or specific health accessories,
  • Data storage and access methods on a private cloud,
  • Inference toolkit to extract specific high-level inferences from raw sensor data including activities, social interaction patterns, stress, sleep, eating, drinking, and other behaviors, and
  • Web-based visualization to digest multi-modal multi-user streams.

Equipment

User Testbed

  • 500-1000 user testbed involving wearable wristbands and mobile devices

Staff

    • Dean Manning College of Information & Computer Sciences-Representative 

    Prashant Shenoy

  • Location

    S354 Life Science Laboratories
    University of Massachusetts Amherst
    240 Thatcher Road
    Amherst, MA 01003