Sensor Technology in Health Care – Sensor Data and Algorithms (Part 2)

Older people often move unsteadily and can easily fall and injure themselves. To prevent this, sensors could record and analyse movement patterns and warn of risks. Researchers in the Digital Health innovation field at the Bern University of Applied Sciences are investigating this. Part 1 dealt with the opportunities and challenges of the technology; in Part 2, the research team provides a look at its practical application.

The researchers of the Applied Research & Development Nursing BFH are investigating how technologies affect care and what benefits result for patients and health professionals. The research group is a project partner in the Innosuisse project RAMOS. They are investigating how those affected / involved accept the technology in terms of:

  1. sensor-based solution for patient monitoring
  2. algorithms based on sensor data for mobility analysis.

In an interdisciplinary project setup, the researchers are developing this technology and also the existing product further.

Enormous amounts of data and the possibility of using them

Over the last few years, the development of data-producing systems has advanced rapidly. As a result, we now have countless data sources at our disposal, e.g. sensors in smartphones, wearables such as fitness trackers, surveillance cameras, environmental sensors, production machines and much more. New opportunities are emerging, such as sensor systems that were originally developed in non-specialist industries like automotive and are now being applied in the health sector. One such application in medicine and care is the continuous recording of vital parameters (Rebsamen, 2021) or of movement patterns (Alanazi et al., 2022).

Parallel to the availability of large amounts of data, the processing speed of these has also increased. This is where the use of data science and machine learning comes into play. This has been catalysed by the fast and high availability of high-performance computers. Cloud offerings not only simplify access to high-performance hardware, but also abstract and automate entire parts of the data science lifecycle (Haertel et al., 2022), such as in the Azure Cloud and with other large cloud providers. Under the keyword MLOps, data is processed automatically, machine learning models are trained, integrated into runtime environments and monitored (John et al., 2021). In addition to the lifecycle, pre-trained models can also be obtained from various platforms, such as Hugging Face, and used according to the respective licensing (Han et al., 2021; Bommasani et Al., 2021). The combination of the available data sets and the possibility to use them effectively results in numerous fields of application, both in research and in practice. One such field is the product “QUMEA Care”, which is being further developed as part of the Innosuisse project RAMOS.

QUMEA Care – intelligent patient monitoring

The industrial partner of the QUMEA project is a start-up from Solothurn, which has developed the radar-based solution “QUMEA Care” for patient monitoring and is constantly perfecting it. The system consists of high-resolution radar sensors that are mounted on the ceiling of the patient’s room and thus have a good field of view of the area to be monitored. Due to the innovative radar technology, the privacy of the patients and the nursing staff is optimally protected, while the safety of the patients and residents is promoted. The radar sensor does not generate any images or sound recordings, but only movement data. These are converted on the sensor into so-called point clouds – a set of points in three-dimensional space – and sent to a cloud application, which evaluates them in real time. The evaluation consists of a multi-layered structure based on various established machine learning algorithms and models. If the algorithms detect a specific event, such as a fall or a patient trying to get up, the nursing staff is notified via a smartphone app and a nurse can take care of the patient immediately.

Project RAMOS – Radar-based Mobility Analysis

The RAMOS project was introduced in Part 1 of this series of articles. It is important to mention here that an interdisciplinary project team is working together. While the biomedical engineers extract the maximum amount of information from the sensor data and develop algorithms to be able to recognise more complex everyday movements and activities (so-called Activities of Daily Living, ADL), the nursing and software experts concentrate on the technology acceptance by the users, the co-development of the algorithms and integration into the existing solution “QUMEA Care”.

Working with real data from the field

Within the framework of the RAMOS project, there is the possibility to install the sensors at practice partners. The practice partners are real institutions in acute and long-term care settings. After the sensors have been installed and commissioned by QUMEA, the movement data in the corresponding rooms are recorded. Unlike in controlled laboratory conditions, this reality-based data collection not only enables the analysis of simple movement sequences and body postures such as walking, sitting or lying down, but also complex ADL up to the interaction between at least two people, for example when carrying out a nursing measure.

With the aim of the RAMOS project to observe patients over a longer period of time, especially in a long-term setting, and to analyse their mobility behaviour, other important sources of information in addition to sensor data are the patient dossier and qualitatively collected data. The combination of these data is intended to analyse, for example, the influences of medication, treatment or care measures on mobility behaviour. The qualitative data collected through interviews and focus groups will be used for the technology acceptance of QUMEA Care(see Part 1).

Discrepancy between theory and practice

In the field of sensor data processing, numerous recent studies exist that present remarkable results, such as very high accuracies in the recognition of people and their movement patterns, taking into account the effective computing time (see for example Lee et al., 2022 or Zhao et al., 2019). However, these results have often been obtained under specific test conditions. Under these controlled test conditions, on the one hand, the test sizes are often manageable and, on the other hand, certain framework conditions are known, for example, movement patterns of a known number of people. One of the major challenges in dealing with data from real-life situations is therefore to provide them with the necessary additional information, which is particularly necessary for supervised machine learning. This is also referred to as labelling or “labelled” data. For example, the data set should ideally contain the information at which time which care measure was carried out and how many people were actually in the recorded room at a given time. Labelling involves considerable effort – especially with data from a real, i.e. non-controlled, environment – as the respective situation must be understood, assessed and subsequently documented. Ideally, this would be done by the nursing staff, as they know the actual situation on site. In practice, however, this is difficult to implement, as the nurses are responsible for the care of the patients and do not have the capacity for this additional work. Therefore, ways need to be found to efficiently record when and where certain situations or events (e.g. mobilisation in the room, stoma change, bladder training, assistance with feeding or medication administration) occurred so that the raw sensor data can be enriched with this additional information and thus function as optimal input for machine learning.

Converting algorithms into an applicable product

The development and integration of algorithms into the existing product “QUMEA Care” is basically done in two ways within the Innosuisse project RAMOS:

  • On the one hand, there are algorithms that are based on the sensor data recorded in the course of this study – as described above – especially with regard to the detection of ADL and the long-term mobility analysis of patients and residents. However, these can only be developed and integrated into the product once the data recording phase has been completed and the data has been labelled and processed, analysed and suitable supervised machine learning models have been trained.
  • On the other hand, there is also a generic part of the algorithms that is optimised with regard to the project goals. The algorithms concerned are not based on supervised machine learning models and therefore do not require the recorded sensor data and associated labels. The concepts applied here come from the field of unsupervised learning. For example, the previously mentioned point clouds are grouped into related objects very early in the processing of the sensor data in the cloud application using situationally adapted clustering algorithms (Xu et al., 2015), which serve as input for further algorithms. Such algorithms then address, for example, the cleaning up of irrelevant movement points (so-called noise filtering, as in Mafukidze et al., 2022) and tracking of multiple people in the room (so-called multi-person tracking, as in Zhao et al., 2019).

By the nature of an Innosuisse project, the RAMOS project is also aimed at integrating the results into a product, in this case “QUMEA Care”. This includes the developed algorithms as well as the generation of knowledge regarding the acceptance of the technology by the nursing staff and the patients.

For the algorithms, it must be ensured before integration that data from considerably more sensors than only those used in the study can be processed in an acceptable time. We are talking about performance in terms of computing time and scalability in terms of the number of sensors that provide data. This is because efficient runtime behaviour is essential in order to be able to offer commercial product enhancements based on the algorithms.

The findings on technology acceptance allow, among other things, changes to the product itself, e.g. to increase the user experience and thus acceptance. However, insights into the implementation processes of “QUMEA Care” are also important.


This article is part 2 of a multi-part series. In the next part, you will learn more about the further course of the RAMOS project.


Literature

  1. Han et al., 2021, Pre-trained models: past, present and future https://arxiv.org/abs/2106.07139
  2. Bommasani et al., 2021, On the Opportunities and Risks of Foundation Models https://arxiv.org/abs/2108.07258
  3. Xu et al., 2015, A Comprehensive Survey of Clustering Algorithms
    https://link.springer.com/article/10.1007/s40745-015-0040-1
  4. Haertel et al., 2022, Toward a Lifecycle for Data Science: A Literature Review of Data Science Process Models
    https://www.researchgate.net/publication/365223374_Toward_a_Lifecycle_for_Data_Science_A_Literature_Review_of_Data_Science_Process_Models_Completed_Research_Paper
  5. John et al., 2021, Towards MLOps: A Framework and Maturity Model
    https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9582569
  6. Rebsamen, 2021, Automated Heart Rate Monitoring using UWB Radar
    https://bfh.easydocmaker.ch/search/abstract/2814/
  7. Alanazi et al., 2022, Towards a Low-Cost Solution for Gait Analysis Using MillimeterWave Sensor and Machine Learning
    https://pubmed.ncbi.nlm.nih.gov/35897975/
  8. Lee et al., 2022, Improving Human Activity Recognition for Sparse Radar Point Clouds: A Graph Neural Network Model with Pre-Trained 3D Human-Joint Coordinates
    https://www.mdpi.com/2076-3417/12/4/2168
  9. Zhao et al., 2019, mID: Tracking and Identifying People with Millimeter Wave Radar
    https://ieeexplore.ieee.org/document/8804831
  10. Mafukidze et al., 2022, Scattering Centers to Point Clouds: A Review of mmWave Radars for Non-Radar Engineers
    https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9908570
Creative Commons Licence

AUTHOR: Marco Buri

Marco Buri is an IT specialist at BFH School of Health Professions. He is working as a software engineer/architect on the RAMOS project and is developing algorithms for the early detection of health deterioration due to mobility changes, among other things.

AUTHOR: Selina Burch

Selina Burch is a research associate in the BFH School fo Health Professions.

AUTHOR: Lena Bruhin

Lena Bruhin is a PHD student at the University of Bern and a member of the BFH's RAMOS project.

AUTHOR: Friederike J. S. Thilo

Prof. Dr Friederike Thilo is Head of Innovation Field "Digital Health", aF&E Nursing, BFH Health. Her research focuses are: Design collaboration human and machine; technology acceptance; need-driven development, testing and evaluation technologies in the context of health/disease; data-based care (artificial intelligence).

Create PDF

Related Posts

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *