AI technique to sense posture through walls, help monitor Parkinson’s
New York, June 12 (IANS) Scientists at the MIT have developed a novel technique that uses artificial intelligence (AI) to teach wireless devices to sense people’s postures and movement, even from the other side of a wall, and could be used to monitor diseases like Parkinson’s and multiple sclerosis (MS).
For the system named “RF-Pose,” the researchers use a neural network to analyse radio signals that bounce off people’s bodies, and can then create a dynamic stick figure that walks, stops, sits and moves its limbs as the person performs those actions.
The system could also help elderly people live more independently, while providing the added security of monitoring for falls, injuries and changes in activity patterns — associated with Parkinson’s, the researchers said.
“We’ve seen that monitoring patients’ walking speed and ability to do basic activities on their own gives healthcare providers a window into their lives that they didn’t have before, which could be meaningful for a whole range of diseases,” said Dina Katabi, Professor at the US varsity.
The results will be presented at the forthcoming Conference on Computer Vision and Pattern Recognition (CVPR) in Salt Lake City, Utah.
For the study, the team collected examples using both their wireless device and a camera. They gathered thousands of images of people doing activities like walking, talking, sitting, opening doors and waiting for elevators.
They then used these images from the camera to extract the stick figures, which they showed to the neural network along with the corresponding radio signal. This combination of examples enabled the system to learn the association between the radio signal and the stick figures of the people in the scene.
Post-training, RF-Pose was able to estimate a person’s posture and movements using only the wireless reflections that bounce off people’s bodies.
Besides sensing movement, the authors also showed that they could use wireless signals to accurately identify somebody 83 per cent of the time out of a line-up of 100 individuals.
“By using this combination of visual data and AI to see through walls, we can enable better scene understanding and smarter environments to live safer, more productive lives,” the researchers said.