To main content

MAXSENSE - Maximizing the value of sensors data using human avatars

The goal of MAXSENSE is to create a portable AI-based, marker-free 3D modelling system to generate personalized musculoskeletal models (or "avatars"). Combining our new models with data from sensors on & around the user, we get high-value data that can be used to improve training, work situations or product design.

Contact person

Ill:: Shutterstock

In an increasingly “technologified” world, we may worry whether increased use of a home office laptop during a lockdown has predictable, specific, negative (and potentially avoidable) outcomes like back or neck pain. Working out at the gym, we may worry whether an exercise actually helps us get fitter or just makes us tighten up more. Body-worn sensors, sensors in furniture, clothes and floors can help to measure functioning and well-being, improve product design, create better work and exercise routines. However, without a good model of us, the users, the value of those data is limited. It is hard then to say, with any degree of certainty, that sitting in a certain way is bad for your posture, or that a chair is poorly designed.

Our solution proposal and ambitious goal, is to create a portable AI-based, marker-free 3D modelling system to generate personalized musculoskeletal models (or avatars). We utilize recent advances in programming tools for hybrid physics-AI problems and combine them with fundamental knowledge about bone structures, tissue and skin to create avatar from observations of humans in motion. The aim of this integrated approach is to allow creation of individualized avatars from only minutes of 3D data recordings. In the past, creating human models has taken years. We proceed to extract high-value situational data by fusing our avatars with data from sensors on and around the user. These data can now be used to improve training and work situations in real-life situations such as working at a desk, texting, or doing squats – without the presence of cameras. Ideally, the proposed system will be deployable by coaches, furniture and workspace designers working to make work environments safer.

Key facts

Project duration

2022 - 2026

Funding

The Research Council of Norway, ID-no.: 332848

Partners

The University of Oslo, NxTech, Nordic Semiconductor, SATS, Flokk, Norwegian School of Sport Sciences, Cornell University

Project type

KSP-S