home
Real-world Use Cases Normative Requirements Repository

ALMI (Ambient Assisted Living for Long-term Monitoring and Interaction)


ALMI

With a rapidly ageing population, the world is facing a social care crisis (Appleby, 2009). Without a step change in the provision of social care, especially to the elderly, the increase in the budgets and resources allocated to social care will soon become unsustainable. Ambient assisted living (Blackman et al., 2016) (i.e., assisted living support provided in a person’s daily environment, with the aid of robotic and autonomous systems – RAS, Artificial Intelligence – AI, and other technologies) is widely envisaged as a key component of such a step change (Lee et al., 2018).

Given this vision, the development of assisted-living RAS and AI solutions has been the focus of intense research and industrial effort in recent years. Designed to help or even replace carers at home and in care homes, these solutions aim to support people with motor or cognitive impairments in a wide range of tasks, increasing their ability to pursue daily living activities independently. These advances have provided RAS solutions capable of assisting elderly and disabled users both in a monitoring/advisory role and with physical tasks. However, integrating the two types of assistance into a combined assistive-care RAS solution that can be used safely over a long period of time still poses significant challenges (SPARK, 2015).

In the ALMI project, we employ a TIAGo robot that uses both its speech interaction and its object manipulation capabilities to help a user with mild motor and cognitive impairments in the daily activity of preparing a meal. Specifically, the TIAGo robot (i) provides step-by-step voice instructions guiding the user through the meal preparation task; (ii) fetches and hands to the user some of the food ingredients, kitchen utensils, crockery, etc. required for these steps; (iii) reminds the user (if needed) where to find other items that are required for the task, and that the robot cannot reach or handle. Providing such support requires the robot to dynamically create, update and exploit a “knowledge store” of household item locations (over a long period of time); to track the user’s progress with the meal preparation task, so that instructions are delivered progressively and repeated if necessary; to handle disruptions safely, etc.

The safe handling of disruptions requires the robot to react to events such as task interruptions due to a phone call received by the user, or loss of vision due to the light being switched off accidentally by the user, or as a result of a power cut. If such unexpected events interrupt the execution of the task, the robot will mitigate the detrimental effects of interruption (if there are remedial actions that can be performed), or issue an alert when an unsafe situation cannot be handled directly by using its capabilities.

TIAGo is a highly customisable mobile robotic platform with 15 degrees of freedom (DoF). The TIAGo robot comprises a mobile base with a footprint of 54cm, an adjustable height torso enabling the robot to vary its overall height between 110–140cm, a pan-tilt head, and a 7 DoF manipulator arm with a reach of 87cm and a payload of 3kg. The mobile base is provided with a differential drive capable of speeds of up to 1 m/s, and uses a LIDAR laser for indoor navigation. The TIAGo control software and applications are deployed on an Intel i7 (7th generation) computer with 16 GB of RAM, 500 GB of disk space, and running Ubuntu LTS 64-bit with the RT Preempt real-time framework. Multiple ROS LTS controllers running in a real-time control loop are used to manage robot components including its torso, head and arm positions, with joint trajectory controllers used from groups of joints and a Head Action Server for controlling the robot’s gaze. The TIAGo navigation unit supports laser-based mapping and self-location, with obstacle avoidance and navigation to map point capabilities. The upper-body motion engine controllers support path planning with self-collision avoidance, and come with a wide range of pre-programmed motions and facilities for defining customised motions. Particularly relevant for ALMI, TIAGo supports (i) speech-based interaction with the users through its integrated ACAPELA text to-speech system and DeepSpeech speech-to-text module; (ii) object and people detection thanks to its ASUS XTION Pro Live 3D Camera mounted on the robot’s head.

Appleby, J. Spending on health and social care over the next 50 years. Why think long term?, 2013. The King’s Fund. 4. M. W. Bewernitz, W. C. Mann, P. Dasler, and P. Belchior. Feasibility of machine-based prompting to assist persons with dementia. Assistive Technology, 21(4):196–207, 2009.

Blackman, S., Matlo, C., Bobrovitskiy, C., Waldoch, A., Fang, M L., Jackson, P., Mihailidis, A., Nygard, L., Astell, A., and Sixsmith, A. Ambient assisted living technologies for aging well: a scoping review. Journal of Intelligent Systems, 25(1):55–69, 2016.

Lee H R. and Riek L D. Reframing assistive robots to promote successful aging. ACM Transactions on Human-Robot Interaction (THRI), 7(1):1–23, 2018.

SPARC – The Partnership for Robotics in Europe. Robotics 2020 multi-annual roadmap for robotics in Europe, 2015.