IBM is working on a robot that will take care of the elderly, mostly those elderly people who live alone and don’t have anybody else to take care of them. Some people would normally use something like these live-in care providers. However, when it comes to the IBM robot times might change. IBM is working with Rice University to develop a series of sensors that will be implanted inside a robot to create an interface for such robots.
The United Nations estimate that by 2030 56% of the world population will be over 60 and older. Developed countries, and even developing countries, are now facing this problem of more and more elderly people either living alone at home or at the old-age living facilities. In India, the age-old joint family system is being taken over by the nuclear family system and due to career and job aspirations, many kids leave their parents behind and move to other cities. For such families, these robots can be a great help.
“Now is the time to invest in, care for, protect, and empower our aging population so they can live more independent lives,” said Arvind Krishna, Senior Vice President, IBM Research. “Our new research on ’embodied cognition,’ which can combine real-time data generated by sensors with cognitive computing, will explore how to provide clinicians and caregivers with insights that could help them make better care decisions for their patients.”
Technology for the elderly needs to work independently. Contemporary gadgets like smartphones and smartwatches need lots of interaction on the user’s part. But when one thinks of a robot taking care of the elderly, the robot will need to respond to motion, scent and audio and then take the appropriate action on its own.
The inbuilt sensors in the robot can detect if a stove burner is on. The sensors can also detect if the elderly person has fallen. The robotic cameras can also read and process facial expressions and other vital physical signs. Just from facial expressions and physical signs the robot will be able to know the blood pressure and the heart rate condition of the person using the robot.
Running on the IBM Cloud and a Softbank Pepper robot interface, IBM MERA uses IBM Watson technologies and CameraVitals, a technology designed at Rice University that calculates vital signs by recording video of a person’s face. These technologies allow IBM MERA to obtain fast, noninvasive readings on a patient’s heart and breathing measurements that can be done multiple times per day. Combined with IBM Watson Speech to Text and Text to Speech APIs, the camera can also view if a fall has occurred and provide information for caregivers.
This video explains briefly the robot for the elderly that IBM is building
The Watson-powered speech recognition can also call for help when the robot feels help is needed. IBM Watson is a question answering system with which you can interact in natural language.
The name of the robot being designed by IBM is called the IBM Multi-Purpose Eldercare Robot Assistant (IBM MERA – it sounds like a sappy Bollywood movie title). The company has been testing this robot in its “Ageing in Place” lab based in Austin, Texas. The environment of a lonely elderly living at home has been created inside the lab.
I’m pretty sure later on such robots can also be used by persons with disabilities to help them live independent lives.
The IBM MERA robot isn’t yet available for real-time use. It still needs to gather lots of information. However, ready for use at any time are medical alert systems that can monitor the safety and health of people in their own home. If you see this as necessary for you or somebody you love, you may want to check out a guide to medical alert systems for more information. IBM plans to modularize its various learning components so that new learnings can be simply downloaded whenever they are available, just like new skills can be downloaded for Alexa and Google Home.