Oklahoma researchers look to robots to care for aging boomers
Remember “The Jetsons” and the family's favorite nonhuman housekeeper, Rosie?
Maybe George's flying car is still a thing of the future, but an in-home robot could be tracking your meds and beating you at gin rummy within a few years.
Why now? Aging baby boomers and a shrinking pool or caregivers.
The number of people age 60 or older is projected to grow by 56 percent worldwide by 2030, according to the United Nations. But, the number of caregivers is not expected to keep up with the demand.
In 2010, there were more than seven potential caregivers for every person in the high-risk years of 80-plus, according to a 2013 report released by AARP. “By 2030, the ratio is projected to decline sharply to 4-to-1; and it is expected to further fall to less than 3-to-1 in 2050, when all boomers will be in the high-risk years of late life,” the report said.
Enter Weihua Sheng, a professor of electrical and computer engineering at Oklahoma State University.
Sheng, with fellow OSU researcher Guoliang Fan, believes companion robots will be essential to filling a potential care gap as baby boomers age.
Sheng and Fan are finishing up a National Science Foundation grant to develop a robot that can both alert caregivers to potential problems and interact with an elder to provide companionship.
“The goal of our project is to help the elderly live a better life. A high-quality life. An independent life,” Sheng said. “Because of the shortage of nurses and doctors and home health care providers — especially in Oklahoma's rural areas — many older adults are not able to access those kinds of caregiver resources. That's why we need technology to help them. This robot is (designed) to play the role of a nurse or a social worker who can constantly check on the older adults and their house.”
Currently, the team is developing two prototypes: a tabletop robot and a mobile robot.
The tabletop model is equipped with technology similar to home assistants such as Google, Siri or Alexa. But Sheng's robot uses artificial intelligence to be conversational and interactive. It has facial features and a “head” that swivels in the direction of a voice. It can ask questions, play games and identify sounds or omission of sounds that could signal a problem. It also can record vital statistics through wearable technology being developed at the lab.
The mobile robot contains the same artificial intelligence as the tabletop robot, but it can move through a one-story home thanks to Fan, an expert in computer vision and motion analysis.
To address privacy issues, the team has wired a model apartment used in their research, with motion sensors. Installed into a home's substructure, the sensors would collect motion data and allow the robot to learn behaviors.
“We can combine robot technologies and house monitoring technologies, Sheng said. “The robot is continually monitoring the situation. The motion sensors in the room can roughly know where you are all the time. These sensors are very inexpensive so you can put sensors in each room.”
Perhaps one of the robot's most advanced capabilities is fall detection. Sheng's team — which includes numerous graduate students and doctoral candidates — has spent years of collecting and digitizing the sound of falls. Once the robot hears a fall, it can follow the sound to the room in which it was detected.
“So, in case something is wrong, the robot can ask you, “Are you OK?” If you say, “I'm not feeling well,” the robot can contact your doctor. The doctor may be at his office or at his home, but he can access the data and use the cameras on the robot to assess the problem.” he said.
If the health care provider wants to go even more high-tech, the mobile robot can interface with a virtual reality headset, which will control where the robot goes and what it sees.
“With a virtual reality goggle like the Oculus Rift, you can see a 3-D image. You can turn your head and the robot will turn its head and you're basically controlling the eyes of the robot,” Sheng said. “Even if you are in China, you can do this. It's called telepresence. In a certain sense, you can be in two different locations at the same time. You can hear what the robot is hearing. You can see what the robot is seeing. We can even use human brainwaves to control the robot. A few years ago, we did some preliminary tests using brainwaves to control the movement of the robot. We have special wearable devices to collect the brain signals — the EEG signals. You just think, ‘I'll go there,' and the robot will go there.”
Sheng believes robots like these will be on the market within the next 10 to 15 years with a fairly reasonable price tag.
“If we put these on the market, our target should be maybe $1,000 for the tabletop to keep it affordable. For the mobile version, maybe $2,000,” he said.
However, work being done on robots to help the elderly isn't necessarily confined to the home, said David Miller, a professor of robotics technology and the Wilkonson Chair of Intelligent Systems at the University of Oklahoma.
“Nationally there's a lot of work (underway) on things like smart wheelchairs,” Miller he said. “A smart wheelchair would be for someone with visual impairment or paralysis, or a person who might be on supplemental oxygen. The smart wheelchair would make it so they could safely drive through crowds of people or say, ‘Take me home,' or ‘Take me to this office,' and the wheelchair look up the address and use tools that currently are available to find a path.”
But, he adds, robots that can do household chores such as cooking and cleaning are still a ways off.
“As far as Rosie the robot from 'The Jetsons,' that's probably more than 10 years off,” Miller said. “The problem is not so much the smartness as it is the actual mechanics. Even a reasonably well-structured robot still requires maintenance. For someone who is bed-bound, that might be less of a help.”