AssistBot: Service and Companion Robot for the Elderly, Lonely, and Impaired
The Project
My project is a service and companion robot for the elderly, lonely, and impaired. My goal for this project was to create a fully functional robot to provide assistance and cognitive support to the elderly and the lonely. I got the idea for this project because my grandfather was a Stroke and Aphasia patient. He was often alone in the house while my parents went to work, so my father used to monitor him using a security camera. I researched and found out that 27.6% of all senior citizens in the U.S. live alone, according to the U.S. Census Bureau. AssistBot has five main features. The first is teleoperation when the robot can be controlled wirelessly through a gaming controller. The elderly and the lonely can control the robot and bring it to them or into another room. The second feature of AssistBot is voice commands. Voice commands do the same tasks as teleoperation, but you can control the robot with your voice. By saying "Move forward," the robot responds promptly and executes the moving forward command. The next and most important feature of AssistBot is the wellness check. In the wellness check, the robot first moves around autonomously using a mapping of the house from a LiDAR. A LiDAR(Light Detection and Ranging) targets an object or a surface with a laser and measures the time for the reflected light to return to the receiver. After navigating through the house, the robot looks for a face and then tries to recognize it using facial recognition. If it recognizes the face as the individual it's looking for, the robot would ask them, "Are you doing all right?". And if the elderly says no, it calls the emergency contact saved into the system. The fourth feature of AssistBot is the emergency button for emergencies. The elderly can press the trigger button on the gaming controller, which calls the emergency contact using the Twilio phone calling API. The fifth feature of AssistBot is Alexa Integration, done by integrating the Amazon Alexa Sample App for Developers. The elderly, lonely, and impaired can set appointments and medication reminders with the Alexa API. For cognitive support, AssistBot can play relaxation music, tell jokes, and have a meaningful conversation. In software, I have used ROS2(Robot Operating System), a set of libraries for robotics projects, the same software library used in the NASA Mars Rover, Twilio API for calling the emergency contact saved in the system, face_recognition library and OpenCV for facial recognition in the wellness check, Google Text-to-Speech library and speech_recognition library for voice commands, Amazon Alexa API for Alexa Integration, Pygame for receiving input from the gaming controller for teleoperation, and Adafruit MPU6050 library for reading the IMU(Inertial Measurement Unit) sensor values. In hardware, I have used: a 12V LiPo battery as a power supply, 4 DC motors with encoders, 2 L298N motor drivers to control the motors, a breadboard strip to distribute power between the motors, terminal blocks to distribute power among all of the components, a Raspberry Pi 4B with 8GB RAM as the "brains" of the system, an auxiliary speaker and USB microphone to communicate with the elderly, lonely, and impaired, an IMU sensor to detect the direction the robot is facing and the distance from the ground. The IMU sensor is also the same sensor used in airplanes. I have also used a LiDAR for mapping, a Step-Down 12V to 5V converter for the Raspberry Pi, a Depth Camera for facial recognition, and a 10.1-inch screen as the display.
Team Comments
I chose to make this project because...My grandfather was a Stroke and Aphasia patient who was often alone in the house. My father used to monitor him using a security camera. I did some research, and I found out that 27% of all senior citizens in the U.S. live alone, according to the U.S. Census Bureau.
What I found difficult and how I worked it outInitially, controlling all the motors with one driver was a challenge. So, I switched to two motor drivers - one for the front and the other for the back. This change enabled the robot to move swiftly and without difficulty.
Next time, I would...If I had more time to do this project, I would add a robotic arm. This arm would be used to autonomously bring food, medication, water, and snacks. It would go hand-in-hand with the Alexa Integration, which reminds the elderly, lonely, or impaired to take their medications.
About the team
Team members
More cool Community projects
L.I.F.E.M.I.N.E
Mobile