Omnidirectional LiDAR Using a Single Point ToF Sensor

The Project

Omnidirectional LiDAR using a Single Point LiDAR sensor In this project, I have devised an Omnidirectional LiDAR using a TF Luna Single Point LiDAR, lego turntables and motors controlled by a Raspberry Pi and Build HAT. The LiDAR is fitted on a Lego Robot which can be controlled with a BlueDot App using a smart phone. I am developing this robot and LiDAR setup for Simultaneous localization and mapping. First the hardware - I have assembled a Lego Robot to house the Omnidirection LiDAR I love LEGOs - I have programmed LEGO Mindstorms EV3 robots since I was 8! In the last 3 years I have been working with Rasperry Pi based robots too - like the GoPiGo. If you have not already heard, there is some exciting news for Raspberry Pi and LEGO aficionados. Raspberry Pi and LEGO education have teamed up to create the Raspberry Pi BUILD HAT The Raspberry Pi Build HAT fits any 40-pin GPIO header and let you control upto 4 LEGO motors and sensors - really any LEGO device with LPF2 connector. The robot base is made from LEGO Education Spike Prime and Lego Mindstorms Robot Inventor parts. It has two drive motors for steering and driving. The raspberry pi and Build HAT are installed on the LEGO Maker plate. The robot also has a raspberry pi camera module. The robot is powered by 6 AA Batter Pack which fits snuggly in the battery compartment. The LIDAR The Omni Directional LIDAR assembly has LEGO and non-LEGO components. The non LEGO components are secured on to custom cut acrylic panels with LEGO compatible holes. The TF Luna single point TOF sensor sits on a LEGO turntable and motor assembly. This forms the y-axis of rotatiom The wiring goes through IDC connectors to Protoboard and through a 6 wire slip ring. This sits on another turntable and motor assembly which forms the z-axis of rotation.. The motor and TFLuna wiring goes through a 12 wire slip ring. The Raspberry pi Build HAT controls the motors and the TFLuna is connected through a serial to USB interface and communicates over UART. An IR beam breaker sensor keeps track of the z-axis rotation. When the LEGO plate breaks the beam, the turntable and TFluna are aligned to zero degree azimuth. Using the slip rings and the 2 turntabl es allows a full 3-D scan - 360 degree rotation about z-axis and 180 degree rotation about y-axis. Software The python module currently generate 2-D scans or 2-D point clouds, This can easily be extended to 3-D scans. To generate a scan we need samples of distance as a function of angle. For the angle, initially, I wanted to use the get methods for the motor. But currently the BUILD HAT firmware only supports 2 Hz motor updates even though LEGO advertizes 100Hz sample rates. A call back methods allows upto 10Hz but this requires synchronizing the motor data with TF Luna outputs. So, for now, I am just using the IR beam breaker sensor and time stamps to compute angles. When the IR beam breaks, an interrupt is generated. The interrupt handler updates the current scan start time and scan number. Assuming uniform motor rotation, we can find the angles using time stamp of the TF Data and the time elapsed between consecutive scans.

Hardware
Art
Education

Team Comments

I chose to make this project because...

LiDARs are commonly used in self-driving cars and autonomous robots. I have been experimenting with what I can build with commonly available components. I am developing a robot with an Omnidirectional LiDAR for Simultaneous Localization and Mapping combining my passions for LEGOs and Raspberry Pi.

What I found difficult and how I worked it out

I used slip rings to solve the problem of routing the cables through rotating parts without entangling them. The Build HAT firmware has latencies. I used an IR Break beam sensor and compute angles using elapsed time. I used acrylic panels with LEGO-compatible holes to secure non-LEGO components.

Next time, I would...

The final goal of this project is to do simultaneous localization and mapping (SLAM). I need to implement 3 theads - a main thread for BlueDot remote to drive and steer the robot, a second thread for drive motor data and a third thread for LiDAR data. All this will be processed in a particle filter.

About the team

  • United States

Team members

  • Arnav