YDPIbot is an open platform differential drive mobile robot that uses a Raspberry Pi, an Arduino Mega, a YDLidar, a motor shield, an IMU, and encoders to perform autonomous navigation and mapping tasks. YDPIbot is an affordable and easy-to-assemble robot that is designed to be an educational tool for students and hobbyists who are interested in learning about robotics and autonomous systems.
To assemble YDPIbot, you will need the following components:
- Raspberry Pi 3 model B+
- YDLidar x4
- Arduino Mega
- Motor driver shield
- IMU (MPU6050)
- Optical encoders
- 2 DC motors
- 2 Wheels
- 1 Caster wheel
- Batteries with holder
- Power bank
Follow these steps to assemble YDPIbot:
- Connect the Arduino Mega to the motor driver shield and the optical encoders and MPU6050 according to the pinout diagram provided in the documentation.
- Connect the IMU to Raspberry Pi using the I2C interface and follow the instructions in the mpu6050 pkg.
- Mount the YDLidar on top of the robot using screws.
- Mount the Raspberry Pi and the motor driver shield on the chassis of the robot using screws and standoffs.
- Connect the DC motors to the motor driver shield and attach the wheels to the motor shafts.
- Attach the optical encoders to the motors using the mountingbrackets provided in the kit.
- Attach the caster wheel to the bottom of the robot chassis.
- Install the batteries or power banks on the robot and connect them to the Arduino Mega and the motor driver shield.
To use YDPIbot, you will need to install the following dependencies:
- Python 3
- Ubiquity Robotics image - Ubuntu 20.04 for rasberry pi
- ROS Noetic - the ubiquity robotics image have ROS noetic already-
- YDLidar pkg
- MPU6050
- Robot Pose EKF
- Hector Mapping
- Rosserial
We recommend installing Ubuntu 20.04 and ROS noetic on the Raspberry Pi using the Ubiquity Robotics image. Follow these steps to install Ubuntu 20.04 using the Ubiquity Robotics image:
-
Download the Ubiquity Robotics image for Raspberry Pi from the Ubiquity Robotics
-
Flash the image to an SD card using a tool like Raspberry Pi Imager.
-
Insert the SD card into the Raspberry Pi and power it on.
-
Follow the on-screen instructions to complete the Ubuntu 20.04 installation process.
To install the YDLidar pkg, follow these steps:
-
Open a terminal on your Raspberry Pi.
-
Clone the YDLidar repository using the following command:
git clone https://github.com/PinkWink/ydlidar.git
-
Follow the instructions in the README file to build the YDLidar pkg.
-
Run some demo with the lidar to check if it's working.
To install the MPU6050 pkg, follow these steps:
-
Open a terminal on your Raspberry Pi.
-
Clone the MPU6050 repository using the following command:
https://github.com/PigeonSensei/pigeon_imu_driver/tree/master/mpu6050
-
Install wiringPi library from here
-
Run some demo with the IMU to check if it's working.
To install the YDPIbot package, follow these steps:
-
Open a terminal on your Raspberry Pi or PC.
-
Navigate to your
catkin_ws/src
directory -
Clone the YDPIbot repository using the following command:
git clone https://github.com/AbdallahAmrBeedo/ydpibot.git
-
Navigate back to the main catkin_ws to build the pkg using the command:
catkin_make
-
Open Arduino IDE and connect the arduino to your PC
-
Choose from tools/boards Arduino Mega 2560 and choose the port (Usually /dev/ttyACM0 if you are running from Ubuntu)
-
Open the code in
~/catkin_ws/src/ydpibot/ydpibot_bringup/YDPIbot/YDPIbot.ino
-
Upload the code and connect the Arduino to the raspberry pi.
For using the robot hardware, just open a terminal on the Raspberry pi and run the following command:
roslaunch ydpibot_bringup robot.launch
and wait until you see the massage Sensors calibrated, Start!
Now, you can run any other pkg to do anything!
For using the robot simulation, just open a terminal on the PC and run the following command:
roslaunch ydpibot_description robot.launch
Now, you can run any other pkg to do anything!
We welcome contributions from other developers! If you would like to contribute to YDPIbot, please follow these guidelines:
- Fork the YDPIbot repository to your own GitHub account.
- Create a new branch for your feature or bug fix.
- Make your changes and test them thoroughly.
- Submit a pull request to the main YDPIbot repository with a clear description of your changes.
YDPIbot was created by Abdallah Amr, Mostafa Osama, Tarek Shohdy, and Yomna Omar from Egypt-Japan University of Science and Technology. We would like to acknowledge the following libraries and tools that were used in the development of this project:
We would also like to thank the following people for their contributions to this project:
- Abdallah Amr (abdallah.amr@ejust.edu.eg)
- Mostafa Osama (mostafa.eshra@ejust.edu.eg)
- Tarek Shohdy (tarek.shohdy@ejust.edu.eg)
- Yomna Omar (yomna.mokhtar@ejust.edu.eg)
If you have any questions or feedback about YDPIbot, please contact us at abdallah.amr@ejust.edu.eg. We would love to hear from you!
- Add documentries (video - frames.pdf)
- Apply localizatin on gazebo simulation
- Odometry (options: try the encoders as it is, kalman filter between imu and encoders)
- Add reset odom service for the pose estimator
- Add Readme file for each pakage with how to use each one
- Add GUI for the bot
- Robot model is not moving in rviz while mapping