Fruit Harvester (2018)
#Robotics #Solidworks #Arduino #RaspberryPi #C++ #3D-Printing
1.Image Processing 2.Robotic Arm 3.Mobile Platform
In today’s world, robotics is a rapidly growing field and a topic of discussion in many crowds. As technological advances in this field continue to grow, new solutions that utilize autonomous systems are being discovered every day. One of the many fields that have witnessed exponential growth in the implementation of robotics systems is Agriculture. Farmers and other agricultural producers now have the ability to use robotics in their field to eliminate many of the simple labor-intensive tasks, resulting in saving millions of dollars while also increasing the quality of their crops. In order to identify and test new concepts, the Robotic Design Competition held annually by the American Society of Agricultural and Biological Engineers (ASABE) calls upon students from all levels and backgrounds to challenge one another for this goal while by simulating a fully autonomous solution to a common agricultural problem. For the 2018 competition, the participants were tasked to design a system to simulate harvesting apples(represented by ping pong balls) on an 8 ft. x 8 ft. playing field. The robots were required to differentiate and harvest eight mature apples (red colored ping pong balls), remove eight rotten apples (blue colored ping pong balls), and leave eight immature apples (green colored ping pong balls). There were be a total of 24 apples on the board. The robots were restricted to be within a 12 in. x 12 in. x 12 in. dimension at the beginning and at the end of the competition but they can go beyond this restriction during the challenge.This project was accomplished in the spring of 2019 with the assistance of the HuskerBot Robotics team, which was lead by me and included the following members: S. Tamkiya, C. Dempsey, D. Nguyen, J. Wynn, A. Atefi, S. Jaju, R. Prajapati ,and C. Liew. This project was guided by Dr. Santosh Pitla, associate professor at the Department of Biological System Engineering at the University of Nebraska, Lincoln. The design methodology adopted by the team can be summarized in the following areas:
To accomplish this task, a Raspberry Pi Camera was implemented to capture images from the board. Image processing algorithms were used to detect the position of empty spots as well as the apples in each panel as shown in the figure below. More specifically, color-based segmentation was used to remove any object that is not colored red or blue in the image and classify the ping pong balls into the three categories of ripe, rotten, and immature based on their color. Once their color is determined, an array was created with the necessary information about their position. The camera intrinsics were then used to estimate the extrinsic parameters with respect to the board. This information is used to calculate the coordinates of the balls that are required to be collected relative to the robotic arm.
Custom designed mount for the RGB Camera
Camera's Point of View
Removal of background for color detection
To collect the desired apples, our team modified a Lynxmotion AL5D robotic arm driven by five servos. A custom-designed gripper was specifically developed to house the ping pong balls. The motors used in this manipulator were controlled using a Lynxmotion servo controller. Specific commands were sent to this board via serial communication from the Arduino. The collection process can be summarized as follow:
3. A kinematic model of the robot is then generated and used to solve for the joint angles (servo positions) necessary to reach the apple that needs to be picked.
1. Data representing the apples in the closest area is received by the Arduino Mega from the vision system.
2. The data is then analyzed to determine which apple should be harvested or collected.
4. Robot servo positions are translated to the data format readable by the servo controller and the picking process is then performed.
The mobile platform consisted of two high-power high-reduction geared DC brushed motors with encoders and two free-to-rotate steel casters to provide a sturdy foundation and enhance the smoothness of the steering. Four carefully placed ultrasonic sensors were used to allow the platform to navigate with precision in terms of displacement and straightness. Encoder feedback, combined with the sensors, made the processes of steering and turning easy with differential rotation across the two motors. The reduction provided a resolution of up to 10,884.47 encoder counts per revolution with a high output torque of 23 kg-cm.
The robot accomplishes the task by dividing each column into separate zones. It then moves to the chosen zone and uses the image processing to find the coordinates of the balls relative to the robotic arm. The coordinates associated with colors are sent to the Arduino microprocessor that is connected to the robotic arm. The robotic arm receives the coordinates and completes the task. This process is repeated until all the desired ping pong balls are removed or stored.