Weeding Robot

From P2P Food Lab
Revision as of 20:43, 3 September 2017 by Kodda (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

While working on CitizenSeeds and the Sensorboxes of the Starter Kit, we had the idea to not only put cameras in small vegetable gardens but also on farms so people could remotely follow what is growing in the fields. During a discussion with a friend/market farmer, it became clear that a tool to control the weeds in his field was much more useful for him than cameras. He showed me videos of advanced weeding machines that work "in-row" (video), meaning, they also remove weeds in between plants in a row.

Example of an existing commercial in-row weeder for tractors.

That became the starting point of the weeding robot: design (1) a light-weight and practical tool for farmers to remove weeds that (2) also obtains data, such as image maps, to document what's happing in the field. We are interested mostly in small, agroecological/permaculture/bio-intensive farms that plant the crops in densely spacings, that require more manual work and for with tractor-based machines are not appropriate.

We didn't choose for design with several arms such as in the "in-row" designs. Instead we decided to use a CNC machine and put it on wheels. CNC machines can position a milling tool precisely on three axis and are used for cutting objects out of wood or metal. We replaced the milling machine with a "soil mixer" that perturbs small and germinating weeds. The CNC machine that we use is the X-Carve. (Both FarmBot and ecoRobotix take a similar approach.)

So now we have an interesting challenge: develop the computer vision and motion planning to control the "robot".

We will use this page to post the technical information. This is an "open" project and if you are interested in helping, please contact us.

The first trip of the robot outside of the office!
The X-Carve CNC machine

Brought to you from an office in the center of Paris:

Existing work on agribots

Simon blackmore.jpg

Dr. Simon Blackmore - Harper Adams Univ.

Ag bit 2 - peter corke - maxresdefault.jpg

AgBot II, Australian Centre for Robotics Vision.


Oz - Naïo Technologies.

IMG 20160525 154934-1024x768.jpg

Dino, Naïo Technologies.



Img bonirob multimedia module.png

BoniRob - Deepfield Robotics / Bosch.


Ladybird - Univ. Sydney.


RIPPA - Univ. Sydney.

IMG 6565 small2.jpg

SwagBot - Univ. Sydney.

Robot desherbage eco robotix danielle.jpg




Article-0-153329CB000005DC-23 634x524.jpg


Wall-ye page3 img10.png


Robotics for agriculture

In the fields, crops may have variable shapes and colors, moreover the meteorological conditions add some further variability bringing challenging problems for robotics [1]. The classical tasks of robotics (perception, navigation) should then be adapted for unstructured objects in unstructured environments.

Although self-driving vehicules are now grasping the guidance part, many tasks are still under development and the following sections describes those adapted to the weeding robot.

Computer vision

Monitoring and nurturing of crops is greatly helped with tools from computer vision.

In particular, the following task are considered:

- Image fusion: gathering images from multiple locations and/or multiples captors, algorithms like stitching are helpful in building a consistent representation of these data.

- Image segmentation: the distinction between regions occupied by plants and those occupied by weeds and ground is critical to the operation of the wedding robot. Furthermore, we focus on algorithms that can provide segmentation at reasonable frame rate to have realtime operation. Improc.jpg

- Image recognition: Large neural networks have recently had success in building appropriate representations to detect the crops. For example, deep architectures (like Inception, 150 layers) can distinguish between various species. Another strategy matches templates to detect crops and we are currently investigating these model based image analysis. Reco 510.jpg

- 3D reconstruction: Based on multiple views of a plant, the 3d plant can be reconstructed.

Motion planning

The motion of the tool is planned to:

- cover the workspace with various strategies, or

- cover the ground area avoiding the cultivated plants using a modified boustrophedon or a self-organized map.

Additional sensors

We also wish to integrate additional sensors like a potentiostat, see its documentation.

Samples of collected data

  • Large scale map of the cuture bed.

Ls map.jpg

  • Temporal dynamics of the growth of plants.


  • Timelapse of plants capturing their rapid motions.
  • Point Cloud from RGB+D camera


Set up notes

Details related to setting up the system are compiled here: installation notes

In-field experiments

Preliminary results about eeding and crop monitoring from the 2017 campaign are presented here: 2017 experiments


  • All the designs, images, videos and documentation are licensed under the Creative Commons Attribution + ShareAlike license (CC BY-SA v2.0).
  • All the software is licensed under the GNU GPL v3


  1. Agricultural robots for field operations: Concepts and components Bechar, Avital and Vigneault, Clément, Biosystems Engineering 2016