While working on CitizenSeeds and the Sensorboxes of the Starter Kit, we had the idea to not only put cameras in small vegetable gardens but also on farms so people could remotely follow what is growing in the fields. During a discussion with a friend/market farmer, it became clear that a tool to control the weeds in his field was much more useful for him than cameras. He showed me videos of advanced weeding machines that work "in-row" (video), meaning, they also remove weeds in between plants in a row.
That became the starting point of the weeding robot: design (1) a light-weight and practical tool for farmers to remove weeds that (2) also obtains data, such as image maps, to document what's happing in the field. We are interested mostly in small, agroecological/permaculture/bio-intensive farms that plant the crops in densely spacings, that require more manual work and for with tractor-based machines are not appropriate.
We didn't choose for design with several arms such as in the "in-row" designs. Instead we decided to use a CNC machine and put it on wheels. CNC machines can position a milling tool precisely on three axis and are used for cutting objects out of wood or metal. We replaced the milling machine with a "soil mixer" that perturbs small and germinating weeds. The CNC machine that we use is the X-Carve. (Both FarmBot and ecoRobotix take a similar approach.)
So now we have an interesting challenge: develop the computer vision and motion planning to control the "robot".
We will use this page to post the technical information. This is an "open" project and if you are interested in helping, please contact us.
Existing work on agribots
Robotics for agriculture
In the fields, crops may have variable shapes and colors, moreover the meteorological conditions add some further variability bringing challenging problems for robotics . The classical tasks of robotics (perception, navigation) should then be adapted for unstructured objects in unstructured environments.
Although self-driving vehicules are now grasping the guidance part, many tasks are still under development and the following sections describes those adapted to the weeding robot.
Monitoring and nurturing of crops is greatly helped with tools from computer vision.
In particular, the following task are considered:
- Image fusion: gathering images from multiple locations and/or multiples captors, algorithms like stitching are helpful in building a consistent representation of these data.
- Image segmentation: the distinction between regions occupied by plants and those occupied by weeds and ground is critical to the operation of the wedding robot. Furthermore, we focus on algorithms that can provide segmentation at reasonable frame rate to have realtime operation.
- Image recognition: Large neural networks have recently had success in building appropriate representations to detect the crops. For example, deep architectures (like Inception, 150 layers) can distinguish between various species. Another strategy matches templates to detect crops and we are currently investigating these model based image analysis.
- 3D reconstruction: Based on multiple views of a plant, the 3d plant can be reconstructed.
The motion of the tool is planned to:
- cover the workspace with various strategies, or
We also wish to integrate additional sensors like a potentiostat, see its documentation.
Samples of collected data
- Large scale map of the cuture bed.
- Temporal dynamics of the growth of plants.
- Timelapse of plants capturing their rapid motions.
- Point Cloud from RGB+D camera
Set up notes
Details related to setting up the system are compiled here: installation notes
Preliminary results about eeding and crop monitoring from the 2017 campaign are presented here: 2017 experiments
- All the designs, images, videos and documentation are licensed under the Creative Commons Attribution + ShareAlike license (CC BY-SA v2.0).
- All the software is licensed under the GNU GPL v3
- Agricultural robots for field operations: Concepts and components Bechar, Avital and Vigneault, Clément, Biosystems Engineering 2016