Difference between revisions of "Weeding Robot"

From P2P Food Lab
Jump to: navigation, search
 
(36 intermediate revisions by 2 users not shown)
Line 2: Line 2:
 
While working on [https://p2pfoodlab.net/CitizenSeeds/about.html CitizenSeeds] and the [[Sensorboxes]] of the Starter Kit, we had the idea to not only put cameras in small vegetable gardens but also on farms so people could remotely follow what is growing in the fields. During a discussion with a friend/market farmer, it became clear that a tool to control the weeds in his field was much more useful for him than cameras. He showed me videos of advanced weeding machines that work "in-row" ([https://www.youtube.com/watch?v=qaxwJQ0_FwM video]), meaning, they also remove weeds in between plants in a row.  
 
While working on [https://p2pfoodlab.net/CitizenSeeds/about.html CitizenSeeds] and the [[Sensorboxes]] of the Starter Kit, we had the idea to not only put cameras in small vegetable gardens but also on farms so people could remotely follow what is growing in the fields. During a discussion with a friend/market farmer, it became clear that a tool to control the weeds in his field was much more useful for him than cameras. He showed me videos of advanced weeding machines that work "in-row" ([https://www.youtube.com/watch?v=qaxwJQ0_FwM video]), meaning, they also remove weeds in between plants in a row.  
  
That became the starting point of the weeding robot: design (1) a lightweight and practical tool for farmers to remove weeds that (2) also obtains data, such as image maps, to document what's happing in the field. We are interested mostly in small, agroecological/permaculture/bio-intensive farms that plant the crops in densely spacings, that require more manual work and for with tractor-based machines are not approriate.
+
[[Image:Robocrop-InRow-The-revolutionary-Robot-weeder.jpg|thumb|center|320px|Example of an existing commercial in-row weeder for tractors.]]
 +
 
 +
 
 +
That became the starting point of the weeding robot: design (1) a light-weight and practical tool for farmers to remove weeds that (2) also obtains data, such as image maps, to document what's happing in the field. We are interested mostly in small, agroecological/permaculture/bio-intensive farms that plant the crops in densely spacings, that require more manual work and for with tractor-based machines are not appropriate.
  
[[Image:Robocrop-InRow-The-revolutionary-Robot-weeder.jpg|thumb|center|320px|In-row weeder for tractors]]
 
  
 
We didn't choose for design with several arms such as in the "in-row" designs. Instead we decided to use a CNC machine and put it on wheels. CNC machines can position a milling tool precisely on three axis and are used for cutting objects out of wood or metal. We replaced the milling machine with a "soil mixer" that perturbs small and germinating weeds. The CNC machine that we use is the X-Carve. (Both [http://farmbot.io FarmBot] and [http://www.ecorobotix.com/ ecoRobotix] take a similar approach.)
 
We didn't choose for design with several arms such as in the "in-row" designs. Instead we decided to use a CNC machine and put it on wheels. CNC machines can position a milling tool precisely on three axis and are used for cutting objects out of wood or metal. We replaced the milling machine with a "soil mixer" that perturbs small and germinating weeds. The CNC machine that we use is the X-Carve. (Both [http://farmbot.io FarmBot] and [http://www.ecorobotix.com/ ecoRobotix] take a similar approach.)
Line 17: Line 19:
 
[[Image:DSC01013.JPG|thumb|center|640px|The X-Carve CNC machine]]
 
[[Image:DSC01013.JPG|thumb|center|640px|The X-Carve CNC machine]]
  
[[Image:office.jpg|thumb|center|640px|Brought to you from an office in the center of Paris.]]
+
 
 +
<center>Brought to you from an office in the center of Paris:</center>
 +
<center><HTML5video width="640" height="360" autoplay="false" loop="false">weedingrobot-20160908-640x360</HTML5video></center>
 +
 
 +
 
 +
== Existing work on agribots ==
 +
 
 +
{|style="border-collapse: separate; border-spacing: 0; border: 1px solid #000; padding: 10px"
 +
|style="padding: 30px; vertical-align: top"|[[File:simon blackmore.jpg|300px]]
 +
[http://www.harper-adams.ac.uk/initiatives/national-centre-precision-farming/ Dr. Simon Blackmore - Harper Adams Univ.]
 +
|style="padding: 30px; vertical-align: top"|[[File:ag bit 2 - peter corke - maxresdefault.jpg|300px]]
 +
[http://roboticvision.org/working-with-us/industry/target-industries/agriculture/ AgBot II, Australian Centre for Robotics Vision].
 +
|style="padding: 30px; vertical-align: top"|[[File:naio-technologies.jpg|300px]]
 +
[http://www.naio-technologies.com/machines-agricoles/robot-de-desherbage-oz/ Oz - Naïo Technologies].
 +
|-
 +
|style="padding: 30px; vertical-align: top"|[[File:IMG_20160525_154934-1024x768.jpg|300px]]
 +
[http://www.naio-technologies.com/machines-agricoles/robot-enjambeur-legumes/ Dino, Naïo Technologies].
 +
|style="padding: 30px; vertical-align: top"|[[File:farmbot-1640.jpg|300px]]
 +
[http://farmbot.io FarmBot].
 +
|style="padding: 30px; vertical-align: top"|[[File:img_bonirob_multimedia_module.png|300px]]
 +
[https://www.deepfield-robotics.com/en/BoniRob.html BoniRob - Deepfield Robotics / Bosch].
 +
|-
 +
|style="padding: 30px; vertical-align: top"|[[File:ladybird.JPG|300px]]
 +
[http://confluence.acfr.usyd.edu.au/display/AGPub/Our+Robots Ladybird - Univ. Sydney].
 +
|style="padding: 30px; vertical-align: top"|[[File:uni-sydney-agri-food-systems-ladybird-robot-field-545x216.jpg|300px]]
 +
[http://confluence.acfr.usyd.edu.au/display/AGPub/Our+Robots RIPPA - Univ. Sydney].
 +
|style="padding: 30px; vertical-align: top"|[[File:IMG_6565_small2.jpg|300px]]
 +
[http://confluence.acfr.usyd.edu.au/display/AGPub/Our+Robots SwagBot - Univ. Sydney].
 +
|-
 +
|style="padding: 30px; vertical-align: top"|[[File:robot_desherbage_eco_robotix_danielle.jpg|300px]]
 +
[http://www.ecorobotix.com/ ecoRobotix].
 +
|style="padding: 30px; vertical-align: top"|[[File:bot2karot.jpg|300px]]
 +
[http://www.gouvernement.fr/en/eliott-sarrey-bot2karot-is-the-link-between-the-virtual-and-the-real-between-gardening-games-and Bot2Karot].
 +
|style="padding: 30px; vertical-align: top"|[[File:article-0-153329CB000005DC-23_634x524.jpg|300px]]
 +
[http://www.wall-ye.com/ Wall-ye]
 +
|-
 +
|style="padding: 30px; vertical-align: top"|[[File:wall-ye page3_img10.png|300px]]
 +
[http://www.wall-ye.com/ Wall-ye]
 +
|}
 +
 
 +
==Robotics for agriculture==
 +
 
 +
In the fields, crops may have variable shapes and colors, moreover the meteorological conditions add some further variability bringing challenging problems for robotics <ref name='review_robotics'> [http://dx.doi.org/10.1016/j.biosystemseng.2016.06.014 Agricultural robots for field operations: Concepts and components] Bechar, Avital
 +
and Vigneault, Clément, Biosystems Engineering 2016 </ref>. The classical tasks of robotics (perception, navigation) should then be adapted for unstructured objects in unstructured environments.
 +
 
 +
Although self-driving vehicules are now grasping the guidance part, many tasks are still under development and the following sections describes those adapted to the weeding robot.
 +
 
  
  
Line 23: Line 71:
 
== Computer vision ==
 
== Computer vision ==
  
Montoring and nurturing of crops is greatly helped with tools from computer vision. In particular, the following task are considered:
+
Monitoring and nurturing of crops is greatly helped with tools from computer vision.  
  
- Image fusion: gathering images from multiple locations and/or multiples captors, algorithms like stitching are helpful in building a consistent representation of these data.
+
In particular, the following task are considered:
  
- Image segmentation: the distinction between regions occupied by plants and those occupied by ground is critical to the operation of the wedding robot. Furthermore, we focus on algorithms that can provide segmentation at reasonale frame rate to have realtime operation.
+
- [[Image fusion]]: gathering images from multiple locations and/or multiples captors, algorithms like stitching are helpful in building a consistent representation of these data.  
  
- Image recognition: Not yet implemented.
+
- [[Image segmentation]]: the distinction between regions occupied by plants and those occupied by weeds and ground is critical to the operation of the wedding robot. Furthermore, we focus on algorithms that can provide segmentation at reasonable frame rate to have realtime operation.
 +
[[File:improc.jpg]]
 +
 
 +
 
 +
 
 +
- [[Image recognition]]: Large neural networks have recently had success in building appropriate representations to detect the crops. For example, deep architectures (like Inception, 150 layers) can distinguish between various species. Another strategy matches templates to detect crops and we are currently investigating these model based image analysis.
 +
[[File:reco_510.jpg]]
 +
 
 +
- [[3D reconstruction]]: Based on multiple views of a plant, the 3d plant can be reconstructed.
  
 
== Motion planning ==
 
== Motion planning ==
  
Based on image analysis, a map of the robot workspace is built. Two possible ways to build the map are considered:
+
The motion of the tool is planned to:
  
- Offline map construction
+
- cover the workspace with [[various strategies]], or
  
- Online map construction
+
- cover the ground area avoiding the cultivated plants using a [[modified boustrophedon]] or a [[self-organized map]].
 +
 
 +
 
 +
== Additional sensors ==
 +
 
 +
We also wish to integrate additional sensors like a potentiostat, see its [https://docs.google.com/document/d/1G0vFTasQZEsVb0G0EbWwWWZxkRtcOSpB6Mi-jZrjBMc/edit?usp=sharing documentation].
 +
 
 +
== Samples of collected data ==
 +
 
 +
 
 +
* Large scale map of the cuture bed.
 +
[[File:ls_map.jpg]]
 +
 
 +
* Temporal dynamics of the growth of plants.
 +
[[File:tempdyn.jpg]]
 +
 
 +
* Timelapse of plants capturing their rapid motions.
 +
<center><HTML5video width="640" height="360" autoplay="false" loop="false">tlapse</HTML5video></center>
 +
 
 +
* Point Cloud from RGB+D camera
 +
[[File:1.png]]
 +
 
 +
== Set up notes ==
 +
 
 +
Details related to setting up the system are compiled here: [[installation notes]]
 +
 
 +
== In-field experiments ==
 +
 
 +
Preliminary results about eeding and crop monitoring from the 2017 campaign are presented here:
 +
[[2017 experiments]]
  
 
== Licenses ==
 
== Licenses ==
  
# All the designs, images, videos and documentation are licensed under the Creative Commons Attribution + ShareAlike license ([https://creativecommons.org/licenses/by-sa/2.0/ CC BY-SA v2.0]).  
+
* All the designs, images, videos and documentation are licensed under the Creative Commons Attribution + ShareAlike license ([https://creativecommons.org/licenses/by-sa/2.0/ CC BY-SA v2.0]).  
# All the software is licensed under the [https://www.gnu.org/licenses/gpl-3.0.en.html GNU GPL v3]
+
* All the software is licensed under the [https://www.gnu.org/licenses/gpl-3.0.en.html GNU GPL v3]
 +
 
 +
==References==

Latest revision as of 19:43, 3 September 2017

While working on CitizenSeeds and the Sensorboxes of the Starter Kit, we had the idea to not only put cameras in small vegetable gardens but also on farms so people could remotely follow what is growing in the fields. During a discussion with a friend/market farmer, it became clear that a tool to control the weeds in his field was much more useful for him than cameras. He showed me videos of advanced weeding machines that work "in-row" (video), meaning, they also remove weeds in between plants in a row.

Example of an existing commercial in-row weeder for tractors.


That became the starting point of the weeding robot: design (1) a light-weight and practical tool for farmers to remove weeds that (2) also obtains data, such as image maps, to document what's happing in the field. We are interested mostly in small, agroecological/permaculture/bio-intensive farms that plant the crops in densely spacings, that require more manual work and for with tractor-based machines are not appropriate.


We didn't choose for design with several arms such as in the "in-row" designs. Instead we decided to use a CNC machine and put it on wheels. CNC machines can position a milling tool precisely on three axis and are used for cutting objects out of wood or metal. We replaced the milling machine with a "soil mixer" that perturbs small and germinating weeds. The CNC machine that we use is the X-Carve. (Both FarmBot and ecoRobotix take a similar approach.)

So now we have an interesting challenge: develop the computer vision and motion planning to control the "robot".

We will use this page to post the technical information. This is an "open" project and if you are interested in helping, please contact us.


The first trip of the robot outside of the office!
The X-Carve CNC machine


Brought to you from an office in the center of Paris:


Existing work on agribots

Simon blackmore.jpg

Dr. Simon Blackmore - Harper Adams Univ.

Ag bit 2 - peter corke - maxresdefault.jpg

AgBot II, Australian Centre for Robotics Vision.

Naio-technologies.jpg

Oz - Naïo Technologies.

IMG 20160525 154934-1024x768.jpg

Dino, Naïo Technologies.

Farmbot-1640.jpg

FarmBot.

Img bonirob multimedia module.png

BoniRob - Deepfield Robotics / Bosch.

Ladybird.JPG

Ladybird - Univ. Sydney.

Uni-sydney-agri-food-systems-ladybird-robot-field-545x216.jpg

RIPPA - Univ. Sydney.

IMG 6565 small2.jpg

SwagBot - Univ. Sydney.

Robot desherbage eco robotix danielle.jpg

ecoRobotix.

Bot2karot.jpg

Bot2Karot.

Article-0-153329CB000005DC-23 634x524.jpg

Wall-ye

Wall-ye page3 img10.png

Wall-ye

Robotics for agriculture

In the fields, crops may have variable shapes and colors, moreover the meteorological conditions add some further variability bringing challenging problems for robotics [1]. The classical tasks of robotics (perception, navigation) should then be adapted for unstructured objects in unstructured environments.

Although self-driving vehicules are now grasping the guidance part, many tasks are still under development and the following sections describes those adapted to the weeding robot.



Computer vision

Monitoring and nurturing of crops is greatly helped with tools from computer vision.

In particular, the following task are considered:

- Image fusion: gathering images from multiple locations and/or multiples captors, algorithms like stitching are helpful in building a consistent representation of these data.

- Image segmentation: the distinction between regions occupied by plants and those occupied by weeds and ground is critical to the operation of the wedding robot. Furthermore, we focus on algorithms that can provide segmentation at reasonable frame rate to have realtime operation. Improc.jpg


- Image recognition: Large neural networks have recently had success in building appropriate representations to detect the crops. For example, deep architectures (like Inception, 150 layers) can distinguish between various species. Another strategy matches templates to detect crops and we are currently investigating these model based image analysis. Reco 510.jpg

- 3D reconstruction: Based on multiple views of a plant, the 3d plant can be reconstructed.

Motion planning

The motion of the tool is planned to:

- cover the workspace with various strategies, or

- cover the ground area avoiding the cultivated plants using a modified boustrophedon or a self-organized map.


Additional sensors

We also wish to integrate additional sensors like a potentiostat, see its documentation.

Samples of collected data

  • Large scale map of the cuture bed.

Ls map.jpg

  • Temporal dynamics of the growth of plants.

Tempdyn.jpg

  • Timelapse of plants capturing their rapid motions.
  • Point Cloud from RGB+D camera

1.png

Set up notes

Details related to setting up the system are compiled here: installation notes

In-field experiments

Preliminary results about eeding and crop monitoring from the 2017 campaign are presented here: 2017 experiments

Licenses

  • All the designs, images, videos and documentation are licensed under the Creative Commons Attribution + ShareAlike license (CC BY-SA v2.0).
  • All the software is licensed under the GNU GPL v3

References

  1. Agricultural robots for field operations: Concepts and components Bechar, Avital and Vigneault, Clément, Biosystems Engineering 2016