When it comes to innovation and artiﬁcial intelligence, one of the most common ﬁelds developers and students try to automate and innovate in is the agriculture industry. However, although there are a vast number of innovative ideas, not many come through as an actual product, and if anything, not many are realistic and sustainable. For sure, it may be due to the lack of knowledge about the industry, or lack of coding skills, but it is mainly because of both surprisingly. Agriculture may seem like a very easy ﬁeld to innovate in, though, not many developers or students know how to apply the concepts of computer science, machine learning, and artiﬁcial intelligence to the agriculture industry. Moreover, this can be proven by just looking around the agriculture industry. How many artiﬁcial intelligence products are there compared to other industries such as the medical sector? Of course, the number would be signiﬁcantly low due to the fact, like stated above, there aren't many ideas that are sustainable and realistic that match the industry in real time. In other words, if we were to compare industry by industry by the amount of artiﬁcial intelligence innovation and products there are within the industry itself with levels of light intensity, the agriculture sector would be the darkest.
But you might be wondering, “how does the bot know what seeds to plant? ". Don't worry, that was my question as well when I ﬁrst encountered this product. Although it seems difﬁcult to train the bot, FarmBot went out-of-the-box with their user- interaction and has allowed users to graphically design their farm by dragging and dropping plants into the map. When indicated, the bot will plant the seeds of the very crop at that speciﬁc place using the same coordinate axis that the software used. Moreover, by using the bot's manual controls, users can move the bot and operate its tools and peripherals in real-time when wanted. Users can scare birds away while at work, take photos, turn the lights on for a night-time harvest, and etc!
When it comes to customization, FarmBot does not end there. FarmBot comes with a fairly simple easy editor, which allows users to completely customize the way their FarmBot operates. Recently, with their new support of variables, users can spend less time conﬁguring and more time enjoying the fruits of their FarmBot's labor. The best thing about their editor is that there is no coding required, rather just a simple drag and drop command system.
But how does FarmBot do all this? Essentially when it comes to planting, and weed detection, FarmBot is fed with calibration parameters and image detection models, which lets it determine the location of the objects in the image. Based on the known locations (from user-inputs of the software) of desired plants in the image, the bot can then determine which plants are desired plants, and which ones are weeds.
However, something to keep in mind is the fact that the weeding tool is only a certain size, and disrupts the soil within a certain area.
Currently, we see that the weeder might affect the lower left plant when weeding weed number 1, since its region of inﬂuence is intersecting the desired plant's circle. We also see that we wouldn't be able to weed 2 without signiﬁcantly disrupting the upper right plant. However, we can weed 3 safely. But, the software takes the weeding tool size into careful consideration with a feature called Safe Remove.
This feature adjusts the location to be weeded for weed 1 away from the lower left plant, removes weed 2 from the list since it can't be removed safely, and keeps 3 on the list of weeds to remove since there are no conﬂicts. You can see the weeds to remove and the weeder location represented with the red and grey circle as before, and cyan circles drawn for weeds that may not be removed completely (or at all) because the action might harm a desired plant.
When it comes to measuring the soil height, FarmBot simulates a virtual stereo camera using FarmBot's CNC camera positioning system. Stereo photography, like binocular vision, provides depth information via parallax, where subjects closer to the lens move further in the frame between lens positions than subjects farther from the lens.
OpenCV computes this disparity between detected object positions in stereo image frames.
This process is performed at multiple locations to develop equation coefﬁcients for a correlation between disparity and distance for a particular camera and environment. These coefﬁcients are applied to the computed disparity values (with the soil as the subject) to calculate distance, which is ﬁnally combined with camera position data from FarmBot's known coordinate system to calculate the z axis coordinate of the soil.
Hence, with the soil position mapped to FarmBot's coordinate system, FarmBot can then perform actions that engage the soil surface as seeding and weeding.
When it comes to determining the location of soil in the image, we will assume that the most common depth map value represents the soil. In the following image, the selected soil depth is highlighted in green. Depth values for objects far from the soil level are highlighted with red, with bright red indicating objects closer to the camera and dark red indicating objects farther from the camera.
The bot then annotates the depth map using the colour-coding:
However, the bot will exclude plants using HSV ﬁltering with the values provided.
At last, the plants removed from the depth map to not interfere with soil surface selection:
Of course, there is much more to FarmBot when it comes to machine learning and deep learning applications to the product, but these are just basic examples. FarmBot continues to update their product regularly with new hardware and software features, artiﬁcial intelligence models, and bug ﬁxes. At this rate, it's almost very certain that products like these will not only be the future of the agriculture industry but will also be an inspiration to other companies and developers to innovate sustainable and realistic products for the future of agriculture industry. Therefore, it may not be too long until there will be some light (referring back to the light intensity reference) shining on the sector, as there already is with this product; FarmBot.
Author: Jaival Patel
Turner Fenton Secondary School at Brampton, Ontario