FarmBot: The Future of Agriculture

When it comes to innovation and artificial intelligence, one of the most common fields developers and students try to automate and innovate in is the agriculture industry. However, although there are a vast number of innovative ideas, not many come through as an actual product, and if anything, not many are realistic and sustainable. For sure, it may be due to the lack of knowledge about the industry, or lack of coding skills, but it is mainly because of both surprisingly. Agriculture may seem like a very easy field to innovate in, though, not many developers or students know how to apply the concepts of computer science, machine learning, and artificial intelligence to the agriculture industry. Moreover, this can be proven by just looking around the agriculture industry. How many artificial intelligence products are there compared to other industries such as the medical sector? Of course, the number would be significantly low due to the fact, like stated above, there aren't many ideas that are sustainable and realistic that match the industry in real time. In other words, if we were to compare industry by industry by the amount of artificial intelligence innovation and products there are within the industry itself with levels of light intensity, the agriculture sector would be the darkest.

But, there's a saying, "when there is darkness, there will always be light", and yes there is light that started to shine in the sector. After various years of an innovation drought, there has been a product that has not only saved the industry, but also has revolutionized with it means to innovate with artificial intelligence. Meet Farmbot, an artificial intelligence powered open-source project, that has re-defined what it means to farm. When it comes to farming, FarmBot can do everything ranging from checking the soil height, planning crops ahead of time, checking the health of the crops, planting crops and much more! FarmBots essentially come with equipped onboard cameras and a computer vision software that is powered by LUA scripts and open-source JavaScript and Python libraries. The combination of the camera and this very software give the FarmBot the ability to measure the soil height of your entire garden bed using stereoscopic image processing techniques. Furthermore, with an accurate mapping of your bed's soil height, FarmBot can then sow seeds more precisely, measure the soil's moisture content, and abate weeds.

But you might be wondering, “how does the bot know what seeds to plant? ". Don't worry, that was my question as well when I first encountered this product. Although it seems difficult to train the bot, FarmBot went out-of-the-box with their user- interaction and has allowed users to graphically design their farm by dragging and dropping plants into the map. When indicated, the bot will plant the seeds of the very crop at that specific place using the same coordinate axis that the software used. Moreover, by using the bot's manual controls, users can move the bot and operate its tools and peripherals in real-time when wanted. Users can scare birds away while at work, take photos, turn the lights on for a night-time harvest, and etc!

When it comes to customization, FarmBot does not end there. FarmBot comes with a fairly simple easy editor, which allows users to completely customize the way their FarmBot operates. Recently, with their new support of variables, users can spend less time configuring and more time enjoying the fruits of their FarmBot's labor. The best thing about their editor is that there is no coding required, rather just a simple drag and drop command system.

But how does FarmBot do all this? Essentially when it comes to planting, and weed detection, FarmBot is fed with calibration parameters and image detection models, which lets it determine the location of the objects in the image. Based on the known locations (from user-inputs of the software) of desired plants in the image, the bot can then determine which plants are desired plants, and which ones are weeds.

However, something to keep in mind is the fact that the weeding tool is only a certain size, and disrupts the soil within a certain area.

Currently, we see that the weeder might affect the lower left plant when weeding weed number 1, since its region of influence is intersecting the desired plant's circle. We also see that we wouldn't be able to weed 2 without significantly disrupting the upper right plant. However, we can weed 3 safely. But, the software takes the weeding tool size into careful consideration with a feature called Safe Remove.

This feature adjusts the location to be weeded for weed 1 away from the lower left plant, removes weed 2 from the list since it can't be removed safely, and keeps 3 on the list of weeds to remove since there are no conflicts. You can see the weeds to remove and the weeder location represented with the red and grey circle as before, and cyan circles drawn for weeds that may not be removed completely (or at all) because the action might harm a desired plant.

When it comes to measuring the soil height, FarmBot simulates a virtual stereo camera using FarmBot's CNC camera positioning system. Stereo photography, like binocular vision, provides depth information via parallax, where subjects closer to the lens move further in the frame between lens positions than subjects farther from the lens.

OpenCV computes this disparity between detected object positions in stereo image frames.

This process is performed at multiple locations to develop equation coefficients for a correlation between disparity and distance for a particular camera and environment. These coefficients are applied to the computed disparity values (with the soil as the subject) to calculate distance, which is finally combined with camera position data from FarmBot's known coordinate system to calculate the z axis coordinate of the soil.

Hence, with the soil position mapped to FarmBot's coordinate system, FarmBot can then perform actions that engage the soil surface as seeding and weeding.

When it comes to determining the location of soil in the image, we will assume that the most common depth map value represents the soil. In the following image, the selected soil depth is highlighted in green. Depth values for objects far from the soil level are highlighted with red, with bright red indicating objects closer to the camera and dark red indicating objects farther from the camera.

The bot then annotates the depth map using the colour-coding:

However, the bot will exclude plants using HSV filtering with the values provided.

At last, the plants removed from the depth map to not interfere with soil surface selection:

Of course, there is much more to FarmBot when it comes to machine learning and deep learning applications to the product, but these are just basic examples. FarmBot continues to update their product regularly with new hardware and software features, artificial intelligence models, and bug fixes. At this rate, it's almost very certain that products like these will not only be the future of the agriculture industry but will also be an inspiration to other companies and developers to innovate sustainable and realistic products for the future of agriculture industry. Therefore, it may not be too long until there will be some light (referring back to the light intensity reference) shining on the sector, as there already is with this product; FarmBot.

Author: Jaival Patel

Turner Fenton Secondary School at Brampton, Ontario