Crop Yield Estimation
Crop yield estimation is an important task in apple orchard management. Accurate yield prediction helps growers improve fruit quality and reduce operating cost by making better decisions on intensity of fruit thinning and size of the harvest labor force. It benefits the packing industry as well, because managers can use estimation results to optimizepacking and storage capacity. Typical yield estimation is performed based on historical data, weather conditions, and workers manually counting apples in multiple sampling locations. This process is time-consuming and labor-intensive, and the limited sample size is usually not enough to reflect the yield distribution across the orchard, especially in those with high spatial variability. Therefore, the current yield estimation practice is inaccurate and inefficient, and improving it would go a long way to making the tree fruit industry more profitable.
To address this challenge, we are developing automated crop yield estimation systems, with an initial focus on apples. To date, we have designed and deployed two prototype systems, one by Vision Robotics Corp. (VRC) and one by Carnegie Mellon University (CMU). Both systems use computer vision technologies to mimic human eyes and count apples in orchards; on the other hand, their designs differ according to their specific goals.
The VRC system is designed to be cost effective and to produce accurate estimates of both apple size and count, with no human input (such as hand counting) required. The hardware is fully self-contained on a single platform which can be towed by any typical orchard vehicle. A vertical mast houses nine stereo cameras and five flashes, which afford multiple views of trees up to approximately 13 feet tall and allow both daytime and nighttime operation. A low-cost GPS unit is integrated, permitting georeferenced yield estimates to be produced. The current prototype has been field tested in commercial orchards and between 2010 and 2012 has scanned over 15 acres, including both red and green apple varieties. The current accuracy for fruit count estimates is dependent on apple visibility, with highly-trained systems yielding relative errors within approximately ±10% and less well-trained systems yielding relative errors within approximately ±16%; further technical refinement of the software's “automatic bias correction” algorithm (which compensates for occluded fruit) remains necessary to operate effectively in less-managed systems. The system has demonstrated promising results in the estimation of median fruit size, with relative errors within ±3%.
Vision Robotics crop yield estimation system at a commercial orchard in Washington State.
Sample Vision Robotics crop yield estimation system apple counts (blue) alongside manual hand counts (green) for rows comprising one acre of a less well-trained commercial orchard in Washington State. Here, the system did not fully compensate for occluded fruit, leading to an underestimate of apple count.
The CMU system was designed with the end goals of accuracy, practicality, and affordability. Its compact hardware includes two low-cost off-the-shelf cameras and two ring flashes, and can be mounted to a variety of orchard tractors and vehicles for fruit-scouting. It also integrates a Global Positioning System receiver to improve yield estimation accuracy and generate yield maps.
Carnegie Mellon crop yield estimation system as deployed in orchards in Washington and Pennsylvania.
Field tests in 2011 and 2012 covered over 3 acres of apple orchards in Washington and Pennsylvania. The results show that the crop yield estimation error is less than ±5% for a block of about half acre. The larger the area, the more accurate the estimation result. The system works with both red and green apples and is able to generate yield maps.
Example of apple crop yield map generated by the CMU system.
More on Yield Estimation
Scout Y4 Goals