grape vine

Grape Vine Winter Pruning

The Situation Motivates

Although mechanization has lowered management costs and precision viticulture is growing, vine selection practices still require time-consuming manual operations. Grape vine is a distinctive crop of the Mediterranean Basin. In Italy, viticulture has a relevant economic  impact in all regions, and the wine industry ranks first in the agro-food chain with over 5.5 billion euro of export in 2017. Despite  widespread Vertical Shoot Positioning systems recently demonstrating positive features in several wine districts, modern vineyards still  require expensive manual operations for vine selection. Winter pruning impacts for the 20-35% of the yearly labor demand that can be lowered by introducing mechanical operations when spur-pruning is adopted. However selective manual follow-up is still required and cannot be performed in the absence of direct human cognition. Robots can represent a revolution upscaling in forthcoming innovative farming systems providing automated solutions combining intelligent robot vision and manipulation. Despite agricultural solutions have been developed in recent years, robotics in viticulture is still in its infancy, and only a few prototypes have been described for performing winter spur-pruning; at present engineering processes seem to be ongoing whilst interactions with grapevine physiology were never assessed.

This project, namely VINUM-ROBOT (2018-2023) tackles the emerging challenge of the dramatic shortage in skilled labor and lower grape price using robotic solutions. The project is researching and developing innovative robotic technologies for grapevine spur-pruning automation.

The information displayed on this page can be also found from the “VINUM” project official website.

The Robot Works

The main goal of this project is to develop and test innovative robotic mobile manipulation technologies for grapevine winter pruning automation. Deep learning processes and pruning skills will be merged into a robot with advanced control capabilities. Multi-modal sensing system, in particular 3D vision, will be developed and integrated to a full torque controled robotic manipulator for grapevine recognition, manipulation, and pruning. This robotic arm will be mounted on a legged locomotion platform forming a mobile manipulator prototype that will be demonstrated in comparison with hand pruning in vineyard. Robot manipulation performance as pruning efficiency, locomotion as related to different terrain as well as vine growth features the following season will be assessed. Impacts of the project embrace the still unreached goal of selective and fully automation of winter pruning. Two trials have been performed in 2018 and 2019.

Trial in 2018: Proof of concept experiment has been demonstrated by a robot arm on table. A simple but effective clipper has been developed and tested.

Trial in 2019: Proof of concept experiment has been demonstrated by a mobile manipulator in the lab. Navigation and new perception algorithms have been tested.

The Simulation Helps

Simulated environments help a faster and efficient robotic technology development. Despite the huge initial effort to develop grape vine winter pruning simulation, the long-term return motivates the effort. For efficient development and deployment of the technology, a high-fidelity simulation environment that provides reliable interaction between Roberry and fruits and leaves has been developed, thanks to the “Learn-Real” project the APRIL lab is running. Advanced rendering and physics modelling from a modern game engine, e.g. Unity3D, Unreal Engine, Nvidia Isaac, Blender 15, and robot modelling tools in ROS have been used for this purpose. It allows to train the robot perception and manipulation skill by easily manipulating the environment paramenters, e.g., the fruits color, texture, wind, and light.

The Perception Guides

A deep neural network has been trained to detect the potential pruning region, where the spur and shoots are located. A further perception algorithm has been applied to generate the pruning points by keeping only the best shoot with knots.

Partners

Dr. Fei Chen, Prof. Darwin Caldwell, Active Perception and Robot Interactive Learning Laboratory, ADVR, IIT

Dr. Claudio Semini, Dynamic Legged Systems, IIT

Prof. Stefano Poni, Prof. Matteo Gatti, Department of Sustainable Crop Production, UNIVERSITÀ CATTOLICA DEL SACRO CUORE, Italy

Note: Dr. Semini and Prof. Gatti are also coordinating the IIT-Unicatt Joint Lab Agri-Food Robotics Projects.

Reference

[1] Fei Chen, Mario Selvaggio, Darwin Caldwell, “Dexterous grasping by manipulability selection for mobile manipulator with visual guidance” in IEEE Transactions on Industrial Informatics 15 (2), 1202-1210, 2018.

[2] Mantian Li, Zeguo Yang, Fusheng Zha, Xin Wang, Pengfei Wang, Ping Li, Qinyuan Ren, Fei Chen, “Design and analysis of a whole-body controller for a velocity controlled robot mobile manipulator”, SCIENCE CHINA Information Sciences, Vol. 63 170204:1–170204:15, July 2020.

[3] Fei Chen, Ferdinando Cannella, Carlo Canali, Traveler Hauptman, Giuseppe Sofia, Darwin Caldwell, “In-hand precise twisting and positioning by a novel dexterous robotic gripper for industrial high-speed assembly”, in IEEE International Conference on Robotics and Automation (ICRA), 270-275, 2014.

Contact

fei.chen -at- iit.it