Alex Thomasson

and 2 more

An autonomous mobile ground-control point (AMGCP) was redesigned and refined for improved collaborative operation with an unmanned aerial vehicle (UAV) to enable calibration of image mosaics from multispectral (MS) and thermal cameras. The AMGCP has built-in reflectance panels and electronically controlled thermal panels that provide high and low reflectance and temperature references that can be used for calibration of reflectance measurements in MS images and temperature measurements in thermal-infrared images. The AMGCP also has an onboard temperature sensor that enables image-based temperature measurements to be compared to ambient temperature so that canopy temperature depression (CTD) can be calculated. The collaborative robotic system consists of the AMGCP and a UAV that have real-time kinematic (RTK) geographic positioning system (GPS) receivers onboard so their precise position can be determined in real time. The system also includes wireless communication capability between the AMGCP and UAV so they can transmit their position and other data to each other during a mission, in which the AMGCP positions itself at multiple locations under the flight path of the UAV, providing multiple instances of reflectance and temperature references in image mosaics collected by the UAV. Testing has shown that reflectance measurements can be calibrated to less than 1% reflectance error, and canopy temperatures of crop plants can be calibrated to within 1.0 C, enabling consistently accurate measurements to be made efficiently and without human intervention in various fields and regions and at different times and dates. This system is also suited to accurate measurement of CTD to facilitate genetic selection relative to various stresses and resilience characteristics like drought tolerance.

Alex Thomasson

and 6 more

Texas A&M University recently completed a set of Automated Precision Phenotyping (APP) Greenhouses that incorporate robotic systems for automated collection of advanced sensor-based plant phenotypes. Transiting the length of a greenhouse is a gantry beam, on which a rolling truck provides a second axis of motion along the gantry. Attached to the truck is a 3.0-m long robotic arm that is controlled to position a sensor head at virtually any position relative to any plant in a greenhouse. The robotic arm can be programmed to operate quickly and safely in complicated scanning patterns to enable data collection on all plants in the greenhouse within a time window of a few hours, ensuring consistent conditions during data collection. The sensor head includes a high-speed multispectral camera and eventually a Raman spectrometer. Relative to phenotyping greenhouses at other institutions, the APP Greenhouses have the advantage of maximum flexibility in configuration of plants in the greenhouses, in positioning of sensors relative to the plants, and in the types of sensors used, making research capabilities in the APP Greenhouses truly unique. Preliminary data have been collected on sorghum and maize plants. Four-band multispectral images have been collected daily, scanning the side of each plant from top to bottom. Preliminary software development is directed at automated image stitching to create a full side-view image of each plant, from which consistent metrics can be automatically calculated, such as plant height, stalk diameter, leaf angle, etc.