Aarhus University Seal / Aarhus Universitets segl

Greenotyper: Image-based plant phenotyping using distributed computing and deep learning

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

  • Marni Tausen
  • ,
  • Marc Mathias Clausen, Denmark
  • Sara Moeskjær
  • ,
  • ASM Shihavuddin, Green University of Bangladesh, DTU Compute, TU of Denmark , Bangladesh
  • Anders Bjorholm Dahl, DTU Compute, TU of Denmark , Denmark
  • Luc Janss
  • Stig Uggerhøj Andersen
Image-based phenotype data with high temporal resolution offers advantages over end-point measurements in plant quantitative genetics experiments, because growth dynamics can be assessed and analysed for genotype-phenotype association. Recently, network-based camera systems have been deployed as customizable, low-cost phenotyping solutions. Here, we implemented a large, automated image-capture system based on distributed computing using 180 networked Raspberry Pi units that could simultaneously monitor 1800 white clover (Trifolium repens) plants. The camera system proved stable with an average uptime of 96% across all 180 cameras. For analysis of the captured images we developed the Greenotyper image analysis pipeline. It detected the location of the plants with a bounding box accuracy of 97.98%, and the U-net-based plant segmentation had an Intersection over Union accuracy of 0.84 and a pixel accuracy of 0.95. We used Greenotyper to analyse a total of 355027 images, which required 24-36 hours. Automated phenotyping using a large number of static cameras and plants thus proved a cost-effective alternative to systems relying on conveyor belts or mobile cameras.
Original languageEnglish
Article number1181
JournalFrontiers in Plant Science
Number of pages17
Publication statusPublished - Aug 2020

    Research areas

  • Deep Learning, Image detection, Raspberry Pi, Plant Phenotyping, Software, Object detection and segmentation, Greenness measures

See relations at Aarhus University Citationformats

ID: 193505918