X-ray microscopy enables multiscale high-resolution 3D imaging of plant cells, tissues, and organs

February 19, 2022

Keith E Duncan (1), Kirk J Czymmek (1), Ni Jiang (1), August C Thies (1), Christopher N Topp (1)
Plant Physiology. Volume 188, Issue 2, Pages 831–845 (February 2022). DOI: https://doi.org/10.1093/plphys/kiab405


Capturing complete internal anatomies of plant organs and tissues within their relevant morphological context remains a key challenge in plant science. While plant growth and development are inherently multiscale, conventional light, fluorescence, and electron microscopy platforms are typically limited to imaging of plant microstructure from small flat samples that lack a direct spatial context to, and represent only a small portion of, the relevant plant macrostructures. We demonstrate technical advances with a lab-based X-ray microscope (XRM) that bridge the imaging gap by providing multiscale high-resolution three-dimensional (3D) volumes of intact plant samples from the cell to the whole plant level. Serial imaging of a single sample is shown to provide sub-micron 3D volumes co-registered with lower magnification scans for explicit contextual reference. High-quality 3D volume data from our enhanced methods facilitate sophisticated and effective computational segmentation. Advances in sample preparation make multimodal correlative imaging workflows possible, where a single resin-embedded plant sample is scanned via XRM to generate a 3D cell-level map, and then used to identify and zoom in on sub-cellular regions of interest for high-resolution scanning electron microscopy. In total, we present the methodologies for use of XRM in the multiscale and multimodal analysis of 3D plant features using numerous economically and scientifically important plant systems.

How Our Software Was Used

Dragonfly’s Segmentation Wizard was used to train a Sensor3D model using seven manually segmented three-phase training frames to identify spikelets, bristles/panicle, and air based on grayscale and on object shape. The trained model was then used to segment the inflorescence structure scan volume. The segmented volume was then adjusted manually to distinguish regions of spikelets and bristles that were not correctly identified by the model. Dragonfly’s Deep Learning Tool with the Sensor3D model was also used to segment the correlative XRM data.

Author Affiliation

(1) Donald Danforth Plant Science Center, St Louis, Missouri 63132, USA