Reconstructing Neural Maps from Electron Microscopy Data with Deep Learning

By Philip Laserstein and Vijay Iyer, MathWorks

Researchers in the Department of Connectomics at the Max Planck Institute (MPI) for Brain Research study neural networks in the cerebral cortex in order to understand how the brain processes sensory experience to detect objects in the environment. Their work involves building connectomes—maps of neuronal circuits that identify the individual connections between neurons.

Unlike the “neurons” of artificial neural networks, biological neurons are not organized into neat rows of one-dimensional layers. Instead, they are packed and connected in a dense 3D space-filling mesh that can be studied only from images of brain tissue captured with the nanometer resolution of an electron microscope (Figure 1). 

Figure 1. Dense reconstruction of approximately 500,000 cubic micrometers of mammalian cortical tissue yielding 2.7 m of neuronal cables, making up a connectome of about 400,000 synapses between 34,221 axons.
From Motta, Berning, Boergens, Staffler, Beining, Loomba, Hennig, Wissler, Helmstaedter. “Dense connectomic reconstruction in layer 4 of the somatosensory cortex.” Science. October 24, 2019. Reprinted with permission from AAAS.1

Why Study Biological Neural Networks?

Convolutional neural networks (CNNs) were inspired by biological intelligence, with the feedforward connectivity of neurons and layers in a CNN resembling that of the visual cortex of humans and other animals. Increased computing power and the availability of massive amounts of data have improved the performance and accuracy of CNNs, but when compared with the human brain, they are remarkably inefficient, in both the energy they consume and the labels they require in training. A large-scale CNN classifier deployed in a cloud computing environment consumes several orders of magnitude more power than the human brain, and while a toddler can learn to classify objects after seeing just a few dozen examples, CNNs need millions of accurately labeled images. Research teams that rely on deep learning are beginning to bump up against these limitations. By analyzing connectomes to understand how evolution solved these challenges in biological neural networks, researchers may find clues for developing next-generation artificial neural networks.

3D electron micrographs reveal the bulbous cell bodies of individual neurons and the dense and tortuous network of thin neuronal cables connecting neurons. The single cable, or axon, projecting from each neuron is an extremely thin structure. Smaller than one micron in diameter, axons connect to neighboring neurons as well as to more distant neurons, such as those in different layers of the same cortical area, or to neurons many millimeters away, even on the opposite side of the brain. Each neuron in the cerebral cortex can receive connections from thousands of other neurons on its own local tree of branching neural cables, or dendrite. These individual connection points (synapses) between the axonal cable of one neuron and the dendritic cable of another are of a sub-micrometer scale. 

The challenge for researchers in the emerging field of Connectomics is to develop techniques to map neuronal connections across this vast range of scales (Figure 2). The Max Planck researchers focused on dense reconstructions of connectomes, work that requires the highest possible accuracy in tracing neuronal cables and identifying synapses within electron microscopy volumes. The challenge is large scale: a single cubic millimeter of gray matter in the cerebral cortex contains kilometers of branching neuronal cables and about 1 billion synapses.

Figure 2. Scales of neural connectivity in the cerebral cortex, from nanometer-scale synapses between individual neurons to millimeter-scale connection distances. Orange = axons; blue = dendrites.

Manual dense reconstruction of connectomes from electron microscopy data typically takes tens of thousands of work hours, even for smaller sample volumes containing around 1 million synapses. To automate the more labor-intensive parts of the reconstruction process, the Department of Connectomics developed FocusEM, a workflow that combines human annotation with automation powered by convolutional neural networks created in MATLAB®. The CNN models were trained and executed using parallel processing on a high-performance computing (HPC) cluster.

FocusEM made it possible to reconstruct 0.9 meters of dendrites and about 1.8 meters of axons in the somatosensory cortex, identifying nearly 500,000 synapses, with only 4000 human work hours, 10 to 25 times more efficiently than previously. This work was published in the journal Science, where the researchers showed how accurate dense reconstructions at this scale can contribute to a detailed understanding of local brain circuitry.

Connectome Reconstruction Challenges

To image the sample, a block of brain tissue is extracted and stained with heavy metal compounds. The sample is transferred into an electron microscope equipped with a custom-built microtome. Tissue imaging alternates with tissue slicing, in which thin slices of 25–30 nm are obtained with the microtome’s diamond knife. Thousands of imaging and cutting alternations produce a 3D image dataset that is hundreds of gigabytes to terabytes in size (Figure 3).

Figure 3. Serial scanning electron microscopy for brain imaging. A probe of neuronal tissue gets imaged and subsequently cut with a custom-built microtome. Alternations of cutting and imaging result in a 3D image stack. Scale bar = 1µm.

To map a connectome, researchers must trace the axon from each neuron as it winds through the 3D volume to identify where it connects to other neurons.

When two cable segments positioned near one another are detected, then researchers must closely analyze the image to determine whether they are part of the same axon, two separate pieces connected via a synapse, or unrelated segments. 

Deep Learning for Large-Scale Reconstruction of Neuronal Circuits

The FocusEM workflow automates most of the time-consuming annotation and decision-making steps in the connectome reconstruction process. The workflow consists of three main stages:

  • Preprocessing steps, based on image processing algorithms and heuristics
  • Image segmentation, based on image processing algorithms and deep learning
  • Morphological reconstruction, based on machine learning combined with focused human queries

The preprocessing stage includes steps such as aligning individual 2D image slices within the 3D sample volume using a global least-squares solver, masking out easily identifiable structures such as blood vessels and nuclei, and correcting for image brightness.

The image segmentation stage is based on a workflow called SegEM, published by the Max Planck Department of Connectomics in the journal Neuron in 2015. SegEM uses a custom-built 3D CNN in combination with image segmentation algorithms such as watershed transforms. For the sample volume of 500,000 cubic micrometers in the current study, the SegEM stage produced 15 million distinct volume segments.

This morphological reconstruction stage relies on a set of machine learning classifiers that were custom-built and trained in MATLAB to support the FocusEM workflow:

  • The ConnectEM classifier determines the likelihood that two adjacent volume segments are physically connected, as in part of a continuous neural cable.
  • The SynEM classifier determines whether adjacent volume segments correspond to a neural connection, a synapse that occurs across a thin gap of nanometer scale (Figure 2); these can be identified via distinct image features such as clusters of synaptic vesicles.
  • Four TypeEM classifiers categorize volume segments as belonging to an axon, a dendrite, a dendritic spine head (the location of a potential synapse), or a non-neuron cell type.

The FocusEM workflow uses these classifiers to automate many steps in the dense reconstruction process. Trained human annotators focus on directed queries from the classifiers to resolve complex situations such as crossings between multiple neural cables.

The result of this semi-automated workflow was a greater than tenfold reduction in work hours compared with manual approaches to dense reconstruction (Figure 4). The FocusEM workflow code is available for download from a GitLab repository

Figure 4. Work hours required for different approaches to densely reconstruct a cubic millimeter of nerve tissue. While manual approaches are time-consuming and expensive, FocusEM allows dense reconstruction of larger brain volumes within realistic time frames and costs.

Acceleration with High-Performance Computing

In addition to minimizing the human work hours required to complete the dense reconstruction of a connectome, the Max Planck researchers sought to minimize the compute time required for the automated steps in the FocusEM workflow.

To achieve this, the researchers turned to parallel computing. The Department of Connectomics accesses a compute cluster containing 2500 CPU cores and 32 GPUs via MATLAB Parallel Server™. The team used Parallel Computing Toolbox™ to help parallelize the image preprocessing algorithms and custom CNN classifiers. Aside from a global image registration step, most compute steps in the dense reconstruction workflow were data parallel because the classifiers could be run on different portions of the sample volume simultaneously. 

“Versatility and speed are top priorities in our development process. The ability to move from an initial idea to a highly parallelized production deployment without having to rewrite code or rethink data structures is vital to our team.”

Moritz Helmstaedter, Director, Max Planck Institute for Brain Research, Department of Connectomics

For the 500,000 cubic micrometer sample volume reconstruction, the FocusEM compute steps took approximately 100 hours of compute time. Compared with the 4000 human work hours required, the computational work was therefore not a bottleneck. Most of the FocusEM processing was computed on the CPUs, and they used about 20% of their local CPU capacity (384 cores). GPUs were used to accelerate training of the SegEM custom deep learning classifier used for image segmentation.

Analyzing the Connectome

Having completed the first dense reconstruction for 500,000 cubic micrometers of cerebral cortex, the Max Planck researchers analyzed the resulting connectivity and geometric data. Their analysis yielded valuable insights into the local properties of this living neural network:

  • Distinct classes of neurons (excitatory and inhibitory) contacted their target cells with distinct innervation patterns, confirming the findings of previous experiments using only connectomic data.
  • Geometry-based rules for how axons and dendrites fill cortical volumes do not explain the connectivity patterns observed, as has been proposed by some prior theoretical models.
  • The measured distribution of synaptic sizes in the connectome can provide insights into learning processes that may have occurred in the brain.

With a sample volume more than 300 times larger than past dense cortical reconstructions, spanning about 7000 axons and about 400,000 synapses, this study provided a level of statistical power not previously available for addressing such questions of local brain circuitry.

Plans for Further Research

Having established the feasibility and scientific value of dense cortical reconstruction at the current scale of 500,000 cubic micrometers, the Max Planck research team is now working to obtain more types of brain samples, to allow comparisons between species and between different brain states, such as diseased and healthy states.

The Department of Connectomics has also begun to tackle the further challenge of reconstructing larger cortical volumes spanning multiple brain layers and containing longer-distance neural connections. They continue to improve automation techniques such as FocusEM to lower the costs of dense reconstructions. Work is under way to analyze a petabyte-sized dataset from a cubic millimeter sample volume, which matches the scale of functional units identified in past studies of brain function. The results achieved to date using parallelized MATLAB show they can complete the FocusEM compute steps for a petabyte-sized dataset on their local cluster without impeding the overall reconstruction effort.

1 Readers may view, browse, and/or download material for temporary copying purposes only, provided these uses are for noncommercial personal purposes. Except as provided by law, this material may not be further reproduced, distributed, transmitted, modified, adapted, performed, displayed, published, or sold in whole or in part, without prior written permission from the publisher.

Published 2020