‘Self-driving labs.’ Scientists at PNNL explore how AI can help transform research

Scientific research takes time. Experimentation can be painstakingly slow. Innovation is often more incremental than immediate.

This all may be true, but given today’s most pressing challenges in science, energy and security, we no longer have the luxury of time. Researchers at the Department of Energy’s Pacific Northwest National Laboratory are exploring the frontiers of artificial intelligence, or AI, to quickly find solutions that can transform these fields.

Their use of AI to aid research will ultimately lead to “self-driving laboratories” and “autonomous experiments” that fully integrate artificial intelligence, robotics and digital technologies. For example, the planned Microbial Molecular Phenotyping Capability, an expansion to the Environmental Molecular Sciences Laboratory, a DOE user facility at PNNL, will use AI to automate the entire experimentation process — from sample preparation to data analysis to guiding the next steps of an experiment. This revolutionary approach will give scientists new ways to rapidly enable biotechnological advances, such as more economical and efficient production of fuels, chemicals and materials.

Researchers at PNNL are using artificial intelligence to speed the process of scientific discovery, including enabling advancements in materials science with tools that rapidly identify patterns in electron microscope images without human guidance.
Researchers at PNNL are using artificial intelligence to speed the process of scientific discovery, including enabling advancements in materials science with tools that rapidly identify patterns in electron microscope images without human guidance.

In one significant step toward self-driving laboratories, researchers at PNNL are incorporating a breakthrough in electron microscopy that uses an AI approach known as deep learning to accelerate advances in materials science. Using complex algorithms, they can train a model to find patterns of interest in the high volume of data collected by these instruments — making it possible to speed the development of new materials for more efficient catalysts and longer-lasting batteries.

While researchers have been applying some of these methods to electron microscopy at PNNL for about a decade, they still required human engagement and were labor intensive.

For instance, scientists studying radiation damage to materials used in nuclear reactors or spacecraft typically begin by manually tracing regions of interest on the images captured by the electron microscope. They then hand-label these datasets and use them to train a model to identify similar areas in other images. Besides being time-consuming, human labeling can lead to errors, especially as people review images of the same sample captured by the microscope’s different lenses.

But now, researchers have built upon a PNNL-developed platform that merges features in electron microscopy images, visualizes data in real time and identifies areas of potential interest automatically. It can do in about an hour what previously took researchers upward of 100 hours — and can now identify patterns in materials without human intervention.

These image classification tools offer such value to the research community that PNNL is working with a leading research instrument supplier to incorporate them into their commercial products.

In our labs, researchers already are applying this technique to improve understanding of damage to materials in high radiation environments. While there is no shortage of data, the missing link has been the ability to interpret it in a way that elucidates the degradation process, shows what is happening at material interfaces and informs how to engineer new, better-performing materials.

When the PNNL-developed model analyzes an image of a material from an electron microscope (left), it divides the image into “chips,” which are then sorted into a network graph of “communities” (right) based on the chips’ similarities to one another. This allows automatic classification of shared material properties and regions in the original image (left).

With the help of the deep learning model, the electron microscope image is accurately parsed into small sections that are sorted into groups based on similarity — even when they are not physically next to each other in the actual material. These different groups are then color coded to represent different levels of radiation damage, providing new insights without the researcher having to tell the model what to look for.

Beyond this work with microscopes, PNNL has much greater aspirations for the power of AI to transform research and innovation. Through our partnership with Microsoft, we will use high-performance computing, cloud computing and artificial intelligence to accelerate scientific discovery in chemistry and materials science. Our journey toward delivering results in hours or days that would have previously taken months or years has just begun, but the prospects are exciting — and the world can’t wait for the results!

Steven Ashby is the director of the Pacific Northwest National Laboratory.