Skip to navigation Skip to content
Careers | Phone Book | A - Z Index

Tess Smidt, “Atomic Architect” and 2018 Luis Alvarez Fellow

July 26, 2018

By Linda Vu

photograph of Tess Smidt

Tess Smidt is Berkeley Lab's 2018 Luis W. Alvarez Fellow in Computing Sciences.

To non-scientists, Tess Smidt describes herself as an “atomic architect.” And as Lawrence Berkeley National Laboratory’s (Berkeley Lab’s) 2018 Luis W. Alvarez Fellow in Computing Sciences, Smidt is designing a neural network that can automatically generate novel atomic crystal structures.

“This is an immensely hard problem. My job as a theorist is to figure out what is possible, if we can wish the atoms into existence,” says Smidt. “There are various ways that atoms arrange themselves; molecules, proteins, and crystals for semiconductors are just a few examples. And while it is relatively easy to develop algorithms to suggest new small molecule arrangements, it is very difficult to develop an algorithm to do this for crystals and systems with many atoms, like large molecules or proteins. This problem is one of the ‘holy grails’ of chemistry.”

While modern computational science has a lot of tools for figuring out what properties a given material may have or what happens when a material is hit with light, Smidt notes that before researchers can apply these methods, they need to give them a starting point and tell them where the atoms are located. By showing neural networks lots of examples of known stable atomic crystal structures, she hopes to teach machines to automatically generate novel stable atomic crystals, which could then be characterized with existing computational physics and chemistry tools.

Smidt’s current work is an extension of her graduate research at UC Berkeley, where she worked with Jeff Neaton’s group at Berkeley Lab’s Molecular Foundry, designing materials and calculating their quantum mechanical properties with supercomputers at the National Energy Research Scientific Computing Center (NERSC).

“For my Ph.D. I worked with a lot of experimental collaborators doing calculations on new experimentally realized materials – hot-off-the-press materials – that had interesting properties,” said Smidt. “We wanted to understand how a particular geometric arrangement of atoms affected the properties of a material. It was fascinating that we could look at nuclei and say that it totally makes sense that they’re behaving this way because of their geometries. I was hooked.”

Smidt said her interest in science and physics was ignited when she was a San Diego middle school student. She watched some Star Trek re-runs and read Brian Greene’s The Elegant Universe. From then on, she knew she wanted to go to the Massachusetts Institute of Technology (MIT) and be a particle physicist. Years later, she would attend MIT as an undergraduate, major in neutrino physics and actually help design neutrino experiments under the mentorship of Professor Janet Conrad. To do this, she taught herself to program and spent summers working at Fermilab on the MicroBooNE experiment and at CERN on the Large Hadron Collider’s (LHC’s) Compact Muon Solenoid (CMS) experiment.

When she arrived at UC Berkeley for graduate school, Smidt decided that she wanted to dig her hands into research that would have a greater impact in the short term. So she began applying her physics expertise to materials problems. But with this new research focus, Smidt found that she missed programming, which led her to join Neaton’s research group. She also decided to take a deep-learning class, the first one offered at UC Berkeley, because she wanted to develop an algorithm for predicting atomic structures, and the modules for doing this didn’t exist yet.

“My interest in computers stems from a passion for articulating ideas as quickly and efficiently as possible. There’s no ambiguity in programming, there’s this clarity of thought that comes with learning to program—if you understand it, you can usually write a program to do it,” said Smidt.

While there are problems that are notoriously difficult to program, she finds those really interesting as well. “This is where my love for deep learning comes in. There were a lot of problems that we thought were really hard, but deep learning showed us that it actually wasn’t hard at all; we were just thinking about them the wrong way,” said Smidt. “Deep learning is just ‘fuzzy programming,’ a method for learning features. I can build a program representing the ‘scaffold’ of the problem as I understand it, and deep learning will fill in the parameters by seeing the data and pushing it forward and backward in the network.”

As a graduate student, Smidt also spent a year as an intern on Google’s Accelerated Science Team, where she applied the company’s machine-learning tools and expertise to tackle the problem of predicting atomic structures. Smidt notes that this work really drove home the fact that science needs for deep learning are very different from the problems that industry is trying to tackle.

“If computer-vision researchers at Google want to know whether there is a cat in a picture, it’s a fairly simple 2D problem and I can train my neural network with a handful of rotational examples. But if I’m dealing with 3D atomic systems, I want a guarantee that my network recognizes that molecules in different orientations are the same things. So depending on how subtle your angular distances need to be, you may need to show the network thousands of rotated examples, and it can be forbiddingly expensive to train on that much data,” said Smidt. “Feeding your network this much data also makes it harder to interpret, and you don’t have a guarantee that it actually gets it.”

But by putting a little more thought into the neural network or program’s ”scaffold” at the beginning, researchers can get a lot more out of it, she adds. “By making your network such that it understands rotation and translation symmetry from the get-go allows you to better understand what your network is learning,” says Smidt. “So in the case of the cat, Google computer vision doesn’t need to understand how the cat is being identified. But in science, we have stricter requirements for interpretability, we absolutely need to understand what enables one material to be more stable over another. This isn’t a lack of rigor on industry’s part, just different goals for science.”

According to Smidt, one of the most exciting aspects of being an Alvarez fellow is the ability to spend two years, in deep thought, working on a project that she’s always wanted to do. “More than that, I also get to do this work in an environment with people who have all these skills that I don’t have,” she adds. “I will get to learn from them, convince them of my vision and get them to collaborate with me.”

As a side project, Smidt also hopes to resurrect a site that she originally created in middle school to sell t-shirts to inspire science-interested teenage girls. But this time around, she hopes to use the site to blog about issues in deep learning.


About Berkeley Lab

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.