Physicists in the US have used machine learning to determine the phase diagram of a system of 12 idealized quantum particles to a higher precision than ever before. The work was done by Eun-Ah Kim of Cornell University and colleagues who say that they are probably the first to use machine learning algorithms to uncover “information beyond conventional knowledge” of condensed matter physics.
Roger Melko
So far, machine learning has only been used to confirm established condensed matter results in proof-of-principle demonstrations, says Roger Melko of the University of Waterloo in Canada, who was not involved in the work. For example, Melko has used machine learning to sort various magnetic states of matter that had already been previously classified. Instead, Kim and colleagues have made new predictions about their system’s phases that are unattainable with other methods. “This is an example of machines beating prior work by humans,” says Melko.
Kim’s group studied the physics of 12 idealized electrons interacting according to the Ising model – which describes the interaction between the spins of neighbouring particles. Although their 12-particle model is simplistic compared to real-life materials, this system can just barely be simulated by supercomputers. This is because the complexity of quantum simulations grows exponentially with every additional particle.
The team was particularly interested in understanding the many body localization (MBL) phases that can arise in quantum systems. These phases occur when particles are out of equilibrium and do not behave as a collection of non-interacting particles nor as an ensemble. Physicists struggle to describe MBL phases because statistical concepts like temperature and pressure are ill-defined. “They challenge our understanding of quantum statistical mechanics and quantum chaos,” says Kim.
90% classification accuracy
The team taught the machine learning algorithm to draw a phase diagram that includes two different MBL phases and one conventional phase. To do this, they first generated simulated data of different configurations of the 12 quantum particles that correspond to known phases. They fed each configuration to a neural network, which classified the data as a particular phase. At this point in the machine-learning process the researchers told the neural network whether its classification was correct. Given that feedback, the neural network iteratively developed an algorithm based on matrix multiplication that could distinguish among phases. The neural network could achieve 90% classification accuracy after being trained with 1000 different particle configurations.
The next step involved using the neural network to classify particle configurations of unknown phase. By sorting these configurations, they could fill a phase diagram with boundaries that were more distinct compared to prior diagrams made from other techniques.
How do they learn?
One important downside of using neural networks to predict new physics is that we do not have a clear understanding of how the systems learn. This is a broad area of current research known as the interpretability problem. Fortunately, Kim’s neural network is relatively simple. Many neural networks, such as those that power speech and image recognition algorithms, involve feeding input data through multiple iterations of matrix multiplication called “hidden layers” before they produce an output. These hidden layers are the most opaque parts of the learning process, and Kim’s neural network only has one hidden layer. Her group is now trying to pick apart what exactly that hidden layer is doing. “It’s possible to look inside a simple, custom-built neural network and figure out how it’s making its decisions,” says Kim.
In addition, Kim wants to see if the team can apply a more sophisticated type of machine learning, known as unsupervised learning, to condensed matter problems. Unlike supervised learning, where the algorithm is given the correct answer as feedback, an unsupervised learning algorithm does not receive such feedback.
Condensed matter problems are particularly well-suited for machine learning because they involve many interacting particles, and therefore lots of data, says Melko. The field is moving fast, he says. “Just like you pick up your phone and take for granted that Siri works, in a few years I think everyone’s going to take for granted that there’s some integration of AI technology in these very complex quantum experiments,” he says.