LLMpediaThe first transparent, open encyclopedia generated by LLMs

Dynamical mean-field theory

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: Hubbard model Hop 6
Expansion Funnel Raw 1 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted1
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
Dynamical mean-field theory
NameDynamical mean-field theory
FieldCondensed matter physics
Developed1989
FoundersGeorges Kotliar, Gabriel Kotliar, Antoine Georges, Werner Krauth
InstitutionsCentre National de la Recherche Scientifique, Massachusetts Institute of Technology, École Normale Supérieure, Max Planck Institute for Solid State Research

Dynamical mean-field theory is a computational and conceptual approach in condensed matter physics that maps interacting lattice models onto quantum impurity problems embedded in a self-consistent bath, enabling nonperturbative treatment of local correlations. It bridges model Hamiltonians, numerical many-body techniques, and experiments by linking lattice models with quantum impurity solvers to describe phenomena such as the Mott transition, heavy fermion behavior, and spectral weight redistribution. The method has influenced research across institutions and collaborations involving theorists and experimental groups worldwide.

Introduction

Dynamical mean-field theory emerged from work at institutions such as the Centre National de la Recherche Scientifique, Massachusetts Institute of Technology, and École Normale Supérieure by researchers associated with awards and programs at the Max Planck Institute for Solid State Research, the Institut Laue–Langevin, and the Kavli Institute. It formalizes a mapping between lattice models like the Hubbard model, Anderson lattice model, and multiorbital Hamiltonians to quantum impurity models solved in a self-consistent medium. The formulation draws on concepts developed in studies associated with the Royal Society, the National Science Foundation, and collaborative projects linked to the European Research Council and various national laboratories. Early milestones relate to workshops and conferences at institutions such as the Aspen Center for Physics, the International Centre for Theoretical Physics, and the Simons Foundation.

Theoretical Framework

The theoretical framework builds on model Hamiltonians prominent in studies at the Max Planck Society, the American Physical Society meetings, and the Royal Institution seminars. It uses Green’s function formalism originally developed in contexts involving Nobel-recognized methods and leverages self-consistency conditions akin to those employed in mean-field approximations used historically at laboratories like Los Alamos National Laboratory and Bell Labs. Central mathematical structures connect to the Hubbard model, Anderson impurity model, Kondo model, and periodic Anderson model—topics often addressed in lectures at Harvard University, Princeton University, and Stanford University. The formulation interfaces with diagrammatic techniques used at institutions such as the Perimeter Institute, CNRS, and the Weizmann Institute, and invokes concepts related to Fermi liquid theory discussed in seminars at ETH Zurich and the University of Cambridge.

Numerical Methods and Algorithms

Numerical implementations rely on quantum impurity solvers developed and refined across collaborative networks including those at the Flatiron Institute, Forschungszentrum Jülich, and RIKEN. Prominent algorithms include continuous-time quantum Monte Carlo methods associated with groups at the Simons Center, exact diagonalization techniques taught in courses at Columbia University and Yale University, and numerical renormalization group approaches pioneered by researchers connected to the Max Planck Institute for the Physics of Complex Systems. Matrix product state methods and density matrix renormalization group approaches from institutions like the University of California, Berkeley, and the University of Tokyo are integrated into hybrid schemes. Software ecosystems developed in laboratories such as Argonne National Laboratory, Oak Ridge National Laboratory, and Lawrence Berkeley National Laboratory facilitate large-scale computations on supercomputers funded by agencies including the Department of Energy and the European Commission.

Applications in Condensed Matter Physics

Applications span phenomena explored at research centers such as the National High Magnetic Field Laboratory, the European Synchrotron Radiation Facility, and CERN-associated collaborations. The method addresses the Mott metal-insulator transition investigated in studies at the Rutherford Appleton Laboratory and the Paul Scherrer Institute, heavy fermion physics probed by groups at Los Alamos National Laboratory and the University of Geneva, and orbital-selective behavior examined at institutions like the Max Planck Institute for Chemical Physics of Solids. It informs interpretation of experiments at beamlines operated by SLAC National Accelerator Laboratory, Brookhaven National Laboratory, and DESY, and relates to materials studies conducted at the Lawrence Livermore National Laboratory, IBM Research, and Samsung Advanced Institute of Technology.

Extensions and Generalizations

Extensions include cluster methods developed within collaborations involving the University of Illinois Urbana-Champaign, the University of Cambridge Cavendish Laboratory, and MIT’s Research Laboratory of Electronics, embedding spatial correlations via cluster dynamical mean-field approaches. Diagrammatic extensions such as dynamical vertex approximations and dual fermion frameworks have been advanced in research programs at the Max Planck Institute for Solid State Research, the University of Warsaw, and the University of Amsterdam. Embedding schemes combining density functional theory with many-body methods were pursued in projects at the Swiss Federal Institute of Technology (ETH Zurich), Oak Ridge National Laboratory, and Imperial College London. Multiscale approaches linking to techniques popularized at the Kavli Institute and the Simons Foundation aim to connect ab initio electronic structure to low-energy models.

Limitations and Challenges

Known limitations and research challenges are active topics at conferences and workshops hosted by organizations such as the American Physical Society, the European Physical Society, and the International Union of Pure and Applied Physics. Finite-size and finite-temperature effects emphasized by groups at the University of Chicago and the University of California, Santa Barbara constrain accuracy in low-dimensional systems. Computational cost challenges affect large-scale deployments on national computing facilities including those at the National Energy Research Scientific Computing Center and the Leibniz Supercomputing Centre. Open problems addressed in collaborative grants with the European Research Council and national science foundations include treatment of long-range interactions, nonlocal correlations, and real-time dynamics relevant to experiments at facilities like SLAC, DESY, and the Max Planck Institutes.

Experimental Connections and Predictions

Predictions and comparisons to experiments are pursued by interdisciplinary teams at synchrotron and neutron facilities such as the European Synchrotron Radiation Facility, the Institut Laue–Langevin, and the SNS at Oak Ridge, informing interpretation of angle-resolved photoemission spectroscopy, inelastic neutron scattering, and transport measurements undertaken at institutions like the Lawrence Berkeley National Laboratory, Brookhaven National Laboratory, and the Paul Scherrer Institute. Collaborations with materials synthesis groups at universities including the University of Tokyo, the University of Cambridge, and Tsinghua University test theoretical predictions for transition metal oxides, iron pnictides, and heavy fermion compounds. Ongoing experimental programs funded by national agencies and foundations continue to refine quantitative comparisons and motivate methodological advances at the intersection of theory and experiment.

Category:Condensed matter physics