Density functional theory calculations of large systems: Interplay between fragments, observables, and computational complexity
Citations Over TimeTop 10% of 2021 papers
Abstract
Abstract In the past decade, developments of computational technology around density functional theory (DFT) calculations have considerably increased the system sizes which can be practically simulated. The advent of robust high performance computing algorithms which scale linearly with system size has unlocked numerous opportunities for researchers. This fact enables computational physicists and chemists to investigate systems of sizes which are comparable to systems routinely considered by experimentalists, leading to collaborations with a wide range of techniques and communities. This has important consequences for the investigation paradigms which should be applied to reduce the intrinsic complexity of quantum mechanical calculations of many thousand atoms. It becomes important to consider portions of the full system in the analysis, which have to be identified, analyzed, and employed as building‐blocks from which decomposed physico‐chemical observables can be derived. After introducing the state‐of‐the‐art in the large scale DFT community, we will illustrate the emerging research practices in this rapidly expanding field, and the knowledge gaps which need to be bridged to face the stimulating challenge of the simulation of increasingly realistic systems. This article is categorized under: Electronic Structure Theory > Density Functional Theory Software > Simulation Methods Structure and Mechanism > Computational Materials Science
Related Papers
- → Computational Complexity Aspects in Membrane Computing(2010)2 cited
- → Computational Complexity on the Blackboard(2017)1 cited
- → Computational model theory: an overview(1998)2 cited
- → The Efficient Implementation of Correction Procedure via Reconstruction with GPU Computing(2013)
- Computing Equilibria: A Computational Complexity Perspective.(2015)