On 20th and 21st July 2021 the Virtual Materials Design 2021 CECAM workshop took place virtually. I was excited about this workshop and the opportunity to get in touch with researchers working on high-throughput computational materials design. While I am not actively working in this field the special requirements of the multitude of calculations running in this field clearly have been one of the main motivations for my work on DFTK, error control and black-box SCF algorithms. In advance of the workshop I asked the organisers to participate with a contributed talk to present my work to this community for the first time, which thankfully got accepted.
Due to the virtual format the workshop it was unfortunately rather packed, which allowed for little time to engage in discussion during the presentation slot. However, the organisers arranged multiple longer poster sessions in a GatherTown virtual world, which allowed for almost realistic face-to-face discussions. In these GatherTown sessions I talked with a number of scientists working on high-throughput studies as well as designing the large software infrastructures, which are commonly used to conduct these. At the level of performing millions of individual calculations in a screening study this naturally poses especial demands on the workflow software as well and I was curious to learn about some of the details.
With my focus on advocating a more mathematical look at screening and DFT simulations I represented a minority viewpoint at the meeting and I was very curious about the general feedback and critique of the more applied scientists in response to our recently proposed ideas. In general people were indeed quite interested to learn about our work on reliable SCF methods for inhomogeneous systems, but being confronted with our recent error estimation perspectives, some had doubts about the required effort being really worth it for DFT simulations. I certainly understand that concern. However, I think one should keep in mind the successes and potential, which has been unlocked by error estimation techniques in other fields, such as finite-element modelling or aerospace design. In these fields simulation methods have both become more efficient due to the lessons learned from uncertainty quantification and error estimation and the nowadays well-established error estimation techniques have furthermore contributed to prevent accidents from trusting faulty simulation data (such as the Sleipner A oil rig collapse). While clearly not all aspects of macroscopic modelling apply in the microscopic world, it is not hard to imagine that error bars establishing a guaranteed trustworthiness can make screening decisions more robust, thus potentially preventing costly manufacture of less useful compounds. Furthermore I expect a careful introduction of numerical errors (e.g. by lowering the floating-point type) to balance numerical error against the (usually much larger) DFT model error to allow for notable computational savings when performing on the order of millions of DFT calculations.
Overall I have enjoyed the two afternoons with many discussions in the high-throughput design community. As usual my slides are attached below.
Link |
---|
Towards error-controlled, black-box density-functional theory methods (Slides) |