large scale electronic structure computations for plasma
play

LARGE SCALE ELECTRONIC STRUCTURE COMPUTATIONS FOR PLASMA DEVICES - PowerPoint PPT Presentation

LARGE SCALE ELECTRONIC STRUCTURE COMPUTATIONS FOR PLASMA DEVICES Presenter: Purnima Ghale Advisor: Professor Harley T. Johnson Department of Mechanical Science and Engineering University of Illinois, Urbana-Champaign We use Blue Waters to


  1. LARGE SCALE ELECTRONIC STRUCTURE COMPUTATIONS FOR PLASMA DEVICES Presenter: Purnima Ghale Advisor: Professor Harley T. Johnson Department of Mechanical Science and Engineering University of Illinois, Urbana-Champaign

  2. We use Blue Waters to … § Develop large scale electronic structure calculations § Investigate nano and microscale dielectric barrier Dielectric barrier discharge, first reported by discharges Siemens, 1857. § Technological applications in microcombustion, chemical processing

  3. Atomistic resolution E z dielectric metal dielectric metal Argon ~10 $ atoms · 10 3 O- surface 10 DOS [electrons/eV] Bulk E F Si- surface 5 0 ~10 % atoms − 30 − 20 − 10 0 Energy [eV]

  4. 100 4000 Of interest: Red: Rate of electron 50 2000 emission from the dielectric 0 0 Blue: AC voltage. 4000V, -50 -2000 f = 20kHz -100 -4000 0 20 40 60 80 Current as a function of time § A microscopic Surface charge on the understanding of DBD dielectric under AC 50 voltage. devices 0 [Ghale and Johnson, § So far, qualitative Phys. Rev. B , 2019] -50 agreement with -100 experiments -100 0 100 Lissajous plot under AC voltage

  5. Overview of methods (⃗ *, ⃗ +) Classical Coarse-grained Continuum Quantum Monte Carlo Tight-binding DFT Potentials Models Models ∼ 100 atoms ∼ 100,000 ∼ 1,000 ~10 & ? ? Tight-binding is the least expensive method that can still give us quantum-mechanical information about electrons at the atomistic level (bandgaps, transport, charge-density)

  6. Possible Computations § Systems for which Carbon quasi-crystal Polycrystalline Ni classical molecular Ahn et. al. Science, 2018 Swygenhoven, Science, 2002 dynamics have been done § Disordered systems § Systems without translational symmetry Quenched amorphous-Si Deringer, et al., Protein, lipid bi-layer, water § Glasses and liquids J. Phys. Chem. Lett., 2018 Zuse Institute Berlin

  7. Background % ∇ # *+, + 1 1 (" 2' + ) 2 " ) Ψ 0 4 , … , 0 7 = 9 Ψ(0 4 , … , 0 7 ) 0 # − 0 / # #./ % ( ∇ # " 2' + ) ) Ψ x 4 , … , 0 7 = 9 Ψ x 4 , … , 0 7 *::,# # ; < = @ 4 + ? ? @ % + ⋯ ? @ 7 Ψ = 9 4 + 9 % … 9 7 Ψ ? ? < B < E Separation of variables : C B D 4 + C E D % + ⋯ = 9 4 + 9 % + ⋯ 9 7 Indistinguishability/anti-symmetry ; @ D # (0 # ) = F # D # (0 # ) @ ⃗ H # = F # H #

  8. Input Coordinates Computation of atoms { ~ R i } Hamiltonian H ( t ) = H SK + V ext ( t ) H i = H ( t ) + H ∆ ( P ) § Given matrix !, compute # where Compute Den- sity matrix P !$ % = ' % $ % $ % $ % ) # = ( || P i − P i − 1 || ≤ ✏ No % # * = # Yes Self consis- § Each rank of P represents tent H and P an electron Energy = Tr[ PH ], density of states, § Large eigenspace charge n ( ~ r ) problem

  9. !" #$ Algorithms Red ed een !" #% Green ue !" #& Bl Blue Violet Vi 1 Sparsity § O(N) in terms of FLOPS § Rely on localization of solution matrix § Expect P to be sparse § Sparse matrix-matrix multiplication (SpMM) Density matrix (matrix size = 7500) (Water) Bock and Challacombe SIAM J. Sci. Comput., 2013

  10. Challenge: 50 τ = 1 e − 6 τ = 1 e − 5 40 τ = 1 e − 4 τ = 1 e − 3 30 With the threshold parameter ! = 10 %& , 20 the memory required § Memory and increases by a factor 8 of 4. communication 2 0 2 4 6 8 10 § Even for a moderate No. of SP2 iterations Increase in % of non-zeros in density matrix threshold, the % of non- (Silica) Ghale and Johnson, Comput. Phys. Comm. , 2018 zeros grows fast. § Sparse matrix-matrix multiplications (SpMM)

  11. Our solution · 10 4 2 walltime [s] 1 . 5 1 § Memory-aware 0 . 5 § Based on Sparse-matrix- 0 vector multiplications 0 1 2 3 (SpMVs) Number of atoms · 10 6 § Construct implicit Simulation of 3.6 million atoms on a single solution (P) large memory node. ~6 hours Ghale and Johnson, Comput. Phys. Comm., 2018 § Sampled via random vectors

  12. Role of Blue Waters: § Scale of hardware x 10,000 nodes (some testing) § Optimized libraries Availability of fast, distributed, optimized SpMV kernels through PETSC Tested 10 # atoms

  13. Impact of Blue Waters · 10 3 4 0 . 5 × 10 6 1 . 9 × 10 6 3 4 . 6 × 10 6 Wall time [s] § Scaling up to 10 # atoms 9 × 10 6 2 § Time-dependent 1 simulations § Essential electron 0 emission data 0 1 2 3 number of processors · 10 3 § Current allocation: bawi Scaling data (Exploratory allocation, baoq)

  14. Future work/other impacts: ■ Technical impact within area – Ability to solve large systems self-consistently – Rates of electron transfer for plasma devices ■ Outside materials science – fast, implicit, accurate, projection matrices ■ Future – Time-dependent Hamiltonian (AC voltage)

  15. Acknowledgements ■ This material is based [in part] upon work supported by the Department of Energy, National Nuclear Security Administration, under Award Number DE-NA0002374. ■ Blue Waters allocation, exploratory support and ongoing technical support. ■ Numerous discussions within XPACC (Center for Exascale Simulation of Plasma Coupled Combustion)

Recommend


More recommend