DFTB+ Release 19.1

Download

sourceSource code of the software with regression tests
executables (x86_64/Linux)

Precompiled executables for x86_64 (64 bit) architecture with Linux operating system.
Use the OMP_NUM_THREADS environment variable to contol the number of threads used by the binaries.

Note: the executables only support OpenMP parallelism. Depending on your hardware, you may obtain substantial benefits from compiling with MPI parallelism.

Compilation

See the INSTALL.rst file in the source for compilation instructions.

This release had been successfully compiled and tested by the developers on the following architectures:

MachineSystemCompilersMPINumerical librariesNotes
x86_64LinuxIntel Fortran/C 16.0MPICH 1.5MKL 2016, ARPACK96 
x86_64LinuxIntel Fortran/C 18.0MPICH 3.2MKL 2018, ARPACK96 
x86_64LinuxGNU Fortran/C 5.3OpenMPI 2.1ScaLAPACK 2.02, LAPACK 3.6.0, OpenBLAS 0.2.20, ARPACK96GNU1
x86_64LinuxGNU Fortran/C 8.2OpenMPI 3.1ScaLAPACK 2.02, LAPACK 3.6.0, OpenBLAS 0.3.6, ARPACK-NG 3.7.0GNU2
x86_64LinuxNAG 6.2 / GCC 5.4MPICH 3.2ScaLAPACK 2.02, LAPACK 3.8.0, OpenBLAS 0.2.20, ARPACK96 
x86_64LinuxPGI Fortran/C 18.10OpenMPI 2.1 (as shipped with the compiler)

PGI ScaLAPACK, PGI LAPACK/BLAS, ARPACK96

PGI1, PGI2,
PGI3

Notes:

[GNU1] Older GNU compilers (especially versions 4.x) are known to fail to compile this release (due to insufficient implementation of the Fortran 2003 standard).

[GNU2] Some test systems where found to be unreasonably slow, when using hybrid parallelisation (MPI + OpenMP) with a binary linked against a threaded OpenBLAS. If you experience similar issues, use either only MPI or only OpenMP parallelisation.

[PGI1] Older PGI compilers (before 17.4) are known to deliver incorrectly working binaries (due to erroneous implementation of Fortran 2003 features)

[PGI2] If you run DFTB+ with threads, make sure, the stack size limit is not set to unlimited, because PGI's diagonalizer seem to hang for certain matrices in those cases. Setting the stack size explicitely to 8192 (usual default value) seems to solve the problem.

[PGI3] Newer PGI compiler (e.g. 19.4) produce a binary, which segfaults for certain systems (likely due to problems in the compilers OpenMP impementation). Make sure, you are using PGI 18.10 instead.

Most relevant changes since release 18.2

Added

  • Non-equilibrium Greens function transport.
  • Onsite corrected Hamiltonian for ground state energies.
  • Ability to perform ground state MD with excitation energies.
  • Faster excited state calculation.
  • Faster MPI-parallelised solvers using the ELSI library.
  • GPU acceleration using the MAGMA library (still experimental)

Changed

  • Updated parser version to 7.

Fixed

  • Orbital-resolved projected eigenstates (shell-resolved ones were correct)

For further changes, see the Change Log file.