...
 
Commits (159)
......@@ -3,7 +3,6 @@
# sphinx build folder
_build
_templates
# OS generated files #
######################
......
......@@ -22,7 +22,7 @@ pages:
- apt-get -y install dvipng
- pip3 install pygments --upgrade
- pip3 install Sphinx --upgrade
- pip3 install sphinx-bootstrap-theme --upgrade
- pip3 install sphinx-rtd-theme --upgrade
- READTHEDOCS=True sphinx-build -nWT -b html . _build/html
- mv _build/html/ public/
- echo -e "\n\n\e[1mYou can find your build of this documentation at \n\t\e[32m${CI_PAGES_URL}\e[0m\n\n"
......
......@@ -3,6 +3,13 @@
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
.. sidebar:: General Information
.. contents:: :depth: 2
* :ref:`contributing`
* :ref:`search`
.. _readme_classical_md:
********************
......@@ -12,13 +19,6 @@ Classical MD Modules
Introduction
============
.. sidebar:: General Information
.. contents:: :depth: 2
* :ref:`contributing`
* :ref:`search`
.. image:: ./images/lipid.jpg
:width: 15 %
:align: left
......@@ -148,14 +148,14 @@ to enumerate -- both academics and industry will benefit greatly from having
software for these methods.
The modules listed here deal with software to perform path sampling methods,
as well as other approaches to rare events.
as well as other approaches to rare events.
OpenPathSampling
================
Several modules were developed based on
`OpenPathSampling (OPS) <http://openpathsampling.org>`_. These include modules
that have been incorporated into the core of OPS, as well as some that remain
Several modules were developed based on
`OpenPathSampling (OPS) <http://openpathsampling.org>`_. These include modules
that have been incorporated into the core of OPS, as well as some that remain
separate projects. The modules that were incorporated into the core are:
.. toctree::
......@@ -182,7 +182,7 @@ The modules that are based on OPS, but remain separate, are:
.. toctree::
:glob:
:maxdepth: 1
./modules/annotated_trajectories/readme
./modules/ops_piggybacker/readme
./modules/contact_maps/readme
......@@ -190,9 +190,9 @@ The modules that are based on OPS, but remain separate, are:
./modules/dw_dimer_testsystem/readme
./modules/lammps_ops/readme
Nine of these modules were part of
Nine of these modules were part of
`E-CAM Deliverable 1.2 <https://www.e-cam2020.eu/deliverables/>`_. Those modules
provided improvements and new features in software for trajectory sampling and
provided improvements and new features in software for trajectory sampling and
for studying the thermodynamics and kinetics of rare events.
Pilot Projects
......@@ -215,14 +215,17 @@ The following modules were developed specifically for the Classical MD pilot pro
:maxdepth: 1
./modules/contact_maps/readme
./modules/contact_maps_parallelization/readme
./modules/contact_maps_parallelization/readme
./modules/contact_concurrences/readme
./modules/PIcore/readme
./modules/PIhydration/readme
Extended Software Development Workshops (ESDWs)
===============================================
The first ESDW for the Classical MD workpackage was held in Traunkirchen,
Austria, in November 2016, with a follow-up to be held in Vienna in April 2017.
The first ESDW for the Classical MD workpackage was held in Traunkirchen,
Austria, in November 2016, with a follow-up to be held in Vienna in April 2017.
The following modules have been produced:
.. toctree::
......@@ -234,7 +237,7 @@ The following modules have been produced:
./modules/OpenPathSampling/ops_maxlikelihood/readme
./modules/OpenPathSampling/ops_interface_optimization/readme
The second ESDW for the Classical MD workpackage was held in Leiden, Holland, in
The second ESDW for the Classical MD workpackage was held in Leiden, Holland, in
August 2017. The following modules have been produced:
.. toctree::
......@@ -254,7 +257,10 @@ The third ESDW for the Classical MD workpackage was held in Turin, Italy in July
:glob:
:maxdepth: 1
./modules/pyscal/readme
./modules/HTC/decorators/readme
./modules/pybop/readme
./modules/HTC/configuration/readme
./modules/HTC/mpi/readme
./modules/HTC/easybuild/readme
.. _E-CAM: https://www.e-cam2020.eu/
.. In ReStructured Text (ReST) indentation and spacing are very important (it is how ReST knows what to do with your
document). For ReST to understand what you intend and to render it correctly please to keep the structure of this
template. Make sure that any time you use ReST syntax (such as for ".. sidebar::" below), it needs to be preceded
and followed by white space (if you see warnings when this file is built they this is a common origin for problems).
.. Firstly, let's add technical info as a sidebar and allow text below to wrap around it. This list is a work in
progress, please help us improve it. We use *definition lists* of ReST_ to make this readable.
.. sidebar:: Software Technical Information
Name
``jobqueue_features``
Language
Python, YAML
Licence
`MIT <https://opensource.org/licenses/mit-license>`_
Documentation Tool
In-source documentation
Application Documentation
Not currently available. Example usage provided.
Relevant Training Material
Not currently available.
Software Module Developed by
Adam Włodarczyk (Wrocław Centre of Networking and Supercomputing),
Alan O'Cais (Juelich Supercomputing Centre)
.. In the next line you have the name of how this module will be referenced in the main documentation (which you can
reference, in this case, as ":ref:`example`"). You *MUST* change the reference below from "example" to something
unique otherwise you will cause cross-referencing errors. The reference must come right before the heading for the
reference to work (so don't insert a comment between).
.. _htc_yaml:
#################################
HTC Library Configuration in YAML
#################################
.. Let's add a local table of contents to help people navigate the page
.. contents:: :local:
.. Add an abstract for a *general* audience here. Write a few lines that explains the "helicopter view" of why you are
creating this module. For example, you might say that "This module is a stepping stone to incorporating XXXX effects
into YYYY process, which in turn should allow ZZZZ to be simulated. If successful, this could make it possible to
produce compound AAAA while avoiding expensive process BBBB and CCCC."
This module is the second in a sequence that will form the overall capabilities of the library (see :ref:`htc` for the
previous module). This module deals with creating a more comprehensive configuration format for the
`Dask-Jobqueue <https://jobqueue.dask.org/en/latest/>`_ Python library in YAML format.
Purpose of Module
_________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
The goal is to allow numerous ``cluster`` instances (which is a place where tasks are executed) to be defined more
broadly and cover all possibilities that the queueing system might offer as well as in configurations that are required
to execute MPI/OpenMP tasks.
The implementation is generic but the specific example provided is for SLURM on the
`JURECA <http://www.fz-juelich.de/ias/jsc/EN/Expertise/Supercomputers/JURECA/JURECA_node.html>`_ system.
Background Information
______________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
This module builds upon the work described in :ref:`htc` and the mechanism already provided by the
`Dask configuration <https://docs.dask.org/en/latest/configuration.html>`_ and
the `Dask-Jobqueue configuration <https://dask-jobqueue.readthedocs.io/en/latest/configuration-setup.html>`_
Building and Testing
____________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
The library is a Python module and can be installed with
::
python setup.py install
More details about how to install a Python package can be found at, for example, `Install Python packages on the
research computing systems at IU <https://kb.iu.edu/d/acey>`_
To run the tests for the decorators within the library, you need the ``pytest`` Python package. You can run all the
relevant tests from the ``jobqueue_features`` directory with
::
pytest tests/test_cluster.py
Source Code
___________
The latest version of the library is available on the `jobqueue_features GitHub repository
<https://github.com/E-CAM/jobqueue_features>`_
The code that was originally created specifically for this module can be seen in the
`HTC/Yaml Merge Request <https://gitlab.e-cam2020.eu/adam/jobqueue_features/merge_requests/2>`_ which can be found in
the original private repository of the code.
easyblock = 'PythonBundle'
name = 'jobqueue_features'
version = '0.0.4'
versionsuffix = '-Python-%(pyver)s'
homepage = 'https://github.com/E-CAM/jobqueue_features'
description = """
A Python module that adds features to dask-jobqueue to handle MPI workloads and different clusters.
Examples of usage can be found in the examples folder of the installation ($JOBQUEUE_FEATURES_EXAMPLES)
"""
toolchain = {'name': 'intel-para', 'version': '2018b'}
dependencies = [
('Python', '3.6.6'),
('Dask', 'Nov2018Bundle', versionsuffix),
]
use_pip = True
exts_list = [
('typing', '3.6.6', {
'source_urls': ['https://pypi.python.org/packages/source/t/typing/'],
'checksums': ['4027c5f6127a6267a435201981ba156de91ad0d1d98e9ddc2aa173453453492d'],
}),
('pytest-cov', '2.6.0', {
'source_urls': ['https://pypi.python.org/packages/source/p/pytest-cov/'],
'checksums': ['e360f048b7dae3f2f2a9a4d067b2dd6b6a015d384d1577c994a43f3f7cbad762'],
}),
(name, version, {
'patches': ['jobqueue_features-%s.patch' % version],
'source_tmpl': 'v%(version)s.tar.gz',
'source_urls': ['https://github.com/E-CAM/jobqueue_features/archive/'],
'checksums': [
'0152ff89f237225656348865073f73f46bda7a17c97e3bc1de8227eea450fb09', # v0.0.4.tar.gz
'698204ef68f5842c82c5f04bfb614335254fae293f00ca65719559582c1fb181', # jobqueue_features-env.patch
],
}),
]
postinstallcmds = [
'cp -r %(builddir)s/%(name)s/%(name)s-%(version)s/examples %(installdir)s/examples',
'mkdir %(installdir)s/config && cp %(builddir)s/%(name)s/%(name)s-%(version)s/%(name)s/%(name)s.yaml %(installdir)s/config'
]
modextravars = {
'DASK_ROOT_CONFIG': '%(installdir)s/config',
'JOBQUEUE_FEATURES_EXAMPLES': '%(installdir)s/examples',
}
sanity_check_paths = {
'files': ['config/jobqueue_features.yaml'],
'dirs': ['lib/python%(pyshortver)s/site-packages', 'examples', 'config'],
}
moduleclass = 'devel'
.. In ReStructured Text (ReST) indentation and spacing are very important (it is how ReST knows what to do with your
document). For ReST to understand what you intend and to render it correctly please to keep the structure of this
template. Make sure that any time you use ReST syntax (such as for ".. sidebar::" below), it needs to be preceded
and followed by white space (if you see warnings when this file is built they this is a common origin for problems).
.. Firstly, let's add technical info as a sidebar and allow text below to wrap around it. This list is a work in
progress, please help us improve it. We use *definition lists* of ReST_ to make this readable.
.. sidebar:: Software Technical Information
Name
``jobqueue_features``
Language
Python, YAML
Licence
`MIT <https://opensource.org/licenses/mit-license>`_
Documentation Tool
In-source documentation
Application Documentation
Not currently available. Example usage provided.
Relevant Training Material
Not currently available.
Software Module Developed by
Adam Włodarczyk (Wrocław Centre of Networking and Supercomputing),
Alan O'Cais (Juelich Supercomputing Centre)
.. In the next line you have the name of how this module will be referenced in the main documentation (which you can
reference, in this case, as ":ref:`example`"). You *MUST* change the reference below from "example" to something
unique otherwise you will cause cross-referencing errors. The reference must come right before the heading for the
reference to work (so don't insert a comment between).
.. _htc_eb:
###############################
Adding HTC Library to EasyBuild
###############################
.. Let's add a local table of contents to help people navigate the page
.. contents:: :local:
.. Add an abstract for a *general* audience here. Write a few lines that explains the "helicopter view" of why you are
creating this module. For example, you might say that "This module is a stepping stone to incorporating XXXX effects
into YYYY process, which in turn should allow ZZZZ to be simulated. If successful, this could make it possible to
produce compound AAAA while avoiding expensive process BBBB and CCCC."
This module is the fourth in a sequence that will form the overall capabilities of the HTC library (see :ref:`htc_mpi`
for the previous module). This module deals with installing the software on HPC systems in a coherent manner through the
tool `EasyBuild <https://easybuild.readthedocs.io/en/latest/>`_.
Purpose of Module
_________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
The HTC library requires configuration for the target system. Typically, this configuration is applicable system-wide.
If the software is provided in the main software stack of the system, this configuration can also be provided centrally.
The goal of the integration with EasyBuild is to highlight how this configuration can be made with an explicit example
of the configuration for the
`JURECA <http://www.fz-juelich.de/ias/jsc/EN/Expertise/Supercomputers/JURECA/JURECA_node.html>`_ system.
Background Information
______________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
EasyBuild is a software build and installation framework that allows you to manage (scientific) software on High
Performance Computing (HPC) systems in an efficient way. Full details on can be found in the
`EasyBuild documentation <https://easybuild.readthedocs.io/en/latest/>`_.
EasyBuild already has support for Python packages, what we describe here is the specific configuration required to
install a particular version of the library on a specific software stack on JURECA.
Building and Testing
____________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
To build the software requires EasyBuild (see
`installation instructions for EasyBuild here <https://easybuild.readthedocs.io/en/latest/Installation.html>`_) and the
build command:
::
eb jobqueue_features-0.0.4-intel-para-2018b-Python-3.6.6.eb
However, please note that this will only work "out of the box" for those with software installation rights on the JURECA
system. The provided sources (as described below) are intended as templates for those who are familiar with EasyBuild to
adapt to their system (the only expected adaption would be to change the ``toolchain`` to suit their own system).
Source Code
___________
The latest version of the library itself is available on the `jobqueue_features GitHub repository
<https://github.com/E-CAM/jobqueue_features>`_.
There is an open `Pull Request for the JURECA software stack <https://github.com/easybuilders/JSC/pull/6>`_ that
provides all necessary dependencies for the library.
The
:download:`configuration file required for the library on JURECA <./jobqueue_features-0.0.4-intel-para-2018b-Python-3.6.6.eb>`
is included below (a version for Python 2 can also be created by simply changing the Python dependency version):
.. literalinclude:: ./jobqueue_features-0.0.4-intel-para-2018b-Python-3.6.6.eb
:language: python
......@@ -3,8 +3,6 @@
template. Make sure that any time you use ReST syntax (such as for ".. sidebar::" below), it needs to be preceded
and followed by white space (if you see warnings when this file is built they this is a common origin for problems).
.. We allow the template to be standalone, so that the library maintainers add it in the right place
.. Firstly, let's add technical info as a sidebar and allow text below to wrap around it. This list is a work in
progress, please help us improve it. We use *definition lists* of ReST_ to make this readable.
......@@ -12,25 +10,26 @@
.. sidebar:: Software Technical Information
Name
pybop
``jobqueue_features``
Language
Python (2.7, 3.4, 3.5, 3.6)
Python, YAML
Licence
`GNU General Public License v3.0 <https://www.gnu.org/licenses/gpl-3.0.en.html>`_
`MIT <https://opensource.org/licenses/mit-license>`_
Documentation Tool
Sphinx/RST
In-source documentation
Application Documentation
https://srmnitc.github.io/pybop/html/index.html
Not currently available. Example usage provided.
Relevant Training Material
https://mybinder.org/v2/gh/srmnitc/pybop/master?filepath=examples%2F
Not currently available.
Software Module Developed by
Sarath Menon
Adam Włodarczyk (Wrocław Centre of Networking and Supercomputing),
Alan O'Cais (Juelich Supercomputing Centre)
.. In the next line you have the name of how this module will be referenced in the main documentation (which you can
......@@ -38,10 +37,11 @@
unique otherwise you will cause cross-referencing errors. The reference must come right before the heading for the
reference to work (so don't insert a comment between).
.. _htc_mpi:
#####
pybop
#####
####################
HTC Multi-node Tasks
####################
.. Let's add a local table of contents to help people navigate the page
......@@ -52,30 +52,22 @@ pybop
into YYYY process, which in turn should allow ZZZZ to be simulated. If successful, this could make it possible to
produce compound AAAA while avoiding expensive process BBBB and CCCC."
``pybop`` is a python module for calculation of bond orientational order parameters [#]_. The core functionality of ``pybop`` is written in C++ with python wrappers using `pybind11 <https://pybind11.readthedocs.io/en/stable/intro.html>`_ . This allows for fast calculations with possibilities for seamless expansion in python.
This module is the third in a sequence that will form the overall capabilities of the HTC library (see :ref:`htc_yaml`
for the previous module). This module deals with enabling tasks to be run over a set of nodes (specifically MPI/OpenMP
tasks).
Purpose of Module
_________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
Bond orientational order parameters have been widely used in distinction of crystal structures in computational studies [#]_. Additionally, these parameters have also been used to distinguish between solid and liquid particles in studies of crystallisation during solidification [#]_.
``pybop`` provides a flexible post-processing python environment for these calculations, at the same time ensuring speed and efficiency as the core code is written in C++ with `pybind11 bindings <https://pybind11.readthedocs.io/en/stable/intro.html>`_. ``pybop`` also links with `Voro++ <http://math.lbl.gov/voro++/>`_ code to carry out calculations of voronoi volumes, indices and face areas.
Some of the major uses of ``pybop`` are listed below-
- calculations including the bond order parameters :math:`q_{i}` where :math:`i = \{2,3 \to 12\}`.
- averaged versions which has been to improve the resolution in identification of crystal structures [#]_.
- weighted :math:`q_{i}` where the contributions are weighted by the voronoi face area shared with adjacent atoms [#]_.
- distinction of liquid and solid atoms based on :math:`q_{6}` parameter.
- calculation of the parameters in non-orthogonal simulation boxes.
- other quantities like radial distribution function, coordination number and voronoi volume of individual particles.
``pybop`` can read in output data from LAMMPS [#]_ `dump format <https://lammps.sandia.gov/doc/dump.html>`_ and POSCAR files from VASP. The module also provides an easy interface for extension of the available data formats or linking with other codes to read in input data.
.. I will add information about the paper and results using pybop.
The initial goal is to allow the HTC library to control tasks that are executed via the MPI launcher command. The task
tracked by Dask is actually the process created by the launcher. The launcher is a forked process from within the
library.
The implementation is intended to be generic but the specific example implementation provided is for ``srun`` launcher
that is used on
`JURECA <http://www.fz-juelich.de/ias/jsc/EN/Expertise/Supercomputers/JURECA/JURECA_node.html>`_ system.
Background Information
......@@ -83,40 +75,38 @@ ______________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
See the `application documentation <https://srmnitc.github.io/pybop/html/index.html>`_ for full details.
The utilisation of Dask within the project came about as a result of the `E-CAM High Throughput Computing ESDW <https://www.e-cam2020.eu/event/4424/?instance_id=71>`_ held in Turin in 2018 and 2019.
This module builds upon the work described in :ref:`htc_yaml`.
Building and Testing
____________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
**Installation**
The library is a Python module and can be installed with
::
First, clone the ``pybop`` repository by ``git clone https://github.com/srmnitc/pybop.git``.
After cloning the repository, ``pybop`` can be installed by running ``python setup.py install`` from main code directory. It can be uninstalled by ``pip uninstall pybop``. All the dependencies of ``pybop`` are installed automatically.
python setup.py install
**Testing**
More details about how to install a Python package can be found at, for example, `Install Python packages on the
research computing systems at IU <https://kb.iu.edu/d/acey>`_
``pybop`` also contains automated tests which use the `pytest <https://docs.pytest.org/en/latest/>`_ python library, which can be installed by ``pip install pytest``. The tests can be run by executing the command ``pytest tests/`` from the main code directory.
To run the tests for the decorators within the library, you need the ``pytest`` Python package. You can run all the
relevant tests from the ``jobqueue_features`` directory with
::
**Examples**
pytest tests/test_mpi_wrapper.py
Examples uses of ``pybop`` can be found `here <https://srmnitc.github.io/pybop/html/examples.html>`_. An `interactive notebook <https://mybinder.org/v2/gh/srmnitc/pybop/master?filepath=examples%2F>`_ using binder is also available.
Specific examples of usage for the JURECA system are available in the ``examples`` subdirectory.
Source Code
___________
.. Notice the syntax of a URL reference below `Text <URL>`_ the backticks matter!
The `source code <https://github.com/srmnitc/pybop>`_. of the module can be found on GitHub.
The latest version of the library is available on the `jobqueue_features GitHub repository
<https://github.com/E-CAM/jobqueue_features>`_
.. [#] Steinhardt, PJ, Nelson, DR, Ronchetti, M. Phys. Rev. B 28, 1983.
.. [#] Lechner, W, Dellago, C, Bolhuis, P.G. J. Chem. Phys. 125, 2011., Diaz Leines, G, Drautz, R, Rogal, J. J. Chem. Phys. 146, 2017.
.. [#] Diaz Leines, G, Drautz, R, Rogal, J. J. Chem. Phys. 146, 2017.
.. [#] Lechner, W, Dellago, C. J. Chem. Phys. 129, 2008.
.. [#] Mickel, W, Kapfer, S.C, Schroder-Turk, G.E, Mecke, K. J. Chem. Phys. 138, 2013.
.. [#] Plimpton, S. J Comp. Phys. 117, 1995.
The code that was originally created specifically for this module can be seen in the
`HTC/MPI Merge Request <https://gitlab.e-cam2020.eu/adam/jobqueue_features/merge_requests/5>`_ which can be found in
the original private repository of the code. Additional, more complex, examples were provided in the
`HTC/MPI examples Merge Request <https://gitlab.e-cam2020.eu/adam/jobqueue_features/merge_requests/7>`_
.. _Particle_Insertion_core:
#######################
Particle Insertion Core
#######################
.. sidebar:: Software Technical Information
This is the core module for the particle insertion suite of codes
Languages
C, Python 2.7, LAMMPS Scripting language
Licence
MIT -however note that LAMMPS is now changing from GPL to LGPL so when used togetherwith LAMMPS LGPL applies
Documentation Tool
All source code should be documented so please indicate what tool has been used for documentation. We can help you
with Doxygen and ReST but if you use other tools it might be harder for us to help if there are problems.
Application Documentation
See `PIcore repository <https://gitlab.e-cam2020.eu/mackernan/particle_insertion/tree/master/PIcore>`_
Relevant Training Material
None
.. contents:: :local:
.. Add technical info as a sidebar and allow text below to wrap around it
Purpose of the Module
_____________________
This software module computes the change in free energy associated with the insertion or deletion of Lennard Jones particles in dilute or dense
conditions in a variety of Thermodynamic Ensembles, where statistical sampling through molecular dynamics is performed under `LAMMPS <https://lammps.sandia.gov/>`_ but
will be extended to other molecular dynamics engines at a later date. Lennard-Jones type interactions are the key source of
difficulty associated with particle insertion or deletion, which is why this module is a core module, as other interactions including
Coulombic and bond, angle and dihedral interactions will be added in a second module. It differs from the main community approach used to
date to compute such changes as it does not use soft-core potentials. Its key advantages over soft-core potentials are: (a) electrostatic interactions
can in principle be performed simultaneously
with particle insertion (this and other functionalities will be added in a new module); and, (b) essentially exact long-range dispersive interactions
using `dispersion Particle Mesh Ewald <https://doi.org/10.1063/1.4764089>`_ (PMME) or EWALD if desired can be selected at runtime by the user.
Background Information
______________________
Particle insertion can be used to compute the free energy associated with hydration/drying, the insertion of cavities in fluids/crystals,
changes in salt levels, changes in solvent mixtures, and alchemical changes such as the mutation of amino-acids. in crystals. It can also
be used to compute the free energy of solvent mixtures and the addition of salts, which is used in the purification processing
industrially, for instance in the purification of pharmaceutical active ingredients. Particle insertion can in principle also be
used to compute the free energy associated with changes in the pH, that is the proton transfer from a titratable site to the bulk,
for example in water.
Our approach consists of rescaling the effective size of inserted atoms through a parameter :math:`\lambda` so that all interactions between
nserted atoms and interactions between inserted atoms and atoms already present in the system are zero when :math:`\lambda = 0`, creating at most an
integrable singularity which we can safely handle. In the context of Lennard-Jones type pair potentials,
our approach at a mathematical level is similar to Simonson, who investigated the mathematical conditions required to `avoid the
singularity of insertion <https://doi.org/10.1080/00268979300102371>`_. It turns out that a non-linear dependence of the
interaction on :math: '\\lambda' between inserted
atoms and those already present is required (i.e. a simple linear dependence on :math: '\lambda' necessarily introduces a singularity).
This module and upcoming modules include computing the free energy changes associated with the following applications
(a) hydration and drying;
(b) the addition of multiple molecules into a condenses environment;
(c) residue mutation and alchemy;
(d) constant pH simulations, this also will also exploit modules created in E-CAM work package 3 (quantum dynamics); and,
(e) free energy changes in chemical potentials associated with changes in solvent mixtures.
General Formulation
___________________
Consider a system consisting of :math:`N+M` degrees of freedom and the Hamiltonian
.. math::
H(r,p,\lambda) =&H_0 + KE_{insert} + \Delta V(r, \lambda)
where :math:`H_0` corresponds to an unperturbed Hamiltonian, and the perturbation :math:`\Delta V(r, \lambda)` depends
nonlinearly on a control parameter :math:`\lambda`. The first set of N degrees of freedom is denoted by A and the second
set of M degrees of freedom is denoted by B. To explore equilibrium properties of the system, thermostats, and barostats
are used to sample either the NVT (canonical) ensemble or the NPT (Gibbs) ensemble. The perturbation is devised so that
when :math:`\lambda = 0`, :math:`\Delta V(r, \lambda) = 0`, B is in purely virtual. When :math:`\lambda = 1`, B
corresponds to a fully physical augmentation of the original system.
In the present software module, we consider only interaction Lennard Jones atoms.
.. math::
\Delta V(r,\lambda) = V_{lj}(r,\lambda)
where for each inserted atom i
.. math::
\hat{\sigma}( \lambda)_i &= \lambda \sigma_i \\
\hat{\epsilon}( \lambda)_i &= \lambda \epsilon_i \\
and the mixing rule for Van der Waals diameters and binding energy between different atoms uses the geometric mean.
The dependence of :math:`\sigma` on :math:`\lambda` has the consequence that the mean
:math:`\sigma` between a pair of inserted atoms scales as :math:`\lambda`, but scales as :math:`\sqrt{\lambda}` when one atom in the pair is
inserted and the other is already present. These choices of perturbations guarantees that the particle insertion and deletion catastrophes are avoided.
Algorithms
__________
At the core of the PI core module there are four functions/codes. The first written in python generates the interpolation points which are
the zero's of suitably transformed Chebyshev functions.
The second code written ln LAMMPS scripting language performs the simulation in user-defined ensembles at the selected
interpolation values of :math:'lambda', at a user-specified frequency, computing two-point central difference estimates of derivatives of the
potential energy needed for thermodynamic integration, computing the energy
functions for all values of :math:'lambda' in the context of MBAR. The user also specifies the locations of the inserted particles.
The user also specifies whether
Particle Mesh Ewald or EWALD should be used for dispersive interactions.
The third code written in python takes the output data from LAMMPS, prepares it so that free energy differences in the
selected ensemble can be computed using MBAR provided by the pymbar suite of python codes of the Chodera group.
The fourth code, also written in python take the LAMMPS output and performs the thermodynamic integration.
.. image:: ./flowchart1.png
Source Code
___________
All files can be found in the ``PIcore`` subdirectory of the `particle_insertion git repository <https://gitlab.e-cam2020.eu/mackernan/particle_insertion>`_.
Compilation and Linking
_______________________
See `PIcore README <https://gitlab.e-cam2020.eu/mackernan/particle_insertion/tree/master/PIcore/README.rst>`_ for full details.
Scaling and Performance
________________________
As the module uses LAMMPS, the performance and scaling of this module should essentially be the same, provided data for thermodynamic integration and
MBAR are not generated too often, as is demonstated below. In the case of thermodynamic integration, this is due to the central difference approximation of derivatives, and in the case
of MBAR, it is due to the fact that many virtual moves are made which can be extremely costly if the number of interpolating points is large. Also, when using
PMME, the initial setup cost is computationally expensive, and should, therefore, be done as infrequently as possible. A future module in preparation will
circumvent the use of central difference approximations of derivatives. The scaling performance of PI-CORE was tested on Jureca multi node.
The results for weak scaling (where the number of core and the system size are doubled from 4 to 768 core) are as follows.
Weak Scaling:
================== ===========
Number of MPI Core timesteps/s
================== ===========
4 1664.793
8 1534.013
16 1458.936
24 1454.075
48 1350.257
96 1301.325
192 1263.402
384 1212.539
768 1108.306
================== ===========
and for the strong scaling (where the number of core are doubled from 4 to 384 but the system size is fixed equal to 768 times the original system
size considered for one core/processor for weak scaling) Strong Scaling:
================== =============
Number of MPI Core timesteps/s
================== =============
4 9.197
8 17.447
16 34.641
24 53.345
48 104.504
96 204.434
192 369.178
384 634.022
================== =============
.. _Particle_Insertion_hydration:
############################
Particle Insertion Hydration
############################
.. sidebar:: Software Technical Information
This is the core module for the particle insertion suite of codes
Languages
C, Python 2.7, LAMMPS Scripting language
Licence
MIT -however, note that LAMMPS is GPL so when used together GPL applies
Documentation Tool
All source code should be documented so please indicate what tool has been used for documentation. We can help you
with Doxygen and ReST but if you use other tools it might be harder for us to help if there are problems.
Application Documentation
See `PIhydration README file <https://gitlab.e-cam2020.eu/mackernan/particle_insertion/tree/master/PIhydration>`_
Relevant Training Material
Add a link to any relevant training material.
.. contents:: :local:
.. Add technical info as a sidebar and allow text below to wrap around it
Purpose of the Module
_____________________
This software module computes the change in free energy associated with the insertion or deletion of water in dilute or dense conditions in a variety of Thermodynamic Ensembles, where statistical sampling through molecular dynamics is performed under `LAMMPS <https://lammps.sandia.gov/>`_ but will be extended to other molecular dynamics engines at a later date. It builds on the PI Core module of codes by adding electrostatic, bond, and angle
:math:`\lambda` dependent interactions including SHAKE to the Lennard-Jones interactions that were dealt with in PIcore. It differs from the main community approach used to date to compute such changes as it does not use soft-core potentials. Its key advantages over soft-core potentials are: (a) electrostatic interactions
can in principle be performed simultaneously
with particle insertion (this and other functionalities will be added in a new module); and, (b) essentially exact long-range dispersive interactions
using `dispersion Particle Mesh Ewald <https://doi.org/10.1063/1.4764089>`_ (PMME) or EWALD if desired can be selected at runtime by the user.
Background Information
______________________
Particle insertion can be used to compute the free energy associated with hydration/drying, the insertion of cavities in fluids/crystals,
changes in salt levels, changes in solvent mixtures, and alchemical changes such as the mutation of amino-acids. in crystals. It can also be used to compute the free energy of solvent mixtures and the addition of salts, which is used in the purification processing industrially, for instance in the purification of pharmaceutical active ingredients. Particle insertion can in principle also be used to compute the free energy associated with changes in the pH, that is the proton transfer from a titratable site to the bulk,
for example in water.
Our approach consists of rescaling electrostatic charges of inserted atoms so that they converge to zero faster than inerted Van der Waals
atoms where the later uses the geometric mean for Lennard Jones diameters and binding energies, and that bond, angle, and dihedral spring constants and where
necessary also bond lengths scale to zero in the same fashion
the effective size of inserted atoms through a parameter :math:`\lambda` so that all interactions between inserted atoms and interactions between inserted atoms and atoms already present in the system are zero when :math:`\lambda = 0`, creating at most an integrable singularity which we can safely handle. In the context of Lennard-Jones type pair potentials,
our approach at a mathematical level is similar to Simonson, who investigated the mathematical conditions required to `avoid the
singularity of insertion <https://doi.org/10.1080/00268979300102371>`_. It turns out that a non-linear dependence of the interaction on :math: '\\lambda' between inserted
atoms and those already present is required (i.e. a simple linear dependence on :math: '\\lambda' necessarily introduces a singularity).
The applications of this module use in upcoming modules include computing the free energy changes associated with:
::
(a) hydration and drying;
(b) the addition of multiple molecules into a condenses environment;
(c) residue mutation and alchemy;
(d) constant pH simulations, this also will also exploit modules created in E-CAM work package 3
(quantum dynamics); and,
(e) free energy changes in chemical potentials associated with changes in solvent mixtures.
General Formulation
___________________
Consider a system consisting of :math:`N+M` degrees of freedom and the Hamiltonian
.. math::
H(r,p,\lambda) =&H_0 + KE_{insert} + \Delta V(r, \lambda)
where :math:`H_0` corresponds to an unperturbed Hamiltonian, and the perturbation :math:`\Delta V(r, \lambda)` depends nonlinearly on a control parameter :math:`\lambda`. The first set of N degrees of freedom is denoted by A and the second set of M degrees of freedom is denoted by B. To explore equilibrium properties of the system, thermostats, and barostats are used to sample either the NVT (canonical) ensemble or the NPT (Gibbs) ensemble. The perturbation is devised so that
when :math:`\lambda = 0`, :math:`\Delta V(r, \lambda) = 0`, B is in purely virtual. When :math:`\lambda = 1`, B
corresponds to a fully physical augmentation of the original system.
In the present software module, we include in the perturbation interaction Lennard Jones potenetials, harmonic bond and angle interactions, and
electostatic interactions:
.. math::
\Delta V(r,\lambda) = V_{lj}(r,\lambda) + V_{b}(r,\lambda) + V_{a}(r,\lambda) + V_{el}(r,\lambda).
where for each inserted atom i
.. math::
\hat{\sigma}( \lambda)_i &= \lambda \sigma_i \\
\hat{\epsilon}( \lambda)_i &= \lambda \epsilon_i \\
\hat{q}( \lambda)_i &= \lambda ^p \\
and the mixing rule for Van der Waals diameters and binding energy between different atoms uses the geometric mean for atoms pairs where one or more of the atoms is inserted but retains the mixing rule for atoms already present. The dependence of
:math:`\sigma` on :math:`\lambda` has the consequence that the mean
:math:`\sigma` between a pair of inserted atoms scales as :math:`\lambda`, but scales as :math:`\sqrt{\lambda}` when one atom in the pair is
inserted and the other is already present. The dependence of math:`\epsilon` on :math:`\lambda` ensures that forces behave regularly when
:math:`\lambda` is very small. These choices of perturbations guarantees that the particle insertion and deletion catastrophes are avoided.
Regarding electrostatic interactions, the exponent p allows the rate of convergence electrostatic interactions to zero to be faster than the rate at which that the effective diameters between corresponding Lennard Jones atoms go to zero, so as to ensure divergences are avoided. Currently p = 1.5. The spring constants for harmonic, angular and torsional interactions involving inserted atoms are currently simply multiplied by :math:`\lambda`.It is also possible to replace
bond, angle and torsional interactions involving only inserted atoms with shake constraints. In such cases, the shake constraints are continuously on. For cases where arithmetic sum rules apply to the original system, an additional lambda bases perturbation stage can be applied to transform geometric mean based mixing rules for Lennard Jones interactions to arithmetic mean rules governing interactions between inserted atoms or inserted atoms and original atoms.
Algorithms
__________
At the core of the PI core module there are four functions/codes. The first written in python generates the interpolation points which are
the zero's of suitably transformed Chebyshev functions.
The second code written ln LAMMPS scripting language performs the simulation in user-defined ensembles at the selected
interpolation values of :math:'lambda', at a user-specified frequency, computing two-point central difference estimates of derivatives of the
potential energy needed for thermodynamic integration, computing the energy
functions for all values of :math:'lambda' in the context of MBAR. The user also specifies the locations of the inserted particles.
The user also specifies whether
Particle Mesh Ewald or EWALD should be used for dispersive interactions.
The third code written in python takes the output data from LAMMPPS, prepares it so that free energy differences in the selected ensemble can be computed using MBAR provided by the pymbar suite of python codes of the Chodera group.
The fourth code, also written in python take the LAMMPS output and performs the thermodynamic integration.
Source Code
___________
All files can be found in the ``PIhydration`` subdirectory of the `particle_insertion git repository <https://gitlab.e-cam2020.eu/mackernan/particle_insertion>`_.
Compilation and Linking
_______________________
See `PIhydration README <https://gitlab.e-cam2020.eu/mackernan/particle_insertion/tree/master/PIhydration/README.rst>`_ for full details.
Scaling and Performance
_________________________
As the module uses LAMMPS, the performance and scaling of this module should essentially be the same, provided data for thermodynamic integration and
MBAR is not generated too often. In the case of thermodynamic integration, this is due to the central difference approximation of derivatives, and in the case
of MBAR, it is due to the fact that many virtual moves are made which can be extremely costly if the number of interpolating points is large. Also, when using
PMME, the initial setup cost is computationally expensive, and should, therefore, be done as infrequently as possible. A future module in preparation will
circumvent the use of central difference approximations of derivatives.
.. sidebar:: Software Technical Information
Name
pyscal
Language
Python (2.7, 3.4, 3.5, 3.6)
Licence
`GNU General Public License v3.0 <https://www.gnu.org/licenses/gpl-3.0.en.html>`_
Documentation Tool
Sphinx/RST
Application Documentation
https://pyscal.readthedocs.io/en/latest/
Relevant Training Material
https://mybinder.org/v2/gh/srmnitc/pyscal/master?filepath=examples%2F
Software Module Developed by
Sarath Menon
Grisell Díaz Leines
Jutta Rogal
######
pyscal
######
.. contents:: :local:
**pyscal** is a python module for the calculation of local atomic structural environments including Steinhardt's bond orientational order parameters [1]_ during post-processing
of atomistic simulation data. The core functionality of pyscal is written in C++ with python wrappers using
`pybind11 <https://pybind11.readthedocs.io/en/stable/intro.html>`_ which allows for fast calculations and
easy extensions in python.
Purpose of Module
_________________
Steinhardt's order parameters are widely used for the identification of crystal structures [3]_. They are also used to distinguish
if an atom is in a solid or liquid environment [4]_. pyscal is inspired by the
`BondOrderAnalysis <https://homepage.univie.ac.at/wolfgang.lechner/bondorderparameter.html>`_ code,
but has since incorporated many additional features and modifications. The pyscal module includes the following functionalities:
* calculation of Steinhardt's order parameters and their averaged version [2]_.
* links with the `Voro++ <http://math.lbl.gov/voro++/>`_ code, for the calculation of Steinhardt parameters weighted using the face areas of Voronoi polyhedra [3]_.
* classification of atoms as solid or liquid [4]_.
* clustering of particles based on a user defined property.
* methods for calculating radial distribution functions, Voronoi volumes of particles, number of vertices and face area of Voronoi polyhedra, and coordination numbers.
Background Information
______________________
See the `application documentation <https://pyscal.readthedocs.io/en/latest/>`_ for full details.
The utilisation of Dask within the project came about as a result of the `E-CAM High Throughput Computing ESDW <https://www.e-cam2020.eu/event/4424/?instance_id=71>`_ held in Turin in 2018 and 2019.
Building and Testing
____________________
**Installation**
pyscal can be installed directly using `Conda <https://docs.conda.io/en/latest/>`_ by the following statement-
.. code:: console
conda install -c pyscal pyscal
pyscal can be built from the repository by-
.. code:: console
git clone https://github.com/srmnitc/pyscal.git
cd pyscal
python setup.py install --user
**Testing**
pyscal contains automated tests which
use the `pytest <https://docs.pytest.org/en/latest/>`_ python library, which can be installed by ``pip install pytest``.
The tests can be run by executing the command ``pytest tests/`` from the main code directory.
**Examples**
Examples using pyscal can be found `here <https://pyscal.readthedocs.io/en/latest/examples.html>`_.
An `interactive notebook <https://mybinder.org/v2/gh/srmnitc/pyscal/master?filepath=examples%2F>`_
using binder is also available.
Source Code
___________
The `source code <https://github.com/srmnitc/pyscal>`_. of the module can be found on GitHub.
.. [1] `Steinhardt, P. J., Nelson, D. R., & Ronchetti, M. (1983). Physical Review B, 28 <https://journals.aps.org/prb/abstract/10.1103/PhysRevB.28.784>`_.
.. [2] `Lechner, W., & Dellago, C. (2008). The Journal of Chemical Physics, 129 <https://aip.scitation.org/doi/full/10.1063/1.2977970>`_.
.. [3] `Mickel, W., Kapfer, S. C., Schröder-Turk, G. E., & Mecke, K. (2013). The Journal of Chemical Physics, 138 <https://aip.scitation.org/doi/full/10.1063/1.4774084>`_.
.. [4] `Auer, S., & Frenkel, D. (2005). Advances in Polymer Science, 173 <https://link.springer.com/chapter/10.1007/b99429>`_.
......@@ -3,6 +3,13 @@
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
.. sidebar:: General Information
.. contents:: :depth: 2
* :ref:`contributing`
* :ref:`search`
.. _readme_electronic_structure:
****************************
......@@ -12,14 +19,7 @@ Electronic Structure Modules
Introduction
============
.. sidebar:: General Information
.. contents:: :depth: 2
* :ref:`contributing`
* :ref:`search`
.. figure:: ./images/wake_nova-rgb.png
.. figure:: ./images/protein-metal-cluster.png
:figwidth: 25 %
:align: left
......@@ -87,6 +87,19 @@ The ESDW in Lausanne in February 2018 was the starting point for the modules bel
:maxdepth: 1
./modules/esl-bundle/readme
./modules/ELPA_easyblock/readme
ESDW Dublin 2019
-----------------
The ESDW in Dublin in January 2019 was the starting point for the modules below.
.. toctree::
:glob:
:maxdepth: 1
./modules/esl-easyconfigs/readme
Other Modules
-------------
......@@ -99,6 +112,7 @@ Modules not coming from ESDWs
./modules/flook/readme
./modules/libgridxc/readme
./modules/libvdwxc/readme
./modules/MatrixSwitchDBCSR/readme
Pilot Projects
......@@ -128,5 +142,11 @@ Below is a list of the modules developed directly within the context of the pilo
./modules/W90_solution_booklet/readme
./modules/FFTXlib/readme
./modules/W90_cube_format_non-orthogonal/readme
./modules/miniPWPP/readme
./modules/PANNA-GVECT/readme
./modules/PANNA-TFR/readme
./modules/PANNA-TRAIN/readme
./modules/PANNA-EVAL/readme
./modules/PANNA-Charges/readme
.. _E-CAM: https://www.e-cam2020.eu/
.. sidebar:: Software Technical Information
Name
EasyBuild
Language
Python
Licence
`GPL-2.0 <https://opensource.org/licenses/GPL-2.0>`_
Documentation Tool
ReST_
Application Documentation
https://easybuild.readthedocs.io
Relevant Training Material
See documentation
Software Module Developed by
Micael Oliveira
.. In the next line you have the name of how this module will be referenced in the main documentation (which you can
reference, in this case, as ":ref:`example`"). You *MUST* change the reference below from "example" to something
unique otherwise you will cause cross-referencing errors. The reference must come right before the heading for the
reference to work (so don't insert a comment between).
.. _elpa_easyblock:
###############################
Add ELPA easyblock to EasyBuild
###############################
.. Let's add a local table of contents to help people navigate the page
.. contents:: :local:
.. Add an abstract for a *general* audience here. Write a few lines that explains the "helicopter view" of why you are
creating this module. For example, you might say that "This module is a stepping stone to incorporating XXXX effects
into YYYY process, which in turn should allow ZZZZ to be simulated. If successful, this could make it possible to
produce compound AAAA while avoiding expensive process BBBB and CCCC."
EasyBuild is used by a number of large HPC sites and integrating targeted support for ELPA ensures that those sites
use optimally built versions of ELPA.
Purpose of Module
_________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
Automate the selection of appropriate configuration flags for ELPA within EasyBuild depending on the type of CPU and available features.
Include additional options as appropriate. Build single and double precision versions of ELPA and also ensure it is linked against the expected version of the linear algebra libraries.
Background Information
______________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
EasyBuild is a software build and installation framework that allows you to manage (scientific) software on High
Performance Computing (HPC) systems in an efficient way. Full details on can be found in the
`EasyBuild documentation <https://easybuild.readthedocs.io/en/latest/>`_.
EasyBuild already had limited support for ELPA, this module allows for automated hardware specific configuration and optimisations.
Building and Testing
____________________
.. Keep the helper text below around in your module by just adding ".. " in front of it, which turns it into a comment
To build the software requires EasyBuild (see
`installation instructions for EasyBuild here <https://easybuild.readthedocs.io/en/latest/Installation.html>`_) and an
example build command would be:
::
eb ELPA-2018.11.001-intel-2019a.eb
Source Code
___________
.. Notice the syntax of a URL reference below `Text <URL>`_ the backticks matter!
There are two relevant Pull Requests in the main EasyBuild repositories:
* https://github.com/easybuilders/easybuild-easyblocks/pull/1621
* https://github.com/easybuilders/easybuild-easyconfigs/pull/8360
.. Here are the URL references used (which is alternative method to the one described above)
.. _ReST: http://www.sphinx-doc.org/en/stable/rest.html
.. _Sphinx: http://www.sphinx-doc.org/en/stable/markup/index.html
##############
PANNA-Charges
##############
.. sidebar:: Software Technical Information
Language
Python 3.6.
Documentation Tool
Sphinx,ReStructuredText
Application Documentation
`Doc mirror <https://gitlab.com/PANNAdevs/panna/tree/master/doc>`_
Relevant Training Material
See usage examples in the ``doc/tutorial`` directory of the source code.
Licence
The MIT License (MIT)
.. contents:: :local:
Purpose of Module
___________________
PANNA-Charges module demonstrates how to train a neural network to predict local atomic charges.
This network can later be used to calculate the electrostatic energy density of a crystal.
See Reference 2 for the theoretical model behind this approach.
PANNA-Charges, following other modules within the PANNA project, uses TensorFlow framework.
Features
__________
PANNA-Charge supports periodic and aperiodic structures, multiple species,
and a different all-to-all connected network architecture for each species.
It further supports controlling the training dynamics: eg. reeze/unfreeze layers, weight transfer, decaying learning rates etc.
Building and Testing
______________________________
A stable version of the module can will be released in the near future,
and will be available for download using the download button on this `page <https://gitlab.com/PANNAdevs/panna>`_
As a python module PANNA-Charges does not require installation but it relies on numpy library version >= 1.15.0, tensorflow version >= 1.13.0, and
tensorboard version >= 1.13.0. Note that with version 2.0.0, tensorflow libraries went under substantial changes in structure, the 1.1X.X
family supports the equally valid previous structure and is still being maintained. PANNA-TRAIN requires tensorflow 1.1X.X family of versions.
In order to set up and test the module, run the following::
$ tar -zxvf panna-master.tar.gz
$ cd panna-master
$ python3 ./panna/test-charges-train.py
Usage
______
PANNA-Charges main script, charges_train.py, requires a configuration file that specifies the parameter of the calculation
such as number of layers and nodes of each neural network layer, learning parameter etc.
A typical command for using this module is as follows::
$ export PYTHONPATH=/path/to/panna/directory/panna
$ python3 charges_train.py --config charges_train_config.ini
A detailed tutorial about the contents of the configuration file will be released
`here <https://gitlab.com/PANNAdevs/panna/blob/master/doc/tutorial/README_tutorial_3_charges_training.md>`_.
In this comprehensive tutorial, a neural network training scenario for systems with long range interactions will be demonstrated.
Source Code
___________
PANNA-Charges source is not currently public, when it is released it will be hosted on `gitlab <https://gitlab.com/PANNAdevs/panna>`_.
Further Information
______________________
The PANNA-Charges module is developed with the contributions of Y. Shaidu, R. Lot, F. Pellegrini, E. Kucukbenli
References
____________
PANNA manuscript:
[1] R. Lot, Y. Shaidu, F. Pellegrini, E. Kucukbenli.
`arxiv:1907.03055 <https://arxiv.org/abs/1907.03055>`_. Submitted (2019).
[2] N. Artrith, T. Morawietz, J. Behler. PRB 83, 153101 (2011).
High-dimensional neural-network potentials for multicomponent systems: Applications to zinc oxide.
Erratum: PRB 86, 079914 (2012).
###########
PANNA-EVAL
###########
.. sidebar:: Software Technical Information
Language
Python 3.6.
Documentation Tool
Sphinx,ReStructuredText
Application Documentation
`Doc mirror <https://gitlab.com/PANNAdevs/panna/tree/master/doc>`_
Relevant Training Material
See usage examples in the ``doc/tutorial`` directory of the source code.
Licence
The MIT License (MIT)
.. contents:: :local:
Purpose of Module
___________________
PANNA-EVAL module evaluates an all to all connected neural network
to predict atomistic quantities, e.g. total energy and forces of a given crystal structure.
PANNA-EVAL can be used with other modules of the PANNA project for neural network validation,
but it can also serve to carry the information of the trianed network to other platforms such as
molecular dynamics code LAMMPS.
Although PANNA-EVAL does not need the advanced capabilities of the TensorFlow framework,
it uses the 'checkpoint' information to automatically test the performance of a network from training data.
Features
__________
PANNA-EVAL module has two user-end scripts: evaluate.py and extract_weights.py.
Main script of the PANNA-EVAL module, evaluate.py can evaluate all to all connected networks with various sizes for each species.
It can also calculate the derivative of the target function, ie. forces for an energy network.
This module was primarily created to validate TensorFlow networks stored during training in checkpoint format, hence it has the functionality to look for
checkpoint numbers in a training directory, and/or run several checkpoint evaluations at once.
Extract_weights.py script allows to save the network parameters from TensorFlow native checkpoint format to other useful ones, such as
human readable or LAMMPS potential formats. This last one allows neural networks that are trained and validated using PANNA modules to
be exported to LAMMPS as interatomic potentials.
Building and Testing
______________________________
A stable version of the module can be downloaded using the download button on this `page <https://gitlab.com/PANNAdevs/panna>`_
As a python module PANNA-EVAL does not require installation but it relies on numpy library version >= 1.15.0, tensorflow version >= 1.13.0.
Note that with version 2.0.0, tensorflow libraries went under substantial changes in structure, the 1.1X.X
family supports the equally valid previous structure and is still being maintained. PANNA-EVAL requires tensorflow 1.1X.X family of versions.
In order to set up and test the module, run the following::
$ tar -zxvf panna-master.tar.gz
$ cd panna-master
$ python3 ./panna/test-evaluate.py
Currently this test only assesses the evaluate.py script. Another test for extract_weights.py will be released in the near future.
Usage
______
PANNA-EVAL main script requires a configuration file that specifies the parameter of the calculation
such as where to find the network to evaluate, which checkpoints to evaluate etc..
A typical command for using this module is as follows::
$ export PYTHONPATH=/path/to/panna/directory/panna
$ python3 evaluate.py --config val_config.ini
A detailed tutorial about the contents of the configuration file can be found
`here <https://gitlab.com/PANNAdevs/panna/blob/master/doc/tutorial/README_tutorial_1_training.md>`_.
In this comprehensive tutorial, a neural network training scenario is demonstrated from beginning to end.
Network training and validation are two key steps of generating a predictive network,
hence in the tutorial how to use this module together with PANNA-TRAIN module used in training is also explained.
Together, these two modules cover all the steps necessary to train an atomistic neural network, starting from a data which specifies
the machine learning task in (input, target output) pair form.
Source Code
___________
PANNA-EVAL source is currently hosted on `gitlab <https://gitlab.com/PANNAdevs/panna>`_.
Further Information
______________________
The PANNA-EVAL module is developed with the contributions of R. Lot, Y. Shaidu, F. Pellegrini, E. Kucukbenli
References
____________
PANNA manuscript:
[1] R. Lot, Y. Shaidu, F. Pellegrini, E. Kucukbenli.
`arxiv:1907.03055 <https://arxiv.org/abs/1907.03055>`_. Submitted (2019).
[2] J. Behler and M. Parrinello, Generalized Neural-Network
Representation of High-Dimensional Potential-Energy
Surfaces, Phys. Rev. Lett. 98, 146401 (2007)
[3] Justin S. Smith, Olexandr Isayev, Adrian E. Roitberg.
ANI-1: An extensible neural network potential with DFT accuracy
at force field computational cost. Chemical Science,(2017), DOI: 10.1039/C6SC05720A
###########
PANNA-GVECT
###########
.. sidebar:: Software Technical Information
Language
Python 3.6.
Documentation Tool
Sphinx,ReStructuredText
Application Documentation
`Doc mirror <https://gitlab.com/PANNAdevs/panna/tree/master/doc>`_
Relevant Training Material
See usage examples in the ``doc/tutorial`` directory of the source code.
Licence
The MIT License (MIT)
.. contents:: :local:
Purpose of Module
___________________
PANNA-GVECT module demonstrates how to efficiently generate Behler-Parinello and modified Behler-Parinello
descriptors (See References 1,2,3).
These descriptors can then be used in machine learning algorithms. Even though these descriptors were originally designed for
neural network models, they are equally suitable for other supervised learning schemes such as kernel methods,
or unsupervised ones such as clustering techniques.
PANNA-GVECT, unlike other modules within the PANNA project, does not use TensorFlow framework.
Features
__________
PANNA-Gvect supports periodic and aperiodic structures, multiple species,
derivative of the descriptors with respect to atomic positions.
Building and Testing
______________________________
A stable version of the module can be downloaded using the download button on this `page <https://gitlab.com/PANNAdevs/panna>`_
As a python module PANNA-GVECT does not require installation but it relies on numpy library version >= 1.15.0.
In order to set up and test the module, run the following::
$ tar -zxvf panna-master.tar.gz
$ cd panna-master
$ python3 ./panna/test-gvect_calculator.py
Usage
______
PANNA-GVECT main script requires a configuration file that specifies the parameter of the calculation such as descriptor type, length etc.
A typical command for using this module is as follows::
$ export PYTHONPATH=/path/to/panna/directory/panna
$ python3 gvect_calculator.py --config gvect_configuration.ini
A detailed tutorial about the contents of the configuration file can be found
`here <https://gitlab.com/PANNAdevs/panna/blob/master/doc/tutorial/README_tutorial_2_data_preparation.md>`_.
In this comprehensive tutorial, how use this module with other modules such as PANNA-TOOLS and PANNA-TFR
is also demonstrated. Together, these modules cover all the steps necessary while going from raw data to descriptors that can be
used in machine learning workflow.
Source Code
___________
PANNA-GVECT source is currently hosted on `gitlab <https://gitlab.com/PANNAdevs/panna>`_.
Further Information
______________________
The PANNA-GVECT module is developed with the contributions of R. Lot, Y. Shaidu, F. Pellegrini, E. Kucukbenli
References
____________
PANNA manuscript:
[1] R. Lot, Y. Shaidu, F. Pellegrini, E. Kucukbenli.
`arxiv:1907.03055 <https://arxiv.org/abs/1907.03055>`_. Submitted (2019).
[2] J. Behler and M. Parrinello, Generalized Neural-Network
Representation of High-Dimensional Potential-Energy
Surfaces, Phys. Rev. Lett. 98, 146401 (2007)
[3] Justin S. Smith, Olexandr Isayev, Adrian E. Roitberg.
ANI-1: An extensible neural network potential with DFT accuracy
at force field computational cost. Chemical Science,(2017), DOI: 10.1039/C6SC05720A
##########
PANNA-TFR
##########
.. sidebar:: Software Technical Information
Language
Python 3.6.
Documentation Tool
Sphinx,ReStructuredText
Application Documentation
`Doc mirror <https://gitlab.com/PANNAdevs/panna/tree/master/doc>`_
Relevant Training Material
See usage examples in the ``doc/tutorial`` directory of the source code.
Licence
The MIT License (MIT)
.. contents:: :local:
Purpose of Module
___________________
PANNA-TFR module demonstrates how to efficiently pack the Behler-Parinello and
modified Behler-Parinello descriptor vectors (See References 1,2,3) written in binary format, into TensorFlow data format
for efficient reading during training.
These descriptors can then be used within TensorFlow efficiently, reducing the overhead during batch creation.
PANNA-TFR is built on TensorFlow.
Features
__________
PANNA-TFR supports descriptors that change size across records, i.e. data points with different number of atoms
are stored efficiently without padding.
Building and Testing
______________________________
A stable version of the module can be downloaded using the download button on this `page <https://gitlab.com/PANNAdevs/panna>`_
As a python module PANNA-TFR does not require installation but it relies on numpy library version => 1.15.0 and tensorflow version => 1.13.0
In order to set up and test the module, run the following::
$ tar -zxvf panna-master.tar.gz
$ cd panna-master
$ python3 ./panna/test-tfr-packer.py
Usage
______
PANNA-TFR main script requires a configuration file that specifies the parameter of the calculation
such as location of descriptor files or how many descriptors to be packed in a single record file.
A typical command for using this module is as follows::
$ export PYTHONPATH=/path/to/panna/directory/panna
$ python3 tfr_packer.py --config tfr_configuration.ini
A detailed tutorial about the contents of the configuration file can be found
`here <https://gitlab.com/PANNAdevs/panna/blob/master/doc/tutorial/README_tutorial_2_data_preparation.md>`_.
In this comprehensive tutorial, how use this module with other modules such as PANNA-GVECT and PANNA-TOOLS
is also demonstrated. Together, these modules cover all the steps necessary while going from raw data to descriptors that can be
used in machine learning workflow.
Source Code
___________
PANNA-TFR source is currently hosted on `gitlab <https://gitlab.com/PANNAdevs/panna>`_.
Further Information
______________________
The PANNA-TFR module is developed with the contributions of R. Lot, Y. Shaidu, F. Pellegrini, E. Kucukbenli
References
____________
PANNA manuscript:
[1] R. Lot, Y. Shaidu, F. Pellegrini, E. Kucukbenli.
`arxiv:1907.03055 <https://arxiv.org/abs/1907.03055>`_. Submitted (2019).
[2] J. Behler and M. Parrinello, Generalized Neural-Network
Representation of High-Dimensional Potential-Energy
Surfaces, Phys. Rev. Lett. 98, 146401 (2007)
[3] Justin S. Smith, Olexandr Isayev, Adrian E. Roitberg.
ANI-1: An extensible neural network potential with DFT accuracy
at force field computational cost. Chemical Science,(2017), DOI: 10.1039/C6SC05720A
#############
PANNA-TRAIN
#############
.. sidebar:: Software Technical Information
Language
Python 3.6.
Documentation Tool
Sphinx,ReStructuredText
Application Documentation
`Doc mirror <https://gitlab.com/PANNAdevs/panna/tree/master/doc>`_
Relevant Training Material
See usage examples in the ``doc/tutorial`` directory of the source code.
Licence
The MIT License (MIT)
.. contents:: :local:
Purpose of Module
___________________
PANNA-TRAIN is a neural network training module for atomistic data, eg. prediction of total energy and forces
given a crystal structure.
It implements a separate atomic network for each species, following the seminal work of Behler and Parinello.
(See References 1,2,3)
which can later be used as interatomic potential in molecular dynamics simulations.
PANNA-TRAIN uses TensorFlow framework as the underlying neural network training and data i/o engine.
Features
__________
PANNA-TRAIN supports all to all connected networks for each species.
Networks with different number of nodes and layers are allowed.
It further supports controlling the training dynamics: eg. reeze/unfreeze layers, weight transfer, decaying learning rates etc.
Building and Testing
______________________________
A stable version of the module can be downloaded using the download button on this `page <https://gitlab.com/PANNAdevs/panna>`_
As a python module PANNA-TRAIN does not require installation but it relies on numpy library version >= 1.15.0, tensorflow version >= 1.13.0, and
tensorboard version >= 1.13.0. Note that with version 2.0.0, tensorflow libraries went under substantial changes in structure, the 1.1X.X
family supports the equally valid previous structure and is still being maintained. PANNA-TRAIN requires tensorflow 1.1X.X family of versions.
In order to set up and test the module, run the following::
$ tar -zxvf panna-master.tar.gz
$ cd panna-master
$ python3 ./panna/test-train.py
Usage
______
PANNA-TRAIN main script requires a configuration file that specifies the parameter of the calculation
such as number of layers and nodes of each neural network layer, learning parameter etc.
A typical command for using this module is as follows::
$ export PYTHONPATH=/path/to/panna/directory/panna
$ python3 train.py --config train_configuration.ini
A detailed tutorial about the contents of the configuration file can be found
`here <https://gitlab.com/PANNAdevs/panna/blob/master/doc/tutorial/README_tutorial_1_training.md>`_.
In this comprehensive tutorial, a neural network training scenario is demonstrated from beginning to end.
Network validation is a key step in network training, hence in the tutorial how to use this module together
with PANNA-EVAL module used in validation is also explained.
Together, these two modules cover all the steps necessary to train an atomistic neural network, starting from a data which specifies
the machine learning task in (input, target output) pair form.
Source Code
___________
PANNA-TRAIN source is currently hosted on `gitlab <https://gitlab.com/PANNAdevs/panna>`_.
Further Information
______________________
The PANNA-TRAIN module is developed with the contributions of R. Lot, Y. Shaidu, F. Pellegrini, E. Kucukbenli
References
____________
PANNA manuscript:
[1] R. Lot, Y. Shaidu, F. Pellegrini, E. Kucukbenli.
`arxiv:1907.03055 <https://arxiv.org/abs/1907.03055>`_. Submitted (2019).
[2] J. Behler and M. Parrinello, Generalized Neural-Network
Representation of High-Dimensional Potential-Energy
Surfaces, Phys. Rev. Lett. 98, 146401 (2007)
[3] Justin S. Smith, Olexandr Isayev, Adrian E. Roitberg.