pyiron.vasp.metadyn module¶
-
class
pyiron.vasp.metadyn.
MetadynInput
[source]¶ Bases:
pyiron.vasp.base.Input
-
from_hdf
(hdf)[source]¶ Reads the attributes and reconstructs the object from a hdf file
- Parameters
hdf – The hdf5 instance
-
to_hdf
(hdf)[source]¶ Save the object in a HDF5 file
- Parameters
hdf (pyiron.base.generic.hdfio.ProjectHDFio) – HDF path to which the object is to be saved
-
-
class
pyiron.vasp.metadyn.
MetadynOutput
[source]¶ Bases:
pyiron.vasp.base.Output
-
collect
(directory='/home/docs/checkouts/readthedocs.org/user_builds/pyiron/checkouts/pyiron-0.2.17/docs', sorted_indices=None)[source]¶ Collects output from the working directory
- Parameters
directory (str) – Path to the directory
sorted_indices (np.array/None) –
-
to_hdf
(hdf)[source]¶ Save the object in a HDF5 file
- Parameters
hdf (pyiron.base.generic.hdfio.ProjectHDFio) – HDF path to which the object is to be saved
-
-
class
pyiron.vasp.metadyn.
VaspMetadyn
(project, job_name)[source]¶ Bases:
pyiron.vasp.vasp.Vasp
Class to setup and run and analyze VASP and VASP metadynamics simulations. For more details see the appropriate VASP documentation
This class is a derivative of pyiron.objects.job.generic.GenericJob. The functions in these modules are written in such the function names and attributes are very generic (get_structure(), molecular_dynamics(), version) but the functions are written to handle VASP specific input/output.
- Parameters
project (pyiron.project.Project instance) – Specifies the project path among other attributes
job_name (str) – Name of the job
Examples
Let’s say you need to run a vasp simulation where you would like to control the input parameters manually. To set up a static dft run with Gaussian smearing and a k-point MP mesh of [6, 6, 6]. You would have to set it up as shown below:
>>> ham = Vasp(job_name="trial_job") >>> ham.input.incar[IBRION] = -1 >>> ham.input.incar[ISMEAR] = 0 >>> ham.input.kpoints.set(size_of_mesh=[6, 6, 6])
However, the according to pyiron’s philosophy, it is recommended to avoid using code specific tags like IBRION, ISMEAR etc. Therefore the recommended way to set this calculation is as follows:
>>> ham = Vasp(job_name="trial_job") >>> ham.calc_static() >>> ham.set_occupancy_smearing(smearing="gaussian") >>> ham.set_kpoints(mesh=[6, 6, 6]) The exact same tags as in the first examples are set automatically.
-
from_hdf
(hdf=None, group_name=None)[source]¶ Recreates instance from the hdf5 file
- Parameters
hdf (pyiron.base.generic.hdfio.ProjectHDFio) – The HDF file/path to read the data from
group_name (str) – The name of the group under which the data must be stored as
-
set_complex_constraint
(name, constraint_type, coefficient_dict, biased=False, increment=0.0)[source]¶ Set complex constraints based on defined primitive constraints.
- Parameters
name (str) – Name of the complex constraint
constraint_type (str) – Type of the constraint (has to be part of supported_complex_constraints
coefficient_dict (dict) – Dictionary containing the primitive constraint name as the key and the constraint coefficient as the values
biased (bool) – True if potential bias is to be applied (biased MD and metadynamics calculations)
increment (float) – Increment in the constraint variable at every time step
-
set_primitive_constraint
(name, constraint_type, atom_indices, biased=False, increment=0.0)[source]¶ Function to set primitive geometric constraints in VASP.
- Parameters
name (str) – Name of the constraint
constraint_type (str) – Type of the constraint (has to be part of supported_primitive_constraints
atom_indices (int/list/numpy.ndarray) – Indices of the atoms for which the constraint should be applied
biased (bool) – True if potential bias is to be applied (biased MD and metadynamics calculations)
increment (float) – Increment in the constraint variable at every time step
-
to_hdf
(hdf=None, group_name=None)[source]¶ Stores the instance attributes into the hdf5 file
- Parameters
hdf (pyiron.base.generic.hdfio.ProjectHDFio) – The HDF file/path to write the data to
group_name (str) – The name of the group under which the data must be stored as