pyiron.base.master.generic module¶
-
class
pyiron.base.master.generic.
GenericMaster
(project, job_name)[source]¶ Bases:
pyiron.base.job.generic.GenericJob
The GenericMaster is the template class for all meta jobs - meaning all jobs which contain multiple other jobs. It defines the shared functionality of the different kind of job series.
- Parameters
project (ProjectHDFio) – ProjectHDFio instance which points to the HDF5 file the job is stored in
job_name (str) – name of the job, which has to be unique within the project
-
.. attribute:: job_name
name of the job, which has to be unique within the project
-
.. attribute:: status
- execution status of the job, can be one of the following [initialized, appended, created, submitted,
running, aborted, collect, suspended, refresh, busy, finished]
-
.. attribute:: job_id
unique id to identify the job in the pyiron database
-
.. attribute:: parent_id
job id of the predecessor job - the job which was executed before the current one in the current job series
-
.. attribute:: master_id
job id of the master job - a meta job which groups a series of jobs, which are executed either in parallel or in serial.
-
.. attribute:: child_ids
list of child job ids - only meta jobs have child jobs - jobs which list the meta job as their master
-
.. attribute:: project
Project instance the jobs is located in
-
.. attribute:: project_hdf5
ProjectHDFio instance which points to the HDF5 file the job is stored in
-
.. attribute:: job_info_str
short string to describe the job by it is job_name and job ID - mainly used for logging
-
.. attribute:: working_directory
working directory of the job is executed in - outside the HDF5 file
-
.. attribute:: path
path to the job as a combination of absolute file system path and path within the HDF5 file.
-
.. attribute:: version
Version of the hamiltonian, which is also the version of the executable unless a custom executable is used.
-
.. attribute:: executable
Executable used to run the job - usually the path to an external executable.
-
.. attribute:: library_activated
For job types which offer a Python library pyiron can use the python library instead of an external executable.
-
.. attribute:: server
Server object to handle the execution environment for the job.
-
.. attribute:: queue_id
the ID returned from the queuing system - it is most likely not the same as the job ID.
-
.. attribute:: logger
logger object to monitor the external execution and internal pyiron warnings.
-
.. attribute:: restart_file_list
list of files which are used to restart the calculation from these files.
-
.. attribute:: job_type
- Job type object with all the available job types: [‘ExampleJob’, ‘SerialMaster’, ‘ParallelMaster’,
‘ScriptJob’, ‘ListMaster’]
-
.. attribute:: child_names
Dictionary matching the child ID to the child job name.
-
append
(job)[source]¶ Append a job to the GenericMaster - just like you would append an element to a list.
- Parameters
job (GenericJob) – job to append
-
property
child_ids
¶ list of child job ids - only meta jobs have child jobs - jobs which list the meta job as their master
- Returns
list of child job ids
- Return type
list
-
property
child_names
¶ Dictionary matching the child ID to the child job name
- Returns
{child_id: child job name }
- Return type
dict
-
collect_output
()[source]¶ Collect the output files of the external executable and store the information in the HDF5 file. This method has to be implemented in the individual hamiltonians.
-
copy_to
(project=None, new_job_name=None, input_only=False, new_database_entry=True)[source]¶ Copy the content of the job including the HDF5 file to a new location
- Parameters
project (ProjectHDFio) – project to copy the job to
new_job_name (str) – to duplicate the job within the same porject it is necessary to modify the job name - optional
input_only (bool) – [True/False] to copy only the input - default False
new_database_entry (bool) – [True/False] to create a new database entry - default True
- Returns
GenericJob object pointing to the new location.
- Return type
-
first_child_name
()[source]¶ Get the name of the first child job
- Returns
name of the first child job
- Return type
str
-
from_hdf
(hdf=None, group_name=None)[source]¶ Restore the GenericMaster from an HDF5 file
- Parameters
hdf (ProjectHDFio) – HDF5 group object - optional
group_name (str) – HDF5 subgroup name - optional
-
get_child_cores
()[source]¶ Calculate the currently active number of cores, by summarizing all childs which are neither finished nor aborted.
- Returns
number of cores used
- Return type
(int)
-
interactive_flush
(path='generic', include_last_step=True)[source]¶ interactive flush is not implemtned for MetaJobs
-
property
job_object_dict
¶ internal cache of currently loaded jobs
- Returns
Dictionary of currently loaded jobs
- Return type
dict
-
move_to
(project)[source]¶ Move the content of the job including the HDF5 file to a new location
- Parameters
project (ProjectHDFio) – project to move the job to
- Returns
JobCore object pointing to the new location.
- Return type
-
pop
(i=- 1)[source]¶ Pop a job from the GenericMaster - just like you would pop an element from a list
- Parameters
i (int) – position of the job. (Default is last element, -1.)
- Returns
job
- Return type
-
run_if_interactive
()[source]¶ For jobs which executables are available as Python library, those can also be executed with a library call instead of calling an external executable. This is usually faster than a single core python job.
-
run_if_interactive_non_modal
()[source]¶ Run if interactive non modal is not implemented for MetaJobs
-
run_if_refresh
()[source]¶ Internal helper function the run if refresh function is called when the job status is ‘refresh’. If the job was suspended previously, the job is going to be started again, to be continued.
-
set_child_id_func
(child_id_func)[source]¶ Add an external function to derive a list of child IDs - experimental feature
- Parameters
child_id_func (Function) – Python function which returns the list of child IDs
-
to_hdf
(hdf=None, group_name=None)[source]¶ Store the GenericMaster in an HDF5 file
- Parameters
hdf (ProjectHDFio) – HDF5 group object - optional
group_name (str) – HDF5 subgroup name - optional