pyiron.base.job.path module

class pyiron.base.job.path.JobPath(db, job_id=None, db_entry=None, user=None)[source]

Bases: pyiron.base.job.path.JobPathBase

The JobPath class is derived from the JobCore and is used as a lean version of the GenericJob class. Instead of loading the full pyiron object the JobPath class only provides access to the HDF5 file, which should be enough for most analysis.

Parameters
  • db (DatabaseAccess) – database object

  • job_id (int) – Job ID - optional, but either a job ID or a database entry db_entry has to be provided.

  • db_entry (dict) – database entry {“job”:, “subjob”:, “projectpath”:, “project”:, “hamilton”:, “hamversion”:, “status”:} and optional entries are {“id”:, “masterid”:, “parentid”:}

  • user (str) – current unix/linux/windows user who is running pyiron

.. attribute:: job_name

name of the job, which has to be unique within the project

.. attribute:: status
execution status of the job, can be one of the following [initialized, appended, created, submitted, running,

aborted, collect, suspended, refresh, busy, finished]

.. attribute:: job_id

unique id to identify the job in the pyiron database

.. attribute:: parent_id

job id of the predecessor job - the job which was executed before the current one in the current job series

.. attribute:: master_id

job id of the master job - a meta job which groups a series of jobs, which are executed either in parallel or in serial.

.. attribute:: child_ids

list of child job ids - only meta jobs have child jobs - jobs which list the meta job as their master

.. attribute:: project

Project instance the jobs is located in

.. attribute:: project_hdf5

ProjectHDFio instance which points to the HDF5 file the job is stored in

.. attribute:: job_info_str

short string to describe the job by it is job_name and job ID - mainly used for logging

.. attribute:: working_directory

working directory of the job is executed in - outside the HDF5 file

.. attribute:: path

path to the job as a combination of absolute file system path and path within the HDF5 file.

.. attribute:: is_root

boolean if the HDF5 object is located at the root level of the HDF5 file

.. attribute:: is_open

boolean if the HDF5 file is currently opened - if an active file handler exists

.. attribute:: is_empty

boolean if the HDF5 file is empty

.. attribute:: base_name

name of the HDF5 file but without any file extension

.. attribute:: file_path

directory where the HDF5 file is located

.. attribute:: h5_path

path inside the HDF5 file - also stored as absolute path

class pyiron.base.job.path.JobPathBase(job_path)[source]

Bases: pyiron.base.job.core.JobCore

The JobPath class is derived from the JobCore and is used as a lean version of the GenericJob class. Instead of loading the full pyiron object the JobPath class only provides access to the HDF5 file, which should be enough for most analysis.

Parameters
  • db (DatabaseAccess) – database object

  • job_id (int) – Job ID - optional, but either a job ID or a database entry db_entry has to be provided.

  • db_entry (dict) – database entry {“job”:, “subjob”:, “projectpath”:, “project”:, “hamilton”:, “hamversion”:, “status”:} and optional entries are {“id”:, “masterid”:, “parentid”:}

  • user (str) – current unix/linux/windows user who is running pyiron

.. attribute:: job_name

name of the job, which has to be unique within the project

.. attribute:: status
execution status of the job, can be one of the following [initialized, appended, created, submitted, running,

aborted, collect, suspended, refresh, busy, finished]

.. attribute:: job_id

unique id to identify the job in the pyiron database

.. attribute:: parent_id

job id of the predecessor job - the job which was executed before the current one in the current job series

.. attribute:: master_id

job id of the master job - a meta job which groups a series of jobs, which are executed either in parallel or in serial.

.. attribute:: child_ids

list of child job ids - only meta jobs have child jobs - jobs which list the meta job as their master

.. attribute:: project

Project instance the jobs is located in

.. attribute:: project_hdf5

ProjectHDFio instance which points to the HDF5 file the job is stored in

.. attribute:: job_info_str

short string to describe the job by it is job_name and job ID - mainly used for logging

.. attribute:: working_directory

working directory of the job is executed in - outside the HDF5 file

.. attribute:: path

path to the job as a combination of absolute file system path and path within the HDF5 file.

.. attribute:: is_root

boolean if the HDF5 object is located at the root level of the HDF5 file

.. attribute:: is_open

boolean if the HDF5 file is currently opened - if an active file handler exists

.. attribute:: is_empty

boolean if the HDF5 file is empty

.. attribute:: base_name

name of the HDF5 file but without any file extension

.. attribute:: file_path

directory where the HDF5 file is located

.. attribute:: h5_path

path inside the HDF5 file - also stored as absolute path

property base_name

Name of the HDF5 file - but without the file extension .h5

Returns

file name without the file extension

Return type

str

close()[source]

Close the current HDF5 path and return to the path before the last open

create_group(name)[source]

Create an HDF5 group - similar to a folder in the filesystem - the HDF5 groups allow the users to structure their data.

Parameters

name (str) – name of the HDF5 group

Returns

FileHDFio object pointing to the new group

Return type

FileHDFio

property file_path

Path where the HDF5 file is located - posixpath.dirname()

Returns

HDF5 file location

Return type

str

groups()[source]

Filter HDF5 file by groups

Returns

an HDF5 file which is filtered by groups

Return type

FileHDFio

property h5_path

Get the path in the HDF5 file starting from the root group - meaning this path starts with ‘/’

Returns

HDF5 path

Return type

str

property is_empty

Check if the HDF5 file is empty

Returns

[True/False]

Return type

bool

property is_root

Check if the current h5_path is pointing to the HDF5 root group.

Returns

[True/False]

Return type

bool

items()[source]

List all keys and values as items of all groups and nodes of the HDF5 file

Returns

list of sets (key, value)

Return type

list

keys()[source]

List all groups and nodes of the HDF5 file - where groups are equivalent to directories and nodes to files.

Returns

all groups and nodes

Return type

list

list_dirs()[source]

equivalent to os.listdirs (consider groups as equivalent to dirs)

Returns

list of groups in pytables for the path self.h5_path

Return type

(list)

listdirs()[source]

equivalent to os.listdirs (consider groups as equivalent to dirs)

Returns

list of groups in pytables for the path self.h5_path

Return type

(list)

nodes()[source]

Filter HDF5 file by nodes

Returns

an HDF5 file which is filtered by nodes

Return type

FileHDFio

open(h5_rel_path)[source]

Create an HDF5 group and enter this specific group. If the group exists in the HDF5 path only the h5_path is set correspondingly otherwise the group is created first.

Parameters

h5_rel_path (str) – relative path from the current HDF5 path - h5_path - to the new group

Returns

FileHDFio object pointing to the new group

Return type

FileHDFio

put(key, value)[source]

Store data inside the HDF5 file

Parameters
  • key (str) – key to store the data

  • value (pandas.DataFrame, pandas.Series, dict, list, float, int) – basically any kind of data is supported

remove_file()[source]

Remove the HDF5 file with all the related content

values()[source]

List all values for all groups and nodes of the HDF5 file

Returns

list of all values

Return type

list