pyiron.table.datamining module

class pyiron.table.datamining.FunctionContainer[source]

Bases: object

Class which is able to append, store and retreive a set of functions.

class pyiron.table.datamining.JobFilters[source]

Bases: object

Certain predefined job filters

static job_name_contains(job_name_segment)[source]
static job_type(job_type)[source]
class pyiron.table.datamining.PyironTable(project, name=None)[source]

Bases: object

Class for easy, efficient, and pythonic analysis of data from pyiron projects

Parameters
  • project (pyiron.project.Project/None) – The project to analyze

  • name (str) – Name of the pyiron table

col_to_value(col_name)[source]
convert_dict(input_dict)[source]
create_table(enforce_update=False, level=3, file=None, job_status_list=None)[source]
property db_filter_function

Function to filter the a project database table before job specific functions are applied.

The function must take a pyiron project table in the pandas.DataFrame format (project.job_table()) and return a boolean pandas.DataSeries with the same number of rows as the project table

Example

def function(df):

return (df[“chemicalformula”==”H2”]) & (df[“hamilton”==”Vasp”])

property filter

Object containing pre-defined filter functions

Returns

The object containing the filters

Return type

pyiron.table.datamining.JobFilters

property filter_function

Function to filter each job before more expensive functions are applied

from_hdf()[source]
get_dataframe()[source]
list_groups()[source]
list_nodes()[source]
load(name=None)[source]
property name

Name of the table. Takes the project name if not specified

Returns

Name of the table

Return type

str

refill_dict(diff_dict_lst)[source]
save(name=None)[source]
static str_to_value(input_val)[source]
to_hdf()[source]
static total_lst_of_keys(diff_dict_lst)[source]
class pyiron.table.datamining.TableJob(project, job_name)[source]

Bases: pyiron.base.job.generic.GenericJob

property add
property analysis_project
property convert_to_object
property db_filter_function
property enforce_update
property filter
property filter_function
from_hdf(hdf=None, group_name=None)[source]

Restore pyiron table job from HDF5

Parameters
  • hdf

  • group_name

get_dataframe()[source]
Returns

pandas.Dataframe

property project_level
property pyiron_table
property ref_project
run_static()[source]

The run static function is called by run to execute the simulation.

to_hdf(hdf=None, group_name=None)[source]

Store pyiron table job in HDF5

Parameters
  • hdf

  • group_name

update_table(job_status_list=None)[source]

Update the pyiron table object, add new columns if a new function was added or add new rows for new jobs

Parameters

job_status_list (list/None) – List of job status which are added to the table by default [“finished”]

validate_ready_to_run()[source]

Validate that the calculation is ready to be executed. By default no generic checks are performed, but one could check that the input information is complete or validate the consistency of the input at this point.

write_input()[source]

Write the input files for the external executable. This method has to be implemented in the individual hamiltonians.

pyiron.table.datamining.always_true(_)[source]

A function that always returns True no matter what!

Returns

True

Return type

bool

pyiron.table.datamining.always_true_pandas(job_table)[source]

A function which returns a pandas Series with all True values based on the size of the input pandas dataframe :param job_table: Input dataframe :type job_table: pandas.DataFrame

Returns

A series of True values

Return type

pandas.Series