leaspy.io.settings.algorithm_settings module
- class AlgorithmSettings(name: str, **kwargs)
Bases:
object
Used to set the algorithms’ settings.
All parameters, except the choice of the algorithm, is set by default. The user can overwrite all default settings.
- Parameters:
- namestr
- The algorithm’s name. Must be in:
- For fit algorithms:
'mcmc_saem'
'lme_fit'
(for LME model only)
- For personalize algorithms:
'scipy_minimize'
'mean_real'
'mode_real'
'constant_prediction'
(for constant model only)'lme_personalize'
(for LME model only)
- For simulate algorithms:
'simulation'
- **kwargsany
- Depending on the algorithm you are setting up, various parameters are possible (not exhaustive):
- seedint, optional, default None
Used for stochastic algorithms.
- model_initialization_methodstr, optional
For fit algorithms only, give a model initialization method, according to those possible in
initialize_parameters()
.
- algo_initialization_methodstr, optional
Personalize the algorithm initialization method, according to those possible for the given algorithm (refer to its documentation in
leaspy.algo
).
- n_iterint, optional
Number of iteration. There is no stopping criteria for the all the MCMC SAEM algorithms.
- n_burn_in_iterint, optional
Number of iteration during burning phase, used for the MCMC SAEM algorithms.
- use_jacobianbool, optional, default True
Used in
scipy_minimize
algorithm to perform a L-BFGS instead of a Powell algorithm.
- n_jobsint, optional, default 1
Used in
scipy_minimize
algorithm to accelerate calculation with parallel derivation using joblib.
- progress_barbool, optional, default True
Used to display a progress bar during computation.
- device: str or torch.device, optional
Specifies on which device the algorithm will run. Only ‘cpu’ and ‘cuda’ are supported for this argument. Only
'mcmc_saem'
,'mean_real'
and'mode_real'
algorithms support this setting.
For the complete list of the available parameters for a given algorithm, please directly refer to its documentation.
- Raises:
See also
Notes
For developers: use
_dynamic_default_parameters
to dynamically set some default parameters, depending on other parameters that were set, while these dynamic parameters were not set.- Example:
you could want to set burn in iterations or annealing iterations as fractions of non-default number of iterations given.
Format:
{algo_name: [ (functional_condition_to_trigger_dynamic_setting(kwargs), { nested_keys_of_dynamic_setting: dynamic_value(kwargs) }) ]}
- Attributes:
- namestr
The algorithm’s name.
- model_initialization_methodstr, optional
For fit algorithms, give a model initialization method, according to those possible in
initialize_parameters()
.- algo_initialization_methodstr, optional
Personalize the algorithm initialization method, according to those possible for the given algorithm (refer to its documentation in
leaspy.algo
).- seedint, optional, default None
Used for stochastic algorithms.
- parametersdict
Contains the other parameters: n_iter, n_burn_in_iter, use_jacobian, n_jobs & progress_bar.
- logs
OutputsSettings
, optional Used to create a
logs
file during a model calibration containing convergence information.- devicestr (or torch.device), optional, default ‘cpu’
Used to specify on which device the algorithm will run. This should either be: ‘cpu’ or ‘cuda’ and is only supported in specific algorithms (inheriting AlgoWithDeviceMixin). Note that specifying an indexed CUDA device (such as ‘cuda:1’) is not supported. In order to specify the precise cuda device index, one should use the CUDA_VISIBLE_DEVICES environment variable.
Methods
Check internal consistency of algorithm settings and warn or raise a LeaspyAlgoInputError if not.
load
(path_to_algorithm_settings)Instantiate a AlgorithmSettings object a from json file.
save
(path, **kwargs)Save an AlgorithmSettings object in a json file.
set_logs
([path])Use this method to monitor the convergence of a model calibration.
- property algo_class
Class of the algorithm derived from its name (shorthand).
- check_consistency() None
Check internal consistency of algorithm settings and warn or raise a LeaspyAlgoInputError if not.
- classmethod load(path_to_algorithm_settings: str)
Instantiate a AlgorithmSettings object a from json file.
- Parameters:
- path_to_algorithm_settingsstr
Path of the json file.
- Returns:
AlgorithmSettings
An instanced of AlgorithmSettings with specified parameters.
- Raises:
LeaspyAlgoInputError
if anything is invalid in algo settings
Examples
>>> from leaspy import AlgorithmSettings >>> leaspy_univariate = AlgorithmSettings.load('outputs/leaspy-univariate_model-settings.json')
- save(path: str, **kwargs)
Save an AlgorithmSettings object in a json file.
TODO? save leaspy version as well for retro/future-compatibility issues?
- Parameters:
- pathstr
Path to store the AlgorithmSettings.
- **kwargs
Keyword arguments for json.dump method. Default: dict(indent=2)
Examples
>>> from leaspy import AlgorithmSettings >>> settings = AlgorithmSettings('scipy_minimize', seed=42) >>> settings.save('outputs/scipy_minimize-settings.json')
- set_logs(path: str | None = None, **kwargs)
Use this method to monitor the convergence of a model calibration.
It create graphs and csv files of the values of the population parameters (fixed effects) during the calibration
- Parameters:
- pathstr, optional
The path of the folder to store the graphs and csv files. No data will be saved if it is None, as well as save_periodicity and plot_periodicity.
- **kwargs
- console_print_periodicity: int, optional, default 100
Display logs in the console/terminal every N iterations.
- save_periodicity: int, optional, default 50
Saves the values in csv files every N iterations.
- plot_periodicity: int, optional, default 1000
Generates plots from saved values every N iterations. Note that:
it should be a multiple of save_periodicity
setting a too low value (frequent) we seriously slow down you calibration
- overwrite_logs_folder: bool, optional, default False
Set it to
True
to overwrite the content of the folder inpath
.
- Raises:
LeaspyAlgoInputError
If the folder given in
path
already exists and ifoverwrite_logs_folder
is set toFalse
.
Notes
By default, if the folder given in
path
already exists, the method will raise an error. To overwrite the content of the folder, setoverwrite_logs_folder
it toTrue
.