leaspy.algo.personalize.scipy_minimize
.ScipyMinimize
- class ScipyMinimize(settings)
Bases:
AbstractPersonalizeAlgo
Gradient descent based algorithm to compute individual parameters, i.e. personalize a model to a given set of subjects.
- Parameters
- settings
AlgorithmSettings
Settings of the algorithm. In particular the parameter custom_scipy_minimize_params may contain keyword arguments passed to
scipy.optimize.minimize()
.
- settings
- Attributes
- scipy_minimize_paramsdict
Keyword arguments to be passed to
scipy.optimize.minize()
. A default setting depending on whether using jacobian or not is applied (cf. ScipyMinimize.DEFAULT_SCIPY_MINIMIZE_PARAMS_WITH_JACOBIANand ScipyMinimize.DEFAULT_SCIPY_MINIMIZE_PARAMS_WITHOUT_JACOBIAN).
You may customize it by setting the custom_scipy_minimize_params algorithm parameter.
- format_convergence_issuesstr
Formatting of convergence issues. It should be a formattable string using any of those variables:
patient_id: str
optimization_result_pformat: str
(optimization_result_obj: dict-like)
cf. ScipyMinimize.DEFAULT_FORMAT_CONVERGENCE_ISSUES for the default format. You may customize it by setting the custom_format_convergence_issues algorithm parameter.
- loggerNone or callable str -> None
The function used to display convergence issues returned by
scipy.optimize.minize()
. By default we print the convergences issues if and only if we do not use BFGS optimization method. You can customize it at initialization by defining a logger attribute to your AlgorithmSettings instance.
Methods
is_jacobian_implemented
(model)Check that the jacobian of model is implemented.
load_parameters
(parameters)Update the algorithm's parameters by the ones in the given dictionary.
obj
(x, *args)Objective loss function to minimize in order to get patient's individual parameters
run
(model, *args[, return_noise])Main method, run the algorithm.
run_impl
(model, dataset)Main personalize function, wraps the abstract
_get_individual_parameters()
method.set_output_manager
(output_settings)Set a
FitOutputManager
object for the run of the algorithm- load_parameters(parameters: dict)
Update the algorithm’s parameters by the ones in the given dictionary. The keys in the io which does not belong to the algorithm’s parameters keys are ignored.
- Parameters
- parametersdict
Contains the pairs (key, value) of the wanted parameters
Examples
>>> settings = leaspy.io.settings.algorithm_settings.AlgorithmSettings("mcmc_saem") >>> my_algo = leaspy.algo.fit.tensor_mcmcsaem.TensorMCMCSAEM(settings) >>> my_algo.algo_parameters {'n_iter': 10000, 'n_burn_in_iter': 9000, 'eps': 0.001, 'L': 10, 'sampler_ind': 'Gibbs', 'sampler_pop': 'Gibbs', 'annealing': {'do_annealing': False, 'initial_temperature': 10, 'n_plateau': 10, 'n_iter': 200}} >>> parameters = {'n_iter': 5000, 'n_burn_in_iter': 4000} >>> my_algo.load_parameters(parameters) >>> my_algo.algo_parameters {'n_iter': 5000, 'n_burn_in_iter': 4000, 'eps': 0.001, 'L': 10, 'sampler_ind': 'Gibbs', 'sampler_pop': 'Gibbs', 'annealing': {'do_annealing': False, 'initial_temperature': 10, 'n_plateau': 10, 'n_iter': 200}}
- property log_noise_fmt
Getter
- Returns
- formatstr
The format for the print of the loss
- obj(x, *args)
Objective loss function to minimize in order to get patient’s individual parameters
- Parameters
- xarray-like [float]
Individual standardized parameters At initialization
x = [xi_mean/xi_std, tau_mean/tau_std] (+ [0.] * n_sources if multivariate model)
- *args
- model
AbstractModel
Model used to compute the group average parameters.
- model
- timepoints
torch.Tensor
[1,n_tpts] Contains the individual ages corresponding to the given
values
- timepoints
- values
torch.Tensor
[n_tpts, n_fts [, extra_dim_for_ordinal_model]] Contains the individual true scores corresponding to the given
times
, with nans.
- values
- with_gradientbool
If True: return (objective, gradient_objective)
Else: simply return objective
- Returns
- objectivefloat
Value of the loss function (opposite of log-likelihood).
- if with_gradient is True:
- 2-tuple (as expected by
scipy.optimize.minimize()
whenjac=True
) objective : float
gradient : array-like[float] of length n_dims_params
- 2-tuple (as expected by
- Raises
LeaspyAlgoInputError
if noise model is not currently supported by algorithm. TODO: everything that is not generic here concerning noise structure should be handle by model/NoiseModel directly!!!!
- run(model: AbstractModel, *args, return_noise: bool = False, **extra_kwargs) Any
Main method, run the algorithm.
TODO fix proper abstract class method: input depends on algorithm… (esp. simulate != from others…)
- Parameters
- model
AbstractModel
The used model.
- dataset
Dataset
Contains all the subjects’ observations with corresponding timepoints, in torch format to speed up computations.
- return_noisebool (default False), keyword only
Should the algorithm return main output and optional noise output as a 2-tuple?
- model
- Returns
- Depends on algorithm class: TODO change?
- run_impl(model, dataset)
Main personalize function, wraps the abstract
_get_individual_parameters()
method.- Parameters
- model
AbstractModel
A subclass object of leaspy AbstractModel.
- dataset
Dataset
Dataset object build with leaspy class objects Data, algo & model
- model
- Returns
- individual_parameters
IndividualParameters
Contains individual parameters.
- noise_stdfloat or
torch.FloatTensor
The estimated noise (is a tensor if model.noise_model is
'gaussian_diagonal'
)where , where are the model’s fixed effect, the model’s random effects, the time-points and the model’s estimator.
- individual_parameters
- set_output_manager(output_settings)
Set a
FitOutputManager
object for the run of the algorithm- Parameters
- output_settings
OutputsSettings
Contains the logs settings for the computation run (console print periodicity, plot periodicity …)
- output_settings
Examples
>>> from leaspy import AlgorithmSettings >>> from leaspy.io.settings.outputs_settings import OutputsSettings >>> from leaspy.algo.fit.tensor_mcmcsaem import TensorMCMCSAEM >>> algo_settings = AlgorithmSettings("mcmc_saem") >>> my_algo = TensorMCMCSAEM(algo_settings) >>> settings = {'path': 'brouillons', 'console_print_periodicity': 50, 'plot_periodicity': 100, 'save_periodicity': 50 } >>> my_algo.set_output_manager(OutputsSettings(settings))