leaspy.algo.personalize.scipy_minimize module
- class ScipyMinimize(settings)
Bases:
AbstractPersonalizeAlgo
Gradient descent based algorithm to compute individual parameters, i.e. personalize a model to a given set of subjects.
- Parameters:
- settings
AlgorithmSettings
Settings of the algorithm. In particular the parameter custom_scipy_minimize_params may contain keyword arguments passed to
scipy.optimize.minimize()
.
- settings
- Attributes:
- scipy_minimize_paramsdict
Keyword arguments to be passed to
scipy.optimize.minimize()
. A default setting depending on whether using jacobian or not is applied (cf. ScipyMinimize.DEFAULT_SCIPY_MINIMIZE_PARAMS_WITH_JACOBIANand ScipyMinimize.DEFAULT_SCIPY_MINIMIZE_PARAMS_WITHOUT_JACOBIAN).
You may customize it by setting the custom_scipy_minimize_params algorithm parameter.
- format_convergence_issuesstr
Formatting of convergence issues. It should be a formattable string using any of those variables:
patient_id: str
optimization_result_pformat: str
(optimization_result_obj: dict-like)
cf. ScipyMinimize.DEFAULT_FORMAT_CONVERGENCE_ISSUES for the default format. You may customize it by setting the custom_format_convergence_issues algorithm parameter.
- loggerNone or callable str -> None
The function used to display convergence issues returned by
scipy.optimize.minimize()
. By default we print the convergences issues if and only if we do not use BFGS optimization method. You can customize it at initialization by defining a logger attribute to your AlgorithmSettings instance.
Methods
is_jacobian_implemented
(model)Check that the jacobian of model is implemented.
load_parameters
(parameters)Update the algorithm's parameters by the ones in the given dictionary.
obj
(x, model, dataset, with_gradient)Objective loss function to minimize in order to get patient's individual parameters
run
(model, *args[, return_loss])Main method, run the algorithm.
run_impl
(model, dataset)Main personalize function, wraps the abstract
_get_individual_parameters()
method.set_output_manager
(output_settings)Set a
FitOutputManager
object for the run of the algorithm- DEFAULT_FORMAT_CONVERGENCE_ISSUES = '<!> {patient_id}:\n{optimization_result_pformat}'
- DEFAULT_SCIPY_MINIMIZE_PARAMS_WITHOUT_JACOBIAN = {'method': 'Powell', 'options': {'ftol': 0.0001, 'maxiter': 200, 'xtol': 0.0001}}
- DEFAULT_SCIPY_MINIMIZE_PARAMS_WITH_JACOBIAN = {'method': 'BFGS', 'options': {'gtol': 0.01, 'maxiter': 200}}
- is_jacobian_implemented(model: AbstractModel) bool
Check that the jacobian of model is implemented.
- load_parameters(parameters: dict)
Update the algorithm’s parameters by the ones in the given dictionary. The keys in the io which does not belong to the algorithm’s parameters keys are ignored.
- Parameters:
- parametersdict
Contains the pairs (key, value) of the wanted parameters
Examples
>>> settings = leaspy.io.settings.algorithm_settings.AlgorithmSettings("mcmc_saem") >>> my_algo = leaspy.algo.fit.tensor_mcmcsaem.TensorMCMCSAEM(settings) >>> my_algo.algo_parameters {'n_iter': 10000, 'n_burn_in_iter': 9000, 'eps': 0.001, 'L': 10, 'sampler_ind': 'Gibbs', 'sampler_pop': 'Gibbs', 'annealing': {'do_annealing': False, 'initial_temperature': 10, 'n_plateau': 10, 'n_iter': 200}} >>> parameters = {'n_iter': 5000, 'n_burn_in_iter': 4000} >>> my_algo.load_parameters(parameters) >>> my_algo.algo_parameters {'n_iter': 5000, 'n_burn_in_iter': 4000, 'eps': 0.001, 'L': 10, 'sampler_ind': 'Gibbs', 'sampler_pop': 'Gibbs', 'annealing': {'do_annealing': False, 'initial_temperature': 10, 'n_plateau': 10, 'n_iter': 200}}
- obj(x: list, model: AbstractModel, dataset: Dataset, with_gradient: bool)
Objective loss function to minimize in order to get patient’s individual parameters
- Parameters:
- xlist[torch.tensors]
Individual standardized parameters At initialization
x = [xi_mean/xi_std, tau_mean/tau_std] (+ [0.] * n_sources if multivariate model)
- model
AbstractModel
Model used to compute the group average parameters.
- dataset
Dataset
A dataset instance for the single patient being optimized.
- with_gradientbool
If True: return (objective, gradient_objective) Else: simply return objective
- Returns:
- objectivefloat
Value of the loss function (negative log-likelihood).
- if with_gradient is True:
- 2-tuple (as expected by
scipy.optimize.minimize()
whenjac=True
) objective : float
gradient : array-like[float] of length n_dims_params
- 2-tuple (as expected by
- Raises:
LeaspyAlgoInputError
if noise model is not currently supported by algorithm.
- output_manager: FitOutputManager | None
- run(model: AbstractModel, *args, return_loss: bool = False, **extra_kwargs) Any
Main method, run the algorithm.
TODO fix proper abstract class method: input depends on algorithm… (esp. simulate != from others…)
- Parameters:
- model
AbstractModel
The used model.
- dataset
Dataset
Contains all the subjects’ observations with corresponding timepoints, in torch format to speed up computations.
- return_lossbool (default False), keyword only
Should the algorithm return main output and optional loss output as a 2-tuple?
- model
- Returns:
- Depends on algorithm class: TODO change?
- run_impl(model: AbstractModel, dataset: Dataset) Tuple[IndividualParameters, Tensor]
Main personalize function, wraps the abstract
_get_individual_parameters()
method.- Parameters:
- model
AbstractModel
A subclass object of leaspy AbstractModel.
- dataset
Dataset
Dataset object build with leaspy class objects Data, algo & model
- model
- Returns:
- individual_parameters
IndividualParameters
Contains individual parameters.
- noise_stdfloat or
torch.FloatTensor
The estimated noise (is a tensor if model.noise_model is
'gaussian_diagonal'
)where , where are the model’s fixed effect, the model’s random effects, the time-points and the model’s estimator.
- individual_parameters
- set_output_manager(output_settings: OutputsSettings) None
Set a
FitOutputManager
object for the run of the algorithm- Parameters:
- output_settings
OutputsSettings
Contains the logs settings for the computation run (console print periodicity, plot periodicity …)
- output_settings
Examples
>>> from leaspy import AlgorithmSettings >>> from leaspy.io.settings.outputs_settings import OutputsSettings >>> from leaspy.algo.fit.tensor_mcmcsaem import TensorMCMCSAEM >>> algo_settings = AlgorithmSettings("mcmc_saem") >>> my_algo = TensorMCMCSAEM(algo_settings) >>> settings = {'path': 'brouillons', 'console_print_periodicity': 50, 'plot_periodicity': 100, 'save_periodicity': 50 } >>> my_algo.set_output_manager(OutputsSettings(settings))