leaspy.algo.personalize.scipy_minimize module

class ScipyMinimize(settings)

Bases: AbstractPersonalizeAlgo

Gradient descent based algorithm to compute individual parameters, i.e. personalize a model to a given set of subjects.

Parameters:
settingsAlgorithmSettings

Settings of the algorithm. In particular the parameter custom_scipy_minimize_params may contain keyword arguments passed to scipy.optimize.minimize().

Attributes:
scipy_minimize_paramsdict

Keyword arguments to be passed to scipy.optimize.minimize(). A default setting depending on whether using jacobian or not is applied (cf. ScipyMinimize.DEFAULT_SCIPY_MINIMIZE_PARAMS_WITH_JACOBIAN

and ScipyMinimize.DEFAULT_SCIPY_MINIMIZE_PARAMS_WITHOUT_JACOBIAN).

You may customize it by setting the custom_scipy_minimize_params algorithm parameter.

format_convergence_issuesstr

Formatting of convergence issues. It should be a formattable string using any of those variables:

  • patient_id: str

  • optimization_result_pformat: str

  • (optimization_result_obj: dict-like)

cf. ScipyMinimize.DEFAULT_FORMAT_CONVERGENCE_ISSUES for the default format. You may customize it by setting the custom_format_convergence_issues algorithm parameter.

loggerNone or callable str -> None

The function used to display convergence issues returned by scipy.optimize.minimize(). By default we print the convergences issues if and only if we do not use BFGS optimization method. You can customize it at initialization by defining a logger attribute to your AlgorithmSettings instance.

Methods

is_jacobian_implemented(model)

Check that the jacobian of model is implemented.

load_parameters(parameters)

Update the algorithm's parameters by the ones in the given dictionary.

obj(x, model, dataset, with_gradient)

Objective loss function to minimize in order to get patient's individual parameters

run(model, *args[, return_loss])

Main method, run the algorithm.

run_impl(model, dataset)

Main personalize function, wraps the abstract _get_individual_parameters() method.

set_output_manager(output_settings)

Set a FitOutputManager object for the run of the algorithm

DEFAULT_FORMAT_CONVERGENCE_ISSUES = '<!> {patient_id}:\n{optimization_result_pformat}'
DEFAULT_SCIPY_MINIMIZE_PARAMS_WITHOUT_JACOBIAN = {'method': 'Powell', 'options': {'ftol': 0.0001, 'maxiter': 200, 'xtol': 0.0001}}
DEFAULT_SCIPY_MINIMIZE_PARAMS_WITH_JACOBIAN = {'method': 'BFGS', 'options': {'gtol': 0.01, 'maxiter': 200}}
deterministic: bool = False
family: str = 'personalize'
is_jacobian_implemented(model: AbstractModel) bool

Check that the jacobian of model is implemented.

load_parameters(parameters: dict)

Update the algorithm’s parameters by the ones in the given dictionary. The keys in the io which does not belong to the algorithm’s parameters keys are ignored.

Parameters:
parametersdict

Contains the pairs (key, value) of the wanted parameters

Examples

>>> settings = leaspy.io.settings.algorithm_settings.AlgorithmSettings("mcmc_saem")
>>> my_algo = leaspy.algo.fit.tensor_mcmcsaem.TensorMCMCSAEM(settings)
>>> my_algo.algo_parameters
{'n_iter': 10000,
 'n_burn_in_iter': 9000,
 'eps': 0.001,
 'L': 10,
 'sampler_ind': 'Gibbs',
 'sampler_pop': 'Gibbs',
 'annealing': {'do_annealing': False,
  'initial_temperature': 10,
  'n_plateau': 10,
  'n_iter': 200}}
>>> parameters = {'n_iter': 5000, 'n_burn_in_iter': 4000}
>>> my_algo.load_parameters(parameters)
>>> my_algo.algo_parameters
{'n_iter': 5000,
 'n_burn_in_iter': 4000,
 'eps': 0.001,
 'L': 10,
 'sampler_ind': 'Gibbs',
 'sampler_pop': 'Gibbs',
 'annealing': {'do_annealing': False,
  'initial_temperature': 10,
  'n_plateau': 10,
  'n_iter': 200}}
name: str = 'scipy_minimize'
obj(x: list, model: AbstractModel, dataset: Dataset, with_gradient: bool)

Objective loss function to minimize in order to get patient’s individual parameters

Parameters:
xlist[torch.tensors]

Individual standardized parameters At initialization x = [xi_mean/xi_std, tau_mean/tau_std] (+ [0.] * n_sources if multivariate model)

modelAbstractModel

Model used to compute the group average parameters.

datasetDataset

A dataset instance for the single patient being optimized.

with_gradientbool

If True: return (objective, gradient_objective) Else: simply return objective

Returns:
objectivefloat

Value of the loss function (negative log-likelihood).

if with_gradient is True:
2-tuple (as expected by scipy.optimize.minimize() when jac=True)
  • objective : float

  • gradient : array-like[float] of length n_dims_params

Raises:
LeaspyAlgoInputError

if noise model is not currently supported by algorithm.

output_manager: FitOutputManager | None
run(model: AbstractModel, *args, return_loss: bool = False, **extra_kwargs) Any

Main method, run the algorithm.

TODO fix proper abstract class method: input depends on algorithm… (esp. simulate != from others…)

Parameters:
modelAbstractModel

The used model.

datasetDataset

Contains all the subjects’ observations with corresponding timepoints, in torch format to speed up computations.

return_lossbool (default False), keyword only

Should the algorithm return main output and optional loss output as a 2-tuple?

Returns:
Depends on algorithm class: TODO change?
run_impl(model: AbstractModel, dataset: Dataset) Tuple[IndividualParameters, Tensor]

Main personalize function, wraps the abstract _get_individual_parameters() method.

Parameters:
modelAbstractModel

A subclass object of leaspy AbstractModel.

datasetDataset

Dataset object build with leaspy class objects Data, algo & model

Returns:
individual_parametersIndividualParameters

Contains individual parameters.

noise_stdfloat or torch.FloatTensor

The estimated noise (is a tensor if model.noise_model is 'gaussian_diagonal')

= \frac{1}{n_{visits} \times n_{dim}} \sqrt{\sum_{i, j \in [1, n_{visits}] \times [1, n_{dim}]} \varepsilon_{i,j}}

where \varepsilon_{i,j} = \left( f(\theta, (z_{i,j}), (t_{i,j})) - (y_{i,j}) \right)^2 , where \theta are the model’s fixed effect, (z_{i,j}) the model’s random effects, (t_{i,j}) the time-points and f the model’s estimator.

set_output_manager(output_settings: OutputsSettings) None

Set a FitOutputManager object for the run of the algorithm

Parameters:
output_settingsOutputsSettings

Contains the logs settings for the computation run (console print periodicity, plot periodicity …)

Examples

>>> from leaspy import AlgorithmSettings
>>> from leaspy.io.settings.outputs_settings import OutputsSettings
>>> from leaspy.algo.fit.tensor_mcmcsaem import TensorMCMCSAEM
>>> algo_settings = AlgorithmSettings("mcmc_saem")
>>> my_algo = TensorMCMCSAEM(algo_settings)
>>> settings = {'path': 'brouillons',
                'console_print_periodicity': 50,
                'plot_periodicity': 100,
                'save_periodicity': 50
                }
>>> my_algo.set_output_manager(OutputsSettings(settings))