Optimize
inspeqtor.optimize
inspeqtor.optimize.get_default_optimizer
get_default_optimizer(
n_iterations: int,
) -> GradientTransformation
Generate present optimizer from number of training iteration.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
n_iterations
|
int
|
Training iteration |
required |
Returns:
| Type | Description |
|---|---|
GradientTransformation
|
optax.GradientTransformation: Optax optimizer. |
Source code in src/inspeqtor/v1/optimize.py
8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 | |
inspeqtor.optimize.minimize
minimize(
params: ArrayTree,
func: Callable[[ndarray], tuple[ndarray, Any]],
optimizer: GradientTransformation,
lower: ArrayTree | None = None,
upper: ArrayTree | None = None,
maxiter: int = 1000,
callbacks: list[Callable] = [],
) -> tuple[ArrayTree, list[Any]]
Optimize the loss function with bounded parameters.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
params
|
ArrayTree
|
Intiial parameters to be optimized |
required |
lower
|
ArrayTree
|
Lower bound of the parameters |
None
|
upper
|
ArrayTree
|
Upper bound of the parameters |
None
|
func
|
Callable[[ndarray], tuple[ndarray, Any]]
|
Loss function |
required |
optimizer
|
GradientTransformation
|
Instance of optax optimizer |
required |
maxiter
|
int
|
Number of optimization step. Defaults to 1000. |
1000
|
Returns:
| Type | Description |
|---|---|
tuple[ArrayTree, list[Any]]
|
tuple[chex.ArrayTree, list[typing.Any]]: Tuple of parameters and optimization history |
Source code in src/inspeqtor/v1/optimize.py
28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 | |
inspeqtor.optimize.stochastic_minimize
stochastic_minimize(
key: ndarray,
params: ArrayTree,
func: Callable[[ndarray, ndarray], tuple[ndarray, Any]],
optimizer: GradientTransformation,
lower: ArrayTree | None = None,
upper: ArrayTree | None = None,
maxiter: int = 1000,
callbacks: list[Callable] = [],
) -> tuple[ArrayTree, list[Any]]
Optimize the loss function with bounded parameters.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
params
|
ArrayTree
|
Intiial parameters to be optimized |
required |
lower
|
ArrayTree
|
Lower bound of the parameters |
None
|
upper
|
ArrayTree
|
Upper bound of the parameters |
None
|
func
|
Callable[[ndarray], tuple[ndarray, Any]]
|
Loss function |
required |
optimizer
|
GradientTransformation
|
Instance of optax optimizer |
required |
maxiter
|
int
|
Number of optimization step. Defaults to 1000. |
1000
|
Returns:
| Type | Description |
|---|---|
tuple[ArrayTree, list[Any]]
|
tuple[chex.ArrayTree, list[typing.Any]]: Tuple of parameters and optimization history |
Source code in src/inspeqtor/v1/optimize.py
72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 | |
inspeqtor.optimize.fit_gaussian_process
fit_gaussian_process(D: Dataset)
Fit the Gaussian process given an instance of Dataset
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
D
|
Dataset
|
The |
required |
Returns:
| Type | Description |
|---|---|
|
tuple[]: description |
Source code in src/inspeqtor/v2/optimize.py
9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 | |
inspeqtor.optimize.predict_with_gaussian_process
predict_with_gaussian_process(
x, posterior: ConjugatePosterior, D: Dataset
) -> tuple[ndarray, ndarray]
Source code in src/inspeqtor/v2/optimize.py
36 37 38 39 40 41 42 43 44 | |
inspeqtor.optimize.predict_mean_and_std
predict_mean_and_std(
x: ndarray, D: Dataset
) -> tuple[ndarray, ndarray]
Predict a Gaussian distribution to the given x using the dataset D
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
x
|
ndarray
|
The array of points to evaluate the gaussian process. |
required |
D
|
Dataset
|
The dataset contain observation from the real process. |
required |
Returns:
| Type | Description |
|---|---|
tuple[ndarray, ndarray]
|
tuple[jnp.ndarray, jnp.ndarray]: The array of mean and standard deviation of the Gaussian process at ponits |
Source code in src/inspeqtor/v2/optimize.py
47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 | |
inspeqtor.optimize.expected_improvement
expected_improvement(
y_best: ndarray,
posterior_mean: ndarray,
posterior_var: ndarray,
exploration_factor: float,
) -> ndarray
The expected improvement calculated using posterior mean and variance of the gaussian process. The exploration factor can be adjust to balance between exploration and exploitation.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
y_best
|
ndarray
|
The current maximum value of y |
required |
posterior_mean
|
ndarray
|
The posterior mean of the gaussian process |
required |
posterior_var
|
ndarray
|
The posterior variance of the gaussian process |
required |
exploration_factor
|
float
|
The factor that balance between exploration and exploitation. Set to 0. to maximize exploitation. |
required |
Returns:
| Type | Description |
|---|---|
ndarray
|
jnp.ndarray: The expeced improvement corresponding to the points given from array of the posterior. |
Source code in src/inspeqtor/v2/optimize.py
64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 | |
inspeqtor.optimize.BayesOptState
The dataclass holding optimization state for the gaussian process.
Source code in src/inspeqtor/v2/optimize.py
91 92 93 94 95 96 | |
inspeqtor.optimize.init_opt_state
init_opt_state(x, y, control) -> BayesOptState
Function to intialize the optimizer
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
x
|
ndarray
|
The input arguments |
required |
y
|
ndarray
|
The observation corresponding to the input |
required |
control
|
_type_
|
The intance of control sequence. |
required |
Returns:
| Name | Type | Description |
|---|---|---|
BayesOptState |
BayesOptState
|
The state of optimizer. |
Source code in src/inspeqtor/v2/optimize.py
99 100 101 102 103 104 105 106 107 108 109 110 | |
inspeqtor.optimize.suggest_next_candidates
suggest_next_candidates(
key: ndarray,
opt_state: BayesOptState,
sample_size: int = 1000,
num_suggest: int = 1,
exploration_factor: float = 0.0,
) -> ndarray
Sample new candidates for experiment using expected improvement.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
key
|
ndarray
|
The jax random key |
required |
opt_state
|
BayesOptState
|
The current optimizer state |
required |
sample_size
|
int
|
The internal number of sample size. Defaults to 1000. |
1000
|
num_suggest
|
int
|
The number of suggestion for next experiment. Defaults to 1. |
1
|
exploration_factor
|
float
|
The factor that balance between exploration and exploitation. Set to 0. to maximize exploitation. Defaults to 0.0. |
0.0
|
Returns:
| Type | Description |
|---|---|
ndarray
|
jnp.ndarray: The suggest data points to evalute in the experiment. |
Source code in src/inspeqtor/v2/optimize.py
113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 | |
inspeqtor.optimize.add_observations
add_observations(
opt_state: BayesOptState, x, y
) -> BayesOptState
Function to update the optimization state using new data points x and y
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
opt_state
|
BayesOptState
|
The current optimization state |
required |
x
|
ndarray
|
The input arguments |
required |
y
|
ndarray
|
The observation corresponding to the input |
required |
Returns:
| Name | Type | Description |
|---|---|---|
BayesOptState |
BayesOptState
|
The updated optimization state. |
Source code in src/inspeqtor/v2/optimize.py
154 155 156 157 158 159 160 161 162 163 164 165 166 167 | |