Skip to content

multilayer_optimizer

qubit_approximant.core.optimizer.multilayer_optimizer

Incremental optimizer

Classes

MultilayerOptimizer: Base class for the optimization of circuits with multiple layers. NonIncrementalOptimizer: This optimizer uses the parameters of an optimized L layer circuit as input for the optimization of a L+1 layer circuit. IncrementalOptimizer: This optimizer uses the parameters of an optimized L layer circuit as input for the optimization of a L+1 layer circuit.

IncrementalOptimizer(min_layer, max_layer, optimizer, new_layer_coef, new_layer_position)

Bases: MultilayerOptimizer

This optimizer uses the parameters of an optimized L layer circuit as input for the optimization of a L+1 layer circuit.

Attributes:

Name Type Description
new_layer_position str

The position where to add the parameters of the new layer. For, example, it may be the initial or final layer of our circuit.

Parameters:

Name Type Description Default
min_layer int

Starting number of layers to optimize.

required
max_layer int

Final number of layers to optimize.

required
optimizer Optimizer

The optimizer used to find the optimum parameters.

required
new_layer_coef float

The coefficient that multiplies the normal distribution of the new parameters in the additional layer.

required
new_layer_position str

The position where to add the parameters of the new layer. For, example, it may be the initial or final layer of our circuit.

required
Source code in qubit_approximant/core/optimizer/multilayer_optimizer.py
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
def __init__(
    self,
    min_layer,
    max_layer,
    optimizer: Optimizer,
    new_layer_coef: float,
    new_layer_position: str,
) -> None:
    """
    Initialize a black box optimizer.

    Parameters
    ----------
    min_layer : int
        Starting number of layers to optimize.
    max_layer : int
        Final number of layers to optimize.
    optimizer : Optimizer
        The optimizer used to find the optimum parameters.
    new_layer_coef : float
        The coefficient that multiplies the normal distribution of the
        new parameters in the additional layer.
    new_layer_position : str
        The position where to add the parameters of the new layer. For,
        example, it may be the initial or final layer of our circuit.
    """
    if new_layer_position in IncrementalOptimizer.layer_positions:
        self.new_layer_position = new_layer_position
    else:
        raise ValueError(
            f"new_layer_position = {new_layer_position} is not supported. "
            "Try 'initial', 'middle', 'final' or 'random'"
        )
    super().__init__(min_layer, max_layer, optimizer, new_layer_coef)

inital_params_diff: tuple[list[float], list[float]] property

Returns a list with the mean and standard deviation of the difference between the optimum parameters in the i-th layer and the optimum parameters of the (i+1)-th layer. (We exclude the additional parameters added with the new layer).

Returns:

Type Description
tuple[list[float], list[float]]

Mean and standard deviation of the parameter differences.

Raises:

Type Description
ValueError

Parameter difference only supported for new initial and final layers.

__call__(cost, grad_cost, init_params)

Calculate the optimized parameters for each number of layers.

Parameters:

Name Type Description Default
cost Callable

Cost function to be minimized.

required
grad_cost Callable

Gradient of the cost function.

required
init_params NDArray

Initial parameter guess for the cost function; used to initialize the optimizer.

required

Returns:

Type Description
list[NDArray]

The optimum parameters for each number of layers.

Source code in qubit_approximant/core/optimizer/multilayer_optimizer.py
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
def __call__(self, cost: Callable, grad_cost: Callable, init_params: NDArray) -> list[NDArray]:
    """Calculate the optimized parameters for each number of layers.

    Parameters
    ----------
    cost : Callable
        Cost function to be minimized.
    grad_cost : Callable
        Gradient of the cost function.
    init_params : NDArray
        Initial parameter guess for the cost function; used to initialize the optimizer.

    Returns
    -------
    list[NDArray]
        The optimum parameters for each number of layers.
    """
    self.params_layer = init_params.size // self.min_layer
    params = init_params
    self.params_list = []

    for layer in range(self.min_layer, self.max_layer + 1):
        params = self.optimizer(cost, grad_cost, params)
        self.params_list.append(params)
        params = self._new_initial_params(params, layer)
    return self.params_list

MultilayerOptimizer(min_layer, max_layer, optimizer, new_layer_coef=0.3)

Bases: ABC

This optimizer uses the parameters of an optimized L layer circuit as input for the optimization of a L+1 layer circuit.

Attributes:

Name Type Description
min_layer int

Starting number of layers to optimize.

max_layer int

Final number of layers to optimize.

optimizer Optimizer

The optimizer used to find the optimum parameters.

new_layer_coef float

The coefficient that multiplies the normal distribution of the new parameters in the additional layer.

Parameters:

Name Type Description Default
min_layer int

Starting number of layers to optimize.

required
max_layer int

Final number of layers to optimize.

required
optimizer Optimizer

The optimizer used to find the optimum parameters.

required
new_layer_coef float

The coefficient that multiplies the normal distribution of the new parameters in the additional layer.

0.3
Source code in qubit_approximant/core/optimizer/multilayer_optimizer.py
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
def __init__(self, min_layer, max_layer, optimizer: Optimizer, new_layer_coef: float = 0.3):
    """
    Initialize a black box optimizer.

    Parameters
    ----------
    min_layer : int
        Starting number of layers to optimize.
    max_layer : int
        Final number of layers to optimize.
    optimizer : Optimizer
        The optimizer used to find the optimum parameters.
    new_layer_coef : float
        The coefficient that multiplies the normal distribution of the
        new parameters in the additional layer.
    """
    self.min_layer = min_layer
    self.max_layer = max_layer
    self.optimizer = optimizer
    self.new_layer_coef = new_layer_coef

__call__(cost, grad_cost, init_params) abstractmethod

Calculate the optimized parameters for each number of layers.

Parameters:

Name Type Description Default
cost Callable

Cost function to be minimized.

required
grad_cost Callable

Gradient of the cost function.

required
init_params NDArray

Initial parameter guess for the cost function; used to initialize the optimizer.

required

Returns:

Type Description
list of NDArray

The optimum parameters for each number of layers.

Source code in qubit_approximant/core/optimizer/multilayer_optimizer.py
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
@abstractmethod
def __call__(self, cost: Callable, grad_cost: Callable, init_params: NDArray) -> list[NDArray]:
    """
    Calculate the optimized parameters for each number of layers.

    Parameters
    ----------
    cost: Callable
        Cost function to be minimized.
    grad_cost: Callable
        Gradient of the cost function.
    init_params : NDArray
        Initial parameter guess for the cost function; used to initialize the optimizer.

    Returns
    -------
    list of NDArray
        The optimum parameters for each number of layers.
    """
    ...

NonIncrementalOptimizer(min_layer, max_layer, optimizer, new_layer_coef)

Bases: MultilayerOptimizer

This optimizer creates new initial parameters for the optimization of a circuit with an additional layer.

Parameters:

Name Type Description Default
min_layer int

Starting number of layers to optimize.

required
max_layer int

Final number of layers to optimize.

required
optimizer Optimizer

The optimizer used to find the optimum parameters.

required
new_layer_coef float

The coefficient that multiplies the normal distribution of the new parameters in the additional layer.

required
Source code in qubit_approximant/core/optimizer/multilayer_optimizer.py
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
def __init__(self, min_layer, max_layer, optimizer: Optimizer, new_layer_coef: float):
    """
    Initialize a black box optimizer.

    Parameters
    ----------
    min_layer : int
        Starting number of layers to optimize.
    max_layer : int
        Final number of layers to optimize.
    optimizer : Optimizer
        The optimizer used to find the optimum parameters.
    new_layer_coef : float
        The coefficient that multiplies the normal distribution of the
        new parameters in the additional layer.
    """
    super().__init__(min_layer, max_layer, optimizer, new_layer_coef)

__call__(cost, grad_cost, init_params)

Calculate the optimized parameters for each number of layers.

Parameters:

Name Type Description Default
cost Callable

Cost function to be minimized.

required
grad_cost Callable

Gradient of the cost function.

required
init_params NDArray

Initial parameter guess for the cost function; used to initialize the optimizer.

required

Returns:

Type Description
list[NDArray]

The optimum parameters for each number of layers.

Source code in qubit_approximant/core/optimizer/multilayer_optimizer.py
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
def __call__(self, cost: Callable, grad_cost: Callable, init_params: NDArray) -> list[NDArray]:
    """
    Calculate the optimized parameters for each number of layers.

    Parameters
    ----------
    cost: Callable
        Cost function to be minimized.
    grad_cost: Callable
        Gradient of the cost function.
    init_params : NDArray
        Initial parameter guess for the cost function; used to initialize the optimizer.

    Returns
    -------
    list[NDArray]
        The optimum parameters for each number of layers.
    """
    self.params_layer = init_params.size // self.min_layer
    self.params_list = []
    params = init_params
    rng = np.random.default_rng()

    for layer in range(self.min_layer, self.max_layer + 1):
        params = self.optimizer(cost, grad_cost, params)
        self.params_list.append(params)
        params = self.new_layer_coef * rng.standard_normal((layer + 1) * self.params_layer)
    return self.params_list