model.rp_optimizer module

RosenPy: An Open Source Python Framework for Complex-Valued Neural Networks. Copyright © A. A. Cruz, K. S. Mayer, D. S. Arantes.

License

This file is part of RosenPy. RosenPy is an open source framework distributed under the terms of the GNU General Public License, as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. For additional information on license terms, please open the Readme.md file.

RosenPy is distributed in the hope that it will be useful to every user, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with RosenPy. If not, see <http://www.gnu.org/licenses/>.

class model.rp_optimizer.AMSGrad(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

AMSGrad optimizer.

This class implements the AMSGrad optimization algorithm, a variant of Adam that improves convergence in certain cases by keeping track of the maximum past squared gradient.

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the AMSGrad optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.AdaGrad(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the AdaGrad optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.Adam(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

Adam optimizer.

This class implements the Adam optimization algorithm, which is an adaptive learning rate optimization algorithm.

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the Adam optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.Adamax(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the Adamax optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.CVAMSGrad(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

Updates the parameters using the complex-valued SAMSGrad optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters of the neural network based on the gradients.

This is a placeholder method that should be implemented by subclasses.

Parameters:

parameterstuple

The parameters of the neural network.

gradientstuple

The gradients of the loss function with respect to the parameters.

learning_ratetuple

The learning rates for updating the parameters.

epochint

The current epoch number.

mttuple

The first moment estimates.

vttuple

The second moment estimates.

uttuple

The third moment estimates.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.CVAdaGrad(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the complex-valued AdaGrad optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.CVAdam(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

Complex-Valued Adam optimizer.

This class implements the complex-valued version of the Adam optimization algorithm.

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the complex-valued Adam optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.CVAdamax(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the complex-valued Adamax optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.CVNadam(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Nadam

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the complex-valued Nadam optimizer.

Parameters:

parameterslist of arrays

The parameters of the neural network.

gradientslist of arrays

The gradients of the loss function with respect to the parameters.

learning_ratefloat

The learning rate for updating the parameters.

epochint

The current epoch number.

mtlist of arrays

The first moment estimates.

vtlist of arrays

The second moment estimates.

utlist of arrays

The third moment estimates.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.CVRMSprop(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the complex-valued RMSprop optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.GradientDescent(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

Gradient Descent optimizer.

This class implements the standard gradient descent optimization algorithm.

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the gradient descent optimizer.

Parameters:

parameterstuple

The parameters of the neural network.

gradientstuple

The gradients of the loss function with respect to the parameters.

learning_ratetuple

The learning rates for updating the parameters.

epochint

The current epoch number.

mttuple

The first moment estimates (not used in this optimizer).

vttuple

The second moment estimates (not used in this optimizer).

uttuple

The third moment estimates (not used in this optimizer).

Returns:

tuple

The updated parameters.

class model.rp_optimizer.Nadam(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the Nadam optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.Optimizer(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: object

Base class for all optimizers used in the neural network.

This class defines common parameters and methods that can be used by all derived optimizers.

set_module(xp)[source]

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters of the neural network based on the gradients.

This is a placeholder method that should be implemented by subclasses.

Parameters:

parameterstuple

The parameters of the neural network.

gradientstuple

The gradients of the loss function with respect to the parameters.

learning_ratetuple

The learning rates for updating the parameters.

epochint

The current epoch number.

mttuple

The first moment estimates.

vttuple

The second moment estimates.

uttuple

The third moment estimates.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.RMSprop(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the RMSprop optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.

class model.rp_optimizer.SAMSGrad(beta=100, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]

Bases: Optimizer

set_module(xp)

Sets the backend module (NumPy or CuPy) for matrix operations.

Parameters:

xpmodule

The backend module (NumPy or CuPy).

update_parameters(parameters, gradients, learning_rate, epoch, mt, vt, ut)[source]

Updates the parameters using the SAMSGrad optimizer.

Parameters:

Same as the parent class.

Returns:

tuple

The updated parameters along with the updated moment estimates.