<< Back

Newton's Method: A Nonlinear Optimization Study (#1160)

Read Article

Date of Conference

July 17-19, 2024

Published In

"Sustainable Engineering for a Diverse, Equitable, and Inclusive Future at the Service of Education, Research, and Industry for a Society 5.0."

Location of Conference

Costa Rica

Authors

Caytuiro Tapia, Cesar

Pastro, Cristian Roberto

Abstract

Neural networks are currently the subject of much study and research, serving as tools for many academic work. For a neural network function properly it is necessary to carry out a process called training. The training of a neural network basically consists of minimizing the network error. One of the most used algorithms is the descending gradient, being known as first order method by using the function vector gradient (vector of first derivatives). This method, although functional, may take a long time to converge, requiring in many cases the use of second order methods, such as the Newton Method, which uses a Hessian matrix (matrix of second derivatives). This work presents a comparison on the convergence of the two methods exploring both aspects of effectiveness and speed of the optimization.

Read Article