Welcome to the Harris Geospatial documentation center. Here you will find reference guides and help documents.


  >  Docs Center  >  IDL Reference  >  Machine Learning  >  IDLmloptGradientDescent

IDLmloptGradientDescent

IDLmloptGradientDescent

Optimization algorithms are used by neural networks to help minimize an error function by modifying the model’s internal learnable parameters. IDLmloptGradientDescent is a commonly used optimization algorithm in machine learning. It looks for the local minimum of a function by taking steps proportional to the negative of the gradient of the function at the current point. In this way, the algorithm follows the direction of the slope downhill until it reaches a local minimum, or a saddle point.

Example


Compile_opt idl2
Optimizer = IDLmloptGradientDescent(0.1)
Print, Optimizer(0.1, 0.1, 0.1)

Note: Though the above is how an optimizer can be used as a standalone object, the common scenario is to pass it to the IDLmlFeedForwardNeuralNetwork::Train() method via the OPTIMIZER keyword.

Syntax


Optimizer = IDLmloptGradientDescent()

Result = Optimizer(LearningRate)

Arguments


LearningRate

Specify the initial learning rate value to use as a starting point. Note that if this parameter is too small, it may lead to slow convergence, and if it is too large, it can miss the optimal solution.

Keywords


None

Version History


8.7.1

Introduced

See Also


IDLmloptAdam, IDLmloptMomentum, IDLmloptQuickProp, IDLmloptRMSProp



© 2019 Harris Geospatial Solutions, Inc. |  Legal
My Account    |    Store    |    Contact Us