Is it not the direction of arrow A that represents the gradient? Fig 1. For illustration purpose, look at the following diagram and identify the point that represents gradient. Gradient of a function at any point is the direction of steepest increase or ascent of the function at that point. Before getting into the example of gradient descent example, let’s understand in detail about what is gradient descent and how to use gradient descent? What is Gradient Descent? In order to minimise the objective function, the most optimal value of the parameters of the function from large or infinite parameter space are found. Loss function, simply speaking, is the measure of the squared difference between actual values and predictions. It is the loss function which is optimized (minimised) and gradient descent is used to find the most optimal value of parameters / weights which minimises the loss function. For machine learning, the objective function is also termed as the cost function or loss function. The function which is set to be minimised is called as an objective function. Gradient descent algorithm is an optimization algorithm which is used to minimise the function. Introduction to Gradient Descent Algorithm
0 Comments
Leave a Reply. |