What exactly is convergence rate referring to in machine learning?
My understanding of the term Convergence Rate is as follows:
Rate at which maximum/Minimum of a function is reached, so in logistic regression rate at which gradient decent reaches global minimum.
So by convergence rate I am guessing it is measure of:
- time measured from start of gradient descent until it reaches global maximum.
- average number of distance our model went downhill(Do not know technical term...) for each iteration.
Can someone verify whether or not one of my guess is True if not explain what it means?
Topic convergence logistic-regression python machine-learning
Category Data Science