Stochastic gradient descent (SGD)

The objective function () = [1∑=1Lossℎ(()⋅())]+2‖‖2

where Lossℎ()=max{0,1−} is the hinge loss function, ((),()) with for =1,… are the training examples, with ()∈{1,−1} being the label for the vector ().

how to find the sgd with respect to theta for when ⋅≤1 is it y*x and is it 0 when ⋅>1

Topic objective-function

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.