I am trying to find the cost function in the unconstrained form of the binary soft-margin SVM optimization problem which is given by g(θ) = f0(θ) + (fj(θ)). The fj function is given by fj(θ) = C*max(0, 1 − yj*θ'* xj ), j = 1, . . . , n, and their sub gradients are given by ∇θ f0 (θ) = 0.5*||w||^2 , and ∇θ fj(θ) = (−C*yj*xj) if yj*θ'* xj < 1 and 0 otherwise.
I cannot implement fj(θ) = C*max(0, 1 − yj*θ'* xj ), j = 1, . . . , n where I don't know how to find the maximum. Is there a built in function for me to find the max?
yj is a vecor of size 105 by 1 which is the y label vector.
xj is a matrix of size 105 by 3 which is the feature vector consisting of training data.
θ is a 3 by 1 vector which takes in the value θ = (w b)' and is the vector of parameters of the soft-margin binary SVM classifier.
C is just a scalar value
if there is any tips and trick you may be able to tell me i would really appriciate it.