fitclinear appears to use sgd solver even when sparsa is specified
1 回表示 (過去 30 日間)
I'm training some SVM's on moderate-dimensional data (a few thousand observations by [less than or equal to a few hundred] features) using the following core function call:
model = fitclinear(X_subset(train_idx,:),...
In order to assess the degree to which performance depends on the features jointly, rather than individually, I add them in one by one and re-run the classifier. The accuracy curve that I get out looks like this: (red and blue are two different experimental replicates)
As you can see, once the number of dimensions passes 100, the solver's accuracy changes dramatically, getting less accurate and also getting more stochastic. I presume this is because on the backend, MATLAB is changing the solver from sparsa to sgd, as implied by the docs but not explicitly stated. For a second set of data that I have, where the overall accuracy is higher (~80%), the effect is still present but not as dramatic.
Is there a way to prevent MATLAB from switching to sgd? I will try passing in the cofficients from the nfeatures=100 model as a warm start to the subsequent models, but even if that fixes my specific problem, this feels like a bug more generally worth reporting.
回答 (1 件)
the cyclist 2023 年 3 月 6 日
This is interesting. I'd be surprised that MATLAB makes that transition when you have explicitly specified the Solver .. but I agree with you that the mention of 100 features in the documentation is a tantalizing hint that it might be happening.
Can you upload the data? I'd be pretty interested to investigate. (I could also create a simulated dataset. This probably doesn't depend the exact data.)
Here would be my approach to trying to confirm your hypothesis. You can use the debugger to pause execution inside fitclinear, then step through the program to see where the Solver is actually set. You could then see whether MATLAB is actively ignoring the Name-Value input.
You might be able to make a copy of the MATLAB code (and put it in your path), then adapt that code to do what you want.
I will mention that I think it is also possible that you are just seeing some phenomenon where the lasso is failing to regularize (or finding some local minimum instead of a global one), but it seems to too extraordinarily coincidental.