Negative D2 score on training data after lassoglm fit
14 ビュー (過去 30 日間)
古いコメントを表示
How can the deviance from a null model (i.e. betas all equal zero) be lower than the deviance from the full model? Surely lassoglm should choose betas all zero in this case?
From the code below, my d2Train is -0.0808.
[B, FitInfo] = lassoglm(table2array(indat.params.trainDataX), indat.params.trainDataY(:, minInd), 'poisson', 'Lambda', indat.combTable.bestLambdas(minInd), 'Alpha', indat.combTable.bestAlphas(minInd));
predCountsTrain = calculateRates(table2array(indat.params.trainDataX),B,FitInfo.Intercept)+eps;
predDevianceTrain = calculateDeviance(indat.params.trainDataY(:, minInd),predCountsTrain);
nullCountsTrain = calculateRates(table2array(indat.params.trainDataX),zeros(size(B)),FitInfo.Intercept)+eps;
nullDevianceTrain = calculateDeviance(indat.params.trainDataY(:, minInd),nullCountsTrain);
d2Train = 1 - (predDevianceTrain ./ nullDevianceTrain);
function rates = calculateRates(x,y,int)
rates = exp((x * y) + int);
end
function dev = calculateDeviance(observed,predicted)
scaledLogRatio = log(observed./predicted).*observed;
rawDifference = observed-predicted;
diffOfTerms = scaledLogRatio - rawDifference;
dev = nansum(diffOfTerms)*2;
end
0 件のコメント
回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Descriptive Statistics についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!