How to reduce the time required for training a logistic regression classifier

4 ビュー (過去 30 日間)
Memo Remo
Memo Remo 2021 年 7 月 17 日
コメント済み: Memo Remo 2021 年 8 月 8 日
Hi everyone,
I am trying to train and use a logistic regression classifier using stepwiseglm function. The regression function is allowed to have up to fourth polynomial degrees of each predictors including their interactions. The AIC criterion is used to study the significance of adding or removing each term.
The training set is a 100K by 4 matrix. The problem is that the training process takes a lot of time. Is there any way to improve its speed? (for instance using GPU parallel processing? Since this is an image processing task, can convolutional neural network (CNN) be trained in a shorter time? (I am not familiar with CNN).
Many thanks in advance.

採用された回答

Prateek Rai
Prateek Rai 2021 年 7 月 30 日
To my understanding, you want to speed up your image classification task. You can use CNN for your classification process.
You can refer to introduction-to-convolutional-neural-networks MathWorks documentation page to learn more on how to use CNN for image classification in MATLAB.
You can also refer to deep-learning-with-big-data-on-gpus-and-in-parallel MathWorks documentation page to learn more on speeding up your task using GPU Parallel Processing.

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by