Variations in LSTM Accuracy Due to Shuffled Feature Columns

5 ビュー (過去 30 日間)
Hamza
Hamza 2023 年 11 月 11 日
編集済み: Hamza 2023 年 11 月 24 日
Hello everyone, I applied LSTM to speech emotion recognition and achieved an accuracy of 42.1709%. However, when I shuffled the columns "features," the accuracy changed to 42.4925%. This variance is unexpected because I used the same data with only the columns shuffled. I attempted to use gpurng and rng to preserve the accuracy without success. Could someone please assist me? The code used is attached below. To shuffle the matrix, uncomment the lines: appp = appp(:, t(1:end)); testt = testt(:, t(1:end)).
  10 件のコメント
Hamza
Hamza 2023 年 11 月 13 日
@Walter Roberson I totally agree with you. Is there a method to extract the weights, then shuffle them as shuffled data? I think this way will solve the issue!
Sam Schumacher
Sam Schumacher 2023 年 11 月 13 日
I think this slight change in accuracy is to be expected.
When you freeze the weights, but shuffle the features relative to those weights, the optimisation process starts by activating the layers using different values. The weights and data are mapped differently once columns shuffled.
if possible, re-order the weights after creating the neural network, according to the way you shuffled the columns. This way the initial weights multiply by the same values in the dataset before begging the backpropogation

サインインしてコメントする。

回答 (0 件)

カテゴリ

Help Center および File ExchangeSequence and Numeric Feature Data Workflows についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by