When comparing with the network output with desired output, if there is error the weight vector w(k) associated with the ith processing unit at the time instant k is corrected (adjusted) as 
w(k+1) = w(k) + D[w(k)] 
where, D[w(k)] is the change in the weight vector and will be explicitly given for various learning rules. 
Perceptron Learning rule is given by:
w(k+1) = w(k) + eta*[ y(k) - sgn(w'(k)*x(k)) ]*x(k)
引用
Bhartendu (2025). Perceptron Learning (https://jp.mathworks.com/matlabcentral/fileexchange/63046-perceptron-learning), MATLAB Central File Exchange. に取得済み.
MATLAB リリースの互換性
              作成:
              R2016a
            
            
              すべてのリリースと互換性あり
            
          プラットフォームの互換性
Windows macOS Linuxカテゴリ
- AI and Statistics > Deep Learning Toolbox > Train Deep Neural Networks > Function Approximation, Clustering, and Control > Function Approximation and Clustering > Define Shallow Neural Network Architectures >
 
      Help Center および MATLAB Answers で Define Shallow Neural Network Architectures についてさらに検索
    
  タグ
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!| バージョン | 公開済み | リリース ノート | |
|---|---|---|---|
| 1.0.0.0 | 
