About RL Custom Agent/ LQRCustomAgent example
2 ビュー (過去 30 日間)
古いコメントを表示
hieu nguyen
2023 年 4 月 21 日
コメント済み: hieu nguyen
2023 年 4 月 24 日
I am trying to create my own RL Custom Agent and I consulted LQRCustomAgent example which is provided by Matlab. Here is the link of the example:
In the learnImpl function, there is two inputs are obj and exp. However, I can't see they define exp in this example. Why they can still use it?. I mean where does the exp input come from? Is there any Matlab file or Abstract class define it?
Here is another example of Matab: https://www.mathworks.com/help/reinforcement-learning/ug/create-agent-for-custom-reinforcement-learning-algorithm.html
they still use learnImpl function with 2 inputs are obj and exp. However, the way they call each element in exp is difference with the previous example. What is meaning of each element calling way ?
I hope you can help answer my question. Thank you very much!
0 件のコメント
採用された回答
Emmanouil Tzorakoleftherakis
2023 年 4 月 24 日
Actually, exp is being indexed in exactly the same way. Only in the first example we are doing it in one line and in the second over two lines. When you index the cell array, you are getting back the specific element, which is not a cell array anymore, that's why you see the '{1}'.
Also, the exp cell array is being automatically created in the background, so you don't need to worry about it.
Hope this helps
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Sequence and Numeric Feature Data Workflows についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!