Environment for Reinforcement Learning Project
5 ビュー (過去 30 日間)
古いコメントを表示
GCats
2020 年 7 月 21 日
コメント済み: Alberto Tellaeche
2023 年 2 月 20 日
Hi everyone!
I'm currently looking to work on a small Reinforcement Learning project. Friends have reccomended the OpenAI Gym (https://gym.openai.com/envs/#classic_control) where they provide many classical/non-classical control environments where one can apply reiforcement learning rules. However, these are based on python. Being a MatLab user myself, I was wondering wheter anyone knew something like OpenAI where I can download an environment (I'm interested in the Lunar Lander env, but it's not a strong preference) where I can apply RL rules easily.
I'd appreciate any tips!
0 件のコメント
採用された回答
Emmanouil Tzorakoleftherakis
2020 年 7 月 21 日
編集済み: Emmanouil Tzorakoleftherakis
2020 年 7 月 21 日
Hello,
We are working on providing an interface between OpenAI Gym and Reinforcement Learning Toolbox but this will take some more time. In the meantime, you could use community posts like this one to get an idea of how this could be accomplished. I have not personally tried the code in the link above, but seems like it is along the lines of what you were looking for.
Hope that helps.
2 件のコメント
John Adams
2021 年 11 月 29 日
Hi Emmanouil,
When will this interface be ready?
I am currently trying to interface using the link you posted above and it works fine for discrete action problems as in the example in the link using "this.open_env.step(int16(Action));" for the discrete cart pole problem. However for the continuous cart problem I get the following error when calling the step function [this.open_env.step(double(Action));] :--
Python Error: TypeError: 'float' object is not subscriptable
How can this problem be avoided?
Thx!
Alberto Tellaeche
2023 年 2 月 20 日
The same problem here....when actions are continuous, the "object is not subscriptable problem appears, no matter you use a 'float' or cast the data to 'single', the error remains the same.
Thank you,
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Introduction to Installation and Licensing についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!