Load a pretrained neural network object in rlNeuralNe​tworkEnvir​onment

6 ビュー (過去 30 日間)
Vasu Sharma
Vasu Sharma 2023 年 11 月 21 日
Hi,
I want to train an RL MBPO Agent that samples from a model. The model is a trained DL object, trained in matlab. I am wondering how I can load its weights inside the env object. The examples for rlNeuralNetworkEnvironment can be used to define a network structure but I would like to add my weights to this?
Best Regards,
Vasu

回答 (1 件)

Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis 2023 年 12 月 21 日
Hi Vasu,
You can use a pretrained environment model with MBPO agent as follows:
1) Create a rlContinuousDeterministicTransitionFunction with the trained dlnet if it is deterministic or rlContinuousGaussianTransitionFunction if it is stochastic (mean heads and std heads).
2) After that, you need to create rlNeuralNetworkEnvironment with newly defined function from 1.
3) Create MBPO agent.
4) Set LearnRate = 0 in TransitionOptimizerOptions in rlMBPOAgentOptions to avoid updating the models during training.
Hope this helps

カテゴリ

Help Center および File ExchangeTraining and Simulation についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by