How to visualize episode behaviour with the reinforcement learning toolbox?
6 views (last 30 days)
Show older comments
Jan de Priester
on 5 Jun 2019
Edited: Emmanouil Tzorakoleftherakis
on 15 Sep 2020
How can I create a visualization for a custom environment that shows the behaviour of the system in the environment during an episode of training? I cannot find code examples or clarifations of code that visualizes systems behaviour during training episodes anywhere on Mathworks. I would like to achieve a visualization that looks something like the cart-pole visualizer shown on this page: https://nl.mathworks.com/help/reinforcement-learning/ug/train-pg-agent-to-balance-cart-pole-system.html?searchHighlight=cart%20pole&s_tid=doc_srchtitle.
PS I am trying to solve the continuous mountain car problem with a ddpg agent with the reinforcement learning toolbox
0 Comments
Accepted Answer
Emmanouil Tzorakoleftherakis
on 7 Jun 2019
Hello,
To create a custom MATLAB environment, use the template that pops up after running
rlCreateEnvTemplate('myenv')
In this template there are two methods that can be used for visualization, "plot", and "envUpdatedCallback" (it is called from within "plot"). Use "plot" to create the basic stationary parts of your visualization, and "envUpdatedCallback" to update the coordinates of the moving parts based on your states.
5 Comments
Prashanth Chivkula
on 15 Sep 2020
And another question where do I define a reward function in the template
Emmanouil Tzorakoleftherakis
on 15 Sep 2020
Edited: Emmanouil Tzorakoleftherakis
on 15 Sep 2020
The error sounds self-explanatory - make sure whatever you are plotting makes sense.
In this template there is no separate function for rewards - it is implemented inside 'step' if you go through the generated code. You could create a separate function if you want as well.
In the future please create a separate question if it's not related to the original one. Thanks!
More Answers (0)
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!