SomeMatlabUser
Followers: 0 Following: 0
Statistics
RANK
39,396
of 295,467
REPUTATION
1
CONTRIBUTIONS
5 Questions
1 Answer
ANSWER ACCEPTANCE
20.0%
VOTES RECEIVED
1
RANK
of 20,234
REPUTATION
N/A
AVERAGE RATING
0.00
CONTRIBUTIONS
0 Files
DOWNLOADS
0
ALL TIME DOWNLOADS
0
RANK
of 153,912
CONTRIBUTIONS
0 Problems
0 Solutions
SCORE
0
NUMBER OF BADGES
0
CONTRIBUTIONS
0 Posts
CONTRIBUTIONS
0 Public Channels
AVERAGE RATING
CONTRIBUTIONS
0 Highlights
AVERAGE NO. OF LIKES
Feeds
Question
Noise model in RL for large action signal
I want to train a model with an DDPG agent. The model requires an action 10 element vetor signal with a bound value of -1.5...+1...
3 years ago | 0 answers | 0
0
answersQuestion
Deploy trained policy to simulink model
I am trying to deploy a trained policy of the reinforcement learning toolbox to a simulink model. This model has to be compatibl...
5 years ago | 1 answer | 0
1
answerReinforcement Learning - How to use a 'trained policy' as a 'controller' block in SIMULINK
Is there any Update (maybe from Mathworks itself) to actually solve this problem? I am trying to get my model working with code ...
5 years ago | 0
Question
Rapid Accelerator does not launch
I switched from R2016a to R2016b. Now, if I try to run my model in Rapid Accelerator Mode in R2016b, I get the following error: ...
5 years ago | 0 answers | 1
0
answersQuestion
How to write tlc-file for c-mex s-function
I am trying to run my simulation in Rapid Accelerator mode. This does not work, my simulink model contains a c-mex s-function. A...
7 years ago | 0 answers | 0
0
answersQuestion
How to write a simple tlc-file for a *.m s-function
I have a really simple m-file s-function with a start- and output function. I need to use s-functions, because need to have a "i...
7 years ago | 0 answers | 0