How to use neural networks to take a scalar as an input and gives a time series as an output

2 views (last 30 days)
Hello there everyone
I have this mind-bugging question about neural networks. we normally use equisampled neural networks (for instance: we give 5 features each of which has 100 samples and we get a feature as an output with 100 sample). How about we train a neural network that, for example, takes 2 features each of which has 5 samples and gives out a time series.
Suppose that we have a differential equation as follows:
dy/dt = at/b
where a and b are constants. we solve this equation five times for different values of a and b and get a time series each time (so we have 5 time series). how can I train a neural network in matlab that by entering an arbitrary a and b, get a time series? Thanks in advance for your time devoted to this question.

Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Products


Release

R2019b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!