How to interpolate between two vectors of timeseries and get outputs of the vector timeseries with the coarser resolution?
4 views (last 30 days)
Show older comments
Hello all,
I've got two vectors of timeseries and the corresponding datasets.
1. So, I've got time_observed = [2003-01-03 15:30; 2003-01-21 15:20;......;2008-12-18], which is a an text file with 100 values.
2. Also, I've got the corresponding vector (data_observed), which is a vector of 100 values.
****************
3. Then, I've got the timeseries, time_simulated = [1998-01-01 12:00; 1998-01-02 12:00;....;2007-12-29 12:00], which is a text file with 3650 values.
4. Finally, I've got the corresponding data, (data_simulated), which is a vector of 3650 values.
****************
I need to 'interpolate' the (data_simulated) vector to the coarser resolution; so that the (time_simulated) vector together with the (data_simulated) vector would match as close as possible to the (time_observed) vector.
At the end I need to get a new data (say data_new), with the length of the (dates_observed) and (data_observed) (130 x 1)
0 Comments
Answers (1)
Walter Roberson
on 11 Jun 2015
Let T_simulated be time_simulated converted to datenumbers, and T_observed be time_observed converted to datenumbers. Then
predicted_data = interp1(T_simulated, data_simulated, T_observed);
and now you can do things like
plot(T_observed, predicted_data - data_observed)
See Also
Categories
Find more on Time Series in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!