How close loop works in NAR neural network?

Hello,
I'm quite new to neural networks, but I'm currently using them to forecast crude oil prices. Now I'm working with only Brent time series as data, so I'm using NAR model to forecasting. I've seen that for multispet prediction I have to close loop after have trained the net, as:
netc = closeloop(net);
[x,xi,ai,t]=preparets(netc,{},{},T);
yc = netc(x,xi,ai);
I don't understand why I have to give the target T? If I have to do prediction, how can I know the target price? I'm new to NN, so probably the answer is very easy. Thanks in advance if someone can clarify it to me.

 Accepted Answer

You have to know the target in order to test and, if needed, to continue training netc.
Just closing the loop is, in general, insufficient because netc feeds back the output with errors instead of the target. Using the target to complete the design of netc may be necessary to reduce the fed back error.
Hope this helps.
Thank you for formally accepting my answer
Greg

6 Comments

Thank you Greg for your answer. So how can I continue the prediction after have tested the net? I've tried use open-loop until the known output series, then I've converted in close-loop for 5 timesteps, but the results are no so good. The code is:
[x1,xio,aio,t]=preparets(net,{},{},T); %know data with open-loop
[y1,xfo,afo]=net(x1,xio,aio);
[netc,xic,aic]=closeloop(net,xfo,afo); %close-loop from the last data
[y2,xfc,afc]=netc(cell(0,5),xic,aic); % predict 5 time-step ahead.
I would like to have a year ahead prediction using the net out of sample, how can I manage it? Thanks in advance for the help.
The code you have posted is woefully incomplete. Post the full code (sans defaults) which can be cut and pasted into the command line. Use the MATLAB data in the help documentation.
For reference, I have posted a number of examples. Search on NEWSGROUP and if needed, on ANSWERS.
greg narnet netc
Hope this helps.
Greg
Thank you Greg for the answer. Here the code. I've used the oil_dataset. Then I've took only the oil prices instead of both oil and gas (I'm new with matlab so some scrip probably could be better). Here I've close the loop for 40 step-ahead prediction. I've divided data in two set: one for open-loop and the other one to compare with close-loop. I would like to know if this code is correct for time series prediction.
close all, clear all, clc;
load oil_dataset
%I've used only oil price as input
%Get the oil prices
Oil = oil_dataset;
OG=seq2con(Oil);
OilGas=OG{1};
Oilseries=OilGas(1,:);
Oilseries=Oilseries'
OilSeries=Oilseries(1:140,:);
TestSeries= Oilseries(141:end,:);
%Set the data for the net
T = tonndata(OilSeries,false,false);
X2= tonndata(TestSeries,false,false);
%Create the net
feedbackDelays = 1:4;
hiddenLayerSize = 15;
net = narnet(feedbackDelays,hiddenLayerSize);
%Pre/Post-Processing
net.input.processFcns = {'removeconstantrows','mapminmax'};
%Preapere the net
[x,xi,ai,t] = preparets(net,{},{},T);
% Setup Division of Data for Training, Validation, Testing
net.divideFcn = 'divideblock'; % Divide data randomly
net.divideMode = 'time'; % Divide up every value
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Choose a Training Function
net.trainFcn = 'trainlm'; % Levenberg-Marquardt
% Choose a Performance Function
net.performFcn = 'mse';
% Train the Network
[net,tr] = train(net,x,t,xi,ai);
% Test the Network
[y1,xfo,afo] = net(x,xi,ai);
e = gsubtract(t,y1);
performance = perform(net,t,y1)
%Clooseloop for Multi-step prediction
[netc,xic,aic] = closeloop(net,xfo,afo);
[y2,xfc,afc] = netc(cell(0,40),xic,aic);
%Plot the real price with open loop prices
figure
plot(OilSeries,'b')
hold on
Y1=cell2mat(y1);
plot(Y1,'r')
%Plot the real price with predicted prices
% %Plot price
figure
plot(TestSeries,'b')
hold on
Y2=cell2mat(y2);
plot(Y2,'r')
My real data is daily prices of Brent. There are 3200 observation, and the same code doesn't work well in the multi-step ahed with close-loop (however is good in open-loop). Thanks in advance for any help.
1. Easier to understand if you use
a. Upper case for cells
b. Lower case for doubles
c. Subscript "o" for openloop variables
d. Subscript "c" for closeloop variables
2. Erroneously used GAS istead of OIL
3.There is no attempt to optimize FD via significant peaks in the autocorrelation function.
4. There is no attempt to miimize H via trial and error.
5. If you use DIVIDEBLOCK, there is absolutely no reason to have an additional test set because the first test set is not used in any way for design.
6. There is no reason to explicitly assign DEFAULT parameters; e.g. processFcns, divideFcn, divideMode, tr/val/tst ratios, trainFcn, performFcn
7. Use preparets to "prepare t"ime "s"eries variables for OL AND CL
8. Initialize the RNG before training so that the designs can be duplicated
9. Post and plot NORMALIZED (w.r.t target variance) OL perfrmance to make sure it is satisfactory
10. Closeloop on neto and obtain initial states
11. Use preparets to obtain closeloop variables
12. Compare netc and neto performances to decide if the closeloop
step is successful on known data. If not, either
a. Design a better neto and try again
b. Train netc initialized with the weights from neto to predict
the known target data.
13. Obtain the final states of netc from known data in order to predict future performance
There are examples in my previous posts.
Greg
Hello Greg, thank you for your answer. I've tried your suggestion: after trained OL I've prepared the CL with preparets but it doesn't work well. I've also tried use weight and bias from neto but the CL doesn't work. It is possible have a net that works well in OL and bad in CL, or I'm probably doing a mistake? With a single time series working with: narnet, timedelaynet or newfftd net gets the same result? I'm little bit confused about that. Thanks in advance for any suggestion.
Reread the edited 12b

Sign in to comment.

More Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Asked:

Ugo
on 3 Feb 2015

Commented:

on 10 Feb 2015

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!