I've made the following attached signal-processing model that operates on four physiological signals via two parallel processing pathways.
It consists of a data import and pre-processing chain that is carried out on contiguous frames of length specified by the user (specified as a mask parameter in the first block). The preprocessing consists of a gain stage and a front-back "filtfilt" (zero-phase) filtering operation. Both of these operations' parameters are specified as mask parameters. The idea is that the entire subsequent processing and analysis will be performed within these non-overlapping frames.
The model runs fine. However, I noticed that the pre-processing output shows some discontinuities at the frame intervals compared to the preprocessing input (so, after it has already been "framed"). E.g. if the frame is 10 seconds, every 10 seconds the pre-processed signals will show some weird glitch or discontinuity:
This particular example above is on the TOP preprocessing branch - the ECG, but on the bottom one (which considers three signals SCG x, y and z, simultaneously) the same phenomenon is observed.
I suppose this must be some consequence of the pre-processing block because the INPUT of the preprocessing block is already "framed", but there are no discontinuities. Could it be with the finite amount of time required to actually perform the pre-processing? What is actually happening here and is there a way to prevent it?
I've also attached some matlab variables (including test signals) required to run the model.
Thanks and please let me know if something wasn't clear!