Why is the derivative of a function or signal one sample shorter that the orignal?

20 views (last 30 days)
Hi All
taking the derivative of y=x.^2 using diff(y) , between 0 and 100 gives a signal with one less Sample, meaning instead of length of 101, it is 100 samples. so every derivative, it will get one sample shorter and if I have to integrate again, it will need those samples again maybe ? what should I do ?

Accepted Answer

Star Strider
Star Strider on 11 Apr 2020
Use the gradient function to calculate the numerical derivatives. The output will be the same length as the input. (It is also more accurate in that respect.) The function assumes regularly-sampled data, however if the sampling intervals are not constant, a work-around for that is:
dydx = gradient(y) ./ gradient(x);
.

More Answers (1)

Cris LaPierre
Cris LaPierre on 11 Apr 2020
diff is taking the difference between adjacent data points. Because the difference is between two points, this causes the result to be one data point shorter.
For a simple array x=[3 4 5], diff(x)=[4-3 5-4] = [1 1].
  2 Comments
farzad
farzad on 11 Apr 2020
I see, So : How would you handle it ? should the derivative of a signal be shorter always ?
farzad
farzad on 11 Apr 2020
If the signal is over time, then it will be somehow meaningless or meaningfull to have a shorter signal for acceleration or velocity of that signal ?

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!