The problem here is that you appear to be committing a fundamental fallacy in how discrete data work. In your example case, you are plotting this:
and claiming the "real" line segment under the threshold is 0.2 units long. However what you are actually inputting as data looks more like this:
in which case you can clearly see why MATLAB thinks the line segment is 1 units long. You are treating the graphical consequence of MATLAB's plotting by connecting the dots with lines as if it is the actual nature of the data.
I am assuming you are here because finding the analytical definition and integrating for your actual problem is a headache or not possible. I would suggest one of the following:
1) If you have a rough idea about the underlying nature of the surface, you can try to fit an analytical form to it, then find the area above the threshold via integration. For this you should be able to get away with `cftool` function.
2) If your data is just plain ugly, you can use an interpolant to estimate the area. Assuming your data is gridded like your example, use `griddedinterpolant` (`scatteredinterpolant` for ungridded) with an interpolation scheme appropriate for your case. You have treated the above example as if the data points are connected by straight lines, so you can choose `linear` (default) for that. Now you sample your data much denser, for the above example -> `Denser = F(4:0.01:6)`. F is the interpolant, and the step 0.01 you can change based on your accuracy needs and computation power. Now you can threshold `Denser` and use trapz, simple pixel count, a triangulation method or something like marching squares/cubes to get the area.