Given two points x and y placed at opposite corners of a rectangle, find the minimal euclidean distance between another point z and every point within this rectangle.
For example, the two points
x = [-1,-1]; y = [1,1];
define a square centered at the origin. The distance between the point
z = [4,5];
and this square is
d = 5;
(the closest point in the square is at [1,1])
The distance between the point z = [0,2] and this same square is d = 1 (closest point at [0,1])
The distance between the point z = [0,0] and this same square is d = 0 (inside the square)
Notes:
For the n dimensional case it would be better to say that x and y lie on opposite vertices of the n-hypercuboid such that each edge is parallel to a coordinate axis.
nice trick :)
Renaming a field in a structure array
513 Solvers
523 Solvers
283 Solvers
365 Solvers
389 Solvers