||From what I understand from the tutorial, to find the distance between a line segment and a point, you need to consider the case where C (the point) lies outside of the line segment AB. In order to determine whether this special case occurs, you use the sign of the dot product of the vectors formed by the three points, A, B and C.
The tutorial says...
"First, check to see if the nearest point on the line AB is beyond B (as in the example above) by taking AB Ã¢ÂÂ
BC. If this value is greater than 0, it means that the angle between AB and BC is between -90 and 90, exclusive, and therefore the nearest point on the segment AB will be B"
But how come it isn't...
"If this value is greater than 0, it means that the angle between AB and BC is between -90 and 90, exclusive, and therefore the nearest point to C is within segment AB (it may or may not be A or B, the endpoints)"
So what I'm thinking is, if the angle between AB and BC, theta, is between 0 and 90, then cos(theta) is positive, then the nearest point to C should lie within the segment AB. In this case, we can drop a perpendicular from C to AB and calculate its length with dot_product(AC, AB) / abs(AB). Otherwise, if theta is greater than 90 degrees (but less than 270) then cos(theta) is negative and so C lies beyond line segment AB, therefore, the closest point must be one of the endpoints A or B and the distance we're looking for the length of line segment AC or BC.
I'm unfamiliar with vector arithmetic but I know high school trigonometry. Does anyone know what the right answer here is :c?