I can't quite figure out how to calculate the intersection of two vectors mathematically. (The closest I could find was this, but it uses the brute-force method, which I'd prefer not to do. It has a good description of the problem, however, so you should read it in addition to my description, which is probably hard to understand.)
Here's a description of the problem. An object (let's call it T, for target) is moving some distance away from you. You have another object which moves at a fixed speed (Y, for you). You have to aim this object so that it will collide with T (the point of intersection, I). (Basically you need to figue out the angle at which to fire Y.) You know only: the distance between Y and T (YT); the velocity (speed (delta TI) and direction) of T; and the speed at which Y will travel (delta YI).
Code:
^ I
<--------\-- . T
\ |
\ |
\. Y
I managed to solve it for the restricted case where T is travelling perpendicular to Y:
Code:
TI = sqrt(TY/((delta YI/delta TI)^2-1))
However, the above formula relys on Pythagoras' Theorem, which isn't true for non-right angle triangles.
The other two cases I have been able to solve are when T is travelling directly away from Y and when T is travelling directly towards Y. This leaves two cases which aren't handled: T is travelling at an acute angle towards Y (which has the potential for two intersections), and T is travelling at an obtuse angle away from Y. And of course, it is possible that Y cannot intersect with T at all, if it can't go fast enough.
Can anyone come up with the general formula for this?