I have a table with 2 columns; here’s two typical examples:

Code:
   TABLE 1           TABLE 2
  X       Y         X       Y
-46.3    16.0     -50.3    71.2     
-40.1   -28.1     -43.6   117.7
-34.0  -154.0     -36.9   165.7
-27.8  -171.8     -30.2   176.9
-21.6   178.0     -23.5  -179.2
-15.4   166.2     -16.8  -173.3
 -9.3   120.1     -10.1  -149.3
 -3.1    -2.0      -3.4   -86.2
  3.1   -28.6       3.4   -67.0
  9.3   -80.7      10.1   -72.8
 15.4  -147.7      16.8   -93.5
 21.6  -175.5      23.5  -151.7
 27.8   162.6      30.2   112.9
 34.0   120.2      36.9    80.4
 40.1    49.4      43.6    67.7
 46.3    15.4      50.3    71.4
In the table 1, the Y decreases, while in the table 2 the Y increases. But notice the ambiguity in the table 1 for Y= 150 and in the table 2 for Y= 70.
I generate one table at runtime (I plan to use from 20 to 50 rows), the column Y is an angle (I use radians from –pi to pi and double precision number, but here I used degrees for the sake of simplicity).
The program generates an angle from –pi to pi and I need to find the two X’s that bracket the angle. For example, if the angle is 150, for the table 1 the function should find [-15.4, -9.3] and [27.8, 34.0].