Fredman’s Trick

OK, sorry, I was being silly.

We wanted to compute MSP of two {n \times w} matrices {A} and {B}. I.e, we wanted {C := A \odot B^\intercal}, say. (Notice that I’ve transposed {B}.) In words, this is

\displaystyle  C_{ij} = \min_{k \in [w]} \{ A_{ik} + B^\intercal_{kj} \} = \min_{k \in [w]} \{ A_{ik} + B_{jk}\}.

If {k^*} is the minimizer of this expression then

\displaystyle  A_{ik^*} + B_{jk^*} \leq A_{ik} + B_{jk} \quad \forall k \in [w].

which is equivalent to

\displaystyle  A_{ik^*} - A_{ik} \leq - (B_{jk^*} - B_{jk}) \quad \forall k \in [w].

Indeed, the signs are opposite for the {A} and {B} entries. No worries.

Now, for each possible {p,q \in [w]}, do the following. Take the {2n} numbers

\displaystyle  A_{1p} - A_{1q}, \ldots, A_{np} - A_{nq}, -(B_{1p} - B_{1q}), \ldots, -(B_{np} - B_{nq})

and sort them with {O(n \log n)} comparisons.

How many comparisons have we done over all? We enumerated over {w^2} pairs of {p,q}, and for each we did {O(n \log n)} comparisons. Total of {O(w^2 n \log n)}. We claim we don’t need to do any more, we have all the information to compute the MSP.

How? Take {i,j}. Find some index {k^* \in [w]} such that for every {k \in [w]}, all the entries {A_{ik^*} - A_{ik}} precede all the entries {- (B_{jk^*} - B_{jk})}. This can be done by just looking at the sorted lists for {p = k^*} and {q=k}. Now you can set {C_{ij} = A_{ik^*} + B_{jk^*}}. Done.

Let’s also complete the analysis for Fredman’s trick. We’d broken down the {n \times n} MSP into {n/w} different {n \times w} mini-MSPs, which together take {(n/w) \cdot (w^2 n \log n) = n^2 w \log n} time. Now we can combine them together by taking the component-wise minima. Each minimum takes {(n/w)-1 } comparisons since there are {n/w} MSPs. And there are {n^2} entries. So a total of {n^3/w} comparisons. Balancing the two gives us {O(n^2 \sqrt{n \log n})} comparisions.

Fredman’s paper gives a better analysis to get rid of the log term, and uses it to get an algorithm with runtime better than {n^3} by an {(\log n/\log \log n)^{1/3}} factor. But we’ll see a larger improvement in the next lecture.

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s