## Lec #17: Ellipsoid and Interior-Point

Hi all: since we covered most of the short-step method today, it may be easiest if you looked over the proof yourself. We can discuss it in office hours on Tuesday, if you’d like. I’d like to start off talking about concentration bounds on Monday.

Some other notes about Ellipsoid and interior-point methods.

• We talked about how a separation oracle for a convex ${K}$ gives (via Ellipsoid) an algorithm for optimization over ${K}$. In fact, the two problems of separation and optimization are equivalent! So, given an algorithm for max-weight perfect matching in general graphs, you also get an algorithm for finding minimum odd-cuts. You will give a direct algorithm for the problem in the upcoming HW.
• In fact, something even more surprising is known. Suppose we are given a point ${x_0}$ in the body ${K}$, and values ${r, R}$ such that ${B(x_0,r) \subseteq K \subseteq B(x_0, R)}$. And we are also given a membership oracle, which given a point ${x}$, just outputs whether or not ${x \in K}$. Then we can still optimize over ${K}$. The idea is to use the membership oracle to sample points from the polytope and generate a (weak) separation oracle.

(It is important that you be given a point inside ${K}$, else you have no chance, even if you know a bounding ball ${B(0,R)}$, and that ${K}$ has some non-trivial ball within it. There is too much volume in ${B(0,R)}$ for you to go searching with just a membership oracle: you’ll pretty much need ${(R/r)^n}$ time to find a point inside ${K}$.)

Of course, this three-way equivalence between membership, separation, and optimization requires care to make precise and to prove. See the Grotschel Lovasz Schrijver book for all the details.

• There are many different ways to implement interior-point methods. We saw a primal-dual analysis that was pretty much from first principles (with a little bit swept under the rug, and even those details can be found in the Matousek-Gaertner book). Matousek and Gaertner also give a different way to find a starting point ${x_0}$, different from the one we outlined in lecture.

Sadly, we did not talk about Newton’s method, or self-concordance, or local norms and Dikin ellipsoids, which form the basis for the “modern” treatment of interior-point methods. If your interest is piqued, you should check out a dedicated optimization course (Tepper and MLD both offer one), or have a look at one of the books listed on the course webpage.

Also, another interior-point algorithm that is very approachable (and has a short self-contained exposition) is Renegar’s algorithm: here are notes by Ryan from our LP/SDP course.