# Sum 2 TheoryApplicationsIn order to apply the SOS

Sum of Squares OptimizationPavan Dhareshwar and Theja KanumuryUniversity of Colorado, Boulder CO 80309, USA,[email protected], [email protected],Abstract. Our project looked at using Sum of Squares (SOS) relaxations to develop algorithms that can be solved in polynomial time. In this report, we first looked at the theory behind the approach before applying this technique to popular NP-hard problems such as MAX CUT and Lyapunov FunctionSearch.Keywords: sum of squares, semidefinite matrices, semidefinite programming, NP-hard1 Introduction 2 TheoryApplicationsIn order to apply the SOS optimization technique, we used a MATLAB toolbox called SOSTOOLS 1. Combined with a semi-definite programming solver toolbox SeDuMi 2, SOSTOOLS can solve SOS polynomials using a simple, flexible, and intuitive high-level syntax.We then use this technique on the following popular NP-hard problems so that they can be solved in polynomial time.Sum of Squares TestMAX-CUTLyapunov Function SearchGlobal and Constrained OptimizationMatrix CopositivityChebyshev PolynomialsBounds in ProbabilitySOS Matrix DecompositionSet ContainmentIn the following sections, we will define each problem, explain how it can be relaxed to an SOS problem, and run each problem on the SOSTOOL toolbox.Sum of Squares TestAs explained in earlier sections the problem of proving a polynomial p(x) in nonnegative is NP-hard. This problem can be relaxed to checking if p(x) is an SOS then it can be solved in polynomial time. While proving that p(x) is an SOS proves it is non-negative the converse is not always true. If a polynomial cannot be expressed it does not mean that it is negative.We can then use SOSTOOLS to solve the feasibility problem. Given a polynomial p(x), determine if p(x) = ZT(x)QZ(x) (1)where Q is a coefficient matrix and Z is a monomial vector.Taking we ran the SOSTest.m code, found in the source code submitted along with this document. Since the code gave us the following non-empty Q and Z(x), p(x) can be expressed as a sum of squares and hence, p(x) ? 0. The result of SOSTest.m is shown in Figure 1(a).Running SOSTest.m on it returns an empty Q and Z(x) which means that p(x) cannot be decomposed into a sum of squares and hence, may be negative. The output is shown in Figure 1(b) (a) (b)Fig.1. (a) Output of code for  (b) Output of code for MAX-CUTThe goal of the MAX-CUT algorithm is to divide an undirected graph G into two sets G1 and G2 such that we maximize the number of edges that have one node in G1 and the other in G2. This can be expressed as a constrained optimization 2 (2)where wij is the weight of the edge connecting the nodes i and j, f(x) is a function that determines the number of cuts and if node i belongs to G1 then xi = 1, else xi = ?1.Given an f(x) and ?, then the LHS of Equation 2 will always be less than or equal to ? if there exists an SOS p1(x) and polynomials p2(x),…,pn+1(x), where n is the number of nodes in G, such that 0 (3)Equation 3 will not hold when f(x) > ? for x ? {?1,1} as the first and last terms will be negative while the middle term will be 0. Therefore we need to set ? to the lowest value that will satisfy the equationIn the code, we examined a graph with 5 nodes and edges that formed a closed chain. The number of cuts is given by f(x) = 2.5 ? 0.5x1x2 ? 0.5x2x3 ? 0.5x3x4 ? 0.5x4x5 ? 0.5x5x1 (4)The results of running MAXCUT.m with different values of ? can be seen in Figure 2(a). When ? = 3 we get a statement saying “Primal infeasible” and the feasibility ratio feasratio = ?1.0899 ? 0 which means it does not satisfy Equation 3. When ? = 4 no such statement occurs and feasratio = 1.0226 ? 0 which means four is the maximum cut that can be obtained from the graph we are working with. (a) (b)Fig.2. (a) ? set to 3. (b) ? set to 4.Lyapunov Function SearchLyapunov’s theorem states that for a system, ?x = f(x) that has an equilibrium at the origin, to be stable there must exist a function V (x) such thatV (0) = 0 and V (x) > 0For the system to be asymptotically stable then? (x) ? 0(5)V? (x) < 0(6)We examined the system given in Equation 7 that has an equilibrium at the origin. The eigenvalue cannot be used to determine the local stability as it is 0.  (7)Upon running Lyapunov.m we get the result shown in Equation 8 and Figure 3.  (8)Fig.3. Output of Lyapunov.m.Global and Constrained OptimizationThe SOS-based approach can be used to find the lower bound for the global minimum of a function f(x). The problem can be relaxed to minimizing ??, such that f(x) ? ? ? 0. (9)In the ConstrainedOptimization.m code we used the Goldstein-Price test function 3, given by Equation 10. The output of the code gave ?opt = 3 which is shown by Figure 4.f(x) = 1 + (x1 + x2 + 1)2(19 ? 14x1 + 3x12 ? 14x2 + 6x1x2 + 3x22)...(10) ...30 + (2x1 ? 3x2)2(18 ? 32x1 + 12x21 + 48x2 ? 36x1x2 + 27x22)Fig.4. Output of ConstrainedOptimization.m.Matrix CopositivityA matrix A is copositive if it satisfies Equation 11 for every vector y ? 0. yTAy ? 0 (11)Equation 11 is very similar to what we stated earlier in Equation 1. We initially ran MatrixCopositivity.m taking A as stated in Equation 12 and the output is shown in Figure 5(a). We then inverted the sign of one of the elements in A, for example A1,3 = ?1, and ran the code with the new matrix A0. The output is shown in Figure 5(b). ? 1 ?1 1 1 ?1? ??1 1 ?1 1 1? A = ?? 1 ?1 1 ?1 1?? (12) ?? 1 1 ?1 1 ?1?? ?1 1 1 ?1 1Similar to the output obtained for MAXCUT.m, when we ran MatrixCopositivity.m for matrix A we got a feasibility ratio feasratio = 1.0433 along with no warning statements which means that A is copositive. When the code was run for matrix A0 we got a 'Primal infeasible' warning and the feasibility ratio feasratio = ?0.9901 which means the matrix may not be copositive. (a) (b)Fig.5. (a) Output of MatrixCopositivity.m with A. (b) Output of MatrixCopositivity.m with A'.Chebyshev PolynomialsLet pn(x) be a Chebyshev Polynomial in one variable with degree n. If ? is the coefficient of xn then we try to maximize ?, subject to: |pn(x)| ? 1,?x ? ?1,1 (13)We set the degree in Chebyshev.m and obtain the results shown in Figure 6 (a) (b)Fig.6. (a) Output of Chebyshev.m with n = 8, ? = 128. (b) Output of Chebyshev.m with n = 5, ? = 16.Bounds in ProbabilityUsing SOS optimization we can find the bound on the worst-case probability of an event, given some information on the distribution. If we have an arbitrary probability distribution in x where x ? 0,5 with mean µ = 1 and standard deviation ? = 1/2, then we can find the worst-case probability of having x ? 4 by solving the following optimization problem: Minimize am0 + bm1 + cm2, subject toa + bx + cx2 ? 0,?x ? 0,5(14) a + bx + cx2 ? 1,?x ? 4,5where m0 = 1, m1 = µ, m2 = µ2 + ?2. The output of BoundsInProbability.m with the above description and Equation 14 is given in Figure 7.Fig.7. Bound on worst-case probability of event = 0.02703.SOS Matrix DecompositionWe can also check to see if a nXn polynomial matrix P(x) is an SOS matrix using SOSTOOLS. If P(x) is an SOS, we can also find a matrix H(x) such that P(x) = HT(x)H(x). We ran MatrixSOS.m for the matrix given by Equation 15 and the output is shown in Figure 8.  (15)Set ContainmentApart from checking if a polynomial matrix P(x) is an SOS matrix, we can also use SOSTOOLS to compute entries of the P(x) such that it is an SOS matrix. Given p(x), g0(x), ?, ? the tool will find a polynomial g1(x) and sum of squares s(x) such that Equation 16 is a sum of squares matrix.  (16)For the code SetContainment.m, we used = 1, and g0 = 2x1 and obtained g1 and s(x) given in Equation 17. The output is in Figure 9.g1 = ?1.58x31 + 2.67x21x2 + 1.317e?7x21 ? 0.8713x1x22 ? 8.628e?7x1x2 ? 2.317e?6x32 + 3.647e?7x22s(x) = 3.89x41 ? 7.29e?6x31x2 ? 5.542e?6x31 + 1.815x12x22 ? 1.956e?5x21x2 ? 2.841x21 + ...(17) ...3.901e?6x1x32 + 1.056e?6x1x22 ? 1.086e?5x1x2 + 5.98e?6x1 + 1.436x42 ? 7.025e?7x32 + ... ...0.5117x22 + 1.968e?6x2 ? 0.004719 (a) (b)Fig.8. (a) Output of MatrixSOS.m with P(x). (b) Output of MatrixSOS.m when P(x) is not an SOS matrix.(a)(b)Fig.9. (a) Output of MatrixSOS.m with P(x). (b) Output of MatrixSOS.m when P(x) is not an SOS matrix.ConclusionIn our project, we looked at the SOS optimization technique in order to relax certain NP-hard problems so they can be solved in polynomial time. We used the SOSTOOLS toolbox for MATLAB and went over these problems in the previous section. One point to note is that for certain problems like MAXCUT and Matrix Copositivity the SDP formulations obtained by the tool are not the most ideal ones. This is why MAXCUT is solved by taking a ? and checking if it is the right solution, similar to how we approach NP problems. The developers of the tool have mentioned that the special structure of the resulting polynomial will be exploited in the next release.ReferencesPapachristoduulou, A., Anderson, J., Valmorbida, G., Prajna, S., Seiler, P., and Parillo, P.A.: SOSTOOLS: Sumof squares optimization toolbox for MATLAB. Available from http://www.eng.ox.ac.uk/control/sostools, http://www.cds.caltech.edu/sostools and http://www.mit.edu/~parrilo/sostools, (2013)Sturm, J.F.: Using SeDuMi 1.02, A Matlab toolbox for optimization over symmetric cones. Optimization Methodsand Software 11, 625–653 (1999)Goldstein, A.A., and Price, J.F.: On descent from local minima. Mathematics of Computation 25, 569-574 (1971)