# Projects

## Permutation groups and the complexity of Boolean functions

László Babai

Graph properties can be defined as the class of those Boolean functions in ${n \choose 2}$ variables that are invariant under the induced action of the symmetric group $S_n$ on the ${n \choose 2}$ pairs. This is but one example to illustrate how interesting classes of Boolean functions arise by postulating a group action.

We shall study open questions regarding various complexity measures of Boolean functions (decision-tree complexity, sensitivity, property-testing complexity, etc.) in the context of groups of symmetries admitted by these functions.

### Prerequisites:

Basic group theory (including Sylow's theorems, the Jordan-Hölder theorem, solvability), discrete probability, and a course in discrete mathematics or equivalent.

## Learning Polynomials, Graphs, and Densities

John Lafferty

Real algebraic geometry and functional analysis provide characterizations of polynomials that are positive on a semi-algebraic set. Recent work has shed new light on the computational aspects of positivity and convexity in polynomial optimization. While convexity is generally hard to verify for real polynomials, computationally tractable subfamilies of convex polynomials are based on semi-definite programming.

This project will explore the use of these ideas in machine learning. The goal is to develop algorithms for estimation of probability densities, regression functions, and graphical models in high dimensions, using semi-definite programming techniques for convex polynomials. The work will also involve studying the statistical properties of the algorithms, and experimenting with them on real data.

### References:

• Jean Bernard Lasserre,"Moments, Positive Polynomials and Their Applications," Imperial College Press, 2010.
• Amir Ali Ahmadi, "Algebraic Relaxations and Hardness Results in Polynomial Optimization and Lyapunov Analysis," MIT EECS Ph.D. Thesis, 2011.
• H. Liu, J. Lafferty, and L. Wasserman, "The nonparanormal: Semiparametric estimation of high dimensional undirected graphs," Journal of Machine Learning Research (JMLR), 2009.
• M. Cule, R. Samworth, and M. Stewart, "Maximum likelihood estimation of a multi-dimensional log-concave density," J. Roy. Statist. Soc., Ser. B., 2010.

## Approximation algorithms and hardness results for Max k-CSP with very large domain size

Yury Makarychev

This project studies the Maximum Constraint Satisfaction Problem. In the Maximum $k$-CSP($d$) problem, we are given a set of variables and constraints. Each variable takes values in a domain of size $d$; each constraint depends on at most k variables. Our goal is to find an assignment to variables that maximizes the number of satisfied constraints.

The Boolean and non-Boolean variants of this problem have been extensively studied in the literature, and the approximability of the problem is well understood when $k > \Omega(d)$. On one hand, the problem admits an $\Omega(kd/d^k)$ approximation if $d > \Omega(\log k)$ [MM]. On the other hand, it is NP-hard to get a better than $\Omega(kd/d^k)$ approximation when $d > k$ [Chan]. However, known upper and lower bounds for the case when $d >> k$ are much weaker. This project will study upper and lower bounds for the problem in the case when $d >> k$.

### References

• [Chan] S. O. Chan. Approximation resistance from pairwise independent subgroups. In Proc. of Symposium on Theory of Computing, 2013, pp. 447–456.
• [CMM] M. Charikar, K. Makarychev, and Y. Makarychev. Near-optimal algorithms for maximum constraint satisfaction problems. ACM Trans. Algorithms 5, 3, Article 32, July 2009.
• [MM] K. Makarychev and Y. Makarychev. Approximation Algorithm for Non-boolean MAX k-CSP. In Proc. of APPROX, 2012, pp. 254–265.

## Applications of deep convolutional neural networks

Greg Shaknarovich

Recently deep convolutional neural networks have emerged as a promising tool for solving a variety of computer vision tasks, in particular image classification and object detection. We would like to investigate their utility in settings previously not explored, and to analyze, empirically and theoretically, the effect of architecture parameters on performance. This is a broadly defined project, with multiple directions possible depending on background and interests of the student.

### Prerequisites:

Probability, linear algebra, basic programming skills (Python, Matlab, C++)

## Generic framework for parameter optimization in machine learning

Greg Shaknarovich

Many machine learning methods include optimization of some "meta parameters" that control various tradeoffs in learning algorithms. Common examples of this include setting the value of regularization coefficient in shrinkage methods, setting the depth of hierarchical architectures such as deep neural networks, and setting the number of target dimensions in a dimension reduction framework. Often this requires design of an ad-hoc parameter search framework. The goal of this project is to design and implement a generic system for efficient multi-scale search over parameter space, as well as to design an APU which would allow plugging in a learning algorithm to be used with such a system.

### Prerequisites:

Basic machine learning/statistics, programming skills.

## Finding New Classes of Approximation Resistant Predicates

For a given Boolean predicate $P$ on (say) $k$ Boolean variables, an instance of CSP($P$) with $n$ variables and $m$ constraints is described by specifying $m$ $k$-tuples of literals (variables or their negations). For a tuple $(l_1, .., l_k)$ we consider the constraint $P((l_1, .., l_k) = 1$ and the goal is to find an assignment to all the variables satisfying the maximum number of constraints.

$P$ is said to be approximation resistant if it is NP-hard to approximate CSP($P$) significantly better than just using a random assignment to the variables. Recently, Khot et al. gave a complete characterization of approximation resistant predicates in terms of their Fourier structure. The goal of this project is to find explicit examples of predicates which satisfy this characterization.It would also be interesting to study if one can obtain a simpler characterization for a special class of predicates.

## Proving New Hardness Results for Semidefinite Hierarchies

For constraint satisfaction problems, Benabbas et al. proved that for a very general class of predicates, CSP($P$) is hard to approximate using LP hierarchies (even after adding some basic semidefinite constraints). This result also gives a candidate approach for proving these reults in the Lasserre SDP hierarchies, the technical part of which reduces to proving that a certain family of matrices is positive semidefinite. The project could start by computationally verifying this for small matrices and then trying to develop general techniques for proving this.