Entwicklung einer Theorie der "Smoothes Analysis" für diskrete Probleme sowie die Anwendung von "Smoothed Analysis" auf andere Konzepte wie z.B. Approximierbarkeit
Final Report Abstract
The project had four main goals: Development of Perturbation models: While for geometric problems, perturbation models are naturally given by the distance function, there are often no natural perturbation models for discrete problems. Therefore, we devoted a lot of time on finding general perturbation models. One very general model is sub-sampling, that is, given an instance of a concrete problem, say TSP, we get a perturbed instance by randomly dropping each city with some probability p. The natural interpretation of this model is that for instance a truck is delivering goods to stores and a store needs new goods with probability 1 - p. However, we could show that it is rather unlikely that problems that are hard on the worst case have good smoothed polynomials complexity. Further evidence for this was given. Therefore, we investigated more specific perturbation models. One promising model are random shortest path metrics which are obtained by perturbing the entries of a distance matrix and then taking the shortest path closure of the perturbed matrix. This model was investigated. More general one can look at the metric polytope; the set of all matrices that describe a metric form a polytope defined by the triangle inequalities. As a perturbation model one can look at neighbourhoods (with respect to the Euclidean metric, for instance) of a point in this polytope. However, further investigations are needed here, since the behaviour of the perturbation is hard to understand. Smoothed analysis of discrete problems: Since we could only shows negative results for subsampling, we had to focus more on traditional models for smoothed analyis. In particular, we looked at Euclidean problems. Most notably, we were able to develop general frameworks which allowed the analysis of a lot of different algorithms with one general technique, for instance for TSP, Steiner trees, k-means method, etc. This framework even gives a guideline for designing algorithms with good smoothed performance. Furthermore, we investigated various discrete problems, however, most of them under the aspect of subsampling, which only resulted in impossibility results. Applications of smoothed analysis to other resources: Beside running time, the next natural resource to analyse is approximability. The aforementioned general framework for Euclidean problems not only allows an analysis of the running times but also of the approximation performance of the algorithms. Often, these algorithms turn out to be asymptotically optimal, in depence of the parameter of the distribution. Furthermore, we looked at Christofides algorithm and could solve on open question by Frieze and Yukich. Development of a general complexity theory for smoothed analysis: We developed a general complexity theory for smoothed analysis. We formally defined pertubations inodels in a complexity theoretic sense. Then we defined corresponding classes capturing the easy and hard problems. While the reductions defined have all the properties that they should have, they are defined on some extended problems, which is to some extend annoying. We are currently investigating different ways to define reductions. On the one hand, we had some very good results that resulted in publications in leading conferences. On the other hand, there were some problems that probably do not have a nice solutions or were we could not find a satisfying solution so far. A problem of the first category is the fact that subsampled instances are resilent to smoothed analysis. A problem of the second category is the difficulty of defining reductions, which is annoying to some extend. We are currently investigating ways to overcome these problems.