Doprava zdarma se Zásilkovnou nad 1 299 Kč
PPL Parcel Shop 54 Balík do ruky 74 Balíkovna 49 GLS 54 Kurýr GLS 64 Zásilkovna 44 PPL 99

First-Order Methods in Large-Scale Semidenite Optimization

Jazyk AngličtinaAngličtina
Kniha Brožovaná
Kniha First-Order Methods in Large-Scale Semidenite Optimization Michael Bürgisser
Libristo kód: 12828288
Nakladatelství Cuvillier, června 2012
Semidefinite Optimization has attracted the attention of many researchers over the last twenty years... Celý popis
? points 98 b
982
Skladem u dodavatele Odesíláme za 14-18 dnů

30 dní na vrácení zboží


Mohlo by vás také zajímat


Annual Report of the Comptroller of the Currency United States: Office of the Comptroller of the Currency / Brožovaná
common.buy 921
Gustav Stickley David Cathers / Pevná
common.buy 1 754
Der Waldbruder Jakob Michael Reinhold Lenz / Pevná
common.buy 875
SEAMOS RAROS, ESTEMOS JUNTOS BARKER / Brožovaná
common.buy 334
Gendered Transactions Indrani Sen / Pevná
common.buy 3 807

Semidefinite Optimization has attracted the attention of many researchers over the last twenty years. It has nowadays a huge variety of applications in such different fields as Control, Structural Design, Statistics, or in the relaxation of hard combinatorial problems. In this thesis, we focus on the practical tractability of large-scale semidefinite optimization problems. From a theoretical point of view, these problems can be solved by polynomial-time Interior-Point methods approximately. The complexity estimate of Interior-Point methods grows logarithmically in the inverse of the solution accuracy, but with the order 3.5 in both the matrix size and the number of constraints. The later property prohibits the resolution of large-scale problems in practice. In this thesis, we present new approaches based on advanced First-Order methods such as Smoothing Techniques and Mirror-Prox algorithms for solving structured large-scale semidefinite optimization problems up to a moderate accuracy. These methods require a very specific problem format. However, generic semidefinite optimization problems do not comply with these requirements. In a preliminary step, we recast slightly structured semidefinite optimization problems in an alternative form to which these methods are applicable, namely as matrix saddle-point problems. The final methods have a complexity result that depends linearly in both the number of constraints and the inverse of the target accuracy. Smoothing Techniques constitute a two-stage procedure: we derive a smooth approximation of the objective function at first and apply an optimal First-Order method to the adapted problem afterwards. We present a refined version of this optimal First-Order method in this thesis. The worst-case complexity result for this modified scheme is of the same order as for the original method. However, numerical results show that this alternative scheme needs much less iterations than its original counterpart to find an approximate solution in practice. Using this refined version of the optimal First-Order method in Smoothing Techniques, we are able to solve randomly generated matrix saddle-point problems involving a hundred matrices of size 12'800 x 12'800 up to an absolute accuracy of 0.0012 in about four hours. Smoothing Techniques and Mirror-Prox methods require the computation of one or two matrix exponentials at every iteration when applied to the matrix saddle-point problems obtained from the above transformation step. Using standard techniques, the efficiency estimate for the exponentiation of a symmetric matrix grows cubically in the size of the matrix. Clearly, this operation limits the class of problems that can be solved by Smoothing Techniques and Mirror-Prox methods in practice. We present a randomized Mirror-Prox method where we replace the exact matrix exponential by a stochastic approximation. This randomized method outperforms all its competitors with respect to the theoretical complexity estimate on a significant class of large-scale matrix saddle-point problems. Furthermore, we show numerical results where the randomized method needs only about 58% of the CPU time of the deterministic counterpart for solving approximately randomly generated matrix saddle-point problems with a hundred matrices of size 800 x 800. As a side result of this thesis, we show that the Hedge algorithm - a method that is heavily used in Theoretical Computer Science - can be interpreted as a Dual Averaging scheme. The embedding of the Hedge algorithm in the framework of Dual Averaging schemes allows us to derive three new versions of this algorithm. The efficiency guarantees of these modified Hedge algorithms are at least as good as, sometimes even better than, the complexity estimates of the original method. We present numerical experiments where the refined methods significantly outperform their vanilla counterpart.

Informace o knize

Plný název First-Order Methods in Large-Scale Semidenite Optimization
Jazyk Angličtina
Vazba Kniha - Brožovaná
Datum vydání 2012
Počet stran 204
EAN 9783954041329
ISBN 3954041324
Libristo kód 12828288
Nakladatelství Cuvillier
Váha 249
Rozměry 148 x 210 x 11
Darujte tuto knihu ještě dnes
Je to snadné
1 Přidejte knihu do košíku a zvolte doručit jako dárek 2 Obratem vám zašleme poukaz 3 Kniha dorazí na adresu obdarovaného

Přihlášení

Přihlaste se ke svému účtu. Ještě nemáte Libristo účet? Vytvořte si ho nyní!

 
povinné
povinné

Nemáte účet? Získejte výhody Libristo účtu!

Díky Libristo účtu budete mít vše pod kontrolou.

Vytvořit Libristo účet