The primary goal of this book is to provide a self-contained, comprehensive study of the main first-order methods that are frequently used in solving large-scale problems. First-order methods exploit information on values and gradients/subgradients (but not Hessians) of the functions composing the model under consideration. With the increase in the number of applications that can be modeled as large or even huge-scale optimization problems, there has been a revived interest in using simple methods that require low iteration cost as well as low memory storage.
The author has gathered, reorganized, and synthesized (in a unified manner) many results that are currently scattered throughout the literature, many of which cannot be typically found in optimization books.
First-Order Methods in Optimization offers comprehensive study of first-order methods with the theoretical foundations; provides plentiful examples and illustrations; emphasizes rates of convergence and complexity analysis of the main first-order methods used to solve large-scale problems; and covers both variables and functional decomposition methods.
Audience: This book is intended primarily for researchers and graduate students in mathematics, computer sciences, and electrical and other engineering departments. Readers with a background in advanced calculus and linear algebra, as well as prior knowledge in the fundamentals of optimization (some convex analysis, optimality conditions, and duality), will be best prepared for the material.
Contents: Chapter 1: Vector Spaces; Chapter 2: Extended Real-Value Functions; Chapter 3: Subgradients; Chapter 4: Conjugate Functions; Chapter 5: Smoothness and Strong Convexity; Chapter 6: The Proximal Operator; Chapter 7: Spectral Functions; Chapter 8: Primal and Dual Projected Subgradient Methods; Chapter 9: Mirror Descent; Chapter 10: The Proximal Gradient Method; Chapter 11: The Block Proximal Gradient Method; Chapter 12: Dual-Based Proximal Gradient Methods; Chapter 13: The Generalized Conditional Gradient Method; Chapter 14: Alternating Minimization; Chapter 15: ADMM; Appendix A: Strong Duality and Optimality Conditions; Appendix B: Tables; Appendix C: Symbols and Notation; Appendix D: Bibliographic Notes.
"synopsis" may belong to another edition of this title.
Amir Beck is a Professor at the School of Mathematical Sciences, Tel-Aviv University. His research interests are in continuous optimization, including theory, algorithmic analysis, and its applications. He has published numerous papers and has given invited lectures at international conferences. He serves in the editorial board of several journals. His research has been supported by various funding agencies, including the Israel Science Foundation, the German-Israeli Foundation, the United States-Israel Binational Science Foundation, the Israeli Science and Energy ministries and the European Community.
"About this title" may belong to another edition of this title.
Shipping:
FREE
Within U.S.A.
Shipping:
US$ 13.17
From United Kingdom to U.S.A.
Seller: BooksRun, Philadelphia, PA, U.S.A.
Paperback. Condition: Good. Ship within 24hrs. Satisfaction 100% guaranteed. APO/FPO addresses supported. Seller Inventory # 1611974984-11-1
Quantity: 1 available
Seller: SecondSale, Montgomery, IL, U.S.A.
Condition: Very Good. Item in very good condition! Textbooks may not include supplemental items i.e. CDs, access codes etc. Seller Inventory # 00085834703
Quantity: 1 available
Seller: SecondSale, Montgomery, IL, U.S.A.
Condition: Good. Item in good condition. Textbooks may not include supplemental items i.e. CDs, access codes etc. Seller Inventory # 00085834712
Quantity: 1 available
Seller: Revaluation Books, Exeter, United Kingdom
Paperback. Condition: Brand New. 484 pages. 10.12x7.01x1.10 inches. In Stock. Seller Inventory # 1611974984
Quantity: 1 available