A Coordinate Gradient Descent Method for Structured Nonsmooth Optimization: Theory and Applications

 
9783836478601: A Coordinate Gradient Descent Method for Structured Nonsmooth Optimization: Theory and Applications

Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems.

"synopsis" may belong to another edition of this title.

About the Author:

Sangwoon Yun: PhD in Mathematics at University of Washington. Research interest: Convex and nonsmooth optimization, variational analysis. Research Fellow at National University of Singapore.

"About this title" may belong to another edition of this title.

Buy New View Book
List Price: US$ 62.00
US$ 55.70

Convert Currency

Shipping: US$ 2.23
From Germany to U.S.A.

Destination, Rates & Speeds

Add to Basket

Top Search Results from the AbeBooks Marketplace

1.

Yun, Sangwoon
ISBN 10: 3836478609 ISBN 13: 9783836478601
New Quantity Available: 1
Seller
Rating
[?]

Book Description Book Condition: New. Publisher/Verlag: VDM Verlag Dr. Müller | Theory and Applications | Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. | Format: Paperback | Language/Sprache: english | 160 gr | 220x150x6 mm | 112 pp. Bookseller Inventory # K9783836478601

More Information About This Seller | Ask Bookseller a Question

Buy New
US$ 55.70
Convert Currency

Add to Basket

Shipping: US$ 2.23
From Germany to U.S.A.
Destination, Rates & Speeds

2.

Sangwoon Yun
Published by VDM Verlag Dr. Müller E.K. Nov 2012 (2012)
ISBN 10: 3836478609 ISBN 13: 9783836478601
New Taschenbuch Quantity Available: 2
Seller
Agrios-Buch
(Bergisch Gladbach, Germany)
Rating
[?]

Book Description VDM Verlag Dr. Müller E.K. Nov 2012, 2012. Taschenbuch. Book Condition: Neu. 220x150x7 mm. Neuware - Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. 112 pp. Englisch. Bookseller Inventory # 9783836478601

More Information About This Seller | Ask Bookseller a Question

Buy New
US$ 56.49
Convert Currency

Add to Basket

Shipping: US$ 19.17
From Germany to U.S.A.
Destination, Rates & Speeds

3.

Sangwoon Yun
Published by VDM Verlag Dr. Müller E.K. Nov 2012 (2012)
ISBN 10: 3836478609 ISBN 13: 9783836478601
New Taschenbuch Quantity Available: 2
Seller
Rheinberg-Buch
(Bergisch Gladbach, Germany)
Rating
[?]

Book Description VDM Verlag Dr. Müller E.K. Nov 2012, 2012. Taschenbuch. Book Condition: Neu. 220x150x7 mm. Neuware - Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. 112 pp. Englisch. Bookseller Inventory # 9783836478601

More Information About This Seller | Ask Bookseller a Question

Buy New
US$ 56.49
Convert Currency

Add to Basket

Shipping: US$ 19.17
From Germany to U.S.A.
Destination, Rates & Speeds

4.

Sangwoon Yun
Published by VDM Verlag, Germany (2012)
ISBN 10: 3836478609 ISBN 13: 9783836478601
New Paperback Quantity Available: 1
Seller
The Book Depository EURO
(London, United Kingdom)
Rating
[?]

Book Description VDM Verlag, Germany, 2012. Paperback. Book Condition: New. 220 x 150 mm. Language: English . Brand New Book. Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L 1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. Bookseller Inventory # KNV9783836478601

More Information About This Seller | Ask Bookseller a Question

Buy New
US$ 84.71
Convert Currency

Add to Basket

Shipping: US$ 3.82
From United Kingdom to U.S.A.
Destination, Rates & Speeds

5.

Sangwoon Yun
Published by VDM Verlag Dr. Müller (2010)
ISBN 10: 3836478609 ISBN 13: 9783836478601
New Paperback Quantity Available: 1
Print on Demand
Seller
Ergodebooks
(RICHMOND, TX, U.S.A.)
Rating
[?]

Book Description VDM Verlag Dr. Müller, 2010. Paperback. Book Condition: New. This item is printed on demand. Bookseller Inventory # DADAX3836478609

More Information About This Seller | Ask Bookseller a Question

Buy New
US$ 85.44
Convert Currency

Add to Basket

Shipping: US$ 3.99
Within U.S.A.
Destination, Rates & Speeds

6.

Sangwoon Yun
Published by VDM Verlag Dr. Müller E.K. Nov 2012 (2012)
ISBN 10: 3836478609 ISBN 13: 9783836478601
New Taschenbuch Quantity Available: 1
Print on Demand
Seller
AHA-BUCH GmbH
(Einbeck, Germany)
Rating
[?]

Book Description VDM Verlag Dr. Müller E.K. Nov 2012, 2012. Taschenbuch. Book Condition: Neu. 220x150x7 mm. This item is printed on demand - Print on Demand Neuware - Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. 112 pp. Englisch. Bookseller Inventory # 9783836478601

More Information About This Seller | Ask Bookseller a Question

Buy New
US$ 56.49
Convert Currency

Add to Basket

Shipping: US$ 33.02
From Germany to U.S.A.
Destination, Rates & Speeds

7.

Yun, Sangwoon
Published by VDM Verlag Dr. Müller (2010)
ISBN 10: 3836478609 ISBN 13: 9783836478601
New Paperback Quantity Available: 1
Seller
Irish Booksellers
(Rumford, ME, U.S.A.)
Rating
[?]

Book Description VDM Verlag Dr. Müller, 2010. Paperback. Book Condition: New. book. Bookseller Inventory # 3836478609

More Information About This Seller | Ask Bookseller a Question

Buy New
US$ 91.33
Convert Currency

Add to Basket

Shipping: FREE
Within U.S.A.
Destination, Rates & Speeds