Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems.

*"synopsis" may belong to another edition of this title.*

Sangwoon Yun: PhD in Mathematics at University of Washington. Research interest: Convex and nonsmooth optimization, variational analysis. Research Fellow at National University of Singapore.

*"About this title" may belong to another edition of this title.*

ISBN 10: 3836478609
ISBN 13: 9783836478601

New
Quantity Available: 1

Seller

Rating

**Book Description **Book Condition: New. Publisher/Verlag: VDM Verlag Dr. Müller | Theory and Applications | Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. | Format: Paperback | Language/Sprache: english | 160 gr | 220x150x6 mm | 112 pp. Bookseller Inventory # K9783836478601

More Information About This Seller | Ask Bookseller a Question

Published by
VDM Verlag Dr. Müller E.K. Nov 2012
(2012)

ISBN 10: 3836478609
ISBN 13: 9783836478601

New
Taschenbuch
Quantity Available: 2

Seller

Rating

**Book Description **VDM Verlag Dr. Müller E.K. Nov 2012, 2012. Taschenbuch. Book Condition: Neu. 220x150x7 mm. Neuware - Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. 112 pp. Englisch. Bookseller Inventory # 9783836478601

More Information About This Seller | Ask Bookseller a Question

Published by
VDM Verlag Dr. Müller E.K. Nov 2012
(2012)

ISBN 10: 3836478609
ISBN 13: 9783836478601

New
Taschenbuch
Quantity Available: 2

Seller

Rating

**Book Description **VDM Verlag Dr. Müller E.K. Nov 2012, 2012. Taschenbuch. Book Condition: Neu. 220x150x7 mm. Neuware - Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. 112 pp. Englisch. Bookseller Inventory # 9783836478601

More Information About This Seller | Ask Bookseller a Question

Published by
VDM Verlag, Germany
(2012)

ISBN 10: 3836478609
ISBN 13: 9783836478601

New
Paperback
Quantity Available: 1

Seller

Rating

**Book Description **VDM Verlag, Germany, 2012. Paperback. Book Condition: New. 220 x 150 mm. Language: English . Brand New Book. Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L 1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. Bookseller Inventory # KNV9783836478601

More Information About This Seller | Ask Bookseller a Question

Published by
VDM Verlag Dr. Müller E.K. Nov 2012
(2012)

ISBN 10: 3836478609
ISBN 13: 9783836478601

New
Taschenbuch
Quantity Available: 1

Seller

Rating

**Book Description **VDM Verlag Dr. Müller E.K. Nov 2012, 2012. Taschenbuch. Book Condition: Neu. 220x150x7 mm. This item is printed on demand - Print on Demand Neuware - Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. 112 pp. Englisch. Bookseller Inventory # 9783836478601

More Information About This Seller | Ask Bookseller a Question

Published by
VDM Verlag Dr. Müller
(2010)

ISBN 10: 3836478609
ISBN 13: 9783836478601

New
Paperback
Quantity Available: 1

Seller

Rating

**Book Description **VDM Verlag Dr. Müller, 2010. Paperback. Book Condition: New. This item is printed on demand. Bookseller Inventory # DADAX3836478609

More Information About This Seller | Ask Bookseller a Question

Published by
VDM Verlag Dr. Müller
(2010)

ISBN 10: 3836478609
ISBN 13: 9783836478601

New
Paperback
Quantity Available: 1

Seller

Rating

**Book Description **VDM Verlag Dr. Müller, 2010. Paperback. Book Condition: New. book. Bookseller Inventory # 3836478609

More Information About This Seller | Ask Bookseller a Question