To view prices and purchase online, please login or create an account now.



Large-Scale Convex Optimization: Algorithms & Analyses via Monotone Operators

Hardback

Main Details

Title Large-Scale Convex Optimization: Algorithms & Analyses via Monotone Operators
Authors and Contributors      By (author) Ernest K. Ryu
By (author) Wotao Yin
Physical Properties
Format:Hardback
Pages:400
Dimensions(mm): Height 254,Width 178
Category/GenreGeometry
ISBN/Barcode 9781009160858
ClassificationsDewey:516.08
Audience
Undergraduate
Illustrations Worked examples or Exercises

Publishing Details

Publisher Cambridge University Press
Imprint Cambridge University Press
Publication Date 1 December 2022
Publication Country United Kingdom

Description

Starting from where a first course in convex optimization leaves off, this text presents a unified analysis of first-order optimization methods - including parallel-distributed algorithms - through the abstraction of monotone operators. With the increased computational power and availability of big data over the past decade, applied disciplines have demanded that larger and larger optimization problems be solved. This text covers the first-order convex optimization methods that are uniquely effective at solving these large-scale optimization problems. Readers will have the opportunity to construct and analyze many well-known classical and modern algorithms using monotone operators, and walk away with a solid understanding of the diverse optimization algorithms. Graduate students and researchers in mathematical optimization, operations research, electrical engineering, statistics, and computer science will appreciate this concise introduction to the theory of convex optimization algorithms.

Author Biography

Ernest K. Ryu is Assistant Professor of Mathematical Sciences at Seoul National University. He previously served as Assistant Adjunct Professor with the Department of Mathematics at the University of California, Los Angeles from 2016 to 2019, before joining Seoul National University in 2020. He received a BS with distinction in physics and electrical engineering from the California Institute of Technology in 2010; and then an MS in statistics and a PhD - with the Gene Golub Best Thesis Award - in computational mathematics at Stanford University in 2016. His current research focuses on mathematical optimization and machine learning. Wotao Yin is Director of the Decision Intelligence Lab with Alibaba Group (US), Damo Academy, and a former Professor of Mathematics at the University of California, Los Angeles. He received his PhD in operations research from Columbia University in 2006. His numerous accolades include an NSF CAREER Award in 2008, an Alfred P. Sloan Research Fellowship in 2009, a Morningside Gold Medal in 2016, and a Damo Award and Egon Balas Prize in 2021. He invented fast algorithms for sparse optimization, image processing, and large-scale distributed optimization problems, and is among the top 1 percent of cited researchers by Clarivate Analytics. His research interests include computational optimization and its applications in signal processing, machine learning, and other data science problems.

Reviews

'Ryu and Yin's Large-Scale Convex Optimization does a great job of covering a field with a long history and much current interest. The book describes dozens of algorithms, from classic ones developed in the 1970s to some very recent ones, in unified and consistent notation, all organized around the basic concept and unifying theme of a monotone operator. I strongly recommend it to any mathematician, researcher, or engineer who uses, or has an interest in, convex optimization.' Stephen Boyd, Stanford University 'This is an absolute must-read research monograph for signal processing, communications, and networking engineers, as well as researchers who wish to choose, design, and analyze splitting-based convex optimization methods best suited for their perplexed and challenging engineering tasks.' Georgios B. Giannakis, University of Minnesota 'This is a very timely book. Monotone operator theory is fundamental to the development of modern algorithms for large-scale convex optimization. Ryu and Yin provide optimization students and researchers with a self-contained introduction to the elegant mathematical theory of monotone operators, and take their readers on a tour of cutting-edge applications, demonstrating the power and range of these essential tools.' Lieven Vandenberghe, University of California, Los Angeles 'First-order methods are the mainstream optimization algorithms in the era of big data. This monograph provides a unique perspective on various first-order convex optimization algorithms via the monotone operator theory, with which the seemingly different and unrelated algorithms are actually deeply connected, and many proofs can be significantly simplified. The book is a beautiful example of the power of abstraction. Those who are interested in convex optimization theory should not miss this book.' Zhouchen Lin, Peking University 'The book covers topics from the basics of optimization to modern techniques such as operator splitting, parallel and distributed optimization, and stochastic algorithms. It is the natural next step after Boyd and Vandenberghe's Convex Optimization for students studying optimization and machine learning. The authors are experts in this kind of optimization. Some of my graduate students took the course based on this book when Wotao Yin was at UCLA. They liked the course and found the materials very useful in their research.' Stanley Osher, University of California, Los Angeles