Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results.
"synopsis" may belong to another edition of this title.
Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results.
"About this title" may belong to another edition of this title.
Seller: thebookforest.com, San Rafael, CA, U.S.A.
Condition: New. Well packaged and promptly shipped from California. Partnered with Friends of the Library since 2010. Seller Inventory # BAY_00_SH_010133
Seller: Books From California, Simi Valley, CA, U.S.A.
paperback. Condition: Very Good. Seller Inventory # mon0003673422
Seller: GreatBookPrices, Columbia, MD, U.S.A.
Condition: New. Seller Inventory # 3190525-n
Seller: Lucky's Textbooks, Dallas, TX, U.S.A.
Condition: New. Seller Inventory # ABLIING23Mar3113020163366
Seller: California Books, Miami, FL, U.S.A.
Condition: New. Seller Inventory # I-9783540225720
Seller: Grand Eagle Retail, Bensenville, IL, U.S.A.
Paperback. Condition: new. Paperback. Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i. e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results. Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Shipping may be from multiple locations in the US or from the UK, depending on stock availability. Seller Inventory # 9783540225720
Seller: Ria Christie Collections, Uxbridge, United Kingdom
Condition: New. In. Seller Inventory # ria9783540225720_new
Quantity: Over 20 available
Seller: Chiron Media, Wallingford, United Kingdom
PF. Condition: New. Seller Inventory # 6666-IUK-9783540225720
Quantity: 10 available
Seller: GreatBookPricesUK, Woodford Green, United Kingdom
Condition: New. Seller Inventory # 3190525-n
Quantity: Over 20 available
Seller: Books Puddle, New York, NY, U.S.A.
Condition: New. pp. 292. Seller Inventory # 26327361