Published by Chapman and Hall/CRC, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: HPB-Red, Dallas, TX, U.S.A.
hardcover. Condition: Good. Connecting readers with great books since 1972! Used textbooks may not include companion materials such as access codes, etc. May have some wear or writing/highlighting. We ship orders daily and Customer Service is our top priority!
Published by Chapman and Hall/CRC, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: Lucky's Textbooks, Dallas, TX, U.S.A.
US$ 127.25
Convert currencyQuantity: Over 20 available
Add to basketCondition: New.
Published by Taylor & Francis Group, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: Majestic Books, Hounslow, United Kingdom
US$ 134.57
Convert currencyQuantity: 3 available
Add to basketCondition: New. pp. 388.
Published by Chapman and Hall/CRC, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: Ria Christie Collections, Uxbridge, United Kingdom
US$ 134.93
Convert currencyQuantity: Over 20 available
Add to basketCondition: New. In.
Published by Taylor & Francis Group, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: Books Puddle, New York, NY, U.S.A.
Condition: New. pp. 388.
Published by Taylor & Francis Ltd, London, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: Grand Eagle Retail, Fairfield, OH, U.S.A.
Hardcover. Condition: new. Hardcover. Tree-based Methods for Statistical Learning in R provides a thorough introduction to both individual decision tree algorithms (Part I) and ensembles thereof (Part II). Part I of the book brings several different tree algorithms into focus, both conventional and contemporary. Building a strong foundation for how individual decision trees work will help readers better understand tree-based ensembles at a deeper level, which lie at the cutting edge of modern statistical and machine learning methodology.The book follows up most ideas and mathematical concepts with code-based examples in the R statistical language; with an emphasis on using as few external packages as possible. For example, users will be exposed to writing their own random forest and gradient tree boosting functions using simple for loops and basic tree fitting software (like rpart and party/partykit), and more. The core chapters also end with a detailed section on relevant software in both R and other opensource alternatives (e.g., Python, Spark, and Julia), and example usage on real data sets. While the book mostly uses R, it is meant to be equally accessible and useful to non-R programmers.Consumers of this book will have gained a solid foundation (and appreciation) for tree-based methods and how they can be used to solve practical problems and challenges data scientists often face in applied work.Features: Thorough coverage, from the ground up, of tree-based methods (e.g., CART, conditional inference trees, bagging, boosting, and random forests). A companion website containing additional supplementary material and the code to reproduce every example and figure in the book.A companion R package, called treemisc, which contains several data sets and functions used throughout the book (e.g., theres an implementation of gradient tree boosting with LAD loss that shows how to perform the line search step by updating the terminal node estimates of a fitted rpart tree).Interesting examples that are of practical use; for example, how to construct partial dependence plots from a fitted model in Spark MLlib (using only Spark operations), or post-processing tree ensembles via the LASSO to reduce the number of trees while maintaining, or even improving performance. This book provides a thorough introduction to both individual decision tree algorithms (Part I) and ensembles thereof (Part II). Part I of the book brings several different tree algorithms into focus, both conventional and contemporary. Shipping may be from multiple locations in the US or from the UK, depending on stock availability.
Published by Taylor & Francis Ltd, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: Kennys Bookshop and Art Galleries Ltd., Galway, GY, Ireland
First Edition
US$ 149.49
Convert currencyQuantity: 1 available
Add to basketCondition: New. 2022. 1st Edition. Hardcover. . . . . .
Published by Taylor & Francis Group, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: Biblios, Frankfurt am main, HESSE, Germany
US$ 159.88
Convert currencyQuantity: 3 available
Add to basketCondition: New. pp. 388.
Published by Taylor & Francis Ltd Jun 2022, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: AHA-BUCH GmbH, Einbeck, Germany
US$ 144.56
Convert currencyQuantity: 1 available
Add to basketBuch. Condition: Neu. Neuware - This book provides a thorough introduction to both individual decision tree algorithms (Part I) and ensembles thereof (Part II). Part I of the book brings several different tree algorithms into focus, both conventional and contemporary.
Published by Taylor & Francis Ltd, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: Kennys Bookstore, Olney, MD, U.S.A.
Condition: New. 2022. 1st Edition. Hardcover. . . . . . Books ship from the US and Ireland.
Published by Taylor & Francis Ltd, London, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: CitiRetail, Stevenage, United Kingdom
US$ 159.62
Convert currencyQuantity: 1 available
Add to basketHardcover. Condition: new. Hardcover. Tree-based Methods for Statistical Learning in R provides a thorough introduction to both individual decision tree algorithms (Part I) and ensembles thereof (Part II). Part I of the book brings several different tree algorithms into focus, both conventional and contemporary. Building a strong foundation for how individual decision trees work will help readers better understand tree-based ensembles at a deeper level, which lie at the cutting edge of modern statistical and machine learning methodology.The book follows up most ideas and mathematical concepts with code-based examples in the R statistical language; with an emphasis on using as few external packages as possible. For example, users will be exposed to writing their own random forest and gradient tree boosting functions using simple for loops and basic tree fitting software (like rpart and party/partykit), and more. The core chapters also end with a detailed section on relevant software in both R and other opensource alternatives (e.g., Python, Spark, and Julia), and example usage on real data sets. While the book mostly uses R, it is meant to be equally accessible and useful to non-R programmers.Consumers of this book will have gained a solid foundation (and appreciation) for tree-based methods and how they can be used to solve practical problems and challenges data scientists often face in applied work.Features: Thorough coverage, from the ground up, of tree-based methods (e.g., CART, conditional inference trees, bagging, boosting, and random forests). A companion website containing additional supplementary material and the code to reproduce every example and figure in the book.A companion R package, called treemisc, which contains several data sets and functions used throughout the book (e.g., theres an implementation of gradient tree boosting with LAD loss that shows how to perform the line search step by updating the terminal node estimates of a fitted rpart tree).Interesting examples that are of practical use; for example, how to construct partial dependence plots from a fitted model in Spark MLlib (using only Spark operations), or post-processing tree ensembles via the LASSO to reduce the number of trees while maintaining, or even improving performance. This book provides a thorough introduction to both individual decision tree algorithms (Part I) and ensembles thereof (Part II). Part I of the book brings several different tree algorithms into focus, both conventional and contemporary. Shipping may be from our UK warehouse or from our Australian or US warehouses, depending on stock availability.
Published by Taylor & Francis Ltd, London, 2022
ISBN 10: 0367532468 ISBN 13: 9780367532468
Language: English
Seller: AussieBookSeller, Truganina, VIC, Australia
US$ 186.22
Convert currencyQuantity: 1 available
Add to basketHardcover. Condition: new. Hardcover. Tree-based Methods for Statistical Learning in R provides a thorough introduction to both individual decision tree algorithms (Part I) and ensembles thereof (Part II). Part I of the book brings several different tree algorithms into focus, both conventional and contemporary. Building a strong foundation for how individual decision trees work will help readers better understand tree-based ensembles at a deeper level, which lie at the cutting edge of modern statistical and machine learning methodology.The book follows up most ideas and mathematical concepts with code-based examples in the R statistical language; with an emphasis on using as few external packages as possible. For example, users will be exposed to writing their own random forest and gradient tree boosting functions using simple for loops and basic tree fitting software (like rpart and party/partykit), and more. The core chapters also end with a detailed section on relevant software in both R and other opensource alternatives (e.g., Python, Spark, and Julia), and example usage on real data sets. While the book mostly uses R, it is meant to be equally accessible and useful to non-R programmers.Consumers of this book will have gained a solid foundation (and appreciation) for tree-based methods and how they can be used to solve practical problems and challenges data scientists often face in applied work.Features: Thorough coverage, from the ground up, of tree-based methods (e.g., CART, conditional inference trees, bagging, boosting, and random forests). A companion website containing additional supplementary material and the code to reproduce every example and figure in the book.A companion R package, called treemisc, which contains several data sets and functions used throughout the book (e.g., theres an implementation of gradient tree boosting with LAD loss that shows how to perform the line search step by updating the terminal node estimates of a fitted rpart tree).Interesting examples that are of practical use; for example, how to construct partial dependence plots from a fitted model in Spark MLlib (using only Spark operations), or post-processing tree ensembles via the LASSO to reduce the number of trees while maintaining, or even improving performance. This book provides a thorough introduction to both individual decision tree algorithms (Part I) and ensembles thereof (Part II). Part I of the book brings several different tree algorithms into focus, both conventional and contemporary. Shipping may be from our Sydney, NSW warehouse or from our UK or US warehouse, depending on stock availability.
Seller: PBShop.store US, Wood Dale, IL, U.S.A.
HRD. Condition: New. New Book. Shipped from UK. THIS BOOK IS PRINTED ON DEMAND. Established seller since 2000.
Seller: PBShop.store UK, Fairford, GLOS, United Kingdom
US$ 152.60
Convert currencyQuantity: Over 20 available
Add to basketHRD. Condition: New. New Book. Delivered from our UK warehouse in 4 to 14 business days. THIS BOOK IS PRINTED ON DEMAND. Established seller since 2000.