Seller: Books From California, Simi Valley, CA, U.S.A.
paperback. Condition: Very Good.
Condition: New.
Seller: Ria Christie Collections, Uxbridge, United Kingdom
US$ 60.65
Quantity: Over 20 available
Add to basketCondition: New. In.
Seller: Books Puddle, New York, NY, U.S.A.
Condition: New.
Seller: Revaluation Books, Exeter, United Kingdom
US$ 87.56
Quantity: 2 available
Add to basketPaperback. Condition: Brand New. 1st edition. 59 pages. 9.25x6.25x0.50 inches. In Stock.
Language: English
Published by Springer Nature Singapore, Springer Nature Singapore Mär 2025, 2025
ISBN 10: 981978879X ISBN 13: 9789819788798
Seller: buchversandmimpf2000, Emtmannsberg, BAYE, Germany
Taschenbuch. Condition: Neu. Neuware -This book introduces the gamma-divergence, a measure of distance between probability distributions that was proposed by Fujisawa and Eguchi in 2008. The gamma-divergence has been extensively explored to provide robust estimation when the power index ¿ is positive. The gamma-divergence can be defined even when the power index ¿ is negative, as long as the condition of integrability is satisfied. Thus, the authors consider the gamma-divergence defined on a set of discrete distributions. The arithmetic, geometric, and harmonic means for the distribution ratios are closely connected with the gamma-divergence with a negative ¿. In particular, the authors call the geometric-mean (GM) divergence the gamma-divergence when ¿ is equal to -1.Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg 128 pp. Englisch.
Language: English
Published by Springer Nature Singapore, Springer Nature Singapore, 2025
ISBN 10: 981978879X ISBN 13: 9789819788798
Seller: AHA-BUCH GmbH, Einbeck, Germany
Taschenbuch. Condition: Neu. Druck auf Anfrage Neuware - Printed after ordering - This book introduces the gamma-divergence, a measure of distance between probability distributions that was proposed by Fujisawa and Eguchi in 2008. The gamma-divergence has been extensively explored to provide robust estimation when the power index Gamma is positive. The gamma-divergence can be defined even when the power index Gamma is negative, as long as the condition of integrability is satisfied. Thus, the authors consider the gamma-divergence defined on a set of discrete distributions. The arithmetic, geometric, and harmonic means for the distribution ratios are closely connected with the gamma-divergence with a negative Gamma. In particular, the authors call the geometric-mean (GM) divergence the gamma-divergence when Gamma is equal to -1.The book begins by providing an overview of the gamma-divergence and its properties. It then goes on to discuss the applications of the gamma-divergence in various areas, including machine learning, statistics, and ecology. Bernoulli, categorical, Poisson, negative binomial, and Boltzmann distributions are discussed as typical examples. Furthermore, regression analysis models that explicitly or implicitly assume these distributions as the dependent variable in generalized linear models are discussed to apply the minimum gamma-divergence method.In ensemble learning, AdaBoost is derived by the exponential loss function in the weighted majority vote manner. It is pointed out that the exponential loss function is deeply connected to the GM divergence. In the Boltzmann machine, the maximum likelihood has to use approximation methods such as mean field approximation because of the intractable computation of the partition function. However, by considering the GM divergence and the exponential loss, it is shown that the calculation of the partition function is not necessary, and it can be executed without variational inference.
Language: English
Published by Springer Japan, Springer Japan, 2019
ISBN 10: 4431555692 ISBN 13: 9784431555698
Seller: AHA-BUCH GmbH, Einbeck, Germany
Taschenbuch. Condition: Neu. Druck auf Anfrage Neuware - Printed after ordering - This book presents a fresh, new approach in that it provides a comprehensive recent review of challenging problems caused by imbalanced data in prediction and classification, and also in that it introduces several of the latest statistical methods of dealing with these problems. The book discusses the property of the imbalance of data from two points of view. The first is quantitative imbalance, meaning that the sample size in one population highly outnumbers that in another population. It includes presence-only data as an extreme case, where the presence of a species is confirmed, whereas the information on its absence is uncertain, which is especially common in ecology in predicting habitat distribution. The second is qualitative imbalance, meaning that the data distribution of one population can be well specified whereas that of the other one shows a highly heterogeneous property. A typical case is the existence of outliers commonly observed in gene expression data, and another is heterogeneous characteristics often observed in a case group in case-control studies. The extension of the logistic regression model, maxent, and AdaBoost for imbalanced data is discussed, providing a new framework for improvement of prediction, classification, and performance of variable selection. Weights functions introduced in the methods play an important role in alleviating the imbalance of data. This book also furnishes a new perspective on these problem and shows some applications of the recently developed statistical methods to real data sets.
Seller: Mispah books, Redhill, SURRE, United Kingdom
US$ 129.79
Quantity: 1 available
Add to basketPaperback. Condition: New. New. book.
Seller: Lucky's Textbooks, Dallas, TX, U.S.A.
Condition: New.
Seller: Ria Christie Collections, Uxbridge, United Kingdom
US$ 161.25
Quantity: Over 20 available
Add to basketCondition: New. In.
Condition: New.
Seller: Buchpark, Trebbin, Germany
Condition: Hervorragend. Zustand: Hervorragend | Sprache: Englisch | Produktart: Bücher | This book explores minimum divergence methods of statistical machine learning for estimation, regression, prediction, and so forth, in which we engage in information geometry to elucidate their intrinsic properties of the corresponding loss functions, learning algorithms, and statistical models. One of the most elementary examples is Gauss's least squares estimator in a linear regression model, in which the estimator is given by minimization of the sum of squares between a response vector and a vector of the linear subspace hulled by explanatory vectors. This is extended to Fisher's maximum likelihood estimator (MLE) for an exponential model, in which the estimator is provided by minimization of the Kullback-Leibler (KL) divergence between a data distribution and a parametric distribution of the exponential model in an empirical analogue. Thus, we envisage a geometric interpretation of such minimization procedures such that a right triangle is kept with Pythagorean identity in the sense of the KL divergence. This understanding sublimates a dualistic interplay between a statistical estimation and model, which requires dual geodesic paths, called m-geodesic and e-geodesic paths, in a framework of information geometry. We extend such a dualistic structure of the MLE and exponential model to that of the minimum divergence estimator and the maximum entropy model, which is applied to robust statistics, maximum entropy, density estimation, principal component analysis, independent component analysis, regression analysis, manifold learning, boosting algorithm, clustering, dynamic treatment regimes, and so forth. We consider a variety of information divergence measures typically including KL divergence to express departure from one probability distribution to another. An information divergence is decomposed into the cross-entropy and the (diagonal) entropy in which the entropy associates with a generative model as a family of maximum entropy distributions; the cross entropy associates with a statistical estimation method via minimization of the empirical analogue based on given data. Thus any statistical divergence includes an intrinsic object between the generative model and the estimation method. Typically, KL divergence leads to the exponential model and the maximum likelihood estimation. It is shown that any information divergence leads to a Riemannian metric and a pair of the linear connections in the framework of information geometry. We focus on a class of information divergence generated by an increasing and convex function U, called U-divergence. It is shown that any generator function U generates the U-entropy and U-divergence, in which there is a dualistic structure between the U-divergence method and the maximum U-entropy model. We observe that a specific choice of U leads to a robust statistical procedurevia the minimum U-divergence method. If U is selected as an exponential function, then the corresponding U-entropy and U-divergence are reduced to the Boltzmann-Shanon entropy and the KL divergence; the minimum U-divergence estimator is equivalent to the MLE. For robust supervised learning to predict a class label we observe that the U-boosting algorithm performs well for contamination of mislabel examples if U is appropriately selected. We present such maximal U-entropy and minimum U-divergence methods, in particular, selecting a power function as U to provide flexible performance in statistical machine l.
Seller: Books Puddle, New York, NY, U.S.A.
Condition: New. 1st ed. 2022 edition NO-PA16APR2015-KAP.
Language: English
Published by Springer, Berlin, Springer Nature Singapore, Springer, 2025
ISBN 10: 981978879X ISBN 13: 9789819788798
Seller: BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Germany
Taschenbuch. Condition: Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -This book introduces the gamma-divergence, a measure of distance between probability distributions that was proposed by Fujisawa and Eguchi in 2008. The gamma-divergence has been extensively explored to provide robust estimation when the power index Gamma is positive. The gamma-divergence can be defined even when the power index Gamma is negative, as long as the condition of integrability is satisfied. Thus, the authors consider the gamma-divergence defined on a set of discrete distributions. The arithmetic, geometric, and harmonic means for the distribution ratios are closely connected with the gamma-divergence with a negative Gamma. In particular, the authors call the geometric-mean (GM) divergence the gamma-divergence when Gamma is equal to -1.The book begins by providing an overview of the gamma-divergence and its properties. It then goes on to discuss the applications of the gamma-divergence in various areas, including machine learning, statistics, and ecology. Bernoulli, categorical, Poisson, negative binomial, and Boltzmann distributions are discussed as typical examples. Furthermore, regression analysis models that explicitly or implicitly assume these distributions as the dependent variable in generalized linear models are discussed to apply the minimum gamma-divergence method.In ensemble learning, AdaBoost is derived by the exponential loss function in the weighted majority vote manner. It is pointed out that the exponential loss function is deeply connected to the GM divergence. In the Boltzmann machine, the maximum likelihood has to use approximation methods such as mean field approximation because of the intractable computation of the partition function. However, by considering the GM divergence and the exponential loss, it is shown that the calculation of the partition function is not necessary, and it can be executed without variational inference. 118 pp. Englisch.
Seller: Majestic Books, Hounslow, United Kingdom
Condition: New. Print on Demand.
Language: English
Published by Springer Japan Jul 2019, 2019
ISBN 10: 4431555692 ISBN 13: 9784431555698
Seller: BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Germany
Taschenbuch. Condition: Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -This book presents a fresh, new approach in that it provides a comprehensive recent review of challenging problems caused by imbalanced data in prediction and classification, and also in that it introduces several of the latest statistical methods of dealing with these problems. The book discusses the property of the imbalance of data from two points of view. The first is quantitative imbalance, meaning that the sample size in one population highly outnumbers that in another population. It includes presence-only data as an extreme case, where the presence of a species is confirmed, whereas the information on its absence is uncertain, which is especially common in ecology in predicting habitat distribution. The second is qualitative imbalance, meaning that the data distribution of one population can be well specified whereas that of the other one shows a highly heterogeneous property. A typical case is the existence of outliers commonly observed in gene expression data, and another is heterogeneous characteristics often observed in a case group in case-control studies. The extension of the logistic regression model, maxent, and AdaBoost for imbalanced data is discussed, providing a new framework for improvement of prediction, classification, and performance of variable selection. Weights functions introduced in the methods play an important role in alleviating the imbalance of data. This book also furnishes a new perspective on these problem and shows some applications of the recently developed statistical methods to real data sets. 68 pp. Englisch.
Seller: Biblios, Frankfurt am main, HESSE, Germany
Condition: New. PRINT ON DEMAND.
Language: English
Published by Springer Verlag GmbH, 2025
ISBN 10: 981978879X ISBN 13: 9789819788798
Seller: moluna, Greven, Germany
Condition: New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt.
Seller: moluna, Greven, Germany
Condition: New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. Osamu Komori, The Institute of Statistical Mathematics, Shinto Eguchi, The Institute of Statistical MathematicsThis book presents a fresh, new approach in that it provides a comprehensive recent review of challenging problems caused by imba.
Language: English
Published by Springer Verlag, Singapore, Singapore, 2025
ISBN 10: 981978879X ISBN 13: 9789819788798
Seller: CitiRetail, Stevenage, United Kingdom
US$ 71.22
Quantity: 1 available
Add to basketPaperback. Condition: new. Paperback. This book introduces the gamma-divergence, a measure of distance between probability distributions that was proposed by Fujisawa and Eguchi in 2008. The gamma-divergence has been extensively explored to provide robust estimation when the power index g is positive. The gamma-divergence can be defined even when the power index g is negative, as long as the condition of integrability is satisfied. Thus, the authors consider the gamma-divergence defined on a set of discrete distributions. The arithmetic, geometric, and harmonic means for the distribution ratios are closely connected with the gamma-divergence with a negative g. In particular, the authors call the geometric-mean (GM) divergence the gamma-divergence when g is equal to -1.The book begins by providing an overview of the gamma-divergence and its properties. It then goes on to discuss the applications of the gamma-divergence in various areas, including machine learning, statistics, and ecology. Bernoulli, categorical, Poisson, negative binomial, and Boltzmann distributions are discussed as typical examples. Furthermore, regression analysis models that explicitly or implicitly assume these distributions as the dependent variable in generalized linear models are discussed to apply the minimum gamma-divergence method.In ensemble learning, AdaBoost is derived by the exponential loss function in the weighted majority vote manner. It is pointed out that the exponential loss function is deeply connected to the GM divergence. In the Boltzmann machine, the maximum likelihood has to use approximation methods such as mean field approximation because of the intractable computation of the partition function. However, by considering the GM divergence and the exponential loss, it is shown that the calculation of the partition function is not necessary, and it can be executed without variational inference. In particular, the authors call the geometric-mean (GM) divergence the gamma-divergence when g is equal to -1.The book begins by providing an overview of the gamma-divergence and its properties. This item is printed on demand. Shipping may be from our UK warehouse or from our Australian or US warehouses, depending on stock availability.
Language: English
Published by Springer Japan, Springer Japan Jul 2019, 2019
ISBN 10: 4431555692 ISBN 13: 9784431555698
Seller: buchversandmimpf2000, Emtmannsberg, BAYE, Germany
Taschenbuch. Condition: Neu. This item is printed on demand - Print on Demand Titel. Neuware -This book presents a fresh, new approach in that it provides a comprehensive recent review of challenging problems caused by imbalanced data in prediction and classification, and also in that it introduces several of the latest statistical methods of dealing with these problems. The book discusses the property of the imbalance of data from two points of view. The first is quantitative imbalance, meaning that the sample size in one population highly outnumbers that in another population. It includes presence-only data as an extreme case, where the presence of a species is confirmed, whereas the information on its absence is uncertain, which is especially common in ecology in predicting habitat distribution. The second is qualitative imbalance, meaning that the data distribution of one population can be well specified whereas that of the other one shows a highly heterogeneous property. A typical case is the existence of outliers commonly observed in gene expression data, and another is heterogeneous characteristics often observed in a case group in case-control studies. The extension of the logistic regression model, maxent, and AdaBoost for imbalanced data is discussed, providing a new framework for improvement of prediction, classification, and performance of variable selection. Weights functions introduced in the methods play an important role in alleviating the imbalance of data. This book also furnishes a new perspective on these problem and shows some applications of the recently developed statistical methods to real data sets.Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg 68 pp. Englisch.
Seller: preigu, Osnabrück, Germany
Taschenbuch. Condition: Neu. Minimum Gamma-Divergence for Regression and Classification Problems | Shinto Eguchi | Taschenbuch | viii | Englisch | 2025 | Springer | EAN 9789819788798 | Verantwortliche Person für die EU: Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg, juergen[dot]hartmann[at]springer[dot]com | Anbieter: preigu Print on Demand.
Seller: moluna, Greven, Germany
US$ 155.04
Quantity: Over 20 available
Add to basketCondition: New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. Provides various applications including boosting and kernel methods in machine learning with a geometric invariance viewpointFacilitates a deeper understanding of the complementary relation between statistical models and estimation in the context .
Seller: Majestic Books, Hounslow, United Kingdom
US$ 227.57
Quantity: 4 available
Add to basketCondition: New. Print on Demand.
Seller: preigu, Osnabrück, Germany
Buch. Condition: Neu. Minimum Divergence Methods in Statistical Machine Learning | From an Information Geometric Viewpoint | Shinto Eguchi (u. a.) | Buch | x | Englisch | 2022 | Springer | EAN 9784431569206 | Verantwortliche Person für die EU: Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg, juergen[dot]hartmann[at]springer[dot]com | Anbieter: preigu Print on Demand.
Seller: Biblios, Frankfurt am main, HESSE, Germany
Condition: New. PRINT ON DEMAND.