About This Book This book is about training methods - in particular, fast second-order training methods - for multi-layer perceptrons (MLPs). MLPs (also known as feed-forward neural networks) are the most widely-used class of neural network. Over the past decade MLPs have achieved increasing popularity among scientists, engineers and other professionals as tools for tackling a wide variety of information processing tasks. In common with all neural networks, MLPsare trained (rather than programmed) to carryout the chosen information processing function. Unfortunately, the (traditional' method for trainingMLPs- the well-knownbackpropagation method - is notoriously slow and unreliable when applied to many prac tical tasks. The development of fast and reliable training algorithms for MLPsis one of the most important areas ofresearch within the entire field of neural computing. The main purpose of this book is to bring to a wider audience a range of alternative methods for training MLPs, methods which have proved orders of magnitude faster than backpropagation when applied to many training tasks. The book also addresses the well-known (local minima' problem, and explains ways in which fast training methods can be com bined with strategies for avoiding (or escaping from) local minima. All the methods described in this book have a strong theoretical foundation, drawing on such diverse mathematical fields as classical optimisation theory, homotopic theory and stochastic approximation theory.
"synopsis" may belong to another edition of this title.
Shipping:
US$ 3.75
Within U.S.A.
Seller: HPB-Red, Dallas, TX, U.S.A.
paperback. Condition: Good. Connecting readers with great books since 1972! Used textbooks may not include companion materials such as access codes, etc. May have some wear or writing/highlighting. We ship orders daily and Customer Service is our top priority! Seller Inventory # S_386356711
Quantity: 1 available
Seller: Anybook.com, Lincoln, United Kingdom
Condition: Good. This is an ex-library book and may have the usual library/used-book markings inside.This book has soft covers. In good all round condition. Please note the Image in this listing is a stock photo and may not match the covers of the actual item,350grams, ISBN:9783540761006. Seller Inventory # 9078869
Quantity: 1 available
Seller: Hard To Find Editions, Bristol, AVON, United Kingdom
Soft cover. Condition: Good. 1st Edition. This book has been read. However all pages intact with no highlighting or writing contained within. The spine remains undamaged with tight binding and the book is generally in good condition. Fast dispatch within and from the UK. 100% money back guarantee. Seller Inventory # ABE-1677501140607
Quantity: 1 available
Seller: California Books, Miami, FL, U.S.A.
Condition: New. Seller Inventory # I-9783540761006
Quantity: Over 20 available
Seller: AHA-BUCH GmbH, Einbeck, Germany
Taschenbuch. Condition: Neu. Druck auf Anfrage Neuware - Printed after ordering - About This Book This book is about training methods - in particular, fast second-order training methods - for multi-layer perceptrons (MLPs). MLPs (also known as feed-forward neural networks) are the most widely-used class of neural network. Over the past decade MLPs have achieved increasing popularity among scientists, engineers and other professionals as tools for tackling a wide variety of information processing tasks. In common with all neural networks, MLPsare trained (rather than programmed) to carryout the chosen information processing function. Unfortunately, the (traditional' method for trainingMLPs- the well-knownbackpropagation method - is notoriously slow and unreliable when applied to many prac tical tasks. The development of fast and reliable training algorithms for MLPsis one of the most important areas ofresearch within the entire field of neural computing. The main purpose of this book is to bring to a wider audience a range of alternative methods for training MLPs, methods which have proved orders of magnitude faster than backpropagation when applied to many training tasks. The book also addresses the well-known (local minima' problem, and explains ways in which fast training methods can be com bined with strategies for avoiding (or escaping from) local minima. All the methods described in this book have a strong theoretical foundation, drawing on such diverse mathematical fields as classical optimisation theory, homotopic theory and stochastic approximation theory. Seller Inventory # 9783540761006
Quantity: 1 available
Seller: Buchpark, Trebbin, Germany
Condition: Gut. Zustand: Gut - Gebrauchs- und Lagerspuren. | Seiten: 160 | Sprache: Englisch | Produktart: Bücher. Seller Inventory # 18741/3
Quantity: 1 available
Seller: Revaluation Books, Exeter, United Kingdom
Paperback. Condition: Brand New. 1997 edition. 145 pages. 9.50x6.25x0.50 inches. In Stock. Seller Inventory # x-3540761004
Quantity: 2 available
Seller: moluna, Greven, Germany
Kartoniert / Broschiert. Condition: New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. About This Book This book is about training methods - in particular, fast second-order training methods - for multi-layer perceptrons (MLPs). MLPs (also known as feed-forward neural networks) are the most widely-used class of neural network. Over the past d. Seller Inventory # 4900366
Quantity: Over 20 available
Seller: Chiron Media, Wallingford, United Kingdom
PF. Condition: New. Seller Inventory # 6666-IUK-9783540761006
Quantity: 10 available
Seller: BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Germany
Taschenbuch. Condition: Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -About This Book This book is about training methods - in particular, fast second-order training methods - for multi-layer perceptrons (MLPs). MLPs (also known as feed-forward neural networks) are the most widely-used class of neural network. Over the past decade MLPs have achieved increasing popularity among scientists, engineers and other professionals as tools for tackling a wide variety of information processing tasks. In common with all neural networks, MLPsare trained (rather than programmed) to carryout the chosen information processing function. Unfortunately, the (traditional' method for trainingMLPs- the well-knownbackpropagation method - is notoriously slow and unreliable when applied to many prac tical tasks. The development of fast and reliable training algorithms for MLPsis one of the most important areas ofresearch within the entire field of neural computing. The main purpose of this book is to bring to a wider audience a range of alternative methods for training MLPs, methods which have proved orders of magnitude faster than backpropagation when applied to many training tasks. The book also addresses the well-known (local minima' problem, and explains ways in which fast training methods can be com bined with strategies for avoiding (or escaping from) local minima. All the methods described in this book have a strong theoretical foundation, drawing on such diverse mathematical fields as classical optimisation theory, homotopic theory and stochastic approximation theory. 160 pp. Englisch. Seller Inventory # 9783540761006
Quantity: 2 available