Regularized Radial Basis Function Networks: Theory and Applications / Edition 1

Regularized Radial Basis Function Networks: Theory and Applications / Edition 1

ISBN-10:
0471353493
ISBN-13:
9780471353492
Pub. Date:
04/16/2001
Publisher:
Wiley
ISBN-10:
0471353493
ISBN-13:
9780471353492
Pub. Date:
04/16/2001
Publisher:
Wiley
Regularized Radial Basis Function Networks: Theory and Applications / Edition 1

Regularized Radial Basis Function Networks: Theory and Applications / Edition 1

Hardcover

$184.95
Current price is , Original price is $184.95. You
$184.95 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Overview

Simon Haykin is a well-known author of books on neural networks.
* An authoritative book dealing with cutting edge technology.
* This book has no competition.

Product Details

ISBN-13: 9780471353492
Publisher: Wiley
Publication date: 04/16/2001
Series: Adaptive and Cognitive Dynamic Systems: Signal Processing, Learning, Communications and Control , #20
Pages: 208
Product dimensions: 6.34(w) x 9.39(h) x 0.75(d)

About the Author

Paul V. Yee is the author of Regularized Radial Basis Function Networks: Theory and Applications, published by Wiley.

Simon Haykin is the author of Regularized Radial Basis Function Networks: Theory and Applications, published by Wiley.

Table of Contents

Preface.

Notations.

Introduction.

Basic Tools.

Probability Estimation and Pattern Classification.

Nonlinear Time-Series Prediction.

Nonlinear State Estimation.

Dynamic Reconstruction of Chaotic Processes.

Discussion.

Appendix of Notes to the Text.

References.

Index.

Preface

This book is intended to serve as a bridge between the two areas of nonparametric estimation and artificial neural networks (ANNs). The growing importance and richness of ideas in both areas cannot be ignored and, to that end, this book examines their interplay under the overarching principle of regularization. The specific vehicle for this study is the regularized strict interpolation radial basis function (RBFN) estimate or neural network. Aside from its practical importance as one of the better known kernel-based methods for estimation and function approximation, the regularized strict interpolation RBFN is chosen for its straightforward structure, which is simple enough to admit theoretical analysis yet sufficiently powerful for nontrivial applications.

The organization of this book is therefore guided by these two facets. Chapter 1 provides a theoretical understanding of the strict interpolation RBFN, specifically with regards to how its mean-square (MS) consistency can be related to that of the Nadaraya-Watson regression estimate (NWRE), a fundamental kernel-based nonparametric estimate. Regularization in the form of cross-validation and asymptotically optimal regularization parameter sequences plays a pivotal role in linking these two estimates. Once this background is set, only minor modifications and extensions are required to explain the application of the regularized strict interpolation RBFN to a variety of challenging tasks. As illustrative examples of such tasks, the book discusses probability estimation (and, hence, pattern classification), nonlinear time-series prediction and state estimation, and the dynamic reconstruction of chaotic processes. Chapter 2 indicates briefly how the MS consistency results of Chapter 1 can be used to prove the Bayes risk consistency of the approximate Bayes decision rules formed from regularized strict interpolation RBFN posterior probability estimates. A similar extension is made in Chapter 3 to establish the MS consistency of regularized strict interpolation RBFNs for nonlinear autoregressive time-series prediction, along with corresponding experimental results on speech prediction. Nonlinear time series continue to be the theme for Chapters 4 and 5, where regularized strict interpolation RBFNs are applied to the problems of nonlinear state estimation and dynamic reconstruction of chaotic processes, respectively. Overall, it is hoped that this selection of topics can give the reader an appreciation for the capabilities of the regularized strict interpolation RBFN in a broad range of real-world applications.

The choice to present theory before practice reflects undoubtedly my bias towards a unified logical development of the material. Although I believe that it is easiest to understand the book's material when presented in this order, I also recognize and encourage those readers with more practical inclinations to browse the book, beginning with the Introduction and skipping between the desired application areas in Chapters 2 to 5. In this mode, the theoretically oriented Chapter 1 can act as a reference when the reader desires a more thorough understanding. To assist the reader's mental navigation in either mode, this book introduces specialized or otherwise less well-known concepts on a "just-in-time" basis with brief text digressions and footnotes.

The target audience for this book comprises researchers, practitioners, and graduate students in engineering and the sciences who are interested in nonparametric estimation and ANNs. Given the objectives of this book and its perspective, it is unavoidable that certain mathematical tools, possibly outside the common realm of graduate engineering disciplines, are necessary for the book's development. Thus, the reader is assumed to have some exposure to elementary measure, integration, and probability theory. Beyond these topics, care has been taken to introduce only the minimum set of such auxiliary mathematical concepts that is required to explain properly the results in this book. For these concepts, some fuller background can be found in the appendices. As is often stated in graduate engineering texts, a degree of "mathematical maturity," that is, the ability to follow mathematical arguments, can allow the reader to grasp the main ideas being discussed despite a lack of formal background.

As with any book, certain topics of interest are omitted. For example, optimizing the regularized strict interpolation RBFN architecture by using only a subset of the available training input data to define the RBFN centers is not discussed in any detail, nor is adaptation of other RBFN parameters such as center location and basis function width during overall network cost function minimization. On the one hand, such omissions acknowledge the availability of existing works on these aspects of RBFN design and, on the other, it could be argued that including these additional details, while important in their own right, does not further the stated purposes of this book. In the end, the bridge between nonparametric estimation and ANNs that this book builds represents but a small step in the path towards a more comprehensive understanding of the regularized strict interpolation RBFN as a principled design choice for RBF neural networks.

Acknowledgements

This work grew out of my doctoral studies completed in 1998 at the Communications Research Laboratory (CRL) at McMaster University, Ontario, Canada, under the able guidance of Dr. Simon Haykin. The speech data for the experiments in Chapter 3 were kindly provided by Dean McArthur. Chapters 4 and 5 arose from research performed jointly with Dr. Eric Derbez and Dr. Sadasivan Puthusserypady, respectively. To them and the many others at the CRL (and elsewhere) with whom I've had the pleasure of collaboration, I owe thanks. The same extends to my colleagues Kevin Hung and Jeremy Benson at PMC-Sierra, Inc., for their help in proofreading the page proofs, and Lisa Van Horn and others at John Wiley and Sons, Inc., for their patient assistance during the production process. Not least of all, I thank my wife Susan for her love and understanding during the writing of this book.

Paul V. Yee
Vancouver, Canada
January 2001

From the B&N Reads Blog

Customer Reviews