Information-Theoretic Aspects of Neural Networks

Information theoretics vis-a-vis neural networks generally embodies parametric entities and conceptual bases pertinent to memory considerations and information storage, information-theoretic based cost-functions, and neurocybernetics and self-organization. Existing studies only sparsely cover the entropy and/or cybernetic aspects of neural information.
Information-Theoretic Aspects of Neural Networks cohesively explores this burgeoning discipline, covering topics such as:

  • Shannon information and information dynamics
  • neural complexity as an information processing system
  • memory and information storage in the interconnected neural web
  • extremum (maximum and minimum) information entropy
  • neural network training
  • non-conventional, statistical distance-measures for neural network optimizations
  • symmetric and asymmetric characteristics of information-theoretic error-metrics
  • algorithmic complexity based representation of neural information-theoretic parameters
  • genetic algorithms versus neural information
  • dynamics of neurocybernetics viewed in the information-theoretic plane
  • nonlinear, information-theoretic transfer function of the neural cellular units
  • statistical mechanics, neural networks, and information theory
  • semiotic framework of neural information processing and neural information flow
  • fuzzy information and neural networks
  • neural dynamics conceived through fuzzy information parameters
  • neural information flow dynamics
  • informatics of neural stochastic resonance
    Information-Theoretic Aspects of Neural Networks acts as an exceptional resource for engineers, scientists, and computer scientists working in the field of artificial neural networks as well as biologists applying the concepts of communication theory and protocols to the functioning of the brain. The information in this book explores new avenues in the field and creates a common platform for analyzing the neural complex as well as artificial neural networks.
  • 1101595083
    Information-Theoretic Aspects of Neural Networks

    Information theoretics vis-a-vis neural networks generally embodies parametric entities and conceptual bases pertinent to memory considerations and information storage, information-theoretic based cost-functions, and neurocybernetics and self-organization. Existing studies only sparsely cover the entropy and/or cybernetic aspects of neural information.
    Information-Theoretic Aspects of Neural Networks cohesively explores this burgeoning discipline, covering topics such as:

  • Shannon information and information dynamics
  • neural complexity as an information processing system
  • memory and information storage in the interconnected neural web
  • extremum (maximum and minimum) information entropy
  • neural network training
  • non-conventional, statistical distance-measures for neural network optimizations
  • symmetric and asymmetric characteristics of information-theoretic error-metrics
  • algorithmic complexity based representation of neural information-theoretic parameters
  • genetic algorithms versus neural information
  • dynamics of neurocybernetics viewed in the information-theoretic plane
  • nonlinear, information-theoretic transfer function of the neural cellular units
  • statistical mechanics, neural networks, and information theory
  • semiotic framework of neural information processing and neural information flow
  • fuzzy information and neural networks
  • neural dynamics conceived through fuzzy information parameters
  • neural information flow dynamics
  • informatics of neural stochastic resonance
    Information-Theoretic Aspects of Neural Networks acts as an exceptional resource for engineers, scientists, and computer scientists working in the field of artificial neural networks as well as biologists applying the concepts of communication theory and protocols to the functioning of the brain. The information in this book explores new avenues in the field and creates a common platform for analyzing the neural complex as well as artificial neural networks.
  • 172.99 In Stock
    Information-Theoretic Aspects of Neural Networks

    Information-Theoretic Aspects of Neural Networks

    by P. S. Neelakanta
    Information-Theoretic Aspects of Neural Networks

    Information-Theoretic Aspects of Neural Networks

    by P. S. Neelakanta

    eBook

    $172.99  $230.00 Save 25% Current price is $172.99, Original price is $230. You Save 25%.

    Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
    WANT A NOOK?  Explore Now

    Related collections and offers


    Overview

    Information theoretics vis-a-vis neural networks generally embodies parametric entities and conceptual bases pertinent to memory considerations and information storage, information-theoretic based cost-functions, and neurocybernetics and self-organization. Existing studies only sparsely cover the entropy and/or cybernetic aspects of neural information.
    Information-Theoretic Aspects of Neural Networks cohesively explores this burgeoning discipline, covering topics such as:

  • Shannon information and information dynamics
  • neural complexity as an information processing system
  • memory and information storage in the interconnected neural web
  • extremum (maximum and minimum) information entropy
  • neural network training
  • non-conventional, statistical distance-measures for neural network optimizations
  • symmetric and asymmetric characteristics of information-theoretic error-metrics
  • algorithmic complexity based representation of neural information-theoretic parameters
  • genetic algorithms versus neural information
  • dynamics of neurocybernetics viewed in the information-theoretic plane
  • nonlinear, information-theoretic transfer function of the neural cellular units
  • statistical mechanics, neural networks, and information theory
  • semiotic framework of neural information processing and neural information flow
  • fuzzy information and neural networks
  • neural dynamics conceived through fuzzy information parameters
  • neural information flow dynamics
  • informatics of neural stochastic resonance
    Information-Theoretic Aspects of Neural Networks acts as an exceptional resource for engineers, scientists, and computer scientists working in the field of artificial neural networks as well as biologists applying the concepts of communication theory and protocols to the functioning of the brain. The information in this book explores new avenues in the field and creates a common platform for analyzing the neural complex as well as artificial neural networks.

  • Product Details

    ISBN-13: 9781000141252
    Publisher: CRC Press
    Publication date: 09/23/2020
    Sold by: Barnes & Noble
    Format: eBook
    Pages: 416
    File size: 9 MB

    About the Author

    P. S. Neelakanta

    Table of Contents

    Introduction, Neural Complex: A Nonlinear CI System?, Neural Complex vis-a-vis Statistical Mechanics, Entropy, Thermodynamics and Information Theory, Neural Communication and Control in Information-Theoretic Plane, Neural Complexity: An Algorithmic Representation, Neural Information Dynamics, Semiotic Framework of Neural Information Processing, Genetic Algorithmic Based Depiction of Neural Information, Epilogue, Appendix
    From the B&N Reads Blog

    Customer Reviews