Search Results for computation-in-neurons-and-neural-systems

Computation in Neurons and Neural Systems contains the collected papers of the 1993 Conference on Computation and Neural Systems which was held between July 31--August 7, in Washington, DC. These papers represent a cross-section of the ...

Author: Frank H. Eeckman

Publisher: Springer Science & Business Media

ISBN: 9781461527145

Category: Technology & Engineering

Page: 319

View: 793

DOWNLOAD & READ
Computation in Neurons and Neural Systems contains the collected papers of the 1993 Conference on Computation and Neural Systems which was held between July 31--August 7, in Washington, DC. These papers represent a cross-section of the state-of-the-art research work in the field of computational neuroscience, and includes coverage of analysis and modeling work as well as results of new biological experimentation.
2012-12-06 By Frank H. Eeckman

This book provides an overview of recent developments in biological neuroscience and artificial neural networks.

Author: Richard Durbin

Publisher: Addison-Wesley

ISBN: UOM:39015017965396

Category: Computers

Page: 417

View: 122

DOWNLOAD & READ
This book provides an overview of recent developments in biological neuroscience and artificial neural networks. Based on a selection of work presented to a meeting on The Neuron as a Computational Unit held in Cambridge in 1988, it brings together the work of leading scientists in neurophysiology and computational neuroscience.
1989 By Richard Durbin

The goal of this volume is to provide a résumé of the state-of-the-art in experimental, computational, and mathematical investigations into the functions of dendrites in a variety of neural systems.

Author: Hermann Cuntz

Publisher: Springer Science & Business Media

ISBN: 9781461480945

Category: Medical

Page: 510

View: 374

DOWNLOAD & READ
Neuronal dendritic trees are complex structures that endow the cell with powerful computing capabilities and allow for high neural interconnectivity. Studying the function of dendritic structures has a long tradition in theoretical neuroscience, starting with the pioneering work by Wilfrid Rall in the 1950s. Recent advances in experimental techniques allow us to study dendrites with a new perspective and in greater detail. The goal of this volume is to provide a résumé of the state-of-the-art in experimental, computational, and mathematical investigations into the functions of dendrites in a variety of neural systems. The book first looks at morphological properties of dendrites and summarizes the approaches to measure dendrite morphology quantitatively and to actually generate synthetic dendrite morphologies in computer models. This morphological characterization ranges from the study of fractal principles to describe dendrite topologies, to the consequences of optimization principles for dendrite shape. Individual approaches are collected to study the aspects of dendrite shape that relate directly to underlying circuit constraints and computation. The second main theme focuses on how dendrites contribute to the computations that neurons perform. What role do dendritic morphology and the distributions of synapses and membrane properties over the dendritic tree have in determining the output of a neuron in response to its input? A wide range of studies is brought together, with topics ranging from general to system-specific phenomena—some having a strong experimental component, and others being fully theoretical. The studies come from many different neural systems and animal species ranging from invertebrates to mammals. With this broad focus, an overview is given of the diversity of mechanisms that dendrites can employ to shape neural computations.
2013-11-23 By Hermann Cuntz

The book can be used for advanced undergraduate or graduate level courses.

Author: Michael A. Arbib

Publisher: MIT Press

ISBN: 9780262034968

Category: Science

Page: 808

View: 378

DOWNLOAD & READ
A comprehensive, integrated, and accessible textbook presenting core neuroscientific topics from a computational perspective, tracing a path from cells and circuits to behavior and cognition. This textbook presents a wide range of subjects in neuroscience from a computational perspective. It offers a comprehensive, integrated introduction to core topics, using computational tools to trace a path from neurons and circuits to behavior and cognition. Moreover, the chapters show how computational neuroscience—methods for modeling the causal interactions underlying neural systems—complements empirical research in advancing the understanding of brain and behavior. The chapters—all by leaders in the field, and carefully integrated by the editors—cover such subjects as action and motor control; neuroplasticity, neuromodulation, and reinforcement learning; vision; and language—the core of human cognition. The book can be used for advanced undergraduate or graduate level courses. It presents all necessary background in neuroscience beyond basic facts about neurons and synapses and general ideas about the structure and function of the human brain. Students should be familiar with differential equations and probability theory, and be able to pick up the basics of programming in MATLAB and/or Python. Slides, exercises, and other ancillary materials are freely available online, and many of the models described in the chapters are documented in the brain operation database, BODB (which is also described in a book chapter). Contributors Michael A. Arbib, Joseph Ayers, James Bednar, Andrej Bicanski, James J. Bonaiuto, Nicolas Brunel, Jean-Marie Cabelguen, Carmen Canavier, Angelo Cangelosi, Richard P. Cooper, Carlos R. Cortes, Nathaniel Daw, Paul Dean, Peter Ford Dominey, Pierre Enel, Jean-Marc Fellous, Stefano Fusi, Wulfram Gerstner, Frank Grasso, Jacqueline A. Griego, Ziad M. Hafed, Michael E. Hasselmo, Auke Ijspeert, Stephanie Jones, Daniel Kersten, Jeremie Knuesel, Owen Lewis, William W. Lytton, Tomaso Poggio, John Porrill, Tony J. Prescott, John Rinzel, Edmund Rolls, Jonathan Rubin, Nicolas Schweighofer, Mohamed A. Sherif, Malle A. Tagamets, Paul F. M. J. Verschure, Nathan Vierling-Claasen, Xiao-Jing Wang, Christopher Williams, Ransom Winder, Alan L. Yuille
2016-11-11 By Michael A. Arbib

Author:

Publisher:

ISBN: UCAL:B5036864

Category: Neural computers

Page:

View: 204

DOWNLOAD & READ
2005 By

Accordinlgy, while the majority of the papers in this book describe analysis and modeling efforts, other papers describe the results of new biological experiments explicitly placed in the context of computational issues.

Author: Frank H. Eeckman

Publisher: Springer Science & Business Media

ISBN: 9781461532545

Category: Computers

Page: 539

View: 716

DOWNLOAD & READ
Computational neuroscience is best defined by its focus on understanding the nervous systems as a computational device rather than by a particular experimental technique. Accordinlgy, while the majority of the papers in this book describe analysis and modeling efforts, other papers describe the results of new biological experiments explicitly placed in the context of computational issues. The distribution of subjects in Computation and Neural Systems reflects the current state of the field. In addition to the scientific results presented here, numerous papers also describe the ongoing technical developments that are critical for the continued growth of computational neuroscience. Computation and Neural Systems includes papers presented at the First Annual Computation and Neural Systems meeting held in San Francisco, CA, July 26--29, 1992.
2012-12-06 By Frank H. Eeckman

This collection includes 71 of the more than 100 papers presented at this year's meeting. Acceptance for meeting presentation was based on the peer review of preliminary papers by at least two referees.

Author: James M. Bower

Publisher: Springer Science & Business Media

ISBN: 9781461522355

Category: Computers

Page: 444

View: 904

DOWNLOAD & READ
This volume includes papers presented at the Third Annual Computation and Neural Systems meeting (CNS*94) held in Monterey California, July 21 - July 26, 1994. This collection includes 71 of the more than 100 papers presented at this year's meeting. Acceptance for meeting presentation was based on the peer review of preliminary papers by at least two referees. The papers in this volume were submitted in final form after the meeting. As represented by this volume, CNS meetings continue to expand in quality, size and breadth of focus as increasing numbers of neuroscientists are taking a computational approach to understanding nervous system function. The CNS meetings are intended to showcase the best of current research in computational neuroscience. As such the meeting is fundamentally focused on understanding the relationship between the structure of neIVOUS systems and their function. What is clear from the continued expansion of the CNS meetings is that computational approaches are increasingly being applied at all levels of neurobiological analysis. in an ever growing number of experimental preparations. and neural subsystems. Thus. experimental subjects range from crickets to primates; sensory systems range from vision to electroreception; experimental approaches range from realistic models of ion channels to the analysis of the information content of spike trains. For this reason, the eNS meetings represent an opportunity for computational neurobiologists to consider their research results in a much broader context than is usually possible.
2012-12-06 By James M. Bower

This book will appeal to a wide audience consisting of computer scientists and electrical engineers interested in brain-like computational mechanisms, computer architects exploring the development of high-performance computing systems to ...

Author: A. Ravishankar Rao

Publisher: Springer Science & Business Media

ISBN: 1461407249

Category: Medical

Page: 226

View: 529

DOWNLOAD & READ
A significant amount of effort in neural modeling is directed towards understanding the representation of information in various parts of the brain, such as cortical maps [6], and the paths along which sensory information is processed. Though the time domain is integral an integral aspect of the functioning of biological systems, it has proven very challenging to incorporate the time domain effectively in neural network models. A promising path that is being explored is to study the importance of synchronization in biological systems. Synchronization plays a critical role in the interactions between neurons in the brain, giving rise to perceptual phenomena, and explaining multiple effects such as visual contour integration, and the separation of superposed inputs. The purpose of this book is to provide a unified view of how the time domain can be effectively employed in neural network models. A first direction to consider is to deploy oscillators that model temporal firing patterns of a neuron or a group of neurons. There is a growing body of research on the use of oscillatory neural networks, and their ability to synchronize under the right conditions. Such networks of synchronizing elements have been shown to be effective in image processing and segmentation tasks, and also in solving the binding problem, which is of great significance in the field of neuroscience. The oscillatory neural models can be employed at multiple scales of abstraction, ranging from individual neurons, to groups of neurons using Wilson-Cowan modeling techniques and eventually to the behavior of entire brain regions as revealed in oscillations observed in EEG recordings. A second interesting direction to consider is to understand the effect of different neural network topologies on their ability to create the desired synchronization. A third direction of interest is the extraction of temporal signaling patterns from brain imaging data such as EEG and fMRI. Hence this Special Session is of emerging interest in the brain sciences, as imaging techniques are able to resolve sufficient temporal detail to provide an insight into how the time domain is deployed in cognitive function. The following broad topics will be covered in the book: Synchronization, phase-locking behavior, image processing, image segmentation, temporal pattern analysis, EEG analysis, fMRI analyis, network topology and synchronizability, cortical interactions involving synchronization, and oscillatory neural networks. This book will benefit readers interested in the topics of computational neuroscience, applying neural network models to understand brain function, extracting temporal information from brain imaging data, and emerging techniques for image segmentation using oscillatory networks
2011-09-18 By A. Ravishankar Rao

This book models an idealized neuron as being driven by basic electrical elements, the goal being to systematically characterize the logical properties of neural pulses.

Author: John Robert Burger

Publisher: Springer Science & Business Media

ISBN: 9781461464129

Category: Medical

Page: 227

View: 103

DOWNLOAD & READ
This book models an idealized neuron as being driven by basic electrical elements, the goal being to systematically characterize the logical properties of neural pulses. In order to constitute a system, neurons as pulsating devices may be represented using novel circuit elements as delineated in this book. A plausible brain system is implied by the delineated elements and logically follows from known and likely properties of a neuron. New to electrical science are novel pulse-related circuit elements involving recursive neurons. A recursive neuron, when properly excited, produces a self-sustaining pulse train that when sampled, provides a true output with a specified probability, and a false output with complementary probability. Because of its similarity to the qubits of quantum mechanics, the recursive pulsating neuron is termed a simulated qubit. Recursive neurons easily function as controlled toggle devices and so are capable of massively parallel calculations, this being a new dimension in brain functioning as described in this book. Simulated qubits and their possibilities are compared to the qubits of quantum physics. Included in the book are suggested neural circuits for associative memory search via a randomized process of cue selection, and neural circuits for priority calculations. These serve to select returns from long term memory, which in turn determines one's next conscious thought or action based on past memorized experiences. The book reports on proposals involving electron tunneling between synapses, and quantum computations within neurons. Although not a textbook, there are easy exercises at the ends of chapters, and in the appendix there are twelve simulation experiments concerning neurons. ​
2013-05-31 By John Robert Burger

This book, written from the perspective of a computer designer and targeted at computer researchers, is intended to give both background and lay out a course of action for studying the brain's computational paradigm.

Author: James E. Smith

Publisher: Morgan & Claypool Publishers

ISBN: 9781627058902

Category: Computers

Page: 215

View: 511

DOWNLOAD & READ
Understanding and implementing the brain's computational paradigm is the one true grand challenge facing computer researchers. Not only are the brain's computational capabilities far beyond those of conventional computers, its energy efficiency is truly remarkable. This book, written from the perspective of a computer designer and targeted at computer researchers, is intended to give both background and lay out a course of action for studying the brain's computational paradigm. It contains a mix of concepts and ideas drawn from computational neuroscience, combined with those of the author. As background, relevant biological features are described in terms of their computational and communication properties. The brain's neocortex is constructed of massively interconnected neurons that compute and communicate via voltage spikes, and a strong argument can be made that precise spike timing is an essential element of the paradigm. Drawing from the biological features, a mathematics-based computational paradigm is constructed. The key feature is spiking neurons that perform communication and processing in space-time, with emphasis on time. In these paradigms, time is used as a freely available resource for both communication and computation. Neuron models are first discussed in general, and one is chosen for detailed development. Using the model, single-neuron computation is first explored. Neuron inputs are encoded as spike patterns, and the neuron is trained to identify input pattern similarities. Individual neurons are building blocks for constructing larger ensembles, referred to as "columns". These columns are trained in an unsupervised manner and operate collectively to perform the basic cognitive function of pattern clustering. Similar input patterns are mapped to a much smaller set of similar output patterns, thereby dividing the input patterns into identifiable clusters. Larger cognitive systems are formed by combining columns into a hierarchical architecture. These higher level architectures are the subject of ongoing study, and progress to date is described in detail in later chapters. Simulation plays a major role in model development, and the simulation infrastructure developed by the author is described.
2017-05-18 By James E. Smith

This book contains twenty-two original contributions that provide a comprehensive overview of computational approaches to understanding a single neuron structure. The focus on cellular-level processes is twofold.

Author: Thomas M. McKenna

Publisher: Academic Press

ISBN: 9781483296067

Category: Computers

Page: 644

View: 873

DOWNLOAD & READ
This book contains twenty-two original contributions that provide a comprehensive overview of computational approaches to understanding a single neuron structure. The focus on cellular-level processes is twofold. From a computational neuroscience perspective, a thorough understanding of the information processing performed by single neurons leads to an understanding of circuit- and systems-level activity. From the standpoint of artificial neural networks (ANNs), a single real neuron is as complex an operational unit as an entire ANN, and formalizing the complex computations performed by real neurons is essential to the design of enhanced processor elements for use in the next generation of ANNs. The book covers computation in dendrites and spines, computational aspects of ion channels, synapses, patterned discharge and multistate neurons, and stochastic models of neuron dynamics. It is the most up-to-date presentation of biophysical and computational methods.
2014-05-19 By Thomas M. McKenna

This is the first time in history in which it is practical to construct networks of neuron-like circuits of reasonable complexity that can be applied to real problems.

Author: Mona E. Zaghloul

Publisher: Springer Science & Business Media

ISBN: 9781461526803

Category: Technology & Engineering

Page: 292

View: 491

DOWNLOAD & READ
When confronted with the hows and whys of nature's computational engines, some prefer to focus upon neural function: addressing issues of neural system behavior and its relation to natural intelligence. Then there are those who prefer the study of the "mechanics" of neural systems: the nuts and bolts of the "wetware": the neurons and synapses. Those who investigate pulse coded implementations ofartificial neural networks know what it means to stand at the boundary which lies between these two worlds: not just asking why natural neural systems behave as they do, but also how they achieve their marvelous feats. The research results presented in this book not only address more conventional abstract notions of neural-like processing, but also the more specific details ofneural-like processors. It has been established for some time that natural neural systems perform a great deal of information processing via electrochemical pulses. Accordingly, pulse coded neural network concepts are receiving increased attention in artificial neural network research. This increased interest is compounded by continuing advances in the field of VLSI circuit design. This is the first time in history in which it is practical to construct networks of neuron-like circuits of reasonable complexity that can be applied to real problems. We believe that the pioneering work in artificial neural systems presented in this book will lead to further advances that will not only be useful in some practical sense, but may also provide some additional insight into the operation of their natural counterparts.
2012-12-06 By Mona E. Zaghloul

This book offers a timely overview of theories and methods developed by an authoritative group of researchers to understand the link between criticality and brain functioning.

Author: Nergis Tomen

Publisher: Springer

ISBN: 9783030209650

Category: Medical

Page: 287

View: 395

DOWNLOAD & READ
This book offers a timely overview of theories and methods developed by an authoritative group of researchers to understand the link between criticality and brain functioning. Cortical information processing in particular and brain function in general rely heavily on the collective dynamics of neurons and networks distributed over many brain areas. A key concept for characterizing and understanding brain dynamics is the idea that networks operate near a critical state, which offers several potential benefits for computation and information processing. However, there is still a large gap between research on criticality and understanding brain function. For example, cortical networks are not homogeneous but highly structured, they are not in a state of spontaneous activation but strongly driven by changing external stimuli, and they process information with respect to behavioral goals. So far the questions relating to how critical dynamics may support computation in this complex setting, and whether they can outperform other information processing schemes remain open. Based on the workshop “Dynamical Network States, Criticality and Cortical Function", held in March 2017 at the Hanse Institute for Advanced Studies (HWK) in Delmenhorst, Germany, the book provides readers with extensive information on these topics, as well as tools and ideas to answer the above-mentioned questions. It is meant for physicists, computational and systems neuroscientists, and biologists.
2019-07-23 By Nergis Tomen

Biophysics of Computation: Information Processing in Single Neurons challenges this notion, using richly detailed experimental and theoretical findings from cellular biophysics to explain the repertoire of computational functions available ...

Author: Christof Koch

Publisher: Oxford University Press

ISBN: 9780195181999

Category: Medical

Page: 562

View: 176

DOWNLOAD & READ
Neural network research often builds on the fiction that neurons are simple linear threshold units, completely neglecting the highly dynamic and complex nature of synapses, dendrites, and voltage-dependent ionic currents. Biophysics of Computation: Information Processing in Single Neurons challenges this notion, using richly detailed experimental and theoretical findings from cellular biophysics to explain the repertoire of computational functions available to single neurons. The author shows how individual nerve cells can multiply, integrate, or delay synaptic inputs and how information can be encoded in the voltage across the membrane, in the intracellular calcium concentration, or in the timing of individual spikes.Key topics covered include the linear cable equation; cable theory as applied to passive dendritic trees and dendritic spines; chemical and electrical synapses and how to treat them from a computational point of view; nonlinear interactions of synaptic input in passive and active dendritic trees; the Hodgkin-Huxley model of action potential generation and propagation; phase space analysis; linking stochastic ionic channels to membrane-dependent currents; calcium and potassium currents and their role in information processing; the role of diffusion, buffering and binding of calcium, and other messenger systems in information processing and storage; short- and long-term models of synaptic plasticity; simplified models of single cells; stochastic aspects of neuronal firing; the nature of the neuronal code; and unconventional models of sub-cellular computation.Biophysics of Computation: Information Processing in Single Neurons serves as an ideal text for advanced undergraduate and graduate courses in cellular biophysics, computational neuroscience, and neural networks, and will appeal to students and professionals in neuroscience, electrical and computer engineering, and physics.
2004-10-28 By Christof Koch

Throughout, the contributors share their vast expertise on the means and benefits of creating brain-like machines. This book is appropriate for advanced students and practitioners of artificial intelligence and machine learning.

Author: Taras Kowaliw

Publisher: Springer

ISBN: 9783642553370

Category: Technology & Engineering

Page: 261

View: 463

DOWNLOAD & READ
The pursuit of artificial intelligence has been a highly active domain of research for decades, yielding exciting scientific insights and productive new technologies. In terms of generating intelligence, however, this pursuit has yielded only limited success. This book explores the hypothesis that adaptive growth is a means of moving forward. By emulating the biological process of development, we can incorporate desirable characteristics of natural neural systems into engineered designs and thus move closer towards the creation of brain-like systems. The particular focus is on how to design artificial neural networks for engineering tasks. The book consists of contributions from 18 researchers, ranging from detailed reviews of recent domains by senior scientists, to exciting new contributions representing the state of the art in machine learning research. The book begins with broad overviews of artificial neurogenesis and bio-inspired machine learning, suitable both as an introduction to the domains and as a reference for experts. Several contributions provide perspectives and future hypotheses on recent highly successful trains of research, including deep learning, the Hyper NEAT model of developmental neural network design, and a simulation of the visual cortex. Other contributions cover recent advances in the design of bio-inspired artificial neural networks, including the creation of machines for classification, the behavioural control of virtual agents, the desi gn of virtual multi-component robots and morphologies and the creation of flexible intelligence. Throughout, the contributors share their vast expertise on the means and benefits of creating brain-like machines. This book is appropriate for advanced students and practitioners of artificial intelligence and machine learning.
2014-06-04 By Taras Kowaliw

Both specialists and laymen will enjoy reading this book.

Author: Benny Elley Lautrup

Publisher: World Scientific

ISBN: 9789814507325

Category: Science

Page: 180

View: 901

DOWNLOAD & READ
Both specialists and laymen will enjoy reading this book. Using a lively, non-technical style and images from everyday life, the authors present the basic principles behind computing and computers. The focus is on those aspects of computation that concern networks of numerous small computational units, whether biological neural networks or artificial electronic devices.
1990-05-01 By Benny Elley Lautrup

The 119 contributions in this book cover a range of topics, including parallel computing, parallel processing in biological neural systems, simulators for artificial neural networks, neural networks for visual and auditory pattern ...

Author: Rolf Eckmiller

Publisher: North Holland

ISBN: 0444883908

Category: Computers

Page: 626

View: 907

DOWNLOAD & READ
The 119 contributions in this book cover a range of topics, including parallel computing, parallel processing in biological neural systems, simulators for artificial neural networks, neural networks for visual and auditory pattern recognition as well as for motor control, AI, and examples of optical and molecular computing. The book may be regarded as a state-of-the-art report and at the same time as an Interdisciplinary Reference Source' for parallel processing. It should catalyze international and interdisciplinary cooperation among computer scientists, neuroscientists, physicists and engineers in the attempt to: 1) decipher parallel information processes in biology, physics and chemistry 2) design conceptually similar technical parallel information processors."
1990 By Rolf Eckmiller

This book looks at how these techniques could complement each other and how, by understanding the role of each in a particular application, we can pave the way towards the development of more effective intelligent systems.

Author: Artur S. d'Avila Garcez

Publisher: Springer Science & Business Media

ISBN: 1852335122

Category: Computers

Page: 271

View: 393

DOWNLOAD & READ
Computing Science and Artificial Intelligence are concerned with producing devices that help and/or replace human beings in their daily activities. To be successful, adequate modelling of these activities needs to be carried out and this has accelerated the development of both old and new disciplines, including Logic and Computation, Neural Networks, Genetic Algorithms and Probabilistic/Casual Networks. This book looks at how these techniques could complement each other and how, by understanding the role of each in a particular application, we can pave the way towards the development of more effective intelligent systems.
2002-08-06 By Artur S. d'Avila Garcez

Computing Methodologies -- Artificial Intelligence.

Author: James A. Freeman

Publisher: Addison Wesley Publishing Company

ISBN: UOM:39015034222953

Category: Computers

Page: 401

View: 385

DOWNLOAD & READ
Computing Methodologies -- Artificial Intelligence.

This is a comprehensive introduction to neural networks and neural information processing.

Author: Helge Ritter

Publisher: Addison Wesley Publishing Company

ISBN: UCAL:B4336215

Category: Computers

Page: 306

View: 343

DOWNLOAD & READ
1992 By Helge Ritter

Best Books