Foundations and tools for neural modeling : International Work-Conference on Artificial and Natural Neural Networks, IWANN'99, Alicante, Spain, June 2-4, 1999 : proceedings. Volume I / José Mira, Juan V. Sánchez-Andrés (eds.).Material type: TextSeries: Serienbezeichnung | Lecture notes in computer science ; 1606,Publication details: Berlin ; Heidelberg : Springer-Verlag, 1999. Description: 1 online resource (xxiii, 865 pages) : illustrationsContent type: text Media type: computer Carrier type: online resourceISBN: 9783540487715; 3540487719Subject(s): Neural networks (Neurobiology) | Neural networks (Computer science) | Computer science | Neurosciences | Computer network architectures | Artificial intelligence | Physiology | Artificial intelligence | Computer network architectures | Computer science | Neural networks (Computer science) | Neural networks (Neurobiology) | Neurosciences | PhysiologyGenre/Form: Electronic books. Additional physical formats: Print version:: Foundations and tools for neural modeling.DDC classification: 573.8 LOC classification: QA76.87Online resources: Click here to access online
|Item type||Current library||Collection||Call number||Status||Date due||Barcode||Item holds|
Includes bibliographical references.
This book constitutes, together with its compagnion LNCS 1607, the refereed proceedings of the International Work-Conference on Artificial and Natural Neural Networks, IWANN'99, held in Alicante, Spain in June 1999. The 89 revised papers presented were carefully reviewed and selected for inclusion in the book. This volume is devoted to foundational issues of neural computation and tools for neural modeling. The papers are organized in parts on neural modeling: biophysical and structural models; plasticity phenomena: maturing, learning, and memory; and artificial intelligence and cognitive neuroscience.
Self-assembly of oscillatory neurons and networks -- Reverberating loops of information as a dynamic mode of functional organization of the N.S.: A working conjecture -- Reconstruction of brain networks by algorithmic amplification of morphometry data -- Slow learning and fast evolution: An approach to cytoarchitectonic parcellation -- Dendritic [Ca2+] dynamics in the presence of immobile buffers and of dyes -- Development of directionally selective microcircuits in striate cortex -- Neural circuitry and plasticity in the adult vertebrate inner retina -- Modelling the circuitry of the cuneate nucleus -- Filtering capability of neural networks from the developing mammalian hippocampus -- Spatial inversion and facilitation in the J. Gonzalo's research of the sensorial cortex. Integrative aspects -- A self-organizing model for the development of ocular dominance and orientation columns in the visual cortex -- Gaze control with neural networks: A unified approach for saccades and smooth pursuit -- The neural net of Hydra and the modulation of its periodic activity -- A biophysical model of intestinal motility: Application in pharmacological studies -- Model of the neuronal net for detection of single bars and cross-like figures -- Connected cortical recurrent networks -- Inter-spike interval statistics of cortical neurons -- A new cochlear model based on adaptive gain mechanism -- Structure of lateral inhibition in an olfactory bulb model -- Effects of correlation and degree of balance in random synaptic inputs on the output of the hodgkin-huxley model -- Oscillations in the lower stations of the somatosensory pathway -- Effects of the ganglion cell response nonlinear mapping on visual system's noise filtering characteristics -- Paradoxical relationship between output and input regularity for the FitzHugh-Nagumo model -- Synchronisation in a network of FHN units with synaptic-like coupling -- Two-compartment stochastic model of a neuron with periodic input -- Stochastic model of the place cell discharge -- Integrate-and-fire model with correlated inputs -- Noise modulation by stochastic neurons of the integrate-and fire type -- Bayesian modelling of neural networks -- Neural networks of the hopfield type -- Stability properties of BSB models -- Storage capacity of the exponential correlation associative memory -- A new input-output function for binary hopfield neural networks -- On the three layer neural networks using sigmoidal functions -- The capacity and attractor basins of associative memory models -- A modular attractor model of semantic access -- Priming an artificial associative memory -- What does a peak in the landscape of a Hopfield associative memory look like? -- Periodic and synchronic firing in an ensemble of identical stochastic units: Structural stability -- Driving neuromodules into synchronous chaos -- Aging and lévy distributions in sandpiles -- Finite size effects in neural networks -- On the computational power of limited precision weights neural networks in classification problems: How to calculate the weight range so that a solution will exist -- Estimating exact form of generalisation errors -- A network model for the emergence of orientation maps and local lateral circuits -- A neural network model for the self-organization of cortical grating cells -- Extended nonlinear hebbian learning for developing sparse-distributed representation -- Cascade error projection: A learning algorithm for hardware implementation -- Unification of supervised and unsupervised training -- On-line optimization of radial basis function networks with orthogonal techniques -- A fast orthogonalized FIR adaptive filter structure using recurrent hopfield-like network -- Using temporal neighborhoods to adapt function approximators in reinforcement learning -- Autonomous clustering for machine learning -- Bioinspired framework for general-purpose learning -- Learning efficient rulesets from fuzzy data with a genetic algorithm -- Self-organizing cases to find paradigms -- Training higher order Gaussian synapses -- On-line gradient learning algorithms for K-nearest neighbor classifiers -- Structure adaptation in artificial neural networks through adaptive clustering and through growth in state space -- Sensitivity analysis of radial basis function networks for fault tolerance purposes -- Association with multi-dendritic radial basis units -- A Boolean neural network controlling task sequences in a noisy environment -- SOAN: Self organizing with adaptive neighborhood neural network -- Topology preservation in SOFM: An euclidean versus manhattan distance comparison -- Supervised VQ learning based on temporal inhibition -- Improving the LBG algorithm -- Sequential learning algorithm for PG-RBF network using regression weights for time series prediction -- Parallel fuzzy learning -- Classification and feature selection by a self-organizing neural network -- SA-prop: Optimization of multilayer perceptron parameters using simulated annealing -- Mobile robot path planning using genetic algorithms -- Do plants optimize? -- Heuristic generation of the initial population in solving job shop problems by evolutionary strategies -- Randomness in heuristics: An experimental investigation for the maximum satisfiability problem -- Solving the packing and strip-packing problems with genetic algorithms -- Multichannel pattern recognition neural network -- A biologically plausible maturation of an ART network -- Adaptive resonance theory microchips -- Application of ART2-A as a pseudo-supervised paradigm to nuclear reactor diagnostics -- Supervised ART-I: A new neural network architecture for learning and classifying multi-valued input patterns -- Conscious and intentional access to unconscious decision-making module in ambiguous visual perception -- A psychophysical approach to the mechanism of human stereovision -- Neuronal coding and color sensations -- Neurocomputational models of visualisation: A preliminary report -- Self-organization of shift-invariant receptive fields -- Pattern recognition system with top-down process of mental rotation -- Segmentation of occluded objects using a hybrid of selective attention and symbolic knowledge -- Hypercolumn model: A modified model of neocognitron using hierarchical self-organizing maps -- Attentional strategies for object recognition.