𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Advances in neural information processing systems 12: Proceedings of the 1999 conference: Edited by Sara A. Solla, Todd K. Leen and Klaus-Robert Müller. The MIT Press, Cambridge, MA. (2000). 1080 pages. $65.00


Publisher
Elsevier Science
Year
2001
Tongue
English
Weight
362 KB
Volume
41
Category
Article
ISSN
0898-1221

No coin nor oath required. For personal study only.

✦ Synopsis


Preface. NIPS committees. Reviewers. Part I. Cognitive science. Recognizing evoked potentials in a virtual environment (Jessica D. Bayliss and Dana H. Ballard). A neurodynamical approach to visual attention (Gustavo Deco and Josef Zihl). Effects of spatial and temporal contiguity on the acquisition of spatial information (Thea B. Ghiselli-Crippa and Paul W. Munro). Acquisition in autoshaping (Sham Kalmde and Peter Dayan). Robust recognition of noisy and superimposed patterns via selective attention (Soo-Young Lee and Michael C. Mozer). Perceptual organization based on temporal dyanmics (Xiuwen Liu and DeLiang L. Wang). Information factorization in connectionist models of perception (Javier R. Movellan and James L. McClelland). Graded grammaticality in prediction fractal machines (Shan Parfitt, Peter Tifio and Georg Dorffner). Rules and similarity in concept learning (Joshua B. Tenenbaum). Evolving learnable languages (Bradley Tonkes, Alan Blair and Janet Wiles). Learning statistically neutral tasks without expert guidance (Ton Weijters, Antal van den Bosch and Eric Postma). A generative model for attractor dynamics (Richard S. Zemel and Michael C. Mozer). Part II. Neuroscience. Recurrent cortical competition: Strengthen or weaken? (P~ter Adorj£n, Lars Schwabe, Christian Piepenbrock and Klaus Obermayer). Effective learning requires neuronal remodeling of Hebbian synapses (Gal Chechik, Isaac Meilijson and Eytan Ruppin). Wiring optimization in the brain (Dmitri B. Chklovskii and Charles F. Stevens). Optimal sizes of dendritic and axonal arbors (Dmitri B. Chklovskii). Neural representation of multi-dimensional stimuli (Christian W. Eurich, Stefan D. Wilke and Helmut Schwegler). Spiking Boltzmann machines (Geoffrey E. Hinton and Andrew D. Brown). Distributed synchrony of spiking neurons in a Hebbian cell assembly (David Horn, Nir Levy, Isaac Meilijson and Eytan Ruppin). Can VI mechanisms account for figureground and medial axis effects? (Zhaoping Li). Channel noise in excitable neural membranes (Amit Manwani, Peter N. Steinmetz and Christof Koch). LTD facilitates learning in a noisy environment (Paul W. Munro and Gerardina Hernandez). Memory capacity of linear vs. nonlinear models of dendritic integration (Panayiota Poirazi and Bartlett W. Mel). Predictive sequence learning in recurrent neocortical circuits (Rajesh P.N. Rao and Terrence J. Sejnowski). A recurrent model of the interaction between prefrontal and inferotemporal cortex in delay tasks (Alfonso Renart, Nestor Parga and Edmund T. Rolls). Information capacity and robustness of stochastic neuron models (Elad Schneidman, Idan Segev and Naffali Tishby). An MEG study of response latency and variability in the human visual system during a visual-motor integration task (Akaysha C. Tang, Barak A. Pearhnutter, Tim A. Hely, Michael Zibulevsky and Michael P. Weisend). Population decoding based on an unfaithful model (Si Wu, Hiroyuki Nakahara, Noboru Murata and Shun-ichi Amari). Spike-based learning rules and stabilization of persistent neural activity (Xiaohui Xie and H. Sebastian Seung). Part III. Theory. A variational Bayesian framework for graphical models (Hagai Attias). Model selection in clustering by uniform convergence bounds (Joachim M. Buhmann and Marcus Held). Uniqueness of the SVM solution (Christopher J.C. Barges and David J. Crisp). Model selection for support vector machines (Olivier Chapelle and Vladimir N. Vapnik). Dynamics of supervised learning with restricted training sets and noisy teachers (A.C.C. Coolen and C.W.H. Mace). A geometric interpretation of u-SVM classifiers (David J. Crisp and Christopher J.C. Burges). Efficient approaches to Gaussian process classification (Lehel C,~satd, Ernest Fokou~, Manfred Opper, Bernhard Schottky and Ole Winther). Potential boosters? (Nigel Duffy and David Helmbold).

Bayesian averaging is well-temperated (Lars Kai Hansen). Regular and irregular Gallager-type error-correcting codes (Yoshiyuki Kabashima, Tatsuto Murayama, David Saad and Renato Vicente). Mixture density estimation (Jonathan Q. Li and Andrew R. Barron). Statistical dynamics of batch learning (Song Li and K.Y. Michael Wong). Neural computation with winner-take-all as the only nonlinear operation (Wolfgang Maass). Boosting with multiway branching in decision trees (Yishay Mansour and David McAllester). Inference for the generalization error (Claude Nadeau and Yoshua Bengio). Resonance in a stochastic neuron model with delayed interaction (Toru Ohira, Yuzuru Sato and Jack D. Cowan). Understanding stepwise generalization of support vector machines: A toy model (Sebastian Risau-Gusman and Mirta B. Gordon). Lower bounds on the complexity of approximating continuous functions by sigmoidal neural networks (Michael Schmitt). Noisy neural networks and generalization (Hava T. Siegelmann, Alexander Roitershtein and Asa Ben-Hut). The entropy regularization information criterion (Alexander J. Smola, John Shawe-Taylor, Bernhard SchSlkopf and Robert C. Williamson). Probabilistic methods for support vector machines (Peter Sollich). Algebraic analysis for non-regular learning machines (Sumio Watanabe). Semiparametric approach to multichannel blind deconvolution of nonminimum phase systems (L.-Q. Zhang, Shun-ichi Amari and A. Cichocki). Some theoretical results concerning the convergence of compositions of regularized linear functions (Tong Zhang).

Part IV. Algorithms and architecture. Robust full Bayesian methods for neural networks (Christophe Andrieu, Jo~o F.G. de Freitas and Arnaud Doucet). Independent factor analysis with temporally structured sources (Hagai Attias). Gaussian fields for approximate inference in layered sigmoid belief networks (David Barber and Peter Sollich). Modeling high-dimensional discrete data with multi-layer neural networks (Yoshua Bengio and Samy Bengio). Robust neural network regression for offiine and online learning (Thomas Briegel and Volker Tresp). Reconstruction of sequential data with probabilistic models and continuity constraints (Miguel/~. Carreira-Perpifi£n). Transductive inference for estimating values of functions (Olivier Chapelle, Vladimir N. Vapnik and Jason Weston). The nonnegative Boltzmann machine (Oliver B. Downs, David J.C. MacKay and Daniel D. Lee). Differentiating functions of the Jacobian with respect to the weights (Gary William Flake and Barak A. Pearlmutter). Local probability propagation for factor analysis (Brendan J. Prey). Variational inference for Bayesian mixtures of factor analysers (Zoubin Ghahramani and Matthew J. Beal). Bayesian transduction (Thore Graepel, Ralf Herbrich


📜 SIMILAR VOLUMES


Advances in neuro information processing
📂 Article 📅 1999 🏛 Elsevier Science 🌐 English ⚖ 363 KB

Contents: Preface. NIPS committees. Reviewers. I. Cognitive science. Evidence for a forward dynamics model in human adaptive motor control (Nikhil Bhushan and Reza Shadmehr). Perceiving without learning: From spirals to inside/outside relations (Ke Chen and DeLiang L. Wang). A model for associative

Advances in neural information processin
📂 Article 📅 1997 🏛 Elsevier Science 🌐 English ⚖ 354 KB

Contents: Preface. Committees. I. Cognitive science. Learning the structure of similarity (J.B. Tenenbaum). A model of spatial representations in parietal cortex explains hemineglect (A. Pouget and T.J. Sejnowski). Human reading and the curse of dimensionality (G.L. Martin). Extracting tree-structur