Advances in neural information processing systems 8: Proceedings of the 1995 conference: Edited by David S. Touretzky, Michael C. Mozer and Michael E. Hasselmo. MIT Press, Cambridge, MA. (1996). 1098 pages. $60.00
- Publisher
- Elsevier Science
- Year
- 1997
- Tongue
- English
- Weight
- 354 KB
- Volume
- 33
- Category
- Article
- ISSN
- 0898-1221
No coin nor oath required. For personal study only.
β¦ Synopsis
Contents: Preface. Committees. I. Cognitive science. Learning the structure of similarity (J.B. Tenenbaum). A model of spatial representations in parietal cortex explains hemineglect (A. Pouget and T.J. Sejnowski). Human reading and the curse of dimensionality (G.L. Martin). Extracting tree-structured representations of trained networks (M.W. Craven and J.W. Shavlik). Harmony networks do not work ( R. Gourley). Dynamics of attention as near saddle-node bifurcation behavior ( H. Nakahara and K. Doya). Rapid quality estimation of neural network input representations (K.J. Cherkauer and J.W. Shavlik). A model of auditory streaming (S.L. McCabe and M.J. Denham). II. Neuroscience. Modeling interactions of the rat's place and head direction systems (A.D. Redish and D.S. Touretzky). Correlated neuronal response: Time scales and mechanisms (W. Bait, E. Zohary and C. Koch). Information through a spiking neuron (C. Stevens and J.G. Taylor). A dynamical model of context dependencies for the vestibulo-ocular reflex (O.J.M.D. Coenen and T.J. Sejnowski). The role of activity in synaptic competition at the neuromuscular junction (S.R.H. Joseph and D.J. Willshaw). When is an integrate-and-fire neuron like a Poisson neuron? (C.F. Stevens and A. Zador). How perception guides production in birdeong learning (C.L. bS"y).
The geometry of eye rotations and Listing's law (A.A. Handsel and T. Flash). Temporal coding in the submillisecond range: Model of Barn Owl auditory pathway JR. Kempter, W. Gerstner, J.L. Van Hemmen and H. Wagner). Cholinergic suppression of transmission may allow combined associative memory function and self-organization in the neocortex (M.E. Hasselmo and M. Cekic). A predictive switching model of cerebellar movement control (A.G. Barto, J.T. Buckingham and J.C. Houk). Independent component analysis of electroencephalographic data (S. Makeig, A.J. Bell, T.-P. Jung and T.J. Sejnowski). Simulation of a thalamocortical circuit for computing directional heading in the rat (H.T. Blair). Plasticity of center-surround opponent receptive fields in real and artificial neural systems of vision i S. Yasui, T. FurukaweL, ~ M. Yamada and T. Saito). III. Theory. Learning model bias (J. Baxter). Statistical theory of overtraining--Is cross-validation asymptotically effective? ( S. Amari, N. Murata, K.R. Miiller, M. Finks and H. Yang). A bound on the error of cross validation using the approximation and estimation rates, with consequences for the training-test split (M. Kearns). Learning with ensembles: How overfitting can be useful ( P. Sollich and A. Korgh). Neural networks with quadratic VC dimension (P. Koiran and E.D. Sontag). Sample complexity for learning recurrent perceptron mappings ( B. Dasgupta and E.D. Sontag). On the computational power of noisy spiking neurons ( w. Maass). A realized learning task which exhibits overfitting (S. BSs). Stable dynamic parameter adaptation (S.M. Riiger). Estimating the Bayes risk from sample data (R.it. Snapp and T. Xu). Recursive estimation of dynamic modular RBF networks (V. Kadirkamanathan and M. Kadirkamanthan). On neural networks with minimal weights (V. Bohossian and J. Bruck). Modern analytic techniques to solve the dynamics of recurrent neural networks (A.C.C. Coolen, S.N. Langhton and D. Sherrington). Implementation issues in the Fourier transform algorithm (Y. Mansour and S. Sahar). Generalisation of a class of continuous neural networks (J. Shawe-Taylor and J. Zhao). Gradient and Hamiltonian dynamics applied to learning in neural networks (J.W. Howse, C.T. Abdallah and G.L. Heileman). Optimization principles for the neural code (M. DeWeese). Strong unimodality and exact learning of constant depth ~-perceptron networks (M. Marchand and S. Hadjifaradji). Active learning in multilayer perceptrons (K. Fukumizu). Dynamics of on-line gradient descent learning for multilayer neural networks (D. Saad and S.A. Solla). Worst-case loss bounds for single neurons (D.P. Helmbold, J. Kivinen and M.K. Warmuth). Exponentially many local minima for single neurons (P. Auer, M. Herbster and M.K. Warmuth). Adaptive back-propagation in on-line learning of multilayer networks (A.H.L. West and D. Saad). Optimizing cortical mappings (G.J. Goodhill, S. Finch and T.J. Sejnowski). Quadratic-type Lyapunov functions for competitive neural networks with different time-scales (A. Meyer-B~se). Examples of learning curves from a modified VC-formalism) (A. Kowalczyk, J. Szymanski, P.L. Bartlett and R.C. Williamson). Bayesian methods for mixtures of experts (S. Waterhonse, D. Mackay and T. Robinson). Some results on convergent unlearning algorithm (S.A. Semenov and I.B. Shuvalova). Geometry of early stopping in linear networks (R. Dodier). Absence of cycles in symmetric neural networks (X. Wang, A. Jagota, F. Botelho and M. Garzon). IV. Algorithms and architectures. Adaptive mixture of probabilistic transducers (Y. Singer). REMAP: Recursive estimation and maximization of a posteriori probabilities--Application to transition-based connectionist speech recognition (Y. Konig, H. Bourlard and N. Morgan). Recurrent neural networks for missing or asynchronous data (Y. Bengio and F. Gingras). Family discovery (S.M. Omohundro). Discriminant adaptive nearest neighbor classification and regression (T. Hastie and It. Tibshirani). Clustering data through an analogy to the Ports model (M. Blatt, S. Wiseman and E. Domany). Generalized learning vector quantization (A. Sato and K. Yamada). Stochastic hillclimbing as a baseline method for evaluating genetic algorithms (A. Jueis and M. Wattenberg). Symplectic nonlinear component analysis (L.C. Parra). A unified learning scheme: Bayesian-Kullback ying-yang machine (L. Xu). Universal approximation and learning of trajectories using oscillators (P. Baidi and K. Hornik). A smoothing regularizer for recurrent neural networks (L. Wu and J. Moody). EM optimization of latent-variable density models (C.M. Bishop, M. Svensdn and C.K.I. Williams). Factorial hidden Markov models (Z. Ghahramani and M.I. Jordan). Boosting decision trees (H. Drucker and C. Cortes). Exploiting tractable substructures in intractable networks (L.K. Saul and M.I. Jordan). Hierarchical recurrent neural networks for long-term dependencies (S.E. Hihi and Y. Bengio). Discovering structure in continuous variables using Bayesian networks (R. Hofmann and V. Tresp). Using pairs of data points to define splits for decision trees (G.E. Hinton and M. Revow). Gaussian processes for regression (C.K.I. Williams and C.E. Rasmussen). Pruning with generalization based weight saliencies: 7OBD, ~/OBS (M.W. Pedersen, L.K. Hansen and J. Larsen). Fast learning by bounding likelihoods in Sigmoid type belief networks (T. Jaakkola, L.K. Saul and M.I. Jordan). Generating accurate and diverse members of a neural-network ensemble (D.W. Opitz and
π SIMILAR VOLUMES
Contents: Preface. NIPS committees. Reviewers. I. Cognitive science. Evidence for a forward dynamics model in human adaptive motor control (Nikhil Bhushan and Reza Shadmehr). Perceiving without learning: From spirals to inside/outside relations (Ke Chen and DeLiang L. Wang). A model for associative
Preface. NIPS committees. Reviewers. Part I. Cognitive science. Recognizing evoked potentials in a virtual environment (Jessica D. Bayliss and Dana H. Ballard). A neurodynamical approach to visual attention (Gustavo Deco and Josef Zihl). Effects of spatial and temporal contiguity on the acquisition