๐”– Scriptorium
โœฆ   LIBER   โœฆ

๐Ÿ“

Neural Networks with Model Compression

โœ Scribed by Baochang Zhang; Tiancheng Wang; Sheng Xu; David Doermann


Publisher
Springer Nature Singapore
Year
2024
Tongue
English
Leaves
269
Category
Library

โฌ‡  Acquire This Volume

No coin nor oath required. For personal study only.

โœฆ Synopsis


Deep learning has achieved impressive results in image classification, computer vision and natural language processing. To achieve better performance, deeper and wider networks have been designed, which increase the demand for computational resources. The number of floating-point operations (FLOPs) has increased dramatically with larger networks, and this has become an obstacle for convolutional neural networks (CNNs) being developed for mobile and embedded devices. In this context, our book will focus on CNN compression and acceleration, which are important for the research community. We will describe numerous methods, including parameter quantization, network pruning, low-rank decomposition and knowledge distillation. More recently, to reduce the burden of handcrafted architecture design, neural architecture search (NAS) has been used to automatically build neural networks by searching over a vast architecture space. Our book will also introduce NAS due to its superiority and state-of-the-art performance in various applications, such as image classification and object detection. We also describe extensive applications of compressed deep models on image classification, speech recognition, object detection and tracking. These topics can help researchers better understand the usefulness and the potential of network compression on practical applications. Moreover, interested readers should have basic knowledge about machine learning and deep learning to better understand the methods described in this book.

โœฆ Table of Contents


Cover
Front Matter
1. Introduction
2. Binary Neural Networks
3. Binary Neural Architecture Search
4. Quantization of Neural Networks
5. Network Pruning
6. Applications


๐Ÿ“œ SIMILAR VOLUMES


Neural Networks with Model Compression
โœ Baochang Zhang, Tiancheng Wang, Sheng Xu, David Doermann ๐Ÿ“‚ Library ๐Ÿ“… 2024 ๐Ÿ› Springer ๐ŸŒ English

Deep learning has achieved impressive results in image classification, computer vision and natural language processing. To achieve better performance, deeper and wider networks have been designed, which increase the demand for computational resources. The number of floating-point operations (FLOPs)

Modelling Perception with Artificial Neu
โœ Colin R. Tosh, Graeme D. Ruxton ๐Ÿ“‚ Library ๐Ÿ“… 2010 ๐Ÿ› Cambridge University Press ๐ŸŒ English

Studies of the evolution of animal signals and sensory behaviour have more recently shifted from considering 'extrinsic' (environmental) determinants to 'intrinsic' (physiological) ones. The drive behind this change has been the increasing availability of neural network models. With contributions fr

Modelling Perception with Artificial Neu
โœ Colin R. Tosh, Graeme D. Ruxton ๐Ÿ“‚ Library ๐Ÿ“… 2010 ๐Ÿ› Cambridge University Press ๐ŸŒ English

Studies of the evolution of animal signals and sensory behaviour have more recently shifted from considering 'extrinsic' (environmental) determinants to 'intrinsic' (physiological) ones. The drive behind this change has been the increasing availability of neural network models. With contributions fr

Modelling perception with artificial neu
โœ Ruxton, Graeme D.;Tosh, Colin ๐Ÿ“‚ Library ๐Ÿ“… 2010 ๐Ÿ› Cambridge University Press ๐ŸŒ English

Studies of the evolution of animal signals and sensory behaviour have more recently shifted from considering 'extrinsic' (environmental) determinants to 'intrinsic' (physiological) ones. The drive behind this change has been the increasing availability of neural network models. With contributions fr

Neural Modeling and Neural Networks
โœ F. Ventriglia (Eds.) ๐Ÿ“‚ Library ๐Ÿ“… 1994 ๐Ÿ› Elsevier Science Pub Co ๐ŸŒ English

Research in neural networks has escalated dramatically in the last decade, acquiring along the way terms and concepts, such as learning, memory, perception, recognition, which are the basis of neuropsychology. Nevertheless, for many, neural modelling remains controversial in its purported ability to