Fine analog coding minimizes information transmission
β Scribed by William R. Softky
- Book ID
- 103926594
- Publisher
- Elsevier Science
- Year
- 1996
- Tongue
- English
- Weight
- 825 KB
- Volume
- 9
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
β¦ Synopsis
One measure of the computational power of a single unit in a network is the rate at which it can transfer information. Because that rate depends both on the unit's analog resolution and on its speed, many types of units may reach their optimum information flow when they operate very fast, with only coarse or binary resolution at each time step (this occurs because the information rate scales linearly with speed or bandwidth, but only logarithmically with analog resolution). Furthermore, calculations for a mean-square-limited channel with Gaussian noise show that the optimum information rate for the best possible analog code is only about 30% better than the optimum rate for the simplest binary code. For the biologically realistic case o fan irregular neural spike train, a binary code carries about two orders of magnitude more information than an analog rate-code using the same spikes, and is equally consistent with neurophysiology. Thus, neural networks that optimally transfer information would probably use a binary (or coarse analog) code, rather than the high-resolution, real-valued codes usually explored.
π SIMILAR VOLUMES