<p>This booklet presents a reasonably self-contained theory of predicate transΒ former semantics. Predicate transformers were introduced by one of us (EWD) as a means for defining programming language semantics in a way that would directly support the systematic development of programs from their fo
Neuromimetic Semantics: Coordination, Quantification, and Collective Predicates
β Scribed by Harry Howard
- Publisher
- Elsevier B.V.
- Year
- 2004
- Tongue
- English
- Leaves
- 555
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
This book attempts to marry truth-conditional semantics with cognitive linguistics in the church of computational neuroscience. To this end, it examines the truth-conditional meanings of coordinators, quantifiers, and collective predicates as neurophysiological phenomena that are amenable to a neurocomputational analysis. Drawing inspiration from work on visual processing, and especially the simple/complex cell distinction in early vision (V1), we claim that a similar two-layer architecture is sufficient to learn the truth-conditional meanings of the logical coordinators and logical quantifiers. As a prerequisite, much discussion is given over to what a neurologically plausible representation of the meanings of these items would look like. We eventually settle on a representation in terms of correlation, so that, for instance, the semantic input to the universal operators (e.g. and, all)is represented as maximally correlated, while the semantic input to the universal negative operators (e.g. nor, no)is represented as maximally anticorrelated. On the basis this representation, the hypothesis can be offered that the function of the logical operators is to extract an invariant feature from natural situations, that of degree of correlation between parts of the situation. This result sets up an elegant formal analogy to recent models of visual processing, which argue that the function of early vision is to reduce the redundancy inherent in natural images. Computational simulations are designed in which the logical operators are learned by associating their phonological form with some degree of correlation in the inputs, so that the overall function of the system is as a simple kind of pattern recognition. Several learning rules are assayed, especially those of the Hebbian sort, which are the ones with the most neurological support. Learning vector quantization (LVQ) is shown to be a perspicuous and efficient means of learning the patterns that are of interest. We draw a formal parallelism between the initial, competitive layer of LVQ and the simple cell layer in V1, and between the final, linear layer of LVQ and the complex cell layer in V1, in that the initial layers are both selective, while the final layers both generalize. It is also shown how the representations argued for can be used to draw the traditionally-recognized inferences arising from coordination and quantification, and why the inference of subalternacy breaks down for collective predicates. Finally, the analogies between early vision and the logical operators allow us to advance the claim of cognitive linguistics that language is not processed by proprietary algorithms, but rather by algorithms that are general to the entire brain. Thus in the debate between objectivist and experiential metaphysics, this book falls squarely into the camp of the latter. Yet it does so by means of a rigorous formal, mathematical, and neurological exposition - in contradiction of the experiential claim that formal analysis has no place in the understanding of cognition. To make our own counter-claim as explicit as possible, we present a sketch of the LVQ structure in terms of mereotopology, in which the initial layer of the network performs topological operations, while the final layer performs mereological operations.The book is meant to be self-contained, in the sense that it does not assume any prior knowledge of any of the many areas that are touched upon. It therefore contains mini-summaries of biological visual processing, especially the retinocortical and ventral /what?/ parvocellular pathways; computational models of neural signaling, and in particular the reduction of the Hodgkin-Huxley equations to the connectionist and integrate-and-fire neurons; Hebbian learning rules and the elaboration of learning vector quantization; the linguistic pathway in the left hemisphere; memory and the hippocampus; truth-conditional vs. image-schematic semantics; objectivist vs. experiential metaphysics; and mereotopology. All of the simulations are implemented in MATLAB, and the code is available from the book's website. Β·The discovery of several algorithmic similarities between visison and semantics.Β·The support of all of this by means of simulations, and the packaging of all of this in a coherent theoretical framework.
β¦ Table of Contents
Cover......Page 1
Title Page......Page 5
Modest vs. robust theories of semantics......Page 7
Single neuron modeling......Page 9
The representation of coordinator meanings......Page 10
Neuromimetic networks for coordinator meanings......Page 11
Inferences among logical operators......Page 12
The failure of subalternacy......Page 13
Three generations of cognitive science......Page 14
MATLAB code......Page 15
Acknowledgements......Page 16
Table of contents......Page 17
1.1. The problem......Page 29
1.1.2. A modest solution: counting......Page 30
1.1.3. Finite automata for the logical coordinators......Page 33
1.1.4. A generalization to the logical quantifiers......Page 35
1.1.6. Set-theoretical alternatives......Page 36
1.2. Vision as an example of natural computation......Page 37
1.2.1. The retinogeniculate pathway......Page 38
1.2.2. Primary visual cortex......Page 43
1.2.2.1. Simple V1 cells......Page 47
1.2.2.2. Complex V1 cells......Page 51
1.2.2.3. The essential V1 circuit: selection and generalization......Page 54
1.2.2.4. Recoding to eliminate redundancy......Page 56
1.2.3.1. Feedforward along the dorsal and ventral streams......Page 63
1.2.3.2.1. Generative models and Bayesian inference......Page 66
1.2.3.2.2. Context......Page 72
1.2.3.2.3. Selective attention and dendritic processing......Page 75
1.2.4. Overview of the visual system......Page 79
1.2.4.2. Mereotopological organization......Page 80
1.3.1. Amit on biological plausibility......Page 82
1.3.2. Shastri on the logical problem of intelligent computation......Page 83
1.3.3. Touretzky and Eliasmith on knowledge representation......Page 85
1.3.4. Strong. vs. weak modularity......Page 86
1.4. How to evaluate competing proposals......Page 88
1.4.1.1. Marr's three levels of analysis......Page 89
1.4.1.2. Tri-level analysis in the light of computational neuroscience......Page 90
1.4.1.3. The computational environment......Page 93
1.4.1.4. Accounting for the desiderata of natural computation......Page 94
1.4.2.1. Chomsky's levels of adequacy of a grammar......Page 95
1.4.2.2. Adequacy of natural (linguistic) computation......Page 96
1.4.3. Levels of adequacy as levels of analysis......Page 98
1.4.4. Summary of five-level theory......Page 99
1.5. The competence/performance distinction......Page 100
1.5.1. Competence and tri-level theory......Page 101
1.5.2. Problems with the competence/performance distinction......Page 103
1.5.3. A nongenerative/experiential alternative......Page 104
1.6.1. The environmental causes of linguistic meaning......Page 106
1.6.2. Preprocessing to extract correlational invariances......Page 107
1.7. Where to go next......Page 110
2.1.1. The structure of the cell membrane......Page 112
2.1.2. Ion channels and chemical and electrical gradients......Page 113
2.2.1. The four-equation, Hodgkin-Huxley model......Page 115
2.2.2. Electrical and hydraulic models of the cell membrane......Page 116
2.2.2.1. The main voltage equation (at equilibrium)......Page 117
2.2.2.3. The three conductance equations......Page 120
2.2.2.4. Hodgkin-Huxley oscillations......Page 124
2.2.2.5. Simplifications and approximations......Page 126
2.2.3.1. Rate-constant interactions and the elimination of two variables......Page 127
2.2.3.2. The fast-slow system......Page 128
2.2.3.3. The FitzHugh-Nagumo model......Page 130
2.2.3.4. FitzHugh-Nagumo models of Type I neurons......Page 135
2.2.3.5. Neuron typology......Page 136
2.2.4. From two to one: The integrate-and-fire model......Page 138
2.2.4.1. Temporal or correlational coding......Page 139
2.2.5. From one to zero: Firing-rate models......Page 140
2.2.6. Summary and transition......Page 141
2.3.1. Dendrites......Page 142
2.3.2. Passive cable models of dendritic electrical function......Page 143
2.3.2.1. Equivalent cables/cylinders......Page 144
2.3.2.2. Passive cable properties and neurite typology......Page 145
2.4. Transmission of signals from cell to cell: the synapse......Page 147
2.4.1. Chemical modulation of synaptic transmission......Page 148
2.4.2. Synaptic efficacy......Page 150
2.4.3. Synaptic plasticity, long-term potentiation, and learning......Page 151
2.4.4. Models of diffusion......Page 153
2.4.5. Calcium accumulation and diffusion in spines......Page 157
2.5. Summary: the classical neuromimetic model......Page 158
2.5.1. The classical model......Page 160
2.5.2. Activation functions......Page 161
2.6. Expanded models......Page 163
2.6.1.1. Voltage-gated channels and compartmental models......Page 164
2.6.1.3. Dendritic spines as logic gates......Page 166
2.6.2. Synaptic stability......Page 167
2.6.3. The alternative of synaptic (or spinal) clustering......Page 168
2.7. Summary and transition......Page 169
3.1.1. Unsigned measures......Page 171
3.1.2. Unsigned measures in language and the problem of complementation......Page 174
3.1.3. Signed measures, signed algebras, and signed lattices......Page 175
3.1.4. Response to those who do not believe in signs......Page 178
3.1.5. Bivalent vs. trivalent logic......Page 179
3.1.6. An interim summary to introduce the notion of spiking measures......Page 181
3.1.7. The logical operators as measures......Page 183
3.2.1. Conditional cardinality......Page 184
3.2.1.1. Cardinality invariance......Page 187
3.2.2. Statistics......Page 189
3.2.2.1. Initial concepts: mean, deviation, variance......Page 190
3.2.2.2. Covariance and correlation......Page 192
3.2.3.1. Unconditional probability......Page 195
3.2.3.2. Conditional probability and the logical quantifiers......Page 197
3.2.3.3. Signed probability and the negative quantifiers......Page 198
3.2.4.1. Syntactic information......Page 199
3.2.4.2. Entropy and conditional entropy......Page 200
3.2.4.3. Semantic information......Page 202
3.2.5.1. Vectors......Page 203
3.2.5.2. Length and angle in polar space......Page 206
3.2.5.3.1. Logical operators as rays......Page 207
3.2.5.3.2. Scalar multiplication......Page 208
3.2.5.3.3. Normalization of a vector, sines and cosines......Page 209
3.2.5.4. Vector space and vector semantics......Page 210
3.2.6. Bringing statistics and vector algebra together......Page 211
3.3.1. A one-dimensional order topology......Page 213
3.3.2. A two-dimensional order topology......Page 215
3.3.3. The order-theoretic definition of a lattice......Page 216
3.4. Discreteness and convexity......Page 217
3.4.1. Voronoi tesselation......Page 218
3.4.2. Vector quantization......Page 219
3.4.3. Voronoi regions as attractor basins......Page 221
3.4.4. Tesselation and quantization: from continuous to discrete......Page 222
3.4.5. Convexity and categorization......Page 223
3.5. Semantic definitions of the logical operators......Page 224
3.5.2.1. Logical operators as edge detectors......Page 225
3.5.2.2. Logical operators as polarity detectors......Page 226
3.5.3. Summary and comparison to Horn's scale......Page 228
3.5.4.1. Vague quantifiers......Page 230
3.6.1. Negative uninformativeness......Page 231
3.6.2. Quantifying negative uninformativeness......Page 233
3.6.3. Horn on implicatures......Page 234
3.6.4. Quantifying the Q implicature......Page 235
3.6.6. Quantifying the usage of logical quantifiers......Page 236
3.7. Summary: What is logicality?......Page 239
4.1. The coordination of major categories......Page 241
4.2.1.1. Verbal predicates as patterns in a space of observations......Page 242
4.2.1.2. Coordinated names and other DPs......Page 243
4.2.1.3. A first mention of coordination and collectivity......Page 245
4.2.1.5. Coordinated common nouns......Page 246
4.2.1.6. Coordinated verbs......Page 247
4.2.1.7. Coordination beyond the monovalent predicate......Page 248
4.2.1.8. Multiple coordination and respectively......Page 249
4.2.2.1. Coordinated adjectivals......Page 250
4.3. Clausal coordination......Page 252
4.3.1. Conjunction reduction as vector addition......Page 253
4.3.2.1. Asymmetric coordination......Page 254
4.3.2.2. Kehler's coherence relations......Page 256
4.3.2.2.1. The data structure......Page 257
4.3.2.2.2. Coherence relations of Resemblance......Page 258
4.3.2.2.3. Coherence relations of Cause-Effect......Page 264
4.3.2.2.4. Coherence relations of Contiguity......Page 267
4.3.2.2.5. Summary......Page 269
4.3.2.3. Asymmetric coordination in Relevance Theory......Page 270
4.3.2.4. The Common-Topic Constraint......Page 271
4.3.3. Summary of clausal coordination......Page 272
4.4. Lexicalization of the logical operators......Page 273
4.4.2. Conversational implicature: from sixteen to three......Page 274
4.4.3. Neuromimetics: from sixteen to four......Page 275
4.5. OR versus XOR......Page 276
4.6. Summary......Page 278
5.1. A first step towards pattern-classification semantics......Page 280
5.2. Learning rules and cerebral subsystems......Page 282
5.3. Error-correction and hyperplane learning......Page 285
5.3.1. McCulloch and Pitts (1943) on the logical connectives......Page 286
5.3.2. Single-layer perceptron (SLP) networks......Page 288
5.3.2.1. SLP classification of the logical coordinators......Page 289
5.3.2.2. SLP error correction......Page 292
5.3.2.3. SLPs and unnormalized coordinators......Page 293
5.3.2.4. SLPs for the normalized logical coordinators......Page 296
5.3.2.5. Linear separability and XOR......Page 297
5.3.3.1. Multilayer perceptrons......Page 298
5.3.3.3. Learning by backpropagation of errors......Page 299
5.3.4. The implausibility of non-local learning rules......Page 300
5.4. Unsupervised learning......Page 301
5.4.1. The Hebbian learning rule......Page 302
5.4.2. Instar networks......Page 303
5.4.2.1. Introduction to the instar rule......Page 304
5.4.2.2. An instar simulation of the logical coordinators......Page 306
5.4.3. Unsupervised competitive learning......Page 307
5.4.3.1. A competitive simulation of the logical coordinators......Page 308
5.4.3.2. Quantization, Voronoi tesselation, and convexity......Page 309
5.5.1. A supervised competitive network and how it works......Page 310
5.5.2. An LVQ simulation of the logical coordinators......Page 312
5.5.3. Interim summary and comparison of LVQ to MLP......Page 313
5.5.4. LVQ in a broader perspective......Page 314
5.6. Dendritic processing......Page 315
5.6.1. From synaptic to dendritic processing......Page 316
5.6.2. Clustering of spines on a dendrite......Page 317
5.7. Summary......Page 322
6.1.2. Conjunctive vs. disjunctive contexts......Page 323
6.1.3. When coordination β quantification......Page 325
6.2.1. Introduction to quantifier meanings......Page 327
6.2.3. QUANT, EXT, CONS, and the Tree of Numbers......Page 329
6.2.3.1. Quantity......Page 330
6.2.3.2. Extension......Page 331
6.2.3.3. Conservativity......Page 332
6.2.3.4. The Tree of Numbers......Page 333
6.2.4. The neuromimetic perspective......Page 337
6.2.4.2. The form of a quantified clause: Quantifier Raising......Page 338
6.2.4.3.2 Extension and normalization......Page 341
6.2.4.3.3 CONS and labeled lines......Page 342
6.2.5. The origin, presupposition failure, and non-correlation......Page 344
6.2.6. Triviality......Page 345
6.2.6.1. Triviality and object recognition......Page 347
6.2.6.2. Continuity of non-triviality and logicality......Page 348
6.2.6.3. Continuity and the order topology......Page 349
6.2.7.1. FIN, density, and approximation......Page 350
6.3. Strict vs. loose readings of universal quantifiers......Page 351
6.4. Summary......Page 352
7.1.1. Perfect data, less than perfect data, and convex decision regions......Page 354
7.1.2. Weight decay and lateral inhibition......Page 356
7.1.3. Accuracy and generalization......Page 357
7.2.1. Three-dimensional data......Page 359
7.2.2. Antiphase complementation......Page 360
7.2.3. Selective attention......Page 361
7.3. Invariant extraction in L2......Page 363
7.4. Summary......Page 365
8.1. Inferences among logical operators......Page 367
8.1.1. The Square of Opposition for quantifiers......Page 368
8.1.2. A Square of Opposition for coordinators......Page 370
8.1.3.1. Syntactic/proof-theoretic deduction......Page 373
8.1.3.2. Semantic/model-theoretic deduction and Mental Models......Page 374
8.1.3.3. Modest vs. robust deduction?......Page 375
8.2.1. Shastri on connectionist reasoning......Page 376
8.2.2. Jackendoff (2002) on the organization of a grammar......Page 377
8.2.3. Spreading Activation Grammar......Page 378
8.2.4. Interactive Competition and Activation......Page 380
8.2.4.2. The calculation of input to a unit......Page 382
8.2.4.4. The evolution of change in activation of a network......Page 383
8.2.5. Activation spreading from semantics to phonology......Page 385
8.2.5.1. The challenge of negation......Page 386
8.2.6. Activation spreading from phonology to semantics......Page 388
8.2.7. Extending the network beyond the preprocessing module......Page 389
8.3.1. Subaltern oppositions......Page 390
8.3.2. Contradictory oppositions......Page 391
8.3.3. (Sub)contrary oppositions......Page 393
8.4. NALL and temporal limits on natural operators......Page 394
8.4.1. Comparisons to other approaches......Page 395
8.5. Summary......Page 396
9.1. Constructions which block the subaltern implication......Page 397
9.1.1. Classes of collectives and symmetric predicates......Page 398
9.2.1. A logical/diagrammatic representation of reciprocity......Page 399
9.2.3. Anaphora in SAG......Page 404
9.2.4. Comments on the SAG analysis of anaphora......Page 405
9.2.5. The contextual elimination of reciprocal links......Page 407
9.2.6. The failure of reciprocal subalternacy......Page 408
9.2.7. Reflexives and reciprocals pattern together......Page 409
9.3.1. Initial characterization and paths......Page 410
9.3.2.1. Verbs of intersection......Page 412
9.3.2.2. Resultative together......Page 414
9.3.2.3. Verbs of congregation......Page 415
9.3.2.4. Summary of centrifugal constructions......Page 417
9.3.3.1. Verbs of separation......Page 418
9.3.3.2. Verbs of extraction and the ablative alternation......Page 420
9.3.3.4. Verbs of dispersion......Page 422
9.3.3.5. Summary of centripetal constructions......Page 423
9.4. Center-oriented constructions as paths......Page 424
9.4.1. Covert reciprocity......Page 426
9.4.3. Pathβ and gestalt locations......Page 427
9.5. Summary......Page 429
10.1.1.1. Broca's aphasia and Broca's region......Page 431
10.1.1.2. Wernicke's aphasia and Wernicke's region......Page 433
10.1.1.3. Other regions......Page 434
10.1.1.4. The Wernicke-Lichtheim-Geschwind boxological model......Page 435
10.1.1.5. Cytoarchitecture and Brodmann's areas......Page 436
10.1.1.6. Cytoarchitecture and post-mortem observations......Page 437
10.1.1.8. The advent of commissurotomy......Page 438
10.1.1.9.1. Dichotic listening......Page 439
10.1.1.9.2. An aside on the right-ear advantage......Page 440
10.1.1.10. Pop-culture lateralization and beyond......Page 441
10.1.1.11.2. MRI and fMRI......Page 444
10.1.1.11.3. Results for language......Page 446
10.1.1.12. Computational modeling......Page 447
10.1.2.1. Word comprehension and the lateralization of lexical processing......Page 448
10.1.2.2. Where are content words stored?......Page 451
10.1.2.3. Where are function words stored?......Page 452
10.1.2.4. Function-word operations and anterior/posterior computation......Page 453
10.1.2.4.1. Goertzel's dual network model......Page 454
10.1.2.5. Some evidence for the weak modularity of language circuits......Page 455
10.1.2.7. BA 44 vs. BA 47......Page 457
10.2.1.1. Quantitative memory: memory storage......Page 458
10.2.1.2. Qualitative memory: declarative vs. non-declarative......Page 460
10.2.1.3. Synthesis......Page 462
10.2.2. A network for episodic memory......Page 463
10.2.2.1. The hippocampal formation and Shastri's SMRITI......Page 465
10.2.2.2.1. The dentate gyrus......Page 467
10.2.2.2.2. An integrate-and-fire alternative......Page 468
10.2.2.2.3. The dentate gyrus and coordinator meanings......Page 469
10.2.2.3. Discussion......Page 474
10.3. Summary......Page 475
11. Three generations of Cognitive Science......Page 476
11.1.1.2. First order syntax......Page 477
11.1.1.3. First order semantics......Page 479
11.1.2. More on the ontology......Page 480
11.1.3. Classical categorization and semantic features......Page 481
11.1.4. Objectivist metaphysics......Page 482
11.1.5. An example: the spatial usage of in......Page 483
11.2.2. Problems with set-theory as an ontology......Page 484
11.3. Gen II: The Embodied and Imaginative Mind......Page 485
11.3.2. Image-schematic semantics......Page 486
11.3.3. Image-schemata and spatial in......Page 488
11.3.4. Image-schematic quantification......Page 489
11.4.1. The math phobia of image-schematic semantics......Page 490
11.5. Gen III: The Imaged and Simulated Brain......Page 492
11.5.1. The microstructure of cognition......Page 493
11.5.2.1.1. Conceptual Spaces......Page 494
11.5.2.1.2. Properties in conceptual space......Page 495
11.5.2.1.3. Prototypes and Voronoi tesselation......Page 496
11.5.2.2.2. Mereotopological notions of Eschenbach (1994)......Page 497
11.5.2.2.3. LVQ mereotopology......Page 500
11.5.3.1. Neural plausibility......Page 501
11.5.3.1.1. Interactivity......Page 502
11.5.3.2. Self-organization......Page 503
11.5.3.2.2. Approximation of the input space......Page 504
11.5.3.3.2. Content-addressability......Page 505
11.5.3.3.3. Pattern completion......Page 506
11.5.3.5. Exemplar-based categorization......Page 507
11.5.3.6. LVQ and the evolution of language......Page 508
11.6. Summary......Page 509
A......Page 511
B......Page 512
C......Page 515
D......Page 516
F......Page 518
G......Page 520
H......Page 522
I......Page 523
K......Page 524
L......Page 526
M......Page 528
N......Page 531
P......Page 532
R......Page 534
S......Page 536
T......Page 539
V......Page 540
Z......Page 541
B......Page 543
C......Page 544
D......Page 545
F......Page 546
I......Page 547
L......Page 548
M......Page 549
N......Page 550
P......Page 551
R......Page 552
S......Page 553
T......Page 554
Z......Page 555
π SIMILAR VOLUMES
A central problem in the design of programming systems is to provide methods for verifying that computer code performs to specification. This book presents a rigorous foundation for defining Boolean categories, in which the relationship between specification and behaviour can be explored. Boolean ca
This study is an attempt to explain coordinate conjoining as a rule-governed process of establishing specific semantic relations within and between sentences. Coordination is thus conceived of both as a basic device of linguistic complex formation and as a rather fundamental principle underlying the