𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Compressed sensing and its applications: 2 MATHEON conf. 2015

✍ Scribed by Boche H (ed.)


Publisher
Birkhauser
Year
2015
Tongue
English
Leaves
402
Series
Applied and Numerical Harmonic Analysis
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Table of Contents


ANHA Series Preface......Page 6
Preface......Page 9
Contents......Page 12
On the Global-Local Dichotomy in Sparsity Modeling......Page 19
1.1 The Need for a New Local-Global Sparsity Theory......Page 20
2 Local-Global Sparsity......Page 21
2.2 Globalized Local Model......Page 22
2.3 Uniqueness and Stability......Page 27
3 Pursuit Algorithms......Page 30
3.1 Global (Oracle) Projection, Local Patch Averaging (LPA) and the Local-Global Gap......Page 31
3.3 Globalized Pursuits......Page 33
3.3.2 ADMM-Inspired Approach......Page 34
4.1 Piecewise Constant (PWC) Signals......Page 36
4.2 Signature-Type Dictionaries......Page 39
4.2.1 Multi-Signature Dictionaries......Page 40
4.3 Convolutional Dictionaries......Page 42
5.1 Signature-Type Signals......Page 44
5.1.2 Noiseless Case......Page 45
5.1.3 Noisy Case......Page 46
5.2.1 Synthetic Data......Page 48
6 Discussion......Page 49
6.2 Further Extensions......Page 50
6.3 Learning Models from Data......Page 51
Appendix B: Proof of Lemma 2......Page 52
Appendix C: Proof of Theorem 6......Page 56
Appendix D: Proof of Theorem 8......Page 58
Local Support Dependencies......Page 61
Constructing ``Good'' Dictionaries......Page 63
Generating Signals......Page 65
Further Remarks......Page 67
References......Page 68
Fourier Phase Retrieval: Uniqueness and Algorithms......Page 72
1 Introduction......Page 73
2 Problem Formulation......Page 74
3.1 Trivial and Non-Trivial Ambiguities......Page 76
3.2 Ensuring Uniqueness in Classical Phase Retrieval......Page 78
3.2.1 Information About Some Entries of the True Signal......Page 79
3.2.2 Sparse Signals......Page 80
3.2.3 Minimum Phase Signals......Page 81
3.3 Phase Retrieval with Deterministic Masks......Page 82
3.4 Phase Retrieval from STFT Measurements......Page 85
3.5 FROG Methods......Page 87
3.6 Multidimensional Phase Retrieval......Page 88
4 Phase Retrieval Algorithms......Page 90
4.1 Alternating Projection Algorithms......Page 92
4.2 Semidefinite Relaxation Algorithms......Page 94
4.3 Additional Non-Convex Algorithms......Page 97
4.4 Algorithms for Sparse Signals......Page 99
5 Conclusion......Page 101
References......Page 102
1 Introduction......Page 109
1.1 Compressed Sensing for High-Dimensional Approximation......Page 110
1.2 Structured Sparsity......Page 111
1.4 Main Results......Page 112
1.5 Existing Literature......Page 113
2.1 Setup and Notation......Page 114
2.2 Regularity and Best k-Term Approximation......Page 115
2.3 Lower Sets and Structured Sparsity......Page 116
3.1 Exploiting Lower Set-Structured Sparsity......Page 118
3.2 Choosing the Optimization Weights: Nonuniform Recovery......Page 120
3.3 Comparison with Oracle Estimators......Page 121
3.4 Sample Complexity for Lower Sets......Page 122
3.5 Quasi-Optimal Approximation: Uniform Recovery......Page 124
3.6 Unknown Errors, Robustness, and Interpolation......Page 127
3.7 Numerical Results......Page 131
4 Conclusions and Challenges......Page 136
References......Page 137
Multisection in the Stochastic Block Model Using Semidefinite Programming......Page 141
1 Introduction......Page 142
1.1 Related Previous and Parallel Work......Page 144
1.2 Preliminaries......Page 146
2 SDP Relaxations and Main Results......Page 147
3.1 Proof of Optimality: Theorem 1......Page 151
3.2 Proof of Optimality: Theorem 2......Page 153
3.3 Proof of Theorem 3......Page 157
3.4 Proof of Theorem 4......Page 165
4 Note About the Monotone Adversary......Page 169
5 Experimental Evaluation......Page 170
6 The Multireference Alignment SDP for Clustering......Page 171
Appendix......Page 175
References......Page 176
Recovering Signals with Unknown Sparsity in Multiple Dictionaries......Page 179
1.1 2-Constrained Regularization......Page 180
1.2 Sparsity-Inducing Composite Regularizers......Page 181
1.4 Related Work......Page 182
2 The Co-L1 Algorithm......Page 184
2.1 Log-Sum MM Interpretation of Co-L1......Page 186
2.2 Convergence of Co-L1......Page 187
2.4 Bayesian MAP Interpretation of Co-L1......Page 188
2.5 Variational EM Interpretation of Co-L1......Page 189
2.6 Co-L1 for Complex-Valued x......Page 191
2.7 New Interpretations of the IRW-L1 Algorithm......Page 192
3 The Co-IRW-L1 Algorithm......Page 193
3.1 Log-Sum-Log MM Interpretation of Co-IRW-L1-Ξ΄......Page 195
3.4 Bayesian MAP Interpretation of Co-IRW-L1-Ξ΄......Page 196
3.5 Variational EM Interpretation of Co-IRW-L1-Ξ΄......Page 197
3.7 Co-IRW-L1 for Complex-Valued x......Page 199
4 Numerical Results......Page 200
4.2 Synthetic 2D Finite-Difference Signals......Page 201
4.3 Shepp-Logan and Cameraman Recovery......Page 203
4.4 Dynamic MRI......Page 204
4.5 Algorithm Runtime......Page 207
5 Conclusions......Page 208
References......Page 209
Compressive Classification and the Rare Eclipse Problem......Page 212
2 Our Model and Related Work......Page 213
3 Theoretical Results......Page 215
3.2 The Case of Two Ellipsoids......Page 217
3.3 The Case of Multiple Convex Sets......Page 221
4.1 Comparison Using Toy Examples......Page 223
4.2 Simulations with Hyperspectral Data......Page 225
5 Future Work......Page 228
6.1 Proof of Gordon's Escape Through a Mesh Theorem......Page 229
6.2 Proof of Lemma 1......Page 230
6.3 Proof of Theorem 2......Page 231
References......Page 234
1 Introduction......Page 236
2 Preliminaries......Page 237
3.1 Real Case......Page 239
4 Weak Phaseless Reconstruction......Page 243
5 Illustrative Examples......Page 246
References......Page 249
1 Introduction......Page 250
2.1 Reconstructing Sparse Distributions from Moments......Page 252
2.2 Dimension Reduction......Page 254
3.1 Moments and Spanning Sets......Page 255
3.2 Frames for Polynomial Spaces......Page 256
4.1 Frames and Cubatures on the Sphere and Beyond......Page 258
4.2 Moment Reconstruction with Cubatures in Grassmannians......Page 260
5.1 Numerical Construction of Cubatures......Page 261
5.2 Cubatures for Approximation of Integrals......Page 262
5.3 Cubatures for Function Approximation......Page 264
5.4 Cubatures as Efficient Coverings......Page 266
5.5 Cubatures for Phase Retrieval......Page 267
6 Cubatures of Varying Ranks......Page 268
References......Page 272
1 Introduction......Page 275
1.1 Tensor Product Spaces......Page 277
1.2 Tensor Contractions and Diagrammatic Notation......Page 278
2 Low-Rank Tensor Decompositions......Page 280
2.1 Tensor Train Format......Page 283
3.1 Randomized SVD for Matrices......Page 288
3.2 Randomized TT-SVD......Page 290
4 Relation to the Alternating Least Squares (ALS) Algorithm......Page 295
5 Numerical Experiments......Page 296
5.1 Approximation Quality for Nearly Low-Rank Tensors......Page 297
5.2 Approximation Quality with Respect to Oversampling......Page 298
5.3 Approximation Quality with Respect to the Order......Page 299
5.4 Computation Time......Page 300
6 Conclusions and Outlook......Page 301
References......Page 303
Versatile and Scalable Cosparse Methods for Physics-Driven Inverse Problems......Page 305
1 Introduction......Page 306
2.1 Linear PDEs......Page 307
2.2 Green's Functions......Page 308
2.3 Linear Inverse Problem......Page 309
3.1 Acoustic Source Localization from Microphone Measurements......Page 310
3.2 Brain Source Localization from EEG Measurements......Page 312
4.1 Finite-Difference Methods (FDM)......Page 313
4.2 Finite Element Methods (FEM)......Page 315
4.3 Numerical Approximations of Green's Functions......Page 317
4.4 Discretized Inverse Problem......Page 318
5.1 Optimization Problems......Page 319
5.2 Optimization Algorithm......Page 320
5.3 Computational Complexity......Page 324
6 Scalability......Page 325
6.1 Analysis vs Synthesis......Page 326
6.2 Multiscale Acceleration......Page 329
7 Versatility......Page 331
7.1 Blind Acoustic Source Localization......Page 332
7.2 Cosparse Brain Source Localization......Page 337
8 Summary and Conclusion......Page 342
References......Page 343
1 Introduction......Page 347
2.1 Sufficient Recovery Conditions......Page 350
2.2 Recovery from Gaussian Measurements......Page 352
2.3 Recovery from Haar-Incoherent Measurements......Page 353
2.4 Recovery from Subsampled Fourier Measurements......Page 355
3 TV Recovery from Subgaussian Measurements in 1D......Page 357
3.1 M* Bounds and Recovery......Page 358
3.2 The Mean Width of Gradient Sparse Vectors in 1D......Page 360
3.3 The Extension to Gradient Compressible Vectors Needs a New Approach......Page 363
3.4 Exact Recovery......Page 365
3.5 Subgaussian Measurements......Page 367
References......Page 370
Compressed Sensing in Hilbert Spaces......Page 373
1.1 Observation Model and Low-Complexity Signals......Page 374
1.2 Decoders......Page 375
1.3 The RIP: A Tool for the Study of Signal Recovery......Page 376
1.4 A General Compressed Sensing Framework......Page 377
2.1 Definition and Examples......Page 378
2.2 Structured Sparsity …......Page 379
2.3 …in Levels......Page 380
3 Dimension Reduction with Random Linear Operators......Page 381
3.1 Projection on a Finite-Dimensional Subspace......Page 382
3.2.1 Randomized Dimension Reduction......Page 383
3.2.2 Some Examples......Page 385
4.1 Convex Decoders and Atomic Norms......Page 386
4.1.2 Group Norms in Levels......Page 387
4.1.3 Atomic Norm Associated with a Union of Subspace Model......Page 388
4.2.1 Stable Recovery in the Presence of Noise......Page 389
4.2.3 Example: The Case of Sparsity in Levels......Page 390
4.3 Definition and Calculation of δΣ(f)......Page 392
5.1 A Flexible Way to Guarantee Recovery......Page 393
5.3 Extensions......Page 394
5.5 New Frontiers: Super-Resolution and Compressive Learning......Page 395
References......Page 396
Applied and Numerical Harmonic Analysis (87 Volumes)......Page 399


πŸ“œ SIMILAR VOLUMES


Compressed Sensing and its Applications:
✍ Holger Boche, Robert Calderbank, Gitta Kutyniok, Jan VybΓ­ral (eds.) πŸ“‚ Library πŸ“… 2015 πŸ› BirkhΓ€user Basel 🌐 English

<p><p>Since publication of the initial papers in 2006, compressed sensing has captured the imagination of the international signal processing community, and the mathematical foundations are nowadays quite well understood.</p><p>Parallel to the progress in mathematics, the potential applications of c

Compressed Sensing and its Applications:
✍ Holger Boche,Giuseppe Caire,Robert Calderbank,Maximilian MΓ€rz,Gitta Kutyniok,Rud πŸ“‚ Library πŸ“… 2017 πŸ› BirkhΓ€user Basel 🌐 English

<p>This contributed volume contains articles written by the plenary and invited speakers from the second international MATHEON Workshop 2015 that focus on applications of compressed sensing. Article authors address their techniques for solving the problems of compressed sensing, as well as connectio

Compressed Sensing and Its Applications:
✍ Holger Boche, Giuseppe Caire, Robert Calderbank, Gitta Kutyniok, Rudolf Mathar, πŸ“‚ Library πŸ“… 2019 πŸ› Springer International Publishing;BirkhΓ€user 🌐 English

<p>The chapters in this volume highlight the state-of-the-art of compressed sensing and are based on talks given at the third international MATHEON conference on the same topic, held from December 4-8, 2017 at the Technical University in Berlin. In addition to methods in compressed sensing, chapters