Handbook of Blind Source Separation Independent Component Analysis and Applications,很经典的盲源分离国外原版教材COMONPII:C2009-0-193340ISBN:978-0-12374726-6PAGE: v(v-xviiiContentsAbout the editors音音XIXPrefaceXXIContributorsChAPTER 1 Introduction1.1 Genesis of blind source separation1.1.1 a biological problem.....1.1.2 Contextual difficulties1.1.3 A few historical notes3671.2 Problem formalization1.2.1 Invertible mixtures111.2.2 Underdetermined mixtures1.3 Source separation methods1.3. 1 Independent component analysis121.3.2 Non-temporally iid sources121.3.3 Other approaches131.4 Spatial whitening, noise reduction and PCA1.5 Applications1. 6 Content of the handbookReferences19ChAPTER 2 Information232.1 Introduction232.2 Methods based on the mutual information242.2.1 Mutual information between random vectors242.2.2 The mixing model and separation criterion252.2.3 Empirical criteria and entropy estimators262.2.4 Computation of entropy estimators292.2.5 Minimization of the empirical criteria322.2.6 Statistical performance2.3 Methods based on the mutual information rate2.3.1 Entropy and mutual information rate452.3.2 Contrasts∴482.3.3 Estimation and separation53COMONPII:C20090-19334ISBN:978-0-12-374726-6PAGE: vi (v-xvill)vi Contents2.4 Conclusion and perspectivesReferencesChAPTER 3 Contrasts653. 1 Introduction音音音653. 1.1 Model and notatio663.1.2 Principle of contrast functions3.1.3 Bibliographical remarks……673.2 Cumulants3. 3 MISO contrasts693. 3. 1 MISO contrasts for static mixtures693.3.2DDeflation principde3. 3. 3 MiSO contrasts for convolutive mixtures3.4 MIMO contrasts for static mixtures783.4.1 Introduction783. 4.2 Contrasts for MiMO static mixtures3.4.3 Contrasts and joint diagonalization3.4.4 Non-symmetric contrasts34.5 Contrasts with reference signals……903.5 MIMO contrasts for dynamic mixtures3.5. 1 Space-time whitening923.5.2 Contrasts for MIMO convolutive mixtures3.5.3 Contrasts and joint diagonalization3.5.4 Non-symmetric contrasts993.6 Constructing other contrast criteria1013.7 Concl102References103ChAPTER 4 Likelihood1074.1 Introduction: Models and likelihood1074.2 Transformation model and equivariance1094.2.1 Transformation likelihood1104.2.2 Transformation contrast ................. 1114.2.3 Relative variations1114.2.4 The iid Gaussian model and decorrelation1134.2.5 Equivariant estimators and uniform performance1154.2.6S1164.3 Independpence4.3. 1 Score function and estimating equations1174.3.2 Mutual information1184.3.3 Mutual information, correlation and1194.3.4 Summary121COMONPI1:C2009-0-193340ISBN:978-0-12-374726-6PAGE: vii (v-xvill)Contents vii4.4 Identifiability stability performance1224.4.1 Elements of asymptotic analysis................ 1234.4.2 Fisher information matrix ....................................1254.4.3 Blind identifiability1264.4.4 Asymptotic performance1264.4.5 When the source model is wrong1274.4.6 Relative gradient and natural gradient1304.4.7 Summary…………………………………………1304.5 Non-Gaussian models1314.5.1 The likelihood contrast in iid models1314.5.2 Score functions and estimating equations1324.5.3 Gaussianity index........................ 1324.5.4 Cramer-Rao bound ............................................1334.5.5 Asymptotic performance1334.5.6 Adaptive scores1344.5.7 Algorithms1354.6 Gaussian models∴,1364.6.1DiasSian1364.6.2 Fisher information and diversit1374.6. 3 Gaussian contrasts1384.6. 4 In practice: Localized gaussian models1404.6.5 Choosing the diagonalizing transform............1414.7No1424.7. 1 Source estimation from noisy mixtures.......... 1424.7.2 Noisy likelihood for localized Gaussian models1434.7.3 Noisy likelihood and hidden variables14448 Conclusion: a general view…………………,1484.8.1 Unity4.8.2 Diversit1494.9 Appendix: Proofs152References153CHAPTER 5 Algebraic methods after prewhitening1551555.1.1 Multilinear algebra1555.1.2 Higher-order statistics1565.1.3 Jacobi iteration∴1595.2 Independent component analysis...................1615. 2. 1 Algebraic formulation∴1615.2.2 Step 1: Prew hite5.2.3 Step 2: Fixing the rotational degrees of frreedom using tenigher-order cumulant164COMONPII:C2009-0-193340ISBN:978-0-12-374726-6PAGE: vili(v-xvili)vContents5.3 Diagonalization in least squares sense1655.3. 1 Third-order real case1675.3.2 Third-order complex case................... 1685.3.3 Fourth-order real case1695.3.4 Fourth-order complex case1705.4 Simultaneous diagonalization of matrix slices . ......................1705.4.1 Real case1735.4.2 Complex case5.5 Simultaneous diagonalization of third-order tensor slices.....1745.6 Maximization of the tensor trace174References175CHAPTER 6 Iterative algorithms1796.1 Introduction1796.2 Model and goal1806.3 Contrast functions for iterative bSS/ica1816.3.1 Information-theoretic contrasts................1816.3.2 Cumulant-based approximations6.3.3 Contrasts for source extraction6.3.4 Nonlinear function approximations1856.4 Iterative search algorithms: generalities∴………,1866.4.1 Batch methods1866.4.2 Stochastic optimization1896.4.3 Batch or adaptive estimates?1916.5 Iterative whitening1926.6 Classical adaptive algorithms..................... 1936.6.1 Herault-Jutten algorithm6.6.2 Self-normalized networks1946.6.3 Adaptive algorithms based on contrasts1956.6.4 Adaptive algorithms based on centroids............ 1986.7 Relative(natural gradient techniques19967.1 Relative gradient and serial updating………,1996.7.2 Adaptive algorithms based on the relative gradient2006.7.3 Likelihood maximization with the relative gradient.... 2026. 8 Adapting the nonlinearities2036.9 Iterative algorithms based on defation6.9.1 Adaptive deflation algorithm by Delfosse-Loubaton2056.9.2 Regression-based deflation2066.9.3 Deflationary orthogonalization207COMONPI1:C20090-19334ISBN:978-0-12-374726-6PAGE: ix (v-xvill)Contents ix6.10 The FastICA algorithm2086.10.1 Introduction2086. 10.2 Implicit adaptation of the contrast in FastICa2096. 10.3 Derivation of fastICA as a Newton iteration2096.10.4 Connection to gradient methods2116.10.5 Convergence of FastICA2126.10.6 FastICA using cumulants2136.10.7 Variants of fastICa214612 Summary, conclusions and outlook…∵∵6.11 Iterative algorithms with optimal step size,266.11.1 Optimizing the step size∴2166. 11.2 The robustICa algorithm217,220References221CHAPTER Second order methods based on color2277.1 Introduction2277.2 WSS processes2287. 2. 1 Parametric WSs processes2307. 3 Problem formulation, identifiability and bounds2327.3.1 Indeterminacies and identifiability∴2337.3.2 Performance measures and bounds.............. 2377. 4 Separation based on joint diagonali2457.4.1 On exact and approximate joint diagonalization2467.4.2 The first JD-based method2517.4.3 AMUSE and its modified versions2517.4.4SOBⅠ, TDSEP and modified versions∴2537.5 Separation based on maximum likelihood2607.5.1 The QML approach2617.52 The EML approach………2667.5.3 The GMI approach2687. 6 Additional issues2707.6.1 The effect of additive noise..................2707.6.2 Non-stationary sources, time-varying mixtures2737.6.3 Complex-valued sources275References∴276chaPTeR 8 Convolutive mixtures2818.1 Introduction and mixture model288. 1.1 Model and notations2818.1.2 Chapter organization282COMONPII:C2009-0-19334-0ISBN:978-0-12374726-6PAGE: x(v-xvili)Contents8.2 Invertibility of convolutive MIMo mixtures288. 2.1 General results音音音8.2.2 FIR systems and polynomial matrices2858.3 Assumptions音音音8.3. 1 Fundamental assumptions2878.3.2 Indeterminacies2888.3.3 Linear and nonlinear sources2898.3.4Sepparation condition∴2918.4 Joint separating method2984.1Wh2928.4.2 Time domain approaches29484.3 Frequency domain approaches……2988.5 Iterative and Deflation Method3018.5.1 Extraction of one source8.5.2 Deflation3088.6 Non-stationary context3098.6.1 Context3098.6.2 Some properties of cyclostationary time-series3108.6 3 Direct extension of the results of section 8.5 for the sourceextraction, and why it is difficult to implement.3148. 6. 4 A function to minimize: a contrast?318References322CHaPTER 9 Algebraic identification of under-determined mixtures. 2359.1 Observation model2359.2Iic identifiability2369.2.1 Equivalent representations2369.2.2 Main theorem2379.2. 3 Core ec9.2.4 Identifiability in the 2-dimensional case2409.3 Problem formulation2429.3.1 Approach based on derivatives of the joint characteristicfunction2429.3.2 Approach based on cumulants2439.4 Higher-order tensors2479.4.1 Canonical tensor decomposition2489.4.2 Essential uniqueness2509.4.3 Computation2549.5 Tensor-based algorithms2559.5.1 Vector and matrix representations2559.5.2 The 2-dimensional case256COMONPII:C20090-19334ISBN:978-0-12-374726-6PAGE: xi (v-xvill)Contents xi9.5.3 SOBIUM family2619.5.4 FOOBi family2649.5.5 BIOME family2679.5.6 ALESCAF and LEMACAF2699.5.7 Other algorithms2709.6 Appendix: expressions of complex cumulants271References272CHAPTER 10 Sparse component analysis36710.1 Introduction音音∴367102Sparse signal representations37010.2. 1 Basic principles of sparsity...................37110.2.2 Dictionaries37210.2. 3 Linear transforms. ..............................................37310.2.4 Adaptive representations37410.3 Joint sparse representation of mixtures37410.3. 1 Principle37510.3.2 Linear transforms37510.3.3 Principle of lt minimization37710.3.4 Bayesian interpretation of l criteria37710.3.5 Effect of the chosen (T criterion37910.3.6 Optimization algorithms for lt criteria........... 38110.3.7 Matching pursuit385103.8 Summary……………∴38610.4 Estimating the mixing matrix by clustering38810.4.1 Global clustering algorithms38910.4.2 Scatter plot selection in multiscale representations.... 39110.4.3 Use of local scatter plots in the time-frequency plane.... 39510.5 Square mixing matrix Relative Newton method for quasi-maximumlikelihood separation39610.5.1 Relative optimization framework39710.5.2 Newton method∴39810.5.3 Gradient and Hessian evaluation.39910.5.4 Sequential optimization40010.5.5 Numerical illustrations40110.5.6 Extension of relative Newton: blind deconvolution40210.6 Separation with a known mixing matrix40310.6.1 Linear separation of (over-) determined mixtures40410.6.2 Binary masking assuming a single active source40510.6.3 Binary masking assuming M