The independent component analysis (ICA) method is introduced in this excellent book, through examples from signal processing. It is addressed to beginners, as well as professionals, engineers, and scientists. The book is divided into 11 chapters, arranged into five parts. The first part, “Independent Component Analysis and Blind Source Separation,” is introductory, and contains two overview chapters. This overview starts with an intuitive example: multiple sound signals mixed and recorded through multiple microphones. The goal of the ICA method is to reconstruct the source signals from mixed signals. ICA is seen as part of the larger field of “blind source separation,” in which little is known about the nature of the source signal. Some of the main assumptions are explained here in simple terms. For example, mixed signals are different from source signals, being more Gaussian and more complex. These same applications, mentioned only briefly in this part, are detailed in a separate chapter. The second part, “The Geometry of Mixtures,” contains three chapters, which introduce the essential mathematical concepts. One chapter is dedicated to mathematical representations of signals: vectors, vector variables, and operations with vectors. Another presents linear transformations, inner products, matrices, geometric transformations, and orthogonal projections. The last chapter defines the basic statistical concepts used in ICA: histograms, probability density, central limit theorem, cumulative density functions, moments, mean, variance, skewness, and kurtosis. The third and most consistent part is called “Methods for Blind Source Separation.” Chapter 6 describes the projection pursuit version of ICA, which extracts source signals such that the resulting source signals are as non-Gaussian as possible. Chapter 7 introduces independence and entropy in statistical terms, and the infomax version of the ICA method. It seeks to maximize the entropy of extracted signals after they have been transformed by a specific sempurna of a joint cumulative density function (CDF). The assumption on the joint CDF is that the perbatasan CDFs are independent. As a method equivalent to infomax, the author describes the usage of the maximum likelihood function in the separation process. In chapter 8, “Complexity Pursuit,” the Kolmogorov approach to complexity is discussed. In contrast to projection pursuit, this version of ICA does not ignore signal structure. Chapter 9 describes the gradient ascent method, which is used to find the parameters that maximize a merit function, such as maximum likelihood or complexity. The last chapter of this part is dedicated to discussing principal component analysis (PCA) and factor analysis (FA). PCA assumes that source signals are Gaussian and statistically uncorrelated, which is different from ICA’s condition of independence. FA is viewed as PCA with an additional model for noise, and the number of signals can be different from the number of sources. Part 4 contains only one chapter: “Applications of ICA.” Some of the mentioned applications are: voice extraction, electroencephalograms, functional magnetic resonance imaging, fetal heart monitoring, and learning stereo disparity. Finally, there are the appendices; the mathematics is more concentrated here, and is accompanied by Matlab code examples. At the end of the book, readers will find a list of resources, recommended readings, and references. This is a clearly written and well-organized book. Explained concepts are sustained by essential proofs and helpful graphical diagrams. The text strikes a balance between the general theory of statistics, theory of signals, and exemplifications of ICA. Even the repetitions, inherent in an introductory book, always clarify and unveil new facets of concepts, and make reading the book a cak benar pleasure.