The emerging field of Graph Signal Processing (GSP) has recently intersected with the realm of Machine Learning, offering a new dimension to signal analysis over networks or graphs. This innovative approach provides a universal perspective on signals across diverse real-world problems. In a recent thesis on GSP, two significant contributions were made in the areas of GSP theory and GSP with machine learning.
In the first context, a novel framework for the Hilbert transform of graph signals was derived, addressing the question of defining amplitude and frequency modulations for these signals. By generalizing Gabor’s analytic signal, amplitude and phase modulations for graph signals were defined through a Hilbert transform, showcasing the ability to pinpoint anomalies or singularities across graphs.
In the second scenario, popular machine learning techniques were integrated into the GSP domain, demonstrating a mutually beneficial relationship between the two disciplines. This collaboration aimed to predict vector target signals, which are graph signals based on an associated graph, using general quantities that may not correspond directly to the physical quantities of the graph signal. By extending kernel regression, multi-kernel regression, Gaussian processes, and extreme learning machines to the graph signal setting, the approach outperformed traditional versions in cases with limited and noisy training data.
Moreover, the thesis delved into the challenge of learning graphs from graph signals, presenting two distinct methods. Firstly, the learning of connected graphs was transformed into a convex optimization constraint that can enhance existing graph learning techniques. Secondly, a sparsity-based approach was proposed to learn graphs in a hyperparameter-free and computationally efficient manner. The ability to handle cases where the input and output are different physical quantities and perform effectively even when the graph is unknown to the user sets this approach apart.
The fusion of GSP with machine learning not only enhances signal analysis and prediction accuracy over graphs but also showcases the potential for efficient resource utilization and computational complexity, particularly in scenarios with limited training data or noisy inputs. This innovative approach signifies a significant step forward in the synergy between GSP and machine learning, paving the way for groundbreaking advancements in signal processing applications across various domains.