Publications

Smooth optimization algorithms for global and locally low-rank regularizers

Published in (Under review) Society for industrial and applied mathematics (SIAM) Journal on Optimization, 2025


Many inverse problems and signal processing problems involve low-rank regularizers based on the nuclear norm. Commonly, proximal gradient methods (PGM) are adopted to solve this type of non-smooth problems as they can offer fast and guaranteed convergence. However, PGM methods cannot be simply applied in settings where low-rank models are imposed locally on overlapping patches; therefore, heuristic approaches have been proposed that lack convergence guarantees. In this work we propose to replace the nuclear norm with a smooth approximation in which a Huber-type function is applied to each singular value. By providing a theoretical framework based on singular value function theory, we show that important properties can be established for the proposed regularizer, such as: convexity, differentiability, and Lipschitz continuity of the gradient. Moreover, we provide a closed-form expression for the regularizer gradient, enabling the use of standard iterative gradient-based optimization algorithms (e.g., nonlinear conjugate gradient) that can easily address the case of overlapping patches and have well-known convergence guarantees. In addition, we provide a novel step-size selection strategy based on a quadratic majorizer of the line-search function that leverages the Huber characteristics of the proposed regularizer. Finally, we assess the proposed optimization framework by providing empirical results in dynamic magnetic resonance imaging (MRI) reconstruction in the context of locally low-rank models with overlapping patches.

Recommended citation: Rodrigo Lobos, Javier Salazar Cavazos, Raj Rao Nadakuditi, and Jeffrey A. Fessler, "Smooth optimization algorithms for global and locally low-rank regularizers" in (Under review) Society for industrial and applied mathematics (SIAM) Journal on Optimization.

ALPCAHUS: Subspace Clustering for Heteroscedastic Data

Published in (Under review) IEEE Transactions on Signal Processing (TSP), 2025


Principal component analysis (PCA) is a key tool in the field of data dimensionality reduction. Various methods have been proposed to extend PCA to the union of subspace (UoS) setting for clustering data that come from multiple subspaces like K-Subspaces (KSS). However, some applications involve heterogeneous data that vary in quality due to noise characteristics associated with each data sample. Heteroscedastic methods aim to deal with such mixed data quality. This paper develops a heteroscedastic-focused subspace clustering method, named ALPCAHUS, that can estimate the sample-wise noise variances and use this information to improve the estimate of the subspace bases associated with the low-rank structure of the data. This clustering algorithm builds on K-Subspaces (KSS) principles by extending the recently proposed heteroscedastic PCA method, named LR-ALPCAH, for clusters with heteroscedastic noise in the UoS setting. Simulations and real-data experiments show the effectiveness of accounting for data heteroscedasticity compared to existing clustering algorithms.Code available at https://github.com/javiersc1/ALPCAHUS

Recommended citation: J. Salazar Cavazos, J. A. Fessler and L. Balzano, "ALPCAHUS: Subspace Clustering for Heteroscedastic Data," in (Under review) IEEE Transactions on Signal Processing. keywords: {Heteroscedastic data, heterogeneous data quality, subspace bases estimation, subspace clustering, union of subspace model, unsupervised learning} http://javiersc1.github.io/files/paper_alpcahus_draft.pdf

Alzheimer’s Disease Classification in Functional MRI With 4D Joint Temporal-Spatial Kernels in Novel 4D CNN Model

Published in International Society for Magnetic Resonance in Medicine (ISMRM 2025), 2025


Previous works in the literature apply 3D spatial-only models on 4D functional MRI data leading to possible sub-par feature extraction to be used for downstream tasks like classification. In this work, we aim to develop a novel 4D convolution network to extract 4D joint temporal-spatial kernels that not only learn spatial information but in addition also capture temporal dynamics. We apply our novel approach on the ADNI dataset with data augmentations such as circular time shifting to enforce time-invariant results. Experimental results show promising performance in capturing spatial-temporal data in functional MRI compared to 3D models. The 4D CNN model improves Alzheimer’s disease diagnosis for rs-fMRI data, enabling earlier detection and better interventions. Future research could explore task-based fMRI applications and regression tasks, enhancing understanding of cognitive performance and disease progression.

Recommended citation: J. Salazar Cavazos, S. Peltier, Alzheimers Disease Classification in Functional MRI With 4D Joint Temporal-Spatial Kernels in Novel 4D CNN Model. In ISMRM 33th Annual Meeting, 2025. p. 3398. http://javiersc1.github.io/files/4dcnn.pdf

ALPCAH: Subspace Learning for Sample-wise Heteroscedastic Data

Published in IEEE Transactions on Signal Processing (TSP), 2025


Principal component analysis (PCA) is a key tool in the field of data dimensionality reduction. However, some applications involve heterogeneous data that vary in quality due to noise characteristics associated with each data sample. Heteroscedastic methods aim to deal with such mixed data quality. This paper develops a subspace learning method, named ALPCAH, that can estimate the sample-wise noise variances and use this information to improve the estimate of the subspace basis associated with the low-rank structure of the data. Our method makes no distributional assumptions of the low-rank component and does not assume that the noise variances are known. Further, this method uses a soft rank constraint that does not require subspace dimension to be known. Additionally, this paper develops a matrix factorized version of ALPCAH, named LR-ALPCAH, that is much faster and more memory efficient at the cost of requiring subspace dimension to be known or estimated. Simulations and real data experiments show the effectiveness of accounting for data heteroscedasticity compared to existing algorithms. Code available at https://github.com/javiersc1/ALPCAH.

Recommended citation: J. Salazar Cavazos, J. A. Fessler and L. Balzano, "ALPCAH: Subspace Learning for Sample-wise Heteroscedastic Data," in IEEE Transactions on Signal Processing, doi: 10.1109/TSP.2025.3537867. keywords: {Heteroscedastic data; heterogeneous data quality; subspace basis estimation; subspace learning} http://javiersc1.github.io/files/paper_alpcah_journal.pdf

ALPCAH: Sample-wise Heteroscedastic PCA with Tail Singular Value Regularization

Published in Sampling Theory and Applications (SampTA), 2023


Principal component analysis (PCA) is a key tool in the field of data dimensionality reduction that is useful for various data science problems. However, many applications involve heterogeneous data that varies in quality due to noise characteristics associated with different sources of the data. Methods that deal with this mixed dataset are known as heteroscedastic methods. Current methods like HePPCAT make Gaussian assumptions of the basis coefficients that may not hold in practice. Other methods such as Weighted PCA (WPCA) assume the noise variances are known, which may be difficult to know in practice. This paper develops a PCA method that can estimate the sample-wise noise variances and use this information in the model to improve the estimate of the subspace basis associated with the low-rank structure of the data. This is done without distributional assumptions of the low-rank component and without assuming the noise variances are known.Simulations show the effectiveness of accounting for such heteroscedasticity in the data, the benefits of using such a method with all of the data versus retaining only good data, and comparisons are made against other PCA methods established in the literature like PCA, Robust PCA (RPCA), and HePPCAT. Code available at https://github.com/javiersc1/ALPCAH

Recommended citation: J. A. S. Cavazos, J. A. Fessler, and L. Balzano. ALPCAH: Sample-wise heteroscedastic PCA with tail singular value regularization. In Fourteenth International Conference on Sampling Theory and Applications, 2023. http://javiersc1.github.io/files/paper_alpcah.pdf