Research interests
-
My research can be characterized into two themes:
- Randomized linear algebra: tensor data analysis, dimensionality reduction, applied probability;
- Data-driven inverse problems: experimental design, PDE-constrained optimization,
towards the vision of efficient and reliable scientific machine learning.
Publications and Preprints
- Continuous nonlinear adaptive experimental design with gradient flow by R. Jin, Q. Li, S. Mussmann and S. Wright. 2024. [PDF]
- Unique reconstruction for discretized inverse problems: a random sketching approach via subsampling. by R. Jin, Q. Li, A. Nair and S. Stechmann. Submitted, 2024. [PDF]
- Optimal experimental design for linear models via gradient flow. by R. Jin, M. Guerra, Q. Li and S. Wright. Submitted, 2024. [PDF]
- Scalable symmetric Tucker tensor decomposition. by R. Jin, J. Kileel, T. G. Kolda and R. Ward. SIAM Journal on Matrix Analysis and Applications, 2024. [DOI] [PDF]
- Space-time reduced-order modeling for uncertainty quantification. by R. Jin, F. Rizzi and E. Parish. CSRI Summer Proceedings, Sandia National Laboratories, 2021. [PDF] [DOI]
- Tensor-structured sketching for constrained least squares. by K. Chen and R. Jin. SIAM Journal on Matrix Analysis and Applications, 2021. [PDF] [DOI]
- Faster Johnson-Lindenstrauss Transform via Kronecker Products. by R. Jin, T. G. Kolda and R. Ward. Information and Inference: A Journal of the IMA, 2020. [PDF] [DOI]