Gaussian Process Regression with Soft Equality Constraints
Abstract
:1. Introduction
2. Gaussian Process Under Equality Constraints
2.1. Standard GP Regression Framework
2.2. Quantum-Inspired Hamiltonian Monte Carlo
3. Proposed Method
Algorithm 1 QHMC Training for GP with Equality Constraints |
Input: Initial point , step size , number of simulation steps L, mass distribution parameters and .
|
3.1. Adaptive Learning Mechanism
Algorithm 2 GP Regression with Soft Constraints |
|
3.2. Convergence Properties of the Probabilistic Approach
- Distance from a point to set:
- Distance between compact sets:
- Hausdorff distance:
- Pseudo-metric for probability distributions: Finally, we define a pseudo-metric to describe the distance between two probability distributions, P and , as
- There exists a weakly compact set such that and .
- with probability 1.
- , with probability 1.
4. Numerical Examples
4.1. Poisson Equation in 2D
4.2. Poisson Equation in 10D
4.3. Heat Transfer in a Hallow Sphere
5. Discussion
- To demonstrate the robustness and effectiveness of the proposed methods, synthetic examples are designed with a focus on two main factors: the size of the dataset and the signal-to-noise ratio (SNR). The QHMC-based algorithms were tested across varying levels of SNR, up to . The findings, illustrated in Figure 2 and Figure 4, demonstrate a consistent ability of the method—both in its soft-constrained and hard-constrained forms—to handle noise effectively, especially when the noise remains below . Furthermore, the algorithms demonstrated improved performance with larger datasets, indicating that increased data size helps mitigate the impacts of noise. These observations from the synthetic examples collectively confirm the robustness of the proposed method under varying conditions.
- The numerical results for the synthetic examples also provide execution times as the SNR and dataset size increase in each case, aiming to highlight the efficiency of the proposed algorithm. As shown in Figure 3 and Figure 5, the algorithms demonstrate significant time-saving benefits, with the soft-constrained versions exhibiting particularly remarkable advantages. Combined with the results in Figure 2 and Figure 4, where the soft-constrained algorithms maintain high prediction accuracy, the proposed method highlights its capability to provide efficient sampling while preserving performance. We further demonstrated the robustness of the algorithms by including the posterior variance values of the results in Table 1 and Table 2.
- To test the ability of the algorithms to handle higher-dimensional problems while maintaining robustness and efficiency, the evaluations focused on different dimensional settings: 2 dimensions and 10 dimensions. The results confirmed that the proposed methods could consistently deliver accurate results, even as dimensionality increased, and they did so with relatively short execution times. This demonstrates the scalability of the algorithms across a range of problem complexities.
- To demonstrate the potential of the proposed method to generalize across different types of problems, we selected a real example: a 3D heat transfer problem that requires solutions to partial differential equations. Unlike the synthetic examples, in this part, we had a fixed dataset size that contain no injected Gaussian noise. A thorough comparison of all the methods, including the step-by-step change in the relative error, is presented in Figure 6, validating the success of all the versions. Moreover, as shown in Table 3, the soft-constrained approaches attained approximately accuracy with time efficiency.
- It is also worth noting that while the numerical results demonstrate the robustness and efficiency of the current QHMC algorithm, further investigation is needed regarding the effects of the probability of constraint violation. Our experiments were conducted with a relatively low constraint release probability (approximately ), and accuracy was maintained under these conditions. However, increasing the allowance for violations could introduce limitations.
- The proposed approach demonstrates strong performance in handling dimensions up to 10, but its scalability to significantly higher dimensions, such as 50 or beyond, has yet to be fully investigated. This question requires systematic study and highlights a valuable direction for extending our work. Previous studies have demonstrated the successful application of the QHMC algorithm to high-dimensional unconstrained problems [26,33], suggesting its potential as a promising method for addressing computational challenges in constrained scenarios.
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
GP | Gaussian process |
MCMC | Markov Chain Monte Carlo |
MH | Metropolis-Hastings |
HMC | Hamiltonian Monte Carlo |
QHMC | Quantum-inspired Hamiltonian Monte Carlo |
HMCad | Hard-constrained Hamiltonian Monte Carlo with adaptivity |
HMCsoftad | Soft-constrained Hamiltonian Monte Carlo with adaptivity |
HMCvar | Hard-constrained Hamiltonian Monte Carlo with variance |
HMCsoftvar | Soft-constrained Hamiltonian Monte Carlo with variance |
HMCboth | Hard-constrained Hamiltonian Monte Carlo with both adaptivity and variance |
HMCsofboth | Soft-constrained Hamiltonian Monte Carlo with both adaptivity and variance |
QHMCad | Hard-constrained Quantum-inspired Hamiltonian Monte Carlo with adaptivity |
QHMCsoftad | Soft-constrained Quantum-inspired Hamiltonian Monte Carlo with adaptivity |
QHMCvar | Hard-constrained Quantum-inspired Hamiltonian Monte Carlo with variance |
QHMCsoftvar | Soft-constrained Quantum-inspired Hamiltonian Monte Carlo with variance |
QHMCboth | Hard-constrained Quantum-inspired Hamiltonian Monte Carlo, adaptivity |
and variance | |
QHMCsofboth | Soft-constrained Quantum-inspired Hamiltonian Monte Carlo, adaptivity and variance |
SNR | Signal-to-noise ratio |
PDE | Partial differential equations |
References
- Lange-Hegermann, M. Linearly constrained Gaussian processes with boundary conditions. In Proceedings of the International Conference on Artificial Intelligence and Statistics, PMLR, San Diego, CA, USA, 13–15 April 2021; pp. 1090–1098. [Google Scholar]
- Kuss, M.; Rasmussen, C. Gaussian processes in reinforcement learning. Adv. Neural Inf. Process. Syst. 2003, 16, 751–759. [Google Scholar]
- Liu, T.; Wei, H.; Liu, S.; Zhang, K. Industrial time series forecasting based on improved Gaussian process regression. Soft Comput. 2020, 24, 15853–15869. [Google Scholar] [CrossRef]
- Jin, B.; Xu, X. Machine learning WTI crude oil price predictions. J. Int. Commer. Econ. Policy 2024. [Google Scholar] [CrossRef]
- Zhang, H.; Liu, J. Jointly stochastic fully symmetric interpolatory rules and local approximation for scalable Gaussian process regression. Pattern Recognit. 2025, 159, 111125. [Google Scholar] [CrossRef]
- Rasmussen, C.E.; Williams, C.K. Gaussian Processes for Machine Learning; Springer: Berlin/Heidelberg, Germany, 2006; Volume 1. [Google Scholar]
- Pensoneault, A.; Yang, X.; Zhu, X. Nonnegativity-enforced Gaussian process regression. Theor. Appl. Mech. Lett. 2020, 10, 182–187. [Google Scholar] [CrossRef]
- Swiler, L.P.; Gulian, M.; Frankel, A.L.; Safta, C.; Jakeman, J.D. A survey of constrained Gaussian process regression: Approaches and implementation challenges. J. Mach. Learn. Model. Comput. 2020, 1–2. [Google Scholar] [CrossRef]
- Maatouk, H.; Bay, X. Gaussian process emulators for computer experiments with inequality constraints. Math. Geosci. 2017, 49, 557–582. [Google Scholar] [CrossRef]
- Abrahamsen, P.; Benth, F.E. Kriging with inequality constraints. Math. Geol. 2001, 33, 719–744. [Google Scholar] [CrossRef]
- Raissi, M.; Perdikaris, P.; Karniadakis, G.E. Machine learning of linear differential equations using Gaussian processes. J. Comput. Phys. 2017, 348, 683–693. [Google Scholar] [CrossRef]
- Salzmann, M.; Urtasun, R. Implicitly Constrained Gaussian Process Regression for Monocular Non-Rigid Pose Estimation. Available online: https://papers.nips.cc/paper_files/paper/2010/hash/115f89503138416a242f40fb7d7f338e-Abstract.html (accessed on 3 January 2025).
- Agrell, C. Gaussian processes with linear operator inequality constraints. arXiv 2019, arXiv:1901.03134. [Google Scholar]
- Da Veiga, S.; Marrel, A. Gaussian process modeling with inequality constraints. Ann. Fac. Des Sci. Toulouse MathéMatiques 2012, 21, 529–555. [Google Scholar] [CrossRef]
- Maatouk, H.; Roustant, O.; Richet, Y. Cross-validation estimations of hyper-parameters of Gaussian processes with inequality constraints. Procedia Environ. Sci. 2015, 27, 38–44. [Google Scholar] [CrossRef]
- López-Lopera, A.F.; Bachoc, F.; Durrande, N.; Roustant, O. Finite-dimensional Gaussian approximation with linear inequality constraints. SIAM/ASA J. Uncertain. Quantif. 2018, 6, 1224–1255. [Google Scholar] [CrossRef]
- Kochan, D.; Yang, X. Gaussian Process Regression with Soft Inequality and Monotonicity Constraints. arXiv 2024, arXiv:2404.02873. [Google Scholar]
- López-Lopera, A.; Bachoc, F.; Roustant, O. High-dimensional additive Gaussian processes under monotonicity constraints. Adv. Neural Inf. Process. Syst. 2022, 35, 8041–8053. [Google Scholar]
- Narcowich, F.J.; Ward, J.D. Generalized Hermite interpolation via matrix-valued conditionally positive definite functions. Math. Comput. 1994, 63, 661–687. [Google Scholar] [CrossRef]
- Jidling, C.; Wahlström, N.; Wills, A.; Schön, T.B. Linearly Constrained Gaussian Processes. Available online: https://papers.nips.cc/paper_files/paper/2018/hash/68b1fbe7f16e4ae3024973f12f3cb313-Abstract.html (accessed on 3 January 2025).
- Albert, C.G.; Rath, K. Gaussian process regression for data fulfilling linear differential equations with localized sources. Entropy 2020, 22, 152. [Google Scholar] [CrossRef]
- Ezati, M.; Esmaeilbeigi, M.; Kamandi, A. Novel approaches for hyper-parameter tuning of physics-informed Gaussian processes: Application to parametric PDEs. Eng. Comput. 2024, 40, 3175–3194. [Google Scholar] [CrossRef]
- Stein, M.L. Asymptotically efficient prediction of a random field with a misspecified covariance function. Ann. Stat. 1988, 16, 55–63. [Google Scholar] [CrossRef]
- Zhang, H. Inconsistent estimation and asymptotically equal interpolations in model-based geostatistics. J. Am. Stat. Assoc. 2004, 99, 250–261. [Google Scholar] [CrossRef]
- Barbu, A.; Zhu, S.C. Monte Carlo Methods; Springer: Berlin/Heidelberg, Germany, 2020; Volume 35. [Google Scholar]
- Liu, Z.; Zhang, Z. Quantum-inspired Hamiltonian Monte Carlo for Bayesian sampling. arXiv 2019, arXiv:1912.01937. [Google Scholar]
- Jensen, B.S.; Nielsen, J.B.; Larsen, J. Bounded Gaussian process regression. In Proceedings of the 2013 IEEE International Workshop on Machine Learning for Signal Processing (MLSP), Southampton, UK, 22–25 September 2013; pp. 1–6. [Google Scholar]
- Gelman, A.; Carlin, J.B.; Stern, H.S.; Dunson, D.B.; Vehtari, A.; Rubin, D.B. Bayesian Data Analysis; Tyler & Francis Group, Inc.: New York, NY, USA, 2014. [Google Scholar]
- Guo, S.; Xu, H.; Zhang, L. Stability analysis for mathematical programs with distributionally robust chance constraint. SIAM J. Optim. 2015. Available online: https://api.semanticscholar.org/CorpusID:16378663 (accessed on 15 January 2025).
- Raissi, M.; Perdikaris, P.; Karniadakis, G.E. Inferring solutions of differential equations using noisy multi-fidelity data. J. Comput. Phys. 2017, 335, 736–746. [Google Scholar] [CrossRef]
- Vishnoi, N.K. An introduction to Hamiltonian Monte Carlo method for sampling. arXiv 2021, arXiv:2108.12107. [Google Scholar]
- Yang, X.; Tartakovsky, G.; Tartakovsky, A.M. Physics information aided kriging using stochastic simulation models. SIAM J. Sci. Comput. 2021, 43, A3862–A3891. [Google Scholar] [CrossRef]
- Kochan, D.; Zhang, Z.; Yang, X. A Quantum-Inspired Hamiltonian Monte Carlo Method for Missing Data Imputation. In Proceedings of the Mathematical and Scientific Machine Learning, PMLR, Beijing, China, 15–17 August 2022; pp. 17–32. [Google Scholar]
Method | Err () | Var () | Method | Err () | Var () |
---|---|---|---|---|---|
QHMCad | 0.15 | 0.11 | HMCad | 0.19 | 0.14 |
QHMCsoftad | 0.17 | 0.10 | HMCsoftad | 0.21 | 0.12 |
QHMCvar | 0.16 | 0.09 | HMCvar | 0.21 | 0.11 |
QHMCsoftvar | 0.19 | 0.08 | HMCsoftvar | 0.22 | 0.11 |
QHMCboth | 0.14 | 0.09 | HMCboth | 0.17 | 0.12 |
QHMCsoftboth | 0.15 | 0.08 | HMCsoftboth | 0.19 | 0.11 |
Method | Err | Posterior Var | Method | Err | Posterior Var |
---|---|---|---|---|---|
QHMCad | 0.022 | 0.016 | HMCad | 0.027 | 0.019 |
QHMCsoftad | 0.025 | 0.014 | HMCsoftad | 0.030 | 0.019 |
QHMCvar | 0.023 | 0.011 | HMCvar | 0.032 | 0.016 |
QHMCsoftvar | 0.026 | 0.010 | HMCsoftvar | 0.034 | 0.015 |
QHMCboth | 0.017 | 0.012 | HMCboth | 0.025 | 0.014 |
QHMCsoftboth | 0.020 | 0.011 | HMCsoftboth | 0.029 | 0.015 |
Method | Err | Var | Time | Method | Error | Var | Time |
---|---|---|---|---|---|---|---|
QHMCad | 0.031 | 0.0042 | 39 s | HMCad | 0.036 | 0.0062 | 52 s |
QHMCsoftad | 0.042 | 0.0040 | 30 s | HMCsoftad | 0.051 | 0.0059 | 41 s |
QHMCvar | 0.032 | 0.0034 | 32 s | HMCvar | 0.045 | 0.0044 | 44 s |
QHMCsoftvar | 0.042 | 0.0032 | 26 s | HMCsoftvar | 0.051 | 0.0043 | 35 s |
QHMCboth | 0.027 | 0.0036 | 43 s | HMCboth | 0.036 | 0.0040 | 56 s |
QHMCsoftboth | 0.031 | 0.0035 | 28 s | HMCsoftboth | 0.049 | 0.0039 | 39 s |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kochan, D.; Yang, X. Gaussian Process Regression with Soft Equality Constraints. Mathematics 2025, 13, 353. https://doi.org/10.3390/math13030353
Kochan D, Yang X. Gaussian Process Regression with Soft Equality Constraints. Mathematics. 2025; 13(3):353. https://doi.org/10.3390/math13030353
Chicago/Turabian StyleKochan, Didem, and Xiu Yang. 2025. "Gaussian Process Regression with Soft Equality Constraints" Mathematics 13, no. 3: 353. https://doi.org/10.3390/math13030353
APA StyleKochan, D., & Yang, X. (2025). Gaussian Process Regression with Soft Equality Constraints. Mathematics, 13(3), 353. https://doi.org/10.3390/math13030353