1.1. The Relevance of Thermal Testing in Spacecraft Instruments
In aerospace engineering, the tolerance for error is extremely limited, requiring rigorous analysis. This is particularly important in spacecraft development, which is constrained by weight, space availability, and high launch costs [
1]. The opportunity to correct design flaws once a satellite is in orbit is minimal, highlighting the importance of avoiding errors at the design and manufacturing phases [
2]. The primary strategy to mitigate these risks is error prevention, which is especially challenging in system elements which require high accuracy.
The importance of thermal testing in this context cannot be overstated. Given the harsh space conditions, spacecraft instruments must be designed to operate reliably within a range of thermal environments. These conditions vary widely, from the intense cold of shadowed space to the severe heat when exposed directly to the sun. Thermal testing, therefore, plays a pivotal role in spacecraft development. It serves as a critical check against the theoretical models used in the design phase, offering a practical assessment of how spacecraft instruments will perform under the thermal conditions they will face in space. This testing phase enables engineers to identify and address potential thermal performance issues, thereby reducing the risk of failure and the costly prospects of in-orbit adjustments.
Moreover, the evolving landscape of space exploration demands ever-increasing levels of precision in spacecraft instrumentation, especially for missions involving Earth observation, planetary exploration, and deep space scientific research. Instruments like the VINIS Earth observation telescope, developed by the Instituto de Astrofísica de Canarias (IAC), exemplify the need for such precision. These missions depend not only on the accurate collection of data but also on the reliability of instruments to operate within expected thermal parameters.
It is within this context that the integration of uncertainty analysis methodologies emerges as a key strategy for enhancing design efficiency. By conducting a thorough uncertainty analysis on the thermal model, engineers can identify critical parameters that impact thermal performance. This approach not only aids in reducing overdesign but also contributes to cost savings in future testing phases. Thus, uncertainty-based analysis is not merely a theoretical exercise; it is a practical tool that guides the design of thermal control systems, ensuring spacecraft instruments can withstand the variable conditions of space. This methodology enhances mission robustness, enables precise resource allocation, and, ultimately, secures the success of space missions.
This paper is organized as follows.
Section 1 outlines the importance of thermal testing in spacecraft design due to the strict constraints in aerospace engineering and introduces uncertainty calculation as a tool for thermal analysis.
Section 2 details the methods for uncertainty calculation applied to spacecraft thermal control. The results in
Section 3 present insights into thermal performance and the impact of uncertainties, leading to
Section 4’s discussion on the key thermal drivers. The paper concludes in
Section 5, emphasizing the role of uncertainty analysis in enhancing the design and reliability of spacecraft thermal control systems.
1.2. Uncertainty Calculation
In complex system design, the use of mathematical models is essential. They are crucial for initial system trade-off analyses, design refinement, and performance evaluation, aligning with the system objectives. The accuracy and efficiency of these models in representing the physical system and predicting behavior within their application domain are critical, necessitating thorough validation [
3].
The fidelity of models in mirroring reality depends on several factors. These include the unknown parameters, the limitations in capturing certain phenomena, and the challenges in accounting for all possible configurations [
4,
5,
6,
7]. Parameters, typically defined under ideal or constrained test conditions, may not always represent the real system accurately. Engineers often introduce equivalent effective parameters to manage the geometric complexities and interactions between components, a task that grows more complex with increasing system scale [
8].
Recognizing and integrating the inherent uncertainties in the numerical model outcomes is essential throughout all the analysis stages. This integration significantly influences decision-making in design and project management, particularly in the early phases where approximately 85% of the project’s lifecycle costs are determined [
9]. Addressing uncertainties related to manufacturing, material properties, positioning, mounting, environmental conditions, and modelling processes is crucial for ensuring robustness, reliability, and safety of the systems.
The framework of Uncertainty Quantification (UQ) is applicable in various fields, which ranges from deterministic models based on physical laws to models for complex socio-economic phenomena (like the efficiency of sustainability policies in the European Union [
10]), groundwater modelling [
11], ocean dynamics [
12], remote sensing [
13], and aircraft design [
14]. This approach increases the likelihood of mission success and assists in efficient resource utilization and risk mitigation. Despite its benefits, UQ is not widely implemented in the design of space missions. The diversity of methods presented in the specialized literature can sometimes make this analysis challenging.
Uncertainty analysis is often conducted alongside sensitivity analysis, yet these two concepts have different focuses, and their distinction is often blurred in the existing literature. Sensitivity analysis evaluates the effects of variations in input parameters on the outputs, while uncertainty analysis encompasses the assessment of all possible outcomes and their associated probabilities.
Methods for sensitivity analysis are classified based on the characteristics of the mathematical model, such as nonlinearities or the effects of simultaneous parameter variations, and how they influence the results of uncertainty analysis. These methods fall into three primary categories: screening methods, local sensitivity analysis, and global sensitivity analysis [
15]. Screening methods prioritize parameters by importance without detailed quantification. Local methods focus on uncertainties close to the nominal solution but may overlook significant effects. Global methods cover the entire range of parameter validity, capturing important model characteristics, including nonlinearities and wide parameter validity intervals.
Another way to categorize these methods is based on the type of information sought from the analysis [
16]. This includes screening methods, importance measurements (which compare statistical values of parameters derived from outputs to those of input parameters), and techniques for deeper exploration of sensitivities (providing comprehensive knowledge of the model’s behavior in parameter space).
Uncertainties in modeling are classified as either random, arising from the inherent unpredictability of natural phenomena, or epistemic, resulting from gaps in information during system modeling [
17]. Random uncertainties are represented using probability distributions and require expert knowledge to develop representative distributions for the parameters involved [
18]. Epistemic uncertainties are further divided into phenomenological uncertainties, linked to unknown information in innovative projects, and model-associated uncertainties, originating from the limitations in the precision of the mathematical model [
19].
1.3. Application of Uncertainty Calculation in the Aerospace Domain
Uncertainty significantly influences aerospace designs, whose impact can be either advantageous or detrimental to the system performance. Assessing and understanding the extent and impact of this uncertainty is essential. Therefore, uncertainty analysis is a useful tool in aerospace engineering for risk management. By quantifying uncertainties in system components and processes, engineers can more effectively predict and mitigate potential failures, enhancing the safety, reliability, and effectiveness of space missions. This approach requires a comprehensive understanding of how different uncertainties interact within the overall system [
20,
21].
UQ in aerospace engineering is important not only for design and manufacturing improvement but also for the planning and developing of testing campaigns. By identifying the system elements most susceptible to uncertainty and which have a higher impact on performance, engineers can focus their testing strategies on these critical aspects, improving testing efficiency and the overall reliability of the system. This process represents an integration of theoretical analysis, empirical data, and practical experience, evolving continuously with new insights and contributing to the development of robust and resilient aerospace systems.
1.4. Application of Uncertainty Calculation in Spacecraft Thermal Analysis
A two-phase methodology is commonly used in the process of thermal control and design for space systems [
22,
23,
24]. Initially, engineers establish scenarios that represent worst-case design conditions, combining the most challenging environmental factors (such as solar irradiance or planetary infrared radiation) with demanding operational conditions (like internal heat dissipation and the degradation of material and optical properties). Following the analysis of these scenarios, predetermined design margins, also known as uncertainty margins, are applied to the results. These margins are designed to address uncertainties in parameters during the initial analysis.
The most common practice in thermal design is to adhere to predefined margins, determined from statistical analyses of temperature predictions versus actual measurements from several spacecraft operations once in orbit. These margins, reflecting the deviation between predicted and actual temperatures, have not seen significant updates since the last revision by NASA in 1994 [
25], though more recent studies [
26,
27,
28,
29,
30] suggest the need for mission-specific margins, considering their unique nature.
The application of uncertainty analysis in thermal control provides an alternative to the traditional fixed-margin approach. It employs the results of uncertainty assessments to establish design margins. This methodology is broadly categorized into two types of methods: One-At-a-Time (OAT) and stochastic methods. OAT approaches, exemplified by Statistical Error Analysis (SEA), offer simplicity and speed at the cost of model simplification. In contrast, stochastic methods like Monte Carlo Simulation (MCS), involving hundreds of solves of the complete thermal model, yield results closer to the actual thermal performance but also bring longer computation times and a lack of detailed insights into how each parameter individually contributes to the overall uncertainty.
In terms of data availability for uncertainty parameters in thermal systems, there is a significant lack of information. The probability distributions associated with these parameters, often assumed to be Gaussian normal, may vary significantly across different project phases. In the initial stages, distributions with higher uncertainty, such as uniform or triangular distributions, are more commonly assigned due to undefined parameter values [
31,
32]. However, as the project progresses and specific choices are made (e.g., selecting a particular type of paint for a radiator), the distribution may shift towards a normal distribution due to random variations between samples.