| Journal: |
Scientific Reports
Nature (Springer Nature)
|
Volume: |
|
| Abstract: |
The growing reliance on satellites highlights the need to understand how space weather quantitatively impacts the reliability and efficiency of power subsystems. While it is well established that space weather disturbances can trigger anomalies in satellite operations, most existing studies lack integrated, data-driven approaches capable of capturing the complex, nonlinear interactions between space weather parameters and satellite health telemetry. This study addresses this gap by introducing a novel four-stage data driven workflow to examine the relationship between key space weather indicators (proton flux, AL index, galactic cosmic rays (GCR), and solar wind density) and the NN1_Voltage and TBS1_Current and temperature of the EgyptSat-1 satellite power subsystem (T1BS, T3BS). The workflow includes: (1) data preprocessing; (2) To handle the high dimensionality and complexity of the data, a two-stage non-linear feature selection approach was employed. In the first stage, an unsupervised Restricted Boltzmann Machine (RBM) was applied to extract a compact and structurally stable feature subset. This was followed by a supervised mutual information (MI) validation step to ensure maximum predictive relevance to the satellite target parameters (T1BS and T3BS). (3) six machine learning models namely: Convolutional Neural Networks (CNN), Long Short-Term Memory (LSTM) networks, Random Forest Regressor, Adaptive Boost, Gradient Boosting, and Voting Regressor, to capture dynamic system behaviours; and (4) anomaly detection and validation by correlating prediction residuals with space weather disturbances, using STL decomposition and Z-score for GCR and P10 anomaly detection, and coincidence rate analysis to assess temporal alignment. The Random Forest (RF) model exhibited strong predictive performance. For NN1_Voltage, the mean squared error (MSE) was 0.00147 (95% CI: 0.00120–0.00184), the root mean squared error (RMSE) was 0.038 (95% CI: 0.0346–0.0428), the mean absolute error (MAE) was 0.028 (95% CI: 0.026–0.031), and the mean absolute percentage error (MAPE) was 0.09% (95% CI: 0.08–0.10%). For TBS1_Current, RF achieved an MSE of 0.0405 (95% CI: 0.0319–0.0489), RMSE of 0.201 (95% CI: 0.179–0.221), MAE of 0.153 (95% CI: 0.136–0.169), and MAPE of 2.4% (95% CI: 2.1–2.6%). Furthermore, analysis of detected anomalies revealed temporal coincidence rates of 31% with GCR disturbances and 27% with P10 proton events. Statistical validation using chi-squared and Fisher’s exact tests yielded significant p-values (e.g., p = 2.83 × 10⁻³ for GCR; p = 5.63 × 10⁻⁷ for P10), suggesting a potential relationship worth further investigation. This analysis is particularly relevant for assessing unexplained satellite failures such as the loss of EgyptSat-1 and contributes to improved resilience and monitoring strategies for future missions. While the proposed workflow shows strong predictive performance, its validation is currently limited to a single satellite dataset, highlighting the need for broader cross-mission testing. This study not only enhances our understanding of space weather impacts on satellite power systems but also demonstrates the potential of machine learning in improving anomaly detection and resilience of satellites operating in challenging space environments.
|
|
|