LTE: Lightweight Transformer Encoder for Orbit Prediction
Abstract
:1. Introduction
2. Materials and Methods
2.1. Dataset
2.2. Data Preprocessing
2.3. Prediction Model
2.4. Experiments
- Support Vector Regressor (SVR) [28]: A machine learning model that uses support vector machines (SVMs) for regression tasks by finding the best-fitting hyperplane that minimizes the error within a predefined threshold.
- eXtreme Gradient Boosting Regressor (XGBR) [29]: An ensemble learning method based on gradient boosting, which builds a series of weak learners (decision trees) to improve predictive performance.
- Long Short-Term Memory (LSTM) [30]: A type of recurrent neural network (RNN) designed to model sequences and time series data by effectively learning long-term dependencies and preventing the vanishing gradient problem.
- Variational AutoEncoder (VAE) [31]: A generative model used for unsupervised learning that learns a probabilistic representation of the data by encoding them into a lower-dimensional space and then decoding them back into their original format.
- Gated Recurrent Unit (GRU) [32]: Another RNN variant similar to LSTM, but with a simpler architecture, using fewer gates to capture dependencies in sequential data while also addressing the vanishing gradient problem.
- Bi-directional Long Short-Term Memory (Bi-LSTM) [33]: An extension of LSTM that processes input sequences in both forward and backward directions, improving its ability to capture context from both past and future information in sequential data.
- Transformer [23]: A neural network architecture based on attention mechanisms that process sequential data without relying on recurrence, making it highly effective for tasks like machine translation and time series prediction.
- Transformer (encoder only): A model that separates the encoder structure from the Transformer architecture. The model first applies PE to the input, followed by Multi-Head Attention, residual connections, LN, and a Feed-Forward layer.
- Transformer (decoder only): A model that separates the decoder structure from the Transformer architecture. The model first applies PE to the input, followed by Multi-Head Attention, residual connections, and LN. Then, it passes through another Multi-Head Attention layer, residual connections, and LN, before passing through a Feed-Forward layer.
3. Results
3.1. Prediction Performance
3.2. Efficiency Analysis: Execution Time and Parameters
3.3. Efficiency Analysis: Removing Layer Normalization and Positional Encoding
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Space Exploration Technologies Corp. SpaceX. 2002–2023. Available online: https://www.spacex.com (accessed on 12 July 2024).
- Blue Origin, Enterprises, L.P. Blue Origin. 2007–2023. Available online: https://www.blueorigin.com/ (accessed on 12 July 2024).
- Rocket Lab USA, Inc. Rocket Lab. 2006–2023. Available online: https://www.https://www.rocketlabusa.com// (accessed on 12 July 2024).
- Eutelsat Oneweb. OneWeb. 2012–2022. Available online: https://www.oneweb.world (accessed on 12 July 2024).
- Allen, B. SpaceX Launches Starlink Satellites. 2024. Available online: https://earthsky.org/spaceflight/spacex-starlink-launches-june-2024/ (accessed on 13 July 2024).
- Kelecy, T.; Jah, M. Analysis of orbital prediction accuracy improvements using high fidelity physical solar radiation pressure models for tracking high area-to-mass ratio objects. In Proceedings of the Fifth European Conference on Space Debris, Darmstadt, Germany, 30 March–2 April 2009; Volume 5. [Google Scholar]
- Puente, C.; Sáenz-Nuño, M.A.; Villa-Monte, A.; Olivas, J.A. Satellite Orbit Prediction Using Big Data and Soft Computing Techniques to Avoid Space Collisions. Mathematics 2021, 9, 2040. [Google Scholar] [CrossRef]
- Peng, H.; Bai, X. Gaussian Processes for improving orbit prediction accuracy. Acta Astronaut. 2019, 161, 44–56. [Google Scholar] [CrossRef]
- Peng, H.; Bai, X. Exploring Capability of Support Vector Machine for Improving Satellite Orbit Prediction Accuracy. J. Aerosp. Inf. Syst. 2018, 15, 366–381. [Google Scholar] [CrossRef]
- Proença, P.F.; Gao, Y. Deep Learning for Spacecraft Pose Estimation from Photorealistic Rendering. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 6007–6013. [Google Scholar] [CrossRef]
- Saxena, A.; Baraha, S.; Sahoo, A.K. Improved Orbit Prediction using Gradient Boost Regression Tree. In Proceedings of the 2023 International Conference on Microwave, Optical, and Communication Engineering (ICMOCE), Bhubaneswar, India, 26–28 May 2023; pp. 1–4. [Google Scholar] [CrossRef]
- Peng, H.; Bai, X. Improving orbit prediction accuracy through supervised machine learning. Adv. Space Res. 2018, 61, 2628–2646. [Google Scholar] [CrossRef]
- Peng, H.; Bai, X. Artificial Neural Network–Based Machine Learning Approach to Improve Orbit Prediction Accuracy. J. Spacecr. Rocket. 2018, 55, 1248–1260. [Google Scholar] [CrossRef]
- Wikipedia. Two-Line Element set. 2023. Available online: https://en.wikipedia.org/wiki/Two-line_element_set (accessed on 12 July 2024).
- Qu, Z.; Wei, C. A PSO-LSTM-based Method for Spatial Target Orbit Phase Prediction. In Proceedings of the 2023 4th International Conference on Computer Vision, Image and Deep Learning (CVIDL), Zhuhai, China, 12–14 May 2023; pp. 358–362. [Google Scholar] [CrossRef]
- Chen, Y.; Wang, K. Prediction of Satellite Time Series Data Based on Long Short Term Memory-Autoregressive Integrated Moving Average Model (LSTM-ARIMA). In Proceedings of the 2019 IEEE 4th International Conference on Signal and Image Processing (ICSIP), Wuxi, China, 19–21 July 2019; pp. 308–312. [Google Scholar] [CrossRef]
- Ren, H.; Chen, X.; Guan, B.; Wang, Y.; Liu, T.; Peng, K. Research on Satellite Orbit Prediction Based on Neural Network Algorithm. In Proceedings of the 2019 3rd High Performance Computing and Cluster Technologies Conference, Guangzhou, China, 22–24 June 2019; pp. 267–273. [Google Scholar]
- Salleh, N.; Azmi, N.F.M.; Yuhaniz, S.S. An Adaptation of Deep Learning Technique In Orbit Propagation Model Using Long Short-Term Memory. In Proceedings of the 2021 International Conference on Electrical, Communication, and Computer Engineering (ICECCE), Kuala Lumpur, Malaysia, 12–13 June 2021; pp. 1–6. [Google Scholar]
- Osama, A.; Raafat, M.; Darwish, A.; Abdelghafar, S.; Hassanien, A.E. Satellite Orbit Prediction Based on Recurrent Neural Network using Two Line Elements. In Proceedings of the 2022 5th International Conference on Computing and Informatics (ICCI), Cairo, Egypt, 9–10 March 2022; pp. 298–302. [Google Scholar] [CrossRef]
- Shin, Y.; Park, E.J.; Woo, S.S.; Jung, O.; Chung, D. Selective Tensorized Multi-layer LSTM for Orbit Prediction. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, CIKM ’22, Atlanta, GA, USA, 17–21 October 2022; pp. 3495–3504. [Google Scholar] [CrossRef]
- Napoli, C.; De Magistris, G.; Ciancarelli, C.; Corallo, F.; Russo, F.; Nardi, D. Exploiting Wavelet Recurrent Neural Networks for satellite telemetry data modeling, prediction and control. Expert Syst. Appl. 2022, 206, 117831. [Google Scholar] [CrossRef]
- Yang, H.T.; Zhu, J.P.; Zhang, J. The Research of Low Earth Orbit Prediction of Satellite Based on Deep Neural Network. DEStech Trans. Comput. Sci. Eng. 2018. [Google Scholar] [CrossRef] [PubMed]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.u.; Polosukhin, I. Attention is All you Need. In Proceedings of the Advances in Neural Information Processing Systems; Guyon, I., Luxburg, U.V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., Garnett, R., Eds.; Curran Associates, Inc.: Long Beach, CA, USA, 2017; Volume 30. [Google Scholar]
- KARI. Korea Aerospace Research Institute, 1989–2023. Available online: https://www.kari.re.kr (accessed on 13 July 2024).
- Wikipedia. Kepler’s Laws of Planetary Motion, 2023. Available online: https://en.wikipedia.org/wiki/Kepler%27s_laws_of_planetary_motion (accessed on 13 July 2024).
- Nasa Science. Orbits and Kepler’s Laws, 2023. Available online: https://science.nasa.gov/resource/orbits-and-keplers-laws/ (accessed on 13 July 2024).
- Nasa Science. Orbits and Kepler’s Laws, 2024. Available online: https://science.nasa.gov/solar-system/orbits-and-keplers-laws/ (accessed on 13 July 2024).
- Drucker, H.; Burges, C.J.C.; Kaufman, L.; Smola, A.; Vapnik, V. Support Vector Regression Machines. In Proceedings of the Advances in Neural Information Processing Systems; Mozer, M., Jordan, M., Petsche, T., Eds.; MIT Press: Denver, CO, USA, 1996; Volume 9. [Google Scholar]
- Zhai, N.; Yao, P.; Zhou, X. Multivariate Time Series Forecast in Industrial Process Based on XGBoost and GRU. In Proceedings of the 2020 IEEE 9th Joint International Information Technology and Artificial Intelligence Conference (ITAIC), Chongqing, China, 11–13 December 2020; Volume 9, pp. 1397–1400. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed]
- Kingma, D.P.; Welling, M. Auto-Encoding Variational Bayes. In Proceedings of the 2nd International Conference on Learning Representations, ICLR 2014, Banff, AB, Canada, 14–16 April 2014. Conference Track Proceedings. [Google Scholar]
- Cho, K.; van Merriënboer, B.; Gulcehre, C.; Bahdanau, D.; Bougares, F.; Schwenk, H.; Bengio, Y. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, 25–29 October 2014; pp. 1724–1734. [Google Scholar] [CrossRef]
- Schuster, M.; Paliwal, K. Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 1997, 45, 2673–2681. [Google Scholar] [CrossRef]
- Zhou, T.; Ma, Z.; Wen, Q.; Sun, L.; Chen, X.; Cai, X.; Zhu, Y.; Zhang, R.; Ma, J.; Xu, Z. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. In Proceedings of the Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI 2021, Virtual, 2–9 February 2021; pp. 11106–11115. [Google Scholar]
- Makridakis, S.; Spiliotis, E.; Assimakopoulos, V. The M4 Competition: Results, findings, conclusion and way forward. Int. J. Forecast. 2018, 34, 802–808. [Google Scholar] [CrossRef]
Element’s Name | Description | Dataset | Range |
---|---|---|---|
Semi-major axis (S) | A value that indicates the size of the orbit, expressed in kilometers, and signifies how far the satellite is from the Earth. | K3 | 7052.951∼7073.345 |
K3A | 6892.097∼6915.121 | ||
K5 | 6928.670∼6937.687 | ||
Eccentricity (E) | A value that characterizes the orbit as an elliptical shape, ranging between 0 and 1. | K3 | 0.000001∼0.003911 |
K3A | 0.000002∼0.004223 | ||
K5 | 0.000410∼0.003133 | ||
Inclination (I) | The angle by which the orbital plane is tilted concerning the equatorial plane. | K3 | 98.11845∼98.19598 |
K3A | 97.45451∼97.59076 | ||
K5 | 97.59100∼97.63480 | ||
RAAN (R) | The angle between the vernal equinox and the ascending node of the satellite’s orbit. | K3 | 0.00034∼359.99997 |
K3A | 0.00006∼359.99992 | ||
K5 | 0.00004∼359.99992 | ||
Argument of perigee (A) | The angle formed between the point where the satellite is nearest to the Earth and the satellite’s ascending node in its orbital path. | K3 | 0.00016∼359.99936 |
K3A | 0.00028∼359.99987 | ||
K5 | 0.00014∼359.99990 | ||
Mean anomaly (M) | The angle at a specific point in the orbit, measured from the perigee. | K3 | 0.00079∼359.99899 |
K3A | 0.00107∼359.99975 | ||
K5 | 61.20162∼298.27813 |
Prediction Error (MSE) | ||||
---|---|---|---|---|
Model | Prediction Error | |||
K3 | K3A | K5 | ||
SVR | - | 0.0033472 | 0.0015751 | 0.0010871 |
XGBR | - | 0.0633434 | 0.0576222 | 0.0653850 |
LSTM | - | 0.0156621 | 0.0038404 | 0.0004618 |
VAE | - | 0.0713109 | 0.0211886 | 0.0016111 |
GRU | - | 0.0154912 | 0.0204662 | 0.0101774 |
Bi-LSTM | - | 0.0086924 | 0.0041601 | 0.0003658 |
Transformer-h3 | 3 | 0.1071742 | 0.1492044 | 0.1120841 |
Transformer-h3 (encoder only) | 3 | 0.0000488 | 0.0000214 | 0.0000010 |
Transformer-h3 (decoder-only) | 3 | 0.0009327 | 0.0000934 | 0.0000011 |
LTE-h3, ours | 3 | ✓0.0000241 | 0.0000107 | ✓0.0000007 |
Transformer-h6 | 6 | 0.0778209 | 0.1337293 | 0.1125202 |
Transformer-h6 (encoder only) | 6 | 0.0000538 | 0.0000125 | 0.0000018 |
Transformer-h6 (decoder only) | 6 | 0.0003631 | 0.0000558 | 0.0000030 |
LTE-h6, ours | 6 | 0.0000292 | ✓0.0000072 | 0.0000008 |
Improvement | - | 50.61% | 42.40% | 30.00% |
Model | Execute Time (s) | Num. of Parameters |
---|---|---|
Transformer | 2779 | 2282 |
Transformer (encoder only) | 1863 | 1030 |
Transformer (decoder only) | 1731 | 1174 |
LTE, ours | ✓1093 | ✓1006 |
Improvement (Reduction Rate) | 36.86% | 2.33% |
Model | Prediction Error (MSE) | |
---|---|---|
LTE-h3 + PE + LN | 3 | 0.0000499 |
LTE-h6 + PE + LN | 6 | 0.0000299 |
LTE-h3 + PE | 3 | 0.1211978 |
LTE-h6 + PE | 6 | 0.0836058 |
LTE-h3 + LN | 3 | 0.0737088 |
LTE-h6 + LN | 6 | 0.0712439 |
LTE-h3, ours | ✓3 | ✓0.0000238 |
LTE-h6, ours | 6 | 0.0000288 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jeong, S.; Shin, Y. LTE: Lightweight Transformer Encoder for Orbit Prediction. Electronics 2024, 13, 4371. https://doi.org/10.3390/electronics13224371
Jeong S, Shin Y. LTE: Lightweight Transformer Encoder for Orbit Prediction. Electronics. 2024; 13(22):4371. https://doi.org/10.3390/electronics13224371
Chicago/Turabian StyleJeong, Seungwon, and Youjin Shin. 2024. "LTE: Lightweight Transformer Encoder for Orbit Prediction" Electronics 13, no. 22: 4371. https://doi.org/10.3390/electronics13224371
APA StyleJeong, S., & Shin, Y. (2024). LTE: Lightweight Transformer Encoder for Orbit Prediction. Electronics, 13(22), 4371. https://doi.org/10.3390/electronics13224371