Automatic Modulation Classification Based on CNN-Transformer Graph Neural Network
Abstract
:1. Introduction
- We preprocess the IQ signal using a sliding window and reorganize the subsequences into a signal matrix; we propose the CTN, which can dynamically map the subsequence matrix into a graph structure, thereby mining temporal and spatial relationships of signal data, to better understand modulation schemes;
- We integrate CTN and GNN for end-to-end training and optimization, which can automatically learn the most reasonable graph representation of signals, and the network structure can be flexibly extended for application to different classification tasks;
- We validated our method using the publicly available datasets RML2016.10a and RML2016.10b. Comparisons with CNN, RNN, and other GNN-based models demonstrated that our CTGNet achieved state-of-the-art performance.
2. Related Works
2.1. Transformer in Classification
2.2. GNN in Classification
2.3. Deep-Learning-Based AMC
3. The Proposed Method
3.1. IQ Data Preprocessing
3.2. Mapping the Processed Data to Graph
3.3. Classification of Graphs Using GNN
4. Dataset and Experiments
4.1. Dataset and Parameter Setting
4.2. Baseline Methods
4.3. Evaluation Metrics
- True Positive (TP): Truly positive, predicted to be positive
- False Positive (FP): Truly negative, predicted to be positive
- False Negative (FN): Truly positive, predicted to be negative
- True Negative (TN): Truly negative, predicted to be negative
4.4. Results and Discussion
4.4.1. Ablation Study
4.4.2. Experiments on Different Sliding Window Sizes and Step Sizes
4.4.3. Comparisons with Other Baseline Methods
4.4.4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Zhu, Z.; Nandi, A.K. Automatic Modulation Classification: Principles, Algorithms and Applications; John Wiley & Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
- Zhou, R.; Liu, F.; Gravelle, C.W. Deep learning for modulation recognition: A survey with a demonstration. IEEE Access 2020, 8, 67366–67376. [Google Scholar] [CrossRef]
- Xu, J.L.; Su, W.; Zhou, M. Software-defined radio equipped with rapid modulation recognition. IEEE Trans. Veh. Technol. 2010, 59, 1659–1667. [Google Scholar] [CrossRef]
- Panagiotou, P.; Anastasopoulos, A.; Polydoros, A. Likelihood ratio tests for modulation classification. In Proceedings of the MILCOM 2000 Proceedings, 21st Century Military Communications, Architectures and Technologies for Information Superiority (Cat. No. 00CH37155), Los Angeles, CA, USA, 22–25 October 2000; Volume 2, pp. 670–674. [Google Scholar]
- Hameed, F.; Dobre, O.A.; Popescu, D.C. On the likelihood-based approach to modulation classification. IEEE Trans. Wirel. Commun. 2009, 8, 5884–5892. [Google Scholar] [CrossRef]
- Dulek, B. Online hybrid likelihood based modulation classification using multiple sensors. IEEE Trans. Wirel. Commun. 2017, 16, 4984–5000. [Google Scholar] [CrossRef]
- Amuru, S.; da Silva, C.R. A blind preprocessor for modulation classification applications in frequency-selective non-Gaussian channels. IEEE Trans. Commun. 2014, 63, 156–169. [Google Scholar]
- Rajendran, S.; Meert, W.; Giustiniano, D.; Lenders, V.; Pollin, S. Deep learning models for wireless signal classification with distributed low-cost spectrum sensors. IEEE Trans. Cogn. Commun. Netw. 2018, 4, 433–445. [Google Scholar] [CrossRef]
- Moser, E.; Moran, M.K.; Hillen, E.; Li, D.; Wu, Z. Automatic modulation classification via instantaneous features. In Proceedings of the 2015 National Aerospace and Electronics Conference (NAECON), Dayton, OH, USA, 15–19 June 2015; pp. 218–223. [Google Scholar]
- Chang, D.C.; Shih, P.K. Cumulants-based modulation classification technique in multipath fading channels. IET Commun. 2015, 9, 828–835. [Google Scholar] [CrossRef]
- Hassan, K.; Dayoub, I.; Hamouda, W.; Berbineau, M. Automatic modulation recognition using wavelet transform and neural network. In Proceedings of the 2009 9th International Conference on Intelligent Transport Systems Telecommunications, (ITST), Lille, France, 20–22 October 2009; pp. 234–238. [Google Scholar]
- Han, L.; Gao, F.; Li, Z.; Dobre, O.A. Low complexity automatic modulation classification based on order-statistics. IEEE Trans. Wirel. Commun. 2016, 16, 400–411. [Google Scholar] [CrossRef]
- Swami, A.; Sadler, B.M. Hierarchical digital modulation classification using cumulants. IEEE Trans. Commun. 2000, 48, 416–429. [Google Scholar] [CrossRef]
- Ramkumar, B. Automatic modulation classification for cognitive radios using cyclic feature detection. IEEE Circuits Syst. Mag. 2009, 9, 27–45. [Google Scholar] [CrossRef]
- Popoola, J.J.; Olst, R. Automatic recognition of analog modulated signals using artificial neural networks. Comput. Technol. Appl. 2011, 2, 29–35. [Google Scholar]
- Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. Imagenet large scale visual recognition challenge. Int. J. Comput. Vis. 2015, 115, 211–252. [Google Scholar] [CrossRef]
- Wu, Y.; Schuster, M.; Chen, Z.; Le, Q.V.; Norouzi, M.; Macherey, W.; Krikun, M.; Cao, Y.; Gao, Q.; Macherey, K.; et al. Google’s neural machine translation system: Bridging the gap between human and machine translation. arXiv 2016, arXiv:1609.08144. [Google Scholar]
- Gao, C.; Wang, X.; He, X.; Li, Y. Graph neural networks for recommender system. In Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining, Tempe, AZ, USA, 21–25 February 2022; pp. 1623–1625. [Google Scholar]
- Qian, X.; Lin, S.; Cheng, G.; Yao, X.; Ren, H.; Wang, W. Object detection in remote sensing images based on improved bounding box regression and multi-level features fusion. Remote Sens. 2020, 12, 143. [Google Scholar] [CrossRef]
- Lin, S.; Zhang, M.; Cheng, X.; Zhou, K.; Zhao, S.; Wang, H. Hyperspectral anomaly detection via sparse representation and collaborative representation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 16, 946–961. [Google Scholar] [CrossRef]
- Lin, S.; Zhang, M.; Cheng, X.; Wang, L.; Xu, M.; Wang, H. Hyperspectral anomaly detection via dual dictionaries construction guided by two-stage complementary decision. Remote Sens. 2022, 14, 1784. [Google Scholar] [CrossRef]
- Lin, S.; Zhang, M.; Cheng, X.; Zhou, K.; Zhao, S.; Wang, H. Dual Collaborative Constraints Regularized Low-Rank and Sparse Representation via Robust Dictionaries Construction for Hyperspectral Anomaly Detection. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 16, 2009–2024. [Google Scholar] [CrossRef]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar]
- Graves, A.; Mohamed, A.r.; Hinton, G. Speech recognition with deep recurrent neural networks. In Proceedings of the 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, Canada, 26–31 May 2013; pp. 6645–6649. [Google Scholar]
- Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Philip, S.Y. A comprehensive survey on graph neural networks. IEEE Trans. Neural Networks Learn. Syst. 2020, 32, 4–24. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30, 5998–6008. [Google Scholar]
- O’Shea, T.J.; Corgan, J.; Clancy, T.C. Convolutional radio modulation recognition networks. In Proceedings of the Engineering Applications of Neural Networks: 17th International Conference, EANN 2016, Aberdeen, UK, 2–5 September 2016; Proceedings 17. Springer: Berlin/Heidelberg, Germany, 2016; pp. 213–226. [Google Scholar]
- O’Shea, T.J.; Roy, T.; Clancy, T.C. Over-the-air deep learning based radio signal classification. IEEE J. Sel. Top. Signal Process. 2018, 12, 168–179. [Google Scholar] [CrossRef]
- Hong, D.; Zhang, Z.; Xu, X. Automatic modulation classification using recurrent neural networks. In Proceedings of the 2017 3rd IEEE International Conference on Computer and Communications (ICCC), Chengdu, China, 13–16 December 2017; pp. 695–700. [Google Scholar]
- Hamilton, W.; Ying, Z.; Leskovec, J. Inductive representation learning on large graphs. Adv. Neural Inf. Process. Syst. 2017, 30, 1024–1034. [Google Scholar]
- Müller, E. Graph clustering with graph neural networks. J. Mach. Learn. Res. 2023, 24, 1–21. [Google Scholar]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv 2020, arXiv:2010.11929. [Google Scholar]
- Han, K.; Xiao, A.; Wu, E.; Guo, J.; Xu, C.; Wang, Y. Transformer in transformer. Adv. Neural Inf. Process. Syst. 2021, 34, 15908–15919. [Google Scholar]
- Liu, Z.; Lin, Y.; Cao, Y.; Hu, H.; Wei, Y.; Zhang, Z.; Lin, S.; Guo, B. Swin transformer: Hierarchical vision transformer using shifted windows. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 10012–10022. [Google Scholar]
- Kipf, T.N.; Welling, M. Semi-supervised classification with graph convolutional networks. arXiv 2016, arXiv:1609.02907. [Google Scholar]
- Veličković, P.; Cucurull, G.; Casanova, A.; Romero, A.; Lio, P.; Bengio, Y. Graph attention networks. arXiv 2017, arXiv:1710.10903. [Google Scholar]
- Ying, Z.; You, J.; Morris, C.; Ren, X.; Hamilton, W.; Leskovec, J. Hierarchical graph representation learning with differentiable pooling. Adv. Neural Inf. Process. Syst. 2018, 31, 4805–4815. [Google Scholar]
- Zhang, F.; Luo, C.; Xu, J.; Luo, Y. An efficient deep learning model for automatic modulation recognition based on parameter estimation and transformation. IEEE Commun. Lett. 2021, 25, 3287–3290. [Google Scholar] [CrossRef]
- Xu, J.; Luo, C.; Parr, G.; Luo, Y. A spatiotemporal multi-channel learning framework for automatic modulation recognition. IEEE Wirel. Commun. Lett. 2020, 9, 1629–1632. [Google Scholar] [CrossRef]
- Yashashwi, K.; Sethi, A.; Chaporkar, P. A learnable distortion correction module for modulation recognition. IEEE Wirel. Commun. Lett. 2018, 8, 77–80. [Google Scholar] [CrossRef]
- Liu, X.; Yang, D.; El Gamal, A. Deep neural network architectures for modulation classification. In Proceedings of the 2017 51st Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, CA, USA, 29 October–1 November 2017; pp. 915–919. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Jiyuan, T.; Limin, Z.; Zhaogen, Z.; Wenlong, Y. Multi-modulation recognition using convolution gated recurrent unit networks. In Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2019; Volume 1284, p. 012052. [Google Scholar]
- Hu, S.; Pei, Y.; Liang, P.P.; Liang, Y.C. Deep neural network for robust modulation classification under uncertain noise conditions. IEEE Trans. Veh. Technol. 2019, 69, 564–577. [Google Scholar] [CrossRef]
- Zhang, Z.; Luo, H.; Wang, C.; Gan, C.; Xiang, Y. Automatic modulation classification using CNN-LSTM based dual-stream structure. IEEE Trans. Veh. Technol. 2020, 69, 13521–13531. [Google Scholar] [CrossRef]
- Chang, S.; Huang, S.; Zhang, R.; Feng, Z.; Liu, L. Multitask-learning-based deep neural network for automatic modulation classification. IEEE Internet Things J. 2021, 9, 2192–2206. [Google Scholar] [CrossRef]
- Ghasemzadeh, P.; Hempel, M.; Sharif, H. GS-QRNN: A high-efficiency automatic modulation classifier for cognitive radio IoT. IEEE Internet Things J. 2022, 9, 9467–9477. [Google Scholar] [CrossRef]
- Ke, Z.; Vikalo, H. Real-time radio technology and modulation classification via an LSTM auto-encoder. IEEE Trans. Wirel. Commun. 2021, 21, 370–382. [Google Scholar] [CrossRef]
- Zhang, F.; Luo, C.; Xu, J.; Luo, Y. An Autoencoder-based I/Q Channel Interaction Enhancement Method for Automatic Modulation Recognition. IEEE Trans. Veh. Technol. 2023, 21, 5977–5988. [Google Scholar] [CrossRef]
- Zhang, W.; Yang, X.; Leng, C.; Wang, J.; Mao, S. Modulation recognition of underwater acoustic signals using deep hybrid neural networks. IEEE Trans. Wirel. Commun. 2022, 21, 5977–5988. [Google Scholar] [CrossRef]
- Xu, S.; Liu, L.; Zhao, Z. DTFTCNet: Radar Modulation Recognition with Deep Time-Frequency Transformation. IEEE Trans. Cogn. Commun. Netw. 2023. early access. [Google Scholar] [CrossRef]
- Li, L.; Zhu, Y.; Zhu, Z. Automatic Modulation Classification Using ResNeXt-GRU with Deep Feature Fusion. IEEE Trans. Instrum. Meas. 2023, 72, 2519710. [Google Scholar] [CrossRef]
- Che, J.; Wang, L.; Bai, X.; Liu, C.; Zhou, F. Spatial-Temporal Hybrid Feature Extraction Network for Few-shot Automatic Modulation Classification. IEEE Trans. Veh. Technol. 2022, 71, 13387–13392. [Google Scholar] [CrossRef]
- Zhang, D.; Lu, Y.; Li, Y.; Ding, W.; Zhang, B. High-order convolutional attention networks for automatic modulation classification in communication. IEEE Trans. Wirel. Commun. 2022, 22, 4600–4610. [Google Scholar] [CrossRef]
- Xiao, C.; Yang, S.; Feng, Z. Complex-valued Depth-wise Separable Convolutional Neural Network for Automatic Modulation Classification. IEEE Trans. Instrum. Meas. 2023, 72, 2522310. [Google Scholar] [CrossRef]
- Zheng, Q.; Zhao, P.; Wang, H.; Elhanashi, A.; Saponara, S. Fine-grained modulation classification using multi-scale radio transformer with dual-channel representation. IEEE Commun. Lett. 2022, 26, 1298–1302. [Google Scholar] [CrossRef]
- Huynh-The, T.; Pham, Q.V.; Nguyen, T.V.; Nguyen, T.T.; da Costa, D.B.; Kim, D.S. RanNet: Learning residual-attention structure in CNNs for automatic modulation classification. IEEE Wirel. Commun. Lett. 2022, 11, 1243–1247. [Google Scholar] [CrossRef]
- Chang, S.; Zhang, R.; Ji, K.; Huang, S.; Feng, Z. A hierarchical classification head based convolutional gated deep neural network for automatic modulation classification. IEEE Trans. Wirel. Commun. 2022, 21, 8713–8728. [Google Scholar] [CrossRef]
- Wang, Z.; Oates, T. Encoding time series as images for visual inspection and classification using tiled convolutional neural networks. In Proceedings of the Workshops at the Twenty-Ninth AAAI Conference on Artificial Intelligence, Austin, TX, USA, 25–30 January 2015; AAAI: Menlo Park, CA, USA, 2015; Volume 1. [Google Scholar]
- Peng, S.; Jiang, H.; Wang, H.; Alwageed, H.; Zhou, Y.; Sebdani, M.M.; Yao, Y.D. Modulation classification based on signal constellation diagrams and deep learning. IEEE Trans. Neural Netw. Learn. Syst. 2018, 30, 718–727. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1106–1114. [Google Scholar] [CrossRef]
- Wang, Y.; Liu, M.; Yang, J.; Gui, G. Data-driven deep learning for automatic modulation recognition in cognitive radios. IEEE Trans. Veh. Technol. 2019, 68, 4074–4077. [Google Scholar] [CrossRef]
- Lin, S.; Zeng, Y.; Gong, Y. Modulation Recognition Using Signal Enhancement and Multistage Attention Mechanism. IEEE Trans. Wirel. Commun. 2022, 21, 9921–9935. [Google Scholar] [CrossRef]
- Lin, S.; Zeng, Y.; Gong, Y. Learning of time-frequency attention mechanism for automatic modulation recognition. IEEE Wirel. Commun. Lett. 2022, 11, 707–711. [Google Scholar] [CrossRef]
- Chen, Z.; Cui, H.; Xiang, J.; Qiu, K.; Huang, L.; Zheng, S.; Chen, S.; Xuan, Q.; Yang, X. SigNet: A novel deep learning framework for radio signal classification. IEEE Trans. Cogn. Commun. Netw. 2021, 8, 529–541. [Google Scholar] [CrossRef]
- Lee, J.; Kim, B.; Kim, J.; Yoon, D.; Choi, J.W. Deep neural network-based blind modulation classification for fading channels. In Proceedings of the 2017 International Conference on Information and Communication Technology Convergence (ICTC), Jeju Island, Republic of Korea, 18–20 October 2017; pp. 551–554. [Google Scholar]
- Huang, S.; Lin, C.; Xu, W.; Gao, Y.; Feng, Z.; Zhu, F. Identification of active attacks in Internet of Things: Joint model-and data-driven automatic modulation classification approach. IEEE Internet Things J. 2020, 8, 2051–2065. [Google Scholar] [CrossRef]
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 4700–4708. [Google Scholar]
- Liu, Y.; Liu, Y.; Yang, C. Modulation recognition with graph convolutional network. IEEE Wirel. Commun. Lett. 2020, 9, 624–627. [Google Scholar] [CrossRef]
- Xuan, Q.; Zhou, J.; Qiu, K.; Chen, Z.; Xu, D.; Zheng, S.; Yang, X. AvgNet: Adaptive visibility graph neural network and its application in modulation classification. IEEE Trans. Netw. Sci. Eng. 2022, 9, 1516–1526. [Google Scholar] [CrossRef]
- Lacasa, L.; Luque, B.; Ballesteros, F.; Luque, J.; Nuno, J.C. From time series to complex networks: The visibility graph. Proc. Natl. Acad. Sci. USA 2008, 105, 4972–4975. [Google Scholar] [CrossRef] [PubMed]
- Luque, B.; Lacasa, L.; Ballesteros, F.; Luque, J. Horizontal visibility graphs: Exact results for random time series. Phys. Rev. E 2009, 80, 046103. [Google Scholar] [CrossRef] [PubMed]
- Zhou, T.T.; Jin, N.D.; Gao, Z.K.; Luo, Y.B. Limited Penetrable Visibility Graph for Establishing Complex Network from Time Series; Acta Physica Sinica: Beijing, China, 2012. [Google Scholar]
- O’shea, T.J.; West, N. Radio machine learning dataset generation with gnu radio. In Proceedings of the GNU Radio Conference, Boulder, CO, USA, 12–16 September 2016; Volume 1. [Google Scholar]
Operation | Output Dimensions |
---|---|
Input | 128 × 1 |
Data Preprocessing | 15 × 16 |
Conv1D (size 5) | 15 × 96 |
Position Encoding | 15 × 96 |
TransformerEncoderLayer (num_layers 3, 8, 96, 96) | 15 × 96 |
Multi-Head Attention + Average | adjacency matrix: 15 × 15 |
Concatenate subsequences of I and Q | node feature: 15 × 32 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
Concatenate the output of above 6 layers | 15 × 576 |
DMoNPool | node feature: 15 × 576, adjacency matrix: 15 × 15 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
Concatenate the output of above 6 layers | 15 × 576 |
DMoNPool | node feature: 8 × 576, adjacency matrix: 8 × 8 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
GraphSAGE + BN + Relu | 15 × 96 |
Concatenate the output of above 6 layers | 15 × 576 |
Average | 576 |
FC + Relu | 288 |
FC + Relu | 96 |
Concatenate feature vectors and | 192 |
FC + Relu | 96 |
FC + Softmax | modulation modes |
Dataset | SNR | Modulation Types |
---|---|---|
RML2016.10a | −20 dB:2 dB:18 dB | BPSK, QPSK, 8PSK, 16QAM, 64QAM, GFSK, CPFSK, PAM4, WBFM, AM-SSB, AM-DSB |
RML2016.10b | −20 dB:2 dB:18 dB | BPSK, QPSK, 8PSK, 16QAM, 64QAM, GFSK, CPFSK, PAM4, WBFM, AM-DSB |
Dataset | Model | Accuracy | F1 Score | Recall | Model Size (MB) | Training Time (s) |
---|---|---|---|---|---|---|
RML2016.10a | w/processing | 0.6207 | 0.6439 | 0.6207 | 4.62 | 0.0440 |
wo/processing | 0.6168 | 0.6378 | 0.6168 | 9.55 | 0.2446 | |
RML2016.10b | w/processing | 0.6433 | 0.6463 | 0.6433 | 4.62 | 0.0444 |
wo/processing | 0.6418 | 0.6393 | 0.6418 | 9.55 | 0.2436 |
Model | Accuracy | F1 Score | Recall | Model Size (MB) | Training Time (s) |
---|---|---|---|---|---|
AvgNet | 0.6112 | 0.6309 | 0.6112 | 3.63 | 0.0897 |
MCLDNN | 0.6107 | 0.6370 | 0.6107 | 1.94 | 0.0219 |
Resnet1d | 0.5891 | 0.6037 | 0.5891 | 0.91 | 0.0072 |
VGG | 0.5732 | 0.5867 | 0.5732 | 47.32 | 0.0730 |
CNN2d | 0.5286 | 0.5344 | 0.5286 | 10.91 | 0.0082 |
GRU | 0.5720 | 0.5870 | 0.5720 | 2.52 | 0.0085 |
LSTM | 0.5614 | 0.5773 | 0.5614 | 0.97 | 0.0111 |
GAF | 0.5309 | 0.5571 | 0.5309 | 63.15 | 0.1547 |
ConsCNN | 0.5437 | 0.5449 | 0.5437 | 69.22 | 0.3006 |
CTGNet | 0.6207 | 0.6439 | 0.6207 | 4.62 | 0.0440 |
Model | Accuracy | F1 Score | Recall | Model Size (MB) | Training Time (s) |
---|---|---|---|---|---|
AvgNet | 0.6363 | 0.6406 | 0.6363 | 3.63 | 0.0858 |
MCLDNN | 0.6394 | 0.6412 | 0.6394 | 1.94 | 0.0219 |
Resnet1d | 0.6315 | 0.6298 | 0.6315 | 0.91 | 0.0078 |
VGG | 0.6365 | 0.6350 | 0.6365 | 47.32 | 0.0709 |
CNN2d | 0.5564 | 0.5638 | 0.5564 | 10.91 | 0.0080 |
GRU | 0.6307 | 0.6298 | 0.6307 | 2.52 | 0.0088 |
LSTM | 0.6329 | 0.6332 | 0.6329 | 0.94 | 0.0114 |
GAF | 0.5598 | 0.5766 | 0.5598 | 63.15 | 0.1685 |
ConsCNN | 0.5629 | 0.5641 | 0.5629 | 69.22 | 0.2998 |
CTGNet | 0.6433 | 0.6463 | 0.6433 | 4.62 | 0.0444 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, D.; Lin, M.; Zhang, X.; Huang, Y.; Zhu, Y. Automatic Modulation Classification Based on CNN-Transformer Graph Neural Network. Sensors 2023, 23, 7281. https://doi.org/10.3390/s23167281
Wang D, Lin M, Zhang X, Huang Y, Zhu Y. Automatic Modulation Classification Based on CNN-Transformer Graph Neural Network. Sensors. 2023; 23(16):7281. https://doi.org/10.3390/s23167281
Chicago/Turabian StyleWang, Dong, Meiyan Lin, Xiaoxu Zhang, Yonghui Huang, and Yan Zhu. 2023. "Automatic Modulation Classification Based on CNN-Transformer Graph Neural Network" Sensors 23, no. 16: 7281. https://doi.org/10.3390/s23167281
APA StyleWang, D., Lin, M., Zhang, X., Huang, Y., & Zhu, Y. (2023). Automatic Modulation Classification Based on CNN-Transformer Graph Neural Network. Sensors, 23(16), 7281. https://doi.org/10.3390/s23167281