A method for Automatic Modulation Classification using Mamba structure
A project employing the Selective State Space Model (Mamba) method for Automatic Modulation Classification (AMC) in a scenario of extended signal length.
The increased sequence length complicates the learning process and diminishes accuracy, while simultaneously escalating memories and reduces timeliness. This issue brings the following impacts:
if our codes helped your reasearch, please consider citing the corresponding submission
@article{zhang2024MAMC,
title={MAMCA -- Optimal on Accuracy and Efficiency for Automatic Modulation Classification with Extended Signal Length},
author={Yezhuo Zhang and Zinan Zhou and Yichao Cao and Guangyu Li and Xuanpeng Li},
year={2024},
journal={arXiv preprint arXiv:2405.11263},
}
@article{10705364,
author={Zhang, Yezhuo and Zhou, Zinan and Cao, Yichao and Li, Guangyu and Li, Xuanpeng},
journal={IEEE Communications Letters},
title={MAMC - Optimal on Accuracy and Efficiency for Automatic Modulation Classification with Extended Signal Length},
year={2024},
volume={},
number={},
pages={1-1},
doi={10.1109/LCOMM.2024.3474519}
}
We utilize a denosing unit for better accuracy performance under noise interference, while using Mamba as the backbone for low GPU occupancy and training/inference time.
To related AMC works, as well as sorce code:
-
Deep Learning Based Automatic Modulation Recognition: Models, Datasets, and Challenges
To the denosing method employed in our work, as well as source code:
-
IEEE Transactions on Industrial Informatics 2020
-
Deep-Residual-Shrinkage-Networks-for-intelligent-fault-diagnosis-DRSN
To the Mamba method employed in our work, as well as source code:
-
arXiv preprint arXiv:2312.00752
Mamba: Linear-Time Sequence Modeling with Selective State Spaces
-
mamba
pip install -r requirements.txt
cd into code/script
and do
bash RML2016.10a.sh
If you have any problem with our code or any suggestions, including discussion on SEI, please feel free to contact
- Yezhuo Zhang ([email protected] | [email protected])
- Zinan Zhou ([email protected])
- Xuanpeng Li ([email protected])