Modulation Format Recognition and OSNR Estimation Using CNN-Based Deep Learning
An intelligent eye-diagram analyzer is proposed to implement both modulation format recognition (MFR) and optical signal-to-noise rate (OSNR) estimation by using a convolution neural network (CNN)-based deep learning technique. With the ability of feature extraction and self-learning, CNN can proces...
Saved in:
Published in: | IEEE photonics technology letters Vol. 29; no. 19; pp. 1667 - 1670 |
---|---|
Main Authors: | , , , , , , |
Format: | Journal Article |
Language: | English |
Published: |
New York
IEEE
01-10-2017
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Abstract | An intelligent eye-diagram analyzer is proposed to implement both modulation format recognition (MFR) and optical signal-to-noise rate (OSNR) estimation by using a convolution neural network (CNN)-based deep learning technique. With the ability of feature extraction and self-learning, CNN can process eye diagram in its raw form (pixel values of an image) from the perspective of image processing, without knowing other eye-diagram parameters or original bit information. The eye diagram images of four commonly-used modulation formats over a wide OSNR range (10~25 dB) are obtained from an eye-diagram generation module in oscilloscope combined with the simulation system. Compared with four other machine learning algorithms (decision tress, k-nearest neighbors, back-propagation artificial neural network, and support vector machine), CNN obtains the higher accuracies. The accuracies of OSNR estimation and MFR both attain 100%. The proposed technique has the potential to be embedded in the test instrument to perform intelligent signal analysis or applied for optical performance monitoring. |
---|---|
AbstractList | An intelligent eye-diagram analyzer is proposed to implement both modulation format recognition (MFR) and optical signal-to-noise rate (OSNR) estimation by using a convolution neural network (CNN)-based deep learning technique. With the ability of feature extraction and self-learning, CNN can process eye diagram in its raw form (pixel values of an image) from the perspective of image processing, without knowing other eye-diagram parameters or original bit information. The eye diagram images of four commonly-used modulation formats over a wide OSNR range (10~25 dB) are obtained from an eye-diagram generation module in oscilloscope combined with the simulation system. Compared with four other machine learning algorithms (decision tress, k-nearest neighbors, back-propagation artificial neural network, and support vector machine), CNN obtains the higher accuracies. The accuracies of OSNR estimation and MFR both attain 100%. The proposed technique has the potential to be embedded in the test instrument to perform intelligent signal analysis or applied for optical performance monitoring. |
Author | Li, Ze Wang, Danshi Li, Jin Fu, Meixia Cui, Yue Zhang, Min Chen, Xue |
Author_xml | – sequence: 1 givenname: Danshi orcidid: 0000-0001-9815-4013 surname: Wang fullname: Wang, Danshi email: danshi_wang@bupt.edu.cn organization: State Key Laboratory of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Beijing, China – sequence: 2 givenname: Min surname: Zhang fullname: Zhang, Min organization: State Key Laboratory of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Beijing, China – sequence: 3 givenname: Ze surname: Li fullname: Li, Ze organization: State Key Laboratory of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Beijing, China – sequence: 4 givenname: Jin surname: Li fullname: Li, Jin organization: State Key Laboratory of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Beijing, China – sequence: 5 givenname: Meixia surname: Fu fullname: Fu, Meixia organization: State Key Laboratory of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Beijing, China – sequence: 6 givenname: Yue surname: Cui fullname: Cui, Yue organization: State Key Laboratory of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Beijing, China – sequence: 7 givenname: Xue surname: Chen fullname: Chen, Xue organization: State Key Laboratory of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Beijing, China |
BookMark | eNo9kM1PAjEQxRuDiYDeTbxs4nmx03bb7VER1GQFg3De7McsWQIttsvB_97iGk_z5uU3M5k3IgNjDRJyC3QCQPVD9rGeMApqwpRgScIvyBC0gDhYYhA0DRqAJ1dk5P2OUhAJF0OyfLf1aV90rTXR3LpD0UUrrOzWtL9WYepo-blYRTPftYce2_jWbKPpYhE_FR7r6BnxGGVYOBP8a3LZFHuPN391TDbz2Xr6GmfLl7fpYxZXTEMXSw1pglRIxiUTWkvGGpUgylJwXgpkNG14QmUJslS6Yk3oFJNprTnUAgQfk_t-79HZrxP6Lt_ZkzPhZA6ap1JRJWSgaE9VznrvsMmPLrzhvnOg-Tm2PMSWn2PL_2ILI3f9SIuI_3hKgSst-Q8zf2d3 |
CODEN | IPTLEL |
CitedBy_id | crossref_primary_10_1109_JPHOT_2022_3197148 crossref_primary_10_1109_TNNLS_2021_3085433 crossref_primary_10_1364_OE_488829 crossref_primary_10_1002_cpe_7180 crossref_primary_10_1109_LPT_2019_2910288 crossref_primary_10_1364_JOCN_425494 crossref_primary_10_3390_app12073331 crossref_primary_10_1109_ACCESS_2020_3019692 crossref_primary_10_1016_j_yofte_2021_102804 crossref_primary_10_3390_app10010253 crossref_primary_10_1016_j_ejmech_2023_115500 crossref_primary_10_1109_JLT_2019_2897313 crossref_primary_10_1117_1_OE_57_4_046111 crossref_primary_10_1109_ACCESS_2021_3111092 crossref_primary_10_1364_OE_26_018684 crossref_primary_10_1364_OE_27_019398 crossref_primary_10_1109_JLT_2019_2927748 crossref_primary_10_1109_JLT_2023_3243883 crossref_primary_10_1142_S0219720019500057 crossref_primary_10_3788_LOP221055 crossref_primary_10_3390_photonics9010030 crossref_primary_10_1109_JPHOT_2019_2947705 crossref_primary_10_1007_s11432_020_2871_2 crossref_primary_10_1016_j_ijleo_2021_167788 crossref_primary_10_1109_ACCESS_2024_3390209 crossref_primary_10_1109_ACCESS_2019_2916806 crossref_primary_10_3390_sym14050873 crossref_primary_10_1364_JOCN_11_0000C1 crossref_primary_10_1016_j_optcom_2020_126084 crossref_primary_10_1109_ACCESS_2019_2943927 crossref_primary_10_1364_JOCN_10_000D84 crossref_primary_10_1016_j_optcom_2020_126007 crossref_primary_10_3390_s23094331 crossref_primary_10_1007_s13198_022_01677_3 crossref_primary_10_1007_s40831_020_00300_8 crossref_primary_10_1016_j_yofte_2023_103357 crossref_primary_10_1364_OE_464159 crossref_primary_10_3390_e24050700 crossref_primary_10_3390_rs15153886 crossref_primary_10_1109_JLT_2019_2942431 crossref_primary_10_1364_OE_27_011281 crossref_primary_10_1364_OE_491377 crossref_primary_10_3390_fi16020038 crossref_primary_10_1117_1_OE_59_6_060501 crossref_primary_10_1016_j_rinp_2019_102790 crossref_primary_10_1109_JPHOT_2022_3148798 crossref_primary_10_1364_AO_455752 crossref_primary_10_1364_OE_27_009403 crossref_primary_10_3390_photonics8090402 crossref_primary_10_1364_JOCN_11_000A52 crossref_primary_10_1364_OE_27_004471 crossref_primary_10_3390_photonics6040111 crossref_primary_10_1145_3468506 crossref_primary_10_1109_COMST_2020_3018494 crossref_primary_10_1049_iet_smt_2017_0465 crossref_primary_10_3389_fmats_2021_791296 crossref_primary_10_1016_j_optcom_2020_125819 crossref_primary_10_1016_j_yofte_2022_102895 crossref_primary_10_1109_ACCESS_2019_2949201 crossref_primary_10_1016_j_yofte_2022_102931 crossref_primary_10_1364_JOCN_390727 crossref_primary_10_1364_OE_422849 crossref_primary_10_1109_JPHOT_2018_2869972 crossref_primary_10_1364_OE_25_026186 crossref_primary_10_1109_JLT_2020_2969296 crossref_primary_10_35234_fumbd_1141515 crossref_primary_10_54105_ijdcn_B5028_023223 crossref_primary_10_3390_electronics12081791 crossref_primary_10_1016_j_neucom_2022_09_088 crossref_primary_10_1007_s11036_021_01887_2 crossref_primary_10_1109_LPT_2018_2869913 crossref_primary_10_1364_AO_58_001246 crossref_primary_10_3389_frcmn_2021_656786 crossref_primary_10_1364_OE_26_021346 crossref_primary_10_1364_AO_426293 crossref_primary_10_3934_publichealth_2024004 crossref_primary_10_1109_ACCESS_2020_2966777 crossref_primary_10_1109_ACCESS_2020_2988727 crossref_primary_10_1109_JLT_2020_2984270 crossref_primary_10_1016_j_optcom_2019_05_062 crossref_primary_10_1007_s11432_022_3557_9 crossref_primary_10_1109_JLT_2019_2895730 crossref_primary_10_1109_JIOT_2024_3350927 crossref_primary_10_1016_j_yofte_2021_102726 crossref_primary_10_3390_photonics10040373 crossref_primary_10_1109_LPT_2018_2878530 crossref_primary_10_1016_j_petrol_2019_05_086 crossref_primary_10_1002_lpor_202000249 crossref_primary_10_1515_nanoph_2021_0578 crossref_primary_10_1109_JPHOT_2019_2944064 crossref_primary_10_1109_JLT_2019_2947154 crossref_primary_10_1109_TVT_2020_3030018 crossref_primary_10_1109_JLT_2017_2778883 crossref_primary_10_3390_photonics10060655 crossref_primary_10_1109_JLT_2022_3148270 crossref_primary_10_1016_j_yofte_2021_102455 crossref_primary_10_1016_j_optcom_2018_11_054 crossref_primary_10_1109_ACCESS_2020_2989521 crossref_primary_10_1109_ACCESS_2022_3197224 crossref_primary_10_1109_JPHOT_2022_3187648 crossref_primary_10_1016_j_optcom_2018_12_004 crossref_primary_10_1364_OSAC_412886 crossref_primary_10_3389_fnbot_2021_624466 crossref_primary_10_1109_JLT_2024_3401419 crossref_primary_10_1016_j_optcom_2022_127933 crossref_primary_10_1016_j_dsp_2022_103650 crossref_primary_10_1109_ACCESS_2019_2895409 crossref_primary_10_1016_j_yofte_2020_102147 crossref_primary_10_3390_electronics11172764 crossref_primary_10_1364_OE_27_018831 crossref_primary_10_1109_JLT_2022_3146025 crossref_primary_10_1103_PhysRevResearch_3_013200 crossref_primary_10_7717_peerj_cs_1453 crossref_primary_10_1016_j_optcom_2019_124698 crossref_primary_10_1109_ACCESS_2021_3071801 crossref_primary_10_1016_j_optcom_2019_07_016 crossref_primary_10_1109_ACCESS_2020_3038624 crossref_primary_10_1109_JLT_2019_2921305 crossref_primary_10_1364_AO_457463 crossref_primary_10_1109_ACCESS_2019_2939043 crossref_primary_10_1007_s11082_021_03280_5 crossref_primary_10_1007_s11432_020_2873_x crossref_primary_10_1002_mop_32906 crossref_primary_10_1109_TVT_2018_2868698 crossref_primary_10_1109_ACCESS_2020_3034942 crossref_primary_10_1016_j_optcom_2021_127657 crossref_primary_10_1109_ACCESS_2021_3054756 crossref_primary_10_1364_OE_395433 crossref_primary_10_1109_JQE_2021_3130935 crossref_primary_10_1364_JOCN_10_000D63 crossref_primary_10_3390_electronics12132762 crossref_primary_10_1364_JOCN_471154 |
Cites_doi | 10.1109/LGRS.2017.2671922 10.1109/TKDE.2015.2399298 10.1364/OE.20.00B181 10.1016/j.neunet.2014.09.003 10.1364/OFC.2017.Th1J.1 10.1109/JLT.2015.2480798 10.1364/OE.25.017150 10.1038/nature16961 10.1109/JLT.2015.2508502 10.1109/LPT.2014.2375960 10.1109/LPT.2016.2555857 10.1016/j.optcom.2016.02.029 10.1109/ECOC.2015.7341753 10.1038/nature14539 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2017 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2017 |
DBID | 97E RIA RIE AAYXX CITATION 7SP 7U5 8FD L7M |
DOI | 10.1109/LPT.2017.2742553 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005-present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library Online CrossRef Electronics & Communications Abstracts Solid State and Superconductivity Abstracts Technology Research Database Advanced Technologies Database with Aerospace |
DatabaseTitle | CrossRef Solid State and Superconductivity Abstracts Technology Research Database Advanced Technologies Database with Aerospace Electronics & Communications Abstracts |
DatabaseTitleList | Solid State and Superconductivity Abstracts |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library Online url: http://ieeexplore.ieee.org/Xplore/DynWel.jsp sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Applied Sciences Engineering Physics |
EISSN | 1941-0174 |
EndPage | 1670 |
ExternalDocumentID | 10_1109_LPT_2017_2742553 8013796 |
Genre | orig-research |
GrantInformation_xml | – fundername: National Natural Science Foundation of China; NSFC grantid: 61372119 funderid: 10.13039/501100001809 |
GroupedDBID | -~X 0R~ 29I 4.4 5GY 6IK 97E AAJGR AASAJ ABQJQ ACGFO ACGFS ACIWK AENEX AKJIK ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD F5P HZ~ IFIPE IPLJI JAVBF LAI M43 O9- OCL P2P RIA RIE RIG RNS TN5 TWZ AAYXX CITATION 7SP 7U5 8FD L7M |
ID | FETCH-LOGICAL-c291t-69185e0462362499622f75ee6b433b4e208f3506b16b79c2ff357268d931d4143 |
IEDL.DBID | RIE |
ISSN | 1041-1135 |
IngestDate | Thu Oct 10 19:44:02 EDT 2024 Fri Aug 23 01:05:17 EDT 2024 Mon Nov 04 12:03:11 EST 2024 |
IsPeerReviewed | false |
IsScholarly | true |
Issue | 19 |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c291t-69185e0462362499622f75ee6b433b4e208f3506b16b79c2ff357268d931d4143 |
ORCID | 0000-0001-9815-4013 |
PQID | 1938670746 |
PQPubID | 85439 |
PageCount | 4 |
ParticipantIDs | crossref_primary_10_1109_LPT_2017_2742553 proquest_journals_1938670746 ieee_primary_8013796 |
PublicationCentury | 2000 |
PublicationDate | 2017-10-01 |
PublicationDateYYYYMMDD | 2017-10-01 |
PublicationDate_xml | – month: 10 year: 2017 text: 2017-10-01 day: 01 |
PublicationDecade | 2010 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationTitle | IEEE photonics technology letters |
PublicationTitleAbbrev | LPT |
PublicationYear | 2017 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref12 ref14 ref11 ref10 ref2 ref17 ref16 ref8 ref7 dos santos (ref15) 2014 ref4 ref3 ref6 ref5 harrington (ref1) 2012 bishop (ref9) 2006 |
References_xml | – ident: ref13 doi: 10.1109/LGRS.2017.2671922 – ident: ref14 doi: 10.1109/TKDE.2015.2399298 – ident: ref3 doi: 10.1364/OE.20.00B181 – ident: ref11 doi: 10.1016/j.neunet.2014.09.003 – year: 2012 ident: ref1 publication-title: Machine Learning in Action contributor: fullname: harrington – start-page: 69 year: 2014 ident: ref15 article-title: Deep convolutional neural networks for sentiment analysis of short texts publication-title: Proc COLING contributor: fullname: dos santos – ident: ref4 doi: 10.1364/OFC.2017.Th1J.1 – year: 2006 ident: ref9 publication-title: Pattern Recognition and Machine Learning contributor: fullname: bishop – ident: ref17 doi: 10.1109/JLT.2015.2480798 – ident: ref16 doi: 10.1364/OE.25.017150 – ident: ref12 doi: 10.1038/nature16961 – ident: ref2 doi: 10.1109/JLT.2015.2508502 – ident: ref5 doi: 10.1109/LPT.2014.2375960 – ident: ref6 doi: 10.1109/LPT.2016.2555857 – ident: ref8 doi: 10.1016/j.optcom.2016.02.029 – ident: ref7 doi: 10.1109/ECOC.2015.7341753 – ident: ref10 doi: 10.1038/nature14539 |
SSID | ssj0014534 |
Score | 2.6458921 |
Snippet | An intelligent eye-diagram analyzer is proposed to implement both modulation format recognition (MFR) and optical signal-to-noise rate (OSNR) estimation by... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Publisher |
StartPage | 1667 |
SubjectTerms | Artificial neural networks Back propagation Back propagation networks Computer simulation Convolution convolution neural network (CNN) Deep learning eye diagram Feature extraction Format Image processing Kernel Machine learning Modulation modulation format recognition (MFR) Neural networks Optical communication Optical imaging Optical noise optical performance monitoring (OPM) optical signal-to-noise rate (OSNR) Recognition Signal analysis Signal to noise ratio |
Title | Modulation Format Recognition and OSNR Estimation Using CNN-Based Deep Learning |
URI | https://ieeexplore.ieee.org/document/8013796 https://www.proquest.com/docview/1938670746 |
Volume | 29 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://sdu.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV07T8MwED7RSkgwUGhBFArywIKE2zwcP0boQx0gRW2R2KImdtjSirT_H9txKhAsbEkUW9Fd7LvzffcdwB3LqYwylmPjG2Ai8hSLjBAcckn1AB0Q2M5z0wWL3_lobGhyHva1MEopCz5TfXNpc_lyne3MUdmAG348QRvQYIJXtVr7jAGJqgyyR3zs-2FUpyQ9MXh-XRoMF-ubtGQUhT9MkO2p8msjttZl0vrfd53CifMi0WOl9jM4UEUbWs6jRG69lm04_kY32IZDC_fMyg7MXtbS9e1CE-u1onmNJNKPVoVEs0U8R2O9AVS1jchiC9AwjvGTNnwSjZTaIMfO-nEOb5PxcjjFrrUCzgLhbzEV2k4rU5iqDZgOemgQ5CxSiqYkDFOiAo_nYeTR1KcpE1mQ6zsWUC5F6EuifawLaBbrQl0CMoz8kfJXQupQjoY05Ss9d-b5WbrKqWJduK-lnWwqBo3ERh6eSLRmEqOZxGmmCx0j3f17TrBd6NXqSdwSKxPteXLKTLuUq79HXcORmbtC3vWguf3cqRtolHJ3a3-dL9fXvXg |
link.rule.ids | 315,782,786,798,27934,27935,54769 |
linkProvider | IEEE |
linkToHtml | http://sdu.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LT8JAEJ4IxqgHUdCIou7Bi4krfeyje1QewQjFACbeGtrdegMi8P_d3S5EoxdvbdNtm5nuzjc7M98A3PKcSZrxHBtsgInIUywyQnAYSaYHaIfAdp7rjXn8HrU7hibnflsLo5SyyWfqwRzaWL6cZ2uzVdaMDD-eYCXYpYQzUVRrbWMGhBYxZI_42PdDuglKeqLZf52YLC7-YAKTlIY_jJDtqvJrKbb2pVv535cdw5HDkeixUPwJ7KhZFSoOUyI3Y5dVOPxGOFiFPZvwmS1rMBzMpevchboWt6LRJpdIX5rOJBqO4xHq6CWgqG5ENrsAteIYP2nTJ1FbqQVy_Kwfp_DW7UxaPeyaK-AsEP4KM6EttTKlqdqEabeHBUHOqVIsJWGYEhV4UR5Sj6U-S7nIglyf8YBFUoS-JBplnUF5Np-pc0CGk58qfyqkduZYyNJoqp-deX6WTnOmeB3uNtJOFgWHRmJ9D08kWjOJ0UziNFOHmpHu9j4n2Do0NupJ3CRbJhp7RoybhikXf4-6gf3eZNBP-s_xyyUcmPcUeXgNKK8-1-oKSku5vra_0RcGFcDL |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Modulation+Format+Recognition+and+OSNR+Estimation+Using+CNN-Based+Deep+Learning&rft.jtitle=IEEE+photonics+technology+letters&rft.au=Wang%2C+Danshi&rft.au=Zhang%2C+Min&rft.au=Li%2C+Ze&rft.au=Li%2C+Jin&rft.date=2017-10-01&rft.pub=IEEE&rft.issn=1041-1135&rft.volume=29&rft.issue=19&rft.spage=1667&rft.epage=1670&rft_id=info:doi/10.1109%2FLPT.2017.2742553&rft.externalDocID=8013796 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1041-1135&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1041-1135&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1041-1135&client=summon |