A Systematic Literature Review of Deep Learning Approaches for Sketch-Based Image Retrieval: Datasets, Metrics, and Future Directions
Sketch-based image retrieval (SBIR) utilizes sketches to search for images containing similar objects or scenes. Due to the proliferation of touch-screen devices, sketching has become more accessible and therefore has received increasing attention. Deep learning has emerged as a potential tool for S...
Saved in:
Published in: | IEEE access Vol. 12; pp. 14847 - 14869 |
---|---|
Main Authors: | , , , , , |
Format: | Journal Article |
Language: | English |
Published: |
Piscataway
IEEE
2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Abstract | Sketch-based image retrieval (SBIR) utilizes sketches to search for images containing similar objects or scenes. Due to the proliferation of touch-screen devices, sketching has become more accessible and therefore has received increasing attention. Deep learning has emerged as a potential tool for SBIR, allowing models to automatically extract image features and learn from large amounts of data. To the best of our knowledge, there is currently no systematic literature review (SLR) of SBIR with deep learning. Therefore, the aim of this review is to incorporate related works into a systematic study, highlighting the main contributions of individual researchers over the years, with a focus on past, present and future trends. To achieve the purpose of this study, 90 studies from 2016 to June 2023 in 4 databases were collected and analyzed using the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) framework. The specific models, datasets, evaluation metrics, and applications of deep learning in SBIR are discussed in detail. This study found that Convolutional Neural Networks (CNN) and Generative Adversarial Networks (GAN) are the most widely used deep learning methods for SBIR. A commonly used dataset is Sketchy, especially in the latest Zero-shot sketch-based image retrieval (ZS-SBIR) task. The results show that Mean Average Precision (mAP) is the most commonly used metric for quantitative evaluation of SBIR. Finally, we provide some future directions and guidance for researchers based on the results of this review. |
---|---|
AbstractList | Sketch-based image retrieval (SBIR) utilizes sketches to search for images containing similar objects or scenes. Due to the proliferation of touch-screen devices, sketching has become more accessible and therefore has received increasing attention. Deep learning has emerged as a potential tool for SBIR, allowing models to automatically extract image features and learn from large amounts of data. To the best of our knowledge, there is currently no systematic literature review (SLR) of SBIR with deep learning. Therefore, the aim of this review is to incorporate related works into a systematic study, highlighting the main contributions of individual researchers over the years, with a focus on past, present and future trends. To achieve the purpose of this study, 90 studies from 2016 to June 2023 in 4 databases were collected and analyzed using the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) framework. The specific models, datasets, evaluation metrics, and applications of deep learning in SBIR are discussed in detail. This study found that Convolutional Neural Networks (CNN) and Generative Adversarial Networks (GAN) are the most widely used deep learning methods for SBIR. A commonly used dataset is Sketchy, especially in the latest Zero-shot sketch-based image retrieval (ZS-SBIR) task. The results show that Mean Average Precision (mAP) is the most commonly used metric for quantitative evaluation of SBIR. Finally, we provide some future directions and guidance for researchers based on the results of this review. Sketch-based image retrieval (SBIR) utilizes sketches to search for images containing similar objects or scenes. Due to the proliferation of touch-screen devices, sketching has become more accessible and therefore has received increasing attention. Deep learning has emerged as a potential tool for SBIR, allowing models to automatically extract image features and learn from large amounts of data. To the best of our knowledge, there is currently no systematic literature review (SLR) of SBIR with deep learning. Therefore, the aim of this review is to incorporate related works into a systematic study, highlighting the main contributions of individual researchers over the years, with a focus on past, present and future trends. To achieve the purpose of this study, 90 studies from 2016 to June 2023 in 4 databases were collected and analyzed using the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) framework. The specific models, datasets, evaluation metrics, and applications of deep learning in SBIR are discussed in detail. This study found that Convolutional Neural Networks (CNN) and Generative Adversarial Networks (GAN) are the most widely used deep learning methods for SBIR. A commonly used dataset is Sketchy, especially in the latest Zero-shot sketch-based image retrieval (ZS-SBIR) task. The results show that Mean Average Precision (mAP) is the most commonly used metric for quantitative evaluation of SBIR. Finally, we provide some future directions and guidance for researchers based on the results of this review. © 2013 IEEE. |
Author | Pang, Yee Yong Al-Dhaqm, Arafat Koh, Tieng Wei Kebande, Victor R. Yang, Fan Ismail, Nor Azman |
Author_xml | – sequence: 1 givenname: Fan orcidid: 0009-0003-5446-2534 surname: Yang fullname: Yang, Fan email: fyang@graduate.utm.my organization: Faculty of Computing, Universiti Teknologi Malaysia (UTM), Skudai, Johor, Malaysia – sequence: 2 givenname: Nor Azman orcidid: 0000-0003-1785-008X surname: Ismail fullname: Ismail, Nor Azman organization: Faculty of Computing, Universiti Teknologi Malaysia (UTM), Skudai, Johor, Malaysia – sequence: 3 givenname: Yee Yong surname: Pang fullname: Pang, Yee Yong organization: Faculty of Computing, Universiti Teknologi Malaysia (UTM), Skudai, Johor, Malaysia – sequence: 4 givenname: Victor R. orcidid: 0000-0003-4071-4596 surname: Kebande fullname: Kebande, Victor R. email: victor.kebande@bth.se organization: Department of Computer Science (DIDA), Blekinge Institute of Technology, Karlskrona, Sweden – sequence: 5 givenname: Arafat surname: Al-Dhaqm fullname: Al-Dhaqm, Arafat organization: Computer and Information Sciences Department, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, Perak, Malaysia – sequence: 6 givenname: Tieng Wei orcidid: 0009-0001-1938-1275 surname: Koh fullname: Koh, Tieng Wei organization: Computer and Information Sciences Department, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, Perak, Malaysia |
BackLink | https://urn.kb.se/resolve?urn=urn:nbn:se:bth-25972$$DView record from Swedish Publication Index |
BookMark | eNpVUd1u0zAYjdAmMcaeAC4scbsU_yfmLrQbVCpCosCt5ThfWpc2yWxn0x6A957bTGj4xp-OzznWd86b7KzrO8iydwTPCMHqYzWf36zXM4opnzEmCsXUq-yCEqlyJpg8ezG_zq5C2OF0ygSJ4iL7W6H1Y4hwMNFZtHIRvImjB_QD7h08oL5FC4ABrcD4znUbVA2D743dQkBt79H6D0S7zT-bAA1aHszmqIzewb3Zf0ILE9NDDNfo2xG0aTBdg27H0xcL58FG13fhbXbemn2Aq-f7Mvt1e_Nz_jVfff-ynFer3DIlY65KBYIwrArDWsoFFml_TpllspQKiqaWJTdcGMIKBlxyQyTFbcN4ikUqwi6z5eTb9GanB-8Oxj_q3jh9Anq_0canIPagMeOYY0oV2JrXhteqIKK1pGhlIaUVyet68goPMIz1f24L97s6udVxq6lQBU30DxM9xXc3Qoh614--S9tqqiihioiSJxabWNb3IXho_9kSrI9t66ltfWxbP7edVO8nlQOAFwpOGC8kewIOR6Xb |
CODEN | IAECCG |
Cites_doi | 10.1109/tcyb.2019.2894498 10.1109/tnnls.2021.3084827 10.1145/3524613.3527816 10.1109/lgrs.2021.3056392 10.1109/CVPR52729.2023.01163 10.1109/ICCVW.2019.00175 10.1109/ICIP.2016.7532801 10.1016/j.patcog.2017.11.032 10.1007/978-3-030-01267-0_26 10.1145/3549555.3549582 10.1109/ICIIP47207.2019.8985733 10.1145/3123266.3123321 10.1145/3290605.3300334 10.1109/tpami.2022.3148853 10.3390/axioms11120663 10.1109/LSP.2020.3043972 10.1109/CVPR42600.2020.00522 10.1016/j.cag.2017.12.006 10.1145/1873951.1874299 10.1109/ISM.2018.00018 10.1109/CVPR.2017.247 10.1109/CVPR.2011.5995324 10.1007/978-3-030-04224-0_25 10.1145/3477495.3532028 10.1007/s00521-022-07978-9 10.1145/3092907.3092910 10.1109/WACV45572.2020.9093402 10.1016/j.cviu.2017.06.007 10.1145/3078971.3078985 10.1145/3394171.3413810 10.1109/CVPR.2017.232 10.1109/CVPR.2019.00228 10.1016/j.imavis.2020.104003 10.1023/b:visi.0000029664.99615.94 10.1109/TIP.2019.2910398 10.1016/j.knosys.2022.109447 10.1109/TGRS.2020.2984316 10.1109/ICIP42928.2021.9506609 10.1109/ICDM51629.2021.00078 10.1109/TCSVT.2020.3041586 10.1109/CVPR42600.2020.01009 10.1145/2070781.2024188 10.1016/j.patcog.2022.108528 10.1145/2897824.2925954 10.1109/access.2023.3241858 10.1145/3565368 10.1016/j.cviu.2013.02.005 10.1145/3503161.3548382 10.1145/3503161.3548224 10.1145/3474085.3475705 10.1109/CVPRW53098.2021.00240 10.1109/cvpr.2019.00077 10.1007/s11263-020-01350-x 10.1109/ICDAR.2013.232 10.1186/s13643-021-01671-z 10.1109/tmm.2019.2892301 10.1145/3524613.3527807 10.1016/j.inffus.2017.01.003 10.1109/ICIP.2010.5649331 10.1145/3123266.3123270 10.1007/978-3-319-46604-0_55 10.1371/journal.pone.0183838 10.1109/TVCG.2010.266 10.1016/j.neucom.2018.03.031 10.1016/j.patcog.2019.107148 10.4018/978-1-7998-3479-3.ch007 10.1109/CVPR.2015.7298685 10.1016/j.neucom.2020.04.060 10.1109/CVPRW50498.2020.00099 10.1145/2964284.2964329 10.1007/978-3-319-59876-5_33 10.1109/IAEAC50856.2021.9390657 10.1007/s11042-017-4799-2 10.1109/icdar.2017.291 10.1016/j.patcog.2021.108291 10.1007/s00521-022-07169-6 10.1109/dicta56598.2022.10034579 10.1109/cvpr.2018.00836 10.48550/ARXIV.1706.03762 10.1007/s11263-016-0932-3 10.1109/IGARSS47720.2021.9554838 10.1145/3503161.3548147 10.1145/1991996.1992016 10.1145/3123266.3127939 10.1109/ICCV.2019.00338 10.1007/s11263-020-01382-3 10.1109/ICECA.2019.8822021 10.1145/3577530.3577550 10.1145/3240508.3240606 10.1145/2647868.2654948 10.1145/3474085.3475499 10.1145/2185520.2185540 10.1109/ICIP.2017.8296970 10.1016/j.physd.2019.132306 10.1109/cvpr.2016.93 10.1145/3397271.3401149 10.1109/34.895972 10.1109/access.2019.2903534 10.1007/s11263-020-01316-z 10.1016/j.patcog.2021.108508 10.1109/CVPR.2014.254 10.1109/ICCPCT.2016.7530359 10.1109/icme.2017.8019432 10.1145/3474085.3475676 10.1016/j.patrec.2019.01.006 10.1117/1.jei.31.6.063048 10.1145/2964284.2964317 10.1109/ICAICE51518.2020.00008 10.1109/ACCESS.2019.2894351 10.1145/3587819.3590980 10.1007/978-3-319-71607-7_21 10.1145/3376067.3376070 10.1145/3123266.3123423 10.1109/ICCVW54120.2021.00275 10.1109/CVPR.2019.00299 10.1007/s00138-018-0953-8 10.1016/j.neucom.2016.04.046 10.1109/ICCV.2017.592 10.1016/j.neucom.2022.09.104 10.1109/ICDAR.2017.76 10.1109/TCSVT.2021.3080920 10.1109/icaibd.2019.8837001 10.1007/978-3-642-15986-2_44 10.1007/978-3-030-22514-8_40 10.1145/3343031.3350900 10.5555/2969033.2969125 10.1609/aaai.v37i1.25141 10.1145/3591106.3592287 10.1145/3477495.3532061 10.1609/aaai.v36i2.20136 10.1145/3503161.3547993 10.1016/j.imavis.2019.06.010 10.1109/ICPR56361.2022.9956593 10.1016/j.jvcir.2020.102835 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
DBID | 97E ESBDL RIA RIE AAYXX CITATION 7SC 7SP 7SR 8BQ 8FD JG9 JQ2 L7M L~C L~D ADTPV AOWAS D8T DF3 ZZAVC DOA |
DOI | 10.1109/ACCESS.2024.3357939 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005-present IEEE Xplore Open Access Journals IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library Online CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts METADEX Technology Research Database Materials Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional SwePub SwePub Articles SWEPUB Freely available online SWEPUB Blekinge Tekniska Högskola SwePub Articles full text Directory of Open Access Journals |
DatabaseTitle | CrossRef Materials Research Database Engineered Materials Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace METADEX Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Materials Research Database |
Database_xml | – sequence: 1 dbid: DOA name: Directory of Open Access Journals url: http://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: ESBDL name: IEEE Xplore Open Access Journals url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 2169-3536 |
EndPage | 14869 |
ExternalDocumentID | oai_doaj_org_article_034040229ecb4ba4b9715fc17f6766c5 oai_DiVA_org_bth_25972 10_1109_ACCESS_2024_3357939 10413476 |
Genre | orig-research |
GrantInformation_xml | – fundername: Blekinge Institute of Technology, Sweden, through the Grant Funded Research |
GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR ACGFS ADBBV ALMA_UNASSIGNED_HOLDINGS BCNDV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD ESBDL GROUPED_DOAJ IFIPE IPLJI JAVBF KQ8 M43 M~E O9- OCL OK1 RIA RIE RIG RNS AAYXX CITATION 7SC 7SP 7SR 8BQ 8FD JG9 JQ2 L7M L~C L~D ADTPV AOWAS D8T DF3 ZZAVC |
ID | FETCH-LOGICAL-c396t-989e513097a3f24505109423c36869e7db684a45a1373e464a1620fd343576913 |
IEDL.DBID | RIE |
ISSN | 2169-3536 |
IngestDate | Tue Oct 22 14:57:51 EDT 2024 Sat Aug 24 00:14:55 EDT 2024 Thu Oct 10 19:43:01 EDT 2024 Wed Sep 04 12:43:36 EDT 2024 Wed Sep 04 05:53:20 EDT 2024 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c396t-989e513097a3f24505109423c36869e7db684a45a1373e464a1620fd343576913 |
ORCID | 0000-0003-1785-008X 0000-0003-4071-4596 0009-0001-1938-1275 0009-0003-5446-2534 |
OpenAccessLink | https://ieeexplore.ieee.org/document/10413476 |
PQID | 2921291584 |
PQPubID | 4845423 |
PageCount | 23 |
ParticipantIDs | ieee_primary_10413476 proquest_journals_2921291584 crossref_primary_10_1109_ACCESS_2024_3357939 doaj_primary_oai_doaj_org_article_034040229ecb4ba4b9715fc17f6766c5 swepub_primary_oai_DiVA_org_bth_25972 |
PublicationCentury | 2000 |
PublicationDate | 20240000 2024-00-00 20240101 2024 2024-01-01 |
PublicationDateYYYYMMDD | 2024-01-01 |
PublicationDate_xml | – year: 2024 text: 20240000 |
PublicationDecade | 2020 |
PublicationPlace | Piscataway |
PublicationPlace_xml | – name: Piscataway |
PublicationTitle | IEEE access |
PublicationTitleAbbrev | Access |
PublicationYear | 2024 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref57 ref56 ref59 ref58 ref53 ref52 ref55 ref54 ref51 ref50 ref46 Huang (ref62) Ha (ref35) 2017 ref45 ref48 ref47 ref42 Kobayashi (ref141) 2023 ref41 ref44 Huang (ref137) ref43 ref49 ref8 ref7 ref9 ref4 ref3 ref6 ref5 ref100 ref101 ref40 ref34 ref37 ref36 ref31 ref30 ref33 ref32 Mnih (ref116); 27 ref39 ref38 Griffin (ref138) 2007 ref24 ref23 ref26 ref25 ref20 ref22 ref21 ref28 ref27 ref29 ref13 ref12 ref15 ref128 ref14 ref129 ref97 ref126 ref96 ref127 ref11 ref99 ref124 ref10 ref98 ref125 ref17 ref16 ref19 ref18 Sermanet (ref119) 2014 ref93 ref133 ref92 ref134 ref95 ref131 ref94 ref132 ref130 ref91 ref90 ref89 ref86 ref85 ref88 ref135 ref87 ref136 ref82 ref81 ref84 ref142 ref83 ref143 ref140 ref80 ref79 ref108 ref78 ref109 ref106 ref107 ref75 ref104 ref74 ref105 ref77 ref102 ref76 ref103 ref2 ref71 ref111 ref70 ref112 ref73 ref72 ref110 ref68 ref67 ref117 ref69 ref118 ref64 ref115 ref63 ref66 ref113 ref65 ref114 Mohian (ref139) 2021 ref60 ref122 ref123 ref120 ref61 ref121 Alkhawlani (ref1) 2015; 4 |
References_xml | – ident: ref29 doi: 10.1109/tcyb.2019.2894498 – ident: ref33 doi: 10.1109/tnnls.2021.3084827 – ident: ref75 doi: 10.1145/3524613.3527816 – ident: ref135 doi: 10.1109/lgrs.2021.3056392 – ident: ref143 doi: 10.1109/CVPR52729.2023.01163 – ident: ref101 doi: 10.1109/ICCVW.2019.00175 – ident: ref17 doi: 10.1109/ICIP.2016.7532801 – ident: ref63 doi: 10.1016/j.patcog.2017.11.032 – ident: ref140 doi: 10.1007/978-3-030-01267-0_26 – ident: ref52 doi: 10.1145/3549555.3549582 – ident: ref45 doi: 10.1109/ICIIP47207.2019.8985733 – ident: ref53 doi: 10.1145/3123266.3123321 – ident: ref12 doi: 10.1145/3290605.3300334 – volume: 27 start-page: 1 volume-title: Proc. Neural Inf. Process. Syst. ident: ref116 article-title: Recurrent models of visual attention contributor: fullname: Mnih – ident: ref22 doi: 10.1109/tpami.2022.3148853 – ident: ref58 doi: 10.3390/axioms11120663 – ident: ref77 doi: 10.1109/LSP.2020.3043972 – ident: ref129 doi: 10.1109/CVPR42600.2020.00522 – year: 2021 ident: ref139 article-title: DoodleUINet: Repository for DoodleUINet drawings dataset and scripts contributor: fullname: Mohian – start-page: 1626 volume-title: Proc. ECAI ident: ref62 article-title: Enhancing sketch-based image retrieval via deep discriminative representation contributor: fullname: Huang – ident: ref44 doi: 10.1016/j.cag.2017.12.006 – ident: ref13 doi: 10.1145/1873951.1874299 – ident: ref56 doi: 10.1109/ISM.2018.00018 – ident: ref38 doi: 10.1109/CVPR.2017.247 – ident: ref136 doi: 10.1109/CVPR.2011.5995324 – ident: ref61 doi: 10.1007/978-3-030-04224-0_25 – ident: ref114 doi: 10.1145/3477495.3532028 – ident: ref48 doi: 10.1007/s00521-022-07978-9 – ident: ref67 doi: 10.1145/3092907.3092910 – ident: ref103 doi: 10.1109/WACV45572.2020.9093402 – ident: ref18 doi: 10.1016/j.cviu.2017.06.007 – ident: ref49 doi: 10.1145/3078971.3078985 – ident: ref54 doi: 10.1145/3394171.3413810 – ident: ref117 doi: 10.1109/CVPR.2017.232 – ident: ref106 doi: 10.1109/CVPR.2019.00228 – ident: ref107 doi: 10.1016/j.imavis.2020.104003 – ident: ref14 doi: 10.1023/b:visi.0000029664.99615.94 – ident: ref51 doi: 10.1109/TIP.2019.2910398 – ident: ref82 doi: 10.1016/j.knosys.2022.109447 – ident: ref57 doi: 10.1109/TGRS.2020.2984316 – ident: ref66 doi: 10.1109/ICIP42928.2021.9506609 – ident: ref74 doi: 10.1109/ICDM51629.2021.00078 – ident: ref100 doi: 10.1109/TCSVT.2020.3041586 – ident: ref122 doi: 10.1109/CVPR42600.2020.01009 – ident: ref127 doi: 10.1145/2070781.2024188 – ident: ref96 doi: 10.1016/j.patcog.2022.108528 – ident: ref26 doi: 10.1145/2897824.2925954 – ident: ref95 doi: 10.1109/access.2023.3241858 – ident: ref98 doi: 10.1145/3565368 – ident: ref123 doi: 10.1016/j.cviu.2013.02.005 – ident: ref112 doi: 10.1145/3503161.3548382 – ident: ref91 doi: 10.1145/3503161.3548224 – ident: ref73 doi: 10.1145/3474085.3475705 – ident: ref134 doi: 10.1109/CVPRW53098.2021.00240 – ident: ref94 doi: 10.1109/cvpr.2019.00077 – ident: ref105 doi: 10.1007/s11263-020-01350-x – year: 2017 ident: ref35 article-title: A neural representation of sketch drawings publication-title: arXiv:1704.03477 contributor: fullname: Ha – ident: ref126 doi: 10.1109/ICDAR.2013.232 – ident: ref23 doi: 10.1186/s13643-021-01671-z – ident: ref31 doi: 10.1109/tmm.2019.2892301 – ident: ref76 doi: 10.1145/3524613.3527807 – ident: ref5 doi: 10.1016/j.inffus.2017.01.003 – ident: ref16 doi: 10.1109/ICIP.2010.5649331 – ident: ref68 doi: 10.1145/3123266.3123270 – ident: ref37 doi: 10.1007/978-3-319-46604-0_55 – ident: ref9 doi: 10.1371/journal.pone.0183838 – ident: ref15 doi: 10.1109/TVCG.2010.266 – ident: ref40 doi: 10.1016/j.neucom.2018.03.031 – ident: ref92 doi: 10.1016/j.patcog.2019.107148 – ident: ref2 doi: 10.4018/978-1-7998-3479-3.ch007 – ident: ref118 doi: 10.1109/CVPR.2015.7298685 – ident: ref81 doi: 10.1016/j.neucom.2020.04.060 – ident: ref99 doi: 10.1109/CVPRW50498.2020.00099 – ident: ref124 doi: 10.1145/2964284.2964329 – ident: ref25 doi: 10.1007/978-3-319-59876-5_33 – ident: ref85 doi: 10.1109/IAEAC50856.2021.9390657 – year: 2007 ident: ref138 article-title: Caltech-256 object category dataset contributor: fullname: Griffin – ident: ref43 doi: 10.1007/s11042-017-4799-2 – ident: ref69 doi: 10.1109/icdar.2017.291 – ident: ref86 doi: 10.1016/j.patcog.2021.108291 – ident: ref115 doi: 10.1007/s00521-022-07169-6 – ident: ref39 doi: 10.1109/dicta56598.2022.10034579 – ident: ref30 doi: 10.1109/cvpr.2018.00836 – ident: ref120 doi: 10.48550/ARXIV.1706.03762 – ident: ref11 doi: 10.1007/s11263-016-0932-3 – ident: ref84 doi: 10.1109/IGARSS47720.2021.9554838 – ident: ref87 doi: 10.1145/3503161.3548147 – ident: ref125 doi: 10.1145/1991996.1992016 – ident: ref55 doi: 10.1145/3123266.3127939 – ident: ref121 doi: 10.1109/ICCV.2019.00338 – ident: ref80 doi: 10.1007/s11263-020-01382-3 – ident: ref50 doi: 10.1109/ICECA.2019.8822021 – ident: ref110 doi: 10.1145/3577530.3577550 – ident: ref70 doi: 10.1145/3240508.3240606 – ident: ref4 doi: 10.1145/2647868.2654948 – ident: ref60 doi: 10.1145/3474085.3475499 – ident: ref8 doi: 10.1145/2185520.2185540 – ident: ref42 doi: 10.1109/ICIP.2017.8296970 – ident: ref34 doi: 10.1016/j.physd.2019.132306 – ident: ref10 doi: 10.1109/cvpr.2016.93 – ident: ref104 doi: 10.1145/3397271.3401149 – ident: ref3 doi: 10.1109/34.895972 – ident: ref46 doi: 10.1109/access.2019.2903534 – year: 2014 ident: ref119 article-title: Attention for fine-grained categorization publication-title: arXiv:1412.7054 contributor: fullname: Sermanet – ident: ref133 doi: 10.1007/s11263-020-01316-z – ident: ref79 doi: 10.1016/j.patcog.2021.108508 – ident: ref128 doi: 10.1109/CVPR.2014.254 – ident: ref19 doi: 10.1109/ICCPCT.2016.7530359 – ident: ref64 doi: 10.1109/icme.2017.8019432 – ident: ref111 doi: 10.1145/3474085.3475676 – ident: ref47 doi: 10.1016/j.patrec.2019.01.006 – ident: ref24 doi: 10.1117/1.jei.31.6.063048 – ident: ref93 doi: 10.1145/2964284.2964317 – ident: ref21 doi: 10.1109/ICAICE51518.2020.00008 – ident: ref72 doi: 10.1109/ACCESS.2019.2894351 – ident: ref88 doi: 10.1145/3587819.3590980 – ident: ref27 doi: 10.1007/978-3-319-71607-7_21 – ident: ref59 doi: 10.1145/3376067.3376070 – ident: ref130 doi: 10.1145/3123266.3123423 – ident: ref89 doi: 10.1109/ICCVW54120.2021.00275 – ident: ref32 doi: 10.1109/CVPR.2019.00299 – ident: ref7 doi: 10.1007/s00138-018-0953-8 – ident: ref41 doi: 10.1016/j.neucom.2016.04.046 – ident: ref83 doi: 10.1109/ICCV.2017.592 – ident: ref97 doi: 10.1016/j.neucom.2022.09.104 – ident: ref132 doi: 10.1109/ICDAR.2017.76 – start-page: 11 volume-title: Proc. Workshop Faces ’Real-Life’ Images, Detection, Alignment, Recognition ident: ref137 article-title: Labeled faces in the wild: A database for studying face recognition in unconstrained environments contributor: fullname: Huang – ident: ref6 doi: 10.1109/TCSVT.2021.3080920 – ident: ref65 doi: 10.1109/icaibd.2019.8837001 – volume: 4 start-page: 58 issue: 1 year: 2015 ident: ref1 article-title: Text-based, content-based, and semantic-based image retrievals: A survey publication-title: Int. J. Comput. Inf. Technol. contributor: fullname: Alkhawlani – ident: ref131 doi: 10.1007/978-3-642-15986-2_44 – ident: ref28 doi: 10.1007/978-3-030-22514-8_40 – ident: ref71 doi: 10.1145/3343031.3350900 – ident: ref36 doi: 10.5555/2969033.2969125 – ident: ref142 doi: 10.1609/aaai.v37i1.25141 – ident: ref113 doi: 10.1145/3591106.3592287 – ident: ref109 doi: 10.1145/3477495.3532061 – year: 2023 ident: ref141 article-title: Sketch-based medical image retrieval publication-title: arXiv:2303.03633 contributor: fullname: Kobayashi – ident: ref108 doi: 10.1609/aaai.v36i2.20136 – ident: ref90 doi: 10.1145/3503161.3547993 – ident: ref20 doi: 10.1016/j.imavis.2019.06.010 – ident: ref78 doi: 10.1109/ICPR56361.2022.9956593 – ident: ref102 doi: 10.1016/j.jvcir.2020.102835 |
SSID | ssj0000816957 |
Score | 2.3503354 |
Snippet | Sketch-based image retrieval (SBIR) utilizes sketches to search for images containing similar objects or scenes. Due to the proliferation of touch-screen... |
SourceID | doaj swepub proquest crossref ieee |
SourceType | Open Website Open Access Repository Aggregation Database Publisher |
StartPage | 14847 |
SubjectTerms | Artificial neural networks Datasets Deep learning Feature extraction Features extraction Generative adversarial networks Image processing Image retrieval Literature reviews Machine learning Measurement Meta-analysis Neural networks Preferred reporting item for systematic review and meta-analyze PRISMA Retrieval SBIR Sketch-based image retrieval Sketch-based image retrievals Sketches SLR Surveys Systematic Systematic literature review Systematic Review Systematics Touch screens |
SummonAdditionalLinks | – databaseName: Directory of Open Access Journals dbid: DOA link: http://sdu.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lb9QwELagJzigAkWEFuQD3Boax29u6W5XRQIuC4ibZTs2VFW3FZv-BP4340dXu6deeoucx9j-vnhmrOQbhN533AoVfWg1733LhPKtJc61MZWCd0F5mkV9zpfy2y81P0syOZtSX-mbsCIPXCbupKMMeNb3OnjHnGVOS8KjJzIKKYQv6qWd2kqm8hqsiNBcVpkh0umTYTaDEUFC2LOPlHKgpd5xRVmxv5ZY2Y02txVEs9dZ7KNnNVzEQ-nmc_QorF6gp1sigi_RvwEvN3LM-MtGJhmXbX98HfE8hBtclVR_46HKiIc1hogVLy8Tcu0puLMRf76C9QXuTGW2gIOf8NxOcGJaH-OvqdHDgV2NeJGlSHBdMIG5B-jH4uz77LytxRVaT7WYWq104ODAtLQ09oynl1NDbAXgKKGDHJ1QzDJuCZU0MMEsEX0XRwrxlRSa0Fdob3W9Cq8RjnxUJHpLorAMUm0bQ4RHaC-dBVu-Qcd382xuioaGyblHp02BxSRYTIWlQacJi82lSQA7NwAtTKWFuY8WDTpISG7ZY-mnWdGgoztoTX1b16bX4MA1gVisQR8K3DvW5xc_h2zdTX8M5Iuyf_MQfTxET9K4y4bOEdqb_t6Gt-jxerx9l9n8HyF29Qs priority: 102 providerName: Directory of Open Access Journals |
Title | A Systematic Literature Review of Deep Learning Approaches for Sketch-Based Image Retrieval: Datasets, Metrics, and Future Directions |
URI | https://ieeexplore.ieee.org/document/10413476 https://www.proquest.com/docview/2921291584 https://urn.kb.se/resolve?urn=urn:nbn:se:bth-25972 https://doaj.org/article/034040229ecb4ba4b9715fc17f6766c5 |
Volume | 12 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://sdu.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3Nb9MwFLfouMBhfA0RNiof4LaMOP6KuaVNqyENLgXEzXIcG6aJdlrTP4H_e8-OF7UHDtwiJ7Fj_d7z-3D8ewi9L7gRlbcuV7y0OROVzQ1p29yHUvCtqyyNpD6XK_n1Z9UsAk1OPp6Fcc7Fn8_cRbiMe_ndxu5Cqgw0nIWTj2KCJlJVw2GtMaESKkgoLhOzECnUx3o-h0lADFiyC0o5SKI6sD6RpD9VVTl0MPdJQ6OhWT77z098jo6TR4nrQQReoEdu_RI93eMZfIX-1ng1Mjbjq5FJGQ87A3jjcePcLU5kq79wnZjG3RaDU4tXNwHcfAYWr8Of_8ASBG-GSlwgpp9wY3q40W_P8ZfQaOHCrDu8jGwlOK2pINwn6Pty8W1-maf6C7mlSvS5qpTjYOOUNNSXjAf9VeB-AX6VUE52raiYYdwQKqljghkiysJ3FFwwKRShr9HRerN2bxD2vKuIt4Z4YRhE48Y7D10oK1sDY9kMnT_gom8Hmg0dw5NC6QFGHWDUCcYMzQJ246OBIzs2ABY6qZwuKIMVqiyVsy1rDWuVJNxbIr2QQlieoZOA3954A3QZOnsQBZ0UeqtLBTZeEXDXMvRhEI-D0ZvrH3Ucve1_awgpZfn2H92foidhKkMa5wwd9Xc79w5Ntt1uGhMCU_R4sZo1V9Mo3vdsHvXf |
link.rule.ids | 230,315,782,786,798,866,887,2108,4030,27644,27934,27935,27936,54770,54945 |
linkProvider | IEEE |
linkToHtml | http://sdu.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lc9MwEN6h6QE48CyDoYAOcKuLZethcUviZNIh7SWF4aaRZakwDEmHOD-B_83KVj3JgQM3j2xrrfl2tbuS9S3A-4wbUXrrUsVzmzJR2tTQuk59KAVfu9IWHanPYiWvvpXVLNDkpMNZGOdc9_OZOw-X3V5-s7G7sFSGFs7CyUdxBMeY1jA-guPZalIth0WVUEVCcRnZhWimPo6nUxwI5oE5Oy8KjtqoDjxQR9QfK6scBpn7xKGds5k__s_PfAKPYlRJxr0aPIV7bv0MHu5xDT6HP2OyGlibyXJgUyb97gDZeFI5d0si4eoNGUe2cbclGNiS1c8AcDpBr9eQi184DeGboRoXquonUpkWb7TbM3IZGi1emHVD5h1jCYnzKir4CXyZz66nizTWYEhtoUSbqlI5jn5OSVP4nPFgwwpDMMSwFMrJphYlM4wbWsjCMcEMFXnmmwLDMCkULV7AaL1Zu5dAPG9K6q2hXhiGGbnxzmMXysraoCybwNkdLvq2p9rQXYqSKd3DqAOMOsKYwCRgNzwaeLK7BsRCR7PTWcFwlspz5WzNasNqJSn3lkovpBCWJ3AS8NuT10OXwOmdKuho1FudK_TzimLIlsCHXj0OpFc_vo476XX7XWNaKfNX_-j-HdxfXF8u9fLi6vNreBCG1S_rnMKo_b1zb-Bo2-zeRvX-C7Fy98Q |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Systematic+Literature+Review+of+Deep+Learning+Approaches+for+Sketch-Based+Image+Retrieval%3A+Datasets%2C+Metrics%2C+and+Future+Directions&rft.jtitle=IEEE+access&rft.au=Yang%2C+Fan&rft.au=Ismail%2C+Nor+Azman&rft.au=Pang%2C+Yee+Yong&rft.au=Kebande%2C+Victor+R.&rft.date=2024&rft.pub=IEEE&rft.eissn=2169-3536&rft.volume=12&rft.spage=14847&rft.epage=14869&rft_id=info:doi/10.1109%2FACCESS.2024.3357939&rft.externalDocID=10413476 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2169-3536&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2169-3536&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2169-3536&client=summon |