Graph convolutional neural networks with node transition probability-based message passing and DropNode regularization
•A new message passing formulation for graph convolutional neural networks is proposed.•An effective regularization technique to address over-fitting and over-smoothing.•The proposed regularization can be applied to different graph neural network models.•Semi-supervised and fully supervised learning...
Saved in:
Published in: | Expert systems with applications Vol. 174; p. 114711 |
---|---|
Main Authors: | , , , , |
Format: | Journal Article |
Language: | English |
Published: |
New York
Elsevier Ltd
15-07-2021
Elsevier BV |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Abstract | •A new message passing formulation for graph convolutional neural networks is proposed.•An effective regularization technique to address over-fitting and over-smoothing.•The proposed regularization can be applied to different graph neural network models.•Semi-supervised and fully supervised learning settings are considered.•The proposed method is evaluated via extensive experiments on benchmark datasets.
Graph convolutional neural networks (GCNNs) have received much attention recently, owing to their capability in handling graph-structured data. Among the existing GCNNs, many methods can be viewed as instances of a neural message passing motif; features of nodes are passed around their neighbors, aggregated and transformed to produce better nodes’ representations. Nevertheless, these methods seldom use node transition probabilities, a measure that has been found useful in exploring graphs. Furthermore, when the transition probabilities are used, their transition direction is often improperly considered in the feature aggregation step, resulting in an inefficient weighting scheme. In addition, although a great number of GCNN models with increasing level of complexity have been introduced, the GCNNs often suffer from over-fitting when being trained on small graphs. Another issue of the GCNNs is over-smoothing, which tends to make nodes’ representations indistinguishable. This work presents a new method to improve the message passing process based on node transition probabilities by properly considering the transition direction, leading to a better weighting scheme in nodes’ features aggregation compared to the existing counterpart. Moreover, we propose a novel regularization method termed DropNode to address the over-fitting and over-smoothing issues simultaneously. DropNode randomly discards part of a graph, thus it creates multiple deformed versions of the graph, leading to data augmentation regularization effect. Additionally, DropNode lessens the connectivity of the graph, mitigating the effect of over-smoothing in deep GCNNs. Extensive experiments on eight benchmark datasets for node and graph classification tasks demonstrate the effectiveness of the proposed methods in comparison with the state of the art. |
---|---|
AbstractList | Graph convolutional neural networks (GCNNs) have received much attention recently, owing to their capability in handling graph-structured data. Among the existing GCNNs, many methods can be viewed as instances of a neural message passing motif; features of nodes are passed around their neighbors, aggregated and transformed to produce better nodes' representations. Nevertheless, these methods seldom use node transition probabilities, a measure that has been found useful in exploring graphs. Furthermore, when the transition probabilities are used, their transition direction is often improperly considered in the feature aggregation step, resulting in an inefficient weighting scheme. In addition, although a great number of GCNN models with increasing level of complexity have been introduced, the GCNNs often suffer from over-fitting when being trained on small graphs. Another issue of the GCNNs is over-smoothing, which tends to make nodes' representations indistinguishable. This work presents a new method to improve the message passing process based on node transition probabilities by properly considering the transition direction, leading to a better weighting scheme in nodes' features aggregation compared to the existing counterpart. Moreover, we propose a novel regularization method termed DropNode to address the over-fitting and over-smoothing issues simultaneously. DropNode randomly discards part of a graph, thus it creates multiple deformed versions of the graph, leading to data augmentation regularization effect. Additionally, DropNode lessens the connectivity of the graph, mitigating the effect of over-smoothing in deep GCNNs. Extensive experiments on eight benchmark datasets for node and graph classification tasks demonstrate the effectiveness of the proposed methods in comparison with the state of the art. •A new message passing formulation for graph convolutional neural networks is proposed.•An effective regularization technique to address over-fitting and over-smoothing.•The proposed regularization can be applied to different graph neural network models.•Semi-supervised and fully supervised learning settings are considered.•The proposed method is evaluated via extensive experiments on benchmark datasets. Graph convolutional neural networks (GCNNs) have received much attention recently, owing to their capability in handling graph-structured data. Among the existing GCNNs, many methods can be viewed as instances of a neural message passing motif; features of nodes are passed around their neighbors, aggregated and transformed to produce better nodes’ representations. Nevertheless, these methods seldom use node transition probabilities, a measure that has been found useful in exploring graphs. Furthermore, when the transition probabilities are used, their transition direction is often improperly considered in the feature aggregation step, resulting in an inefficient weighting scheme. In addition, although a great number of GCNN models with increasing level of complexity have been introduced, the GCNNs often suffer from over-fitting when being trained on small graphs. Another issue of the GCNNs is over-smoothing, which tends to make nodes’ representations indistinguishable. This work presents a new method to improve the message passing process based on node transition probabilities by properly considering the transition direction, leading to a better weighting scheme in nodes’ features aggregation compared to the existing counterpart. Moreover, we propose a novel regularization method termed DropNode to address the over-fitting and over-smoothing issues simultaneously. DropNode randomly discards part of a graph, thus it creates multiple deformed versions of the graph, leading to data augmentation regularization effect. Additionally, DropNode lessens the connectivity of the graph, mitigating the effect of over-smoothing in deep GCNNs. Extensive experiments on eight benchmark datasets for node and graph classification tasks demonstrate the effectiveness of the proposed methods in comparison with the state of the art. |
ArticleNumber | 114711 |
Author | Deligiannis, Nikos Do, Tien Huu Nguyen, Duc Minh Bekoulis, Giannis Munteanu, Adrian |
Author_xml | – sequence: 1 givenname: Tien Huu surname: Do fullname: Do, Tien Huu email: thdo@etrovub.be organization: Vrije Universiteit Brussel, Pleinlaan 2, B-1050 Brussels, Belgium – sequence: 2 givenname: Duc Minh surname: Nguyen fullname: Nguyen, Duc Minh email: mdnguyen@etrovub.be organization: Vrije Universiteit Brussel, Pleinlaan 2, B-1050 Brussels, Belgium – sequence: 3 givenname: Giannis surname: Bekoulis fullname: Bekoulis, Giannis email: gbekouli@etrovub.be organization: Vrije Universiteit Brussel, Pleinlaan 2, B-1050 Brussels, Belgium – sequence: 4 givenname: Adrian surname: Munteanu fullname: Munteanu, Adrian email: acmuntea@etrovub.be organization: Vrije Universiteit Brussel, Pleinlaan 2, B-1050 Brussels, Belgium – sequence: 5 givenname: Nikos surname: Deligiannis fullname: Deligiannis, Nikos email: ndeligia@etrovub.be organization: Vrije Universiteit Brussel, Pleinlaan 2, B-1050 Brussels, Belgium |
BookMark | eNp9kMtOwzAQRS0EEqXwA6wssU7xOA-nEhvEW0KwgbXlx6S4BDvYSSv4ehLKmtXd3DO6c47Ivg8eCTkFtgAG1fl6gWmrFpxxWAAUAmCPzKAWeVaJZb5PZmxZiqwAURySo5TWjIFgTMzI5i6q7o2a4DehHXoXvGqpxyH-Rr8N8T3RrevfqA8WaR-VT26q0S4GrbRrXf-VaZXQ0g9MSa2Qdiol51dUeUuvY-ieJjLiamhVdN9qoo_JQaPahCd_OSevtzcvV_fZ4_Pdw9XlY2ZywftMlbU2thZgS6Zrw3hjGeda5KI2JRhbIa9ZbSrVYKW5YIVgVWFyaKzWAkvI5-Rsd3dc-zlg6uU6DHH8MUle5suyAij52OK7lokhpYiN7KL7UPFLApOTX7mWk185-ZU7vyN0sYNw3L9xGGUyDr1B6yKaXtrg_sN_AMnciIE |
CitedBy_id | crossref_primary_10_3390_sym14040798 crossref_primary_10_1109_JSEN_2023_3332653 crossref_primary_10_1016_j_knosys_2023_110556 crossref_primary_10_1007_s11042_024_18608_y crossref_primary_10_1016_j_eswa_2022_118264 crossref_primary_10_1016_j_eswa_2024_123182 crossref_primary_10_1016_j_eswa_2021_116240 crossref_primary_10_1016_j_eswa_2022_119235 crossref_primary_10_1109_TMI_2023_3299518 crossref_primary_10_1007_s10489_023_04739_6 crossref_primary_10_1016_j_patrec_2024_06_013 |
Cites_doi | 10.1371/journal.pone.0002051 10.1038/30918 10.1007/s10822-016-9938-8 10.3115/v1/P15-2104 10.1609/aimag.v29i3.2157 10.1145/1150402.1150479 10.1109/JPROC.2018.2820126 10.1609/aaai.v34i04.5747 10.1038/ncomms13890 |
ContentType | Journal Article |
Copyright | 2021 Elsevier Ltd Copyright Elsevier BV Jul 15, 2021 |
Copyright_xml | – notice: 2021 Elsevier Ltd – notice: Copyright Elsevier BV Jul 15, 2021 |
DBID | AAYXX CITATION 7SC 8FD JQ2 L7M L~C L~D |
DOI | 10.1016/j.eswa.2021.114711 |
DatabaseName | CrossRef Computer and Information Systems Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
DatabaseTitle | CrossRef Computer and Information Systems Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Advanced Technologies Database with Aerospace ProQuest Computer Science Collection Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Computer and Information Systems Abstracts |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science |
EISSN | 1873-6793 |
ExternalDocumentID | 10_1016_j_eswa_2021_114711 S0957417421001524 |
GroupedDBID | --K --M .DC .~1 0R~ 13V 1B1 1RT 1~. 1~5 4.4 457 4G. 5GY 5VS 7-5 71M 8P~ 9JN 9JO AAAKF AABNK AACTN AAEDT AAEDW AAIAV AAIKJ AAKOC AALRI AAOAW AAQFI AARIN AAXUO AAYFN ABBOA ABFNM ABMAC ABMVD ABUCO ABYKQ ACDAQ ACGFS ACHRH ACNTT ACRLP ACZNC ADBBV ADEZE ADTZH AEBSH AECPX AEKER AENEX AFKWA AFTJW AGHFR AGJBL AGUBO AGUMN AGYEJ AHHHB AHJVU AHZHX AIALX AIEXJ AIKHN AITUG AJOXV ALEQD ALMA_UNASSIGNED_HOLDINGS AMFUW AMRAJ AOUOD APLSM AXJTR BJAXD BKOJK BLXMC BNSAS CS3 DU5 EBS EFJIC EFLBG EO8 EO9 EP2 EP3 F5P FDB FIRID FNPLU FYGXN G-Q GBLVA GBOLZ HAMUX IHE J1W JJJVA KOM LG9 LY1 LY7 M41 MO0 N9A O-L O9- OAUVE OZT P-8 P-9 P2P PC. PQQKQ Q38 ROL RPZ SDF SDG SDP SDS SES SPC SPCBC SSB SSD SSL SST SSV SSZ T5K TN5 ~G- 29G AAAKG AAQXK AAXKI AAYXX ABKBG ABXDB ACNNM ADJOM ADMUD AFJKZ AKRWK ASPBG AVWKF AZFZN CITATION EJD FEDTE FGOYB G-2 HLZ HVGLF HZ~ R2- RIG SBC SET SEW WUQ XPP ZMT 7SC 8FD JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c372t-a58bcd871d50b8c02fd022b7378c51cd6e2808c6afe6b27047064c31fdbb7e513 |
ISSN | 0957-4174 |
IngestDate | Thu Oct 10 17:13:29 EDT 2024 Thu Sep 26 18:36:26 EDT 2024 Fri Feb 23 02:46:14 EST 2024 |
IsDoiOpenAccess | false |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Keywords | Graph convolutional neural networks Geometric deep learning Graph classification Node classification |
Language | English |
LinkModel | OpenURL |
MergedId | FETCHMERGED-LOGICAL-c372t-a58bcd871d50b8c02fd022b7378c51cd6e2808c6afe6b27047064c31fdbb7e513 |
OpenAccessLink | https://arxiv.org/pdf/2008.12578 |
PQID | 2539561152 |
PQPubID | 2045477 |
ParticipantIDs | proquest_journals_2539561152 crossref_primary_10_1016_j_eswa_2021_114711 elsevier_sciencedirect_doi_10_1016_j_eswa_2021_114711 |
PublicationCentury | 2000 |
PublicationDate | 2021-07-15 |
PublicationDateYYYYMMDD | 2021-07-15 |
PublicationDate_xml | – month: 07 year: 2021 text: 2021-07-15 day: 15 |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationTitle | Expert systems with applications |
PublicationYear | 2021 |
Publisher | Elsevier Ltd Elsevier BV |
Publisher_xml | – name: Elsevier Ltd – name: Elsevier BV |
References | Feng, Zhang, Dong, Han, Luan, Xu, Yang, Kharlamov, Tang (b0065) 2020; 33 Teney, Liu, van DenHengel (b0215) 2017 Hamilton, Ying, Leskovec (b0090) 2017 Watts, Strogatz (b0230) 1998; 393 Gomez, L. G., Chiem, B., & Delvenne, J. -C. (2017). Dynamics based features for graph classification. arXiv preprint arXiv:1705.10817. Gilmer, Schoenholz, Riley, Vinyals, Dahl (b0080) 2017 (pp. 1993–2001). Bruna, J., Zaremba, W., Szlam, A., & LeCun, Y. (2013). Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203. Yao, Mao, Luo (b0240) 2019 Rong, Huang, Xu, Huang (b0185) 2019 Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., & Sun, X. (2019). Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. arXiv preprint arXiv:1909.03211. Rahimi, A., Cohn, T., & Baldwin, T. (2015). Twitter user geolocation using a unified text and network prediction model. arXiv preprint arXiv:1506.08259. Erdős, Rényi (b0060) 1960; 5 Ying, You, Morris, Ren, Hamilton, Leskovec (b0245) 2018 Gao, Wang, Ji (b0075) 2018 Lee, J., Lee, I., & Kang, J. (2019). Self-attention graph pooling. arXiv preprint arXiv:1904.08082. Chen, J., Ma, T., & Xiao, C. (2018). Fastgcn: fast learning with graph convolutional networks via importance sampling. arXiv:1801.10247. Schütt, Arbabzadah, Chmiela, Müller, Tkatchenko (b0195) 2017; 8 Srivastava, Hinton, Krizhevsky, Sutskever, Salakhutdinov (b0210) 2014; 15 Zhang, Cui, Neumann, Chen (b0250) 2018 Defferrard, M., Bresson, X., & Vandergheynst, P. (2016). Convolutional neural networks on graphs with fast localized spectral filtering. In Leskovec, J., & Faloutsos, C. (2006). Sampling from large graphs. In Niepert, Ahmed, Kutzkov (b0150) 2016 Sen, Namata, Bilgic, Getoor, Galligher, Eliassi-Rad (b0200) 2008; 29 Henaff, M., Bruna, J., & LeCun, Y. (2015). Deep convolutional networks on graph-structured data. arXiv preprint arXiv:1506.05163. Do, Nguyen, Tsiligianni, Aguirre, LaManna, Pasveer, Philips, Deligiannis (b0050) 2019 Gao, Ji (b0070) 2019 Kipf, T. N., & Welling, M. (2016). Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907. Veličković, P., Fedus, W., Hamilton, W. L., Liò, P., Bengio, Y., & Hjelm, R. D. (2018). Deep graph infomax. arXiv preprint arXiv:1809.10341. Battaglia, P., Pascanu, R., Lai, M., Rezende, D. J. et al. (2016). Interaction networks for learning about objects, relations and physics. In Monti, Frasca, Eynard, Mannion, Bronstein (b0145) 2019 Qu, Bengio, Tang (b0170) 2019 Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., & Bengio, Y. (2017). Graph attention networks. arXiv preprint arXiv:1710.10903. Do, T. H., Nguyen, D. M., Tsiligianni, E., Cornelis, B., & Deligiannis, N. (2017). Multiview deep learning for predicting twitter users’ location. arXiv preprint arXiv:1712.08091. Duvenaud, David K. Maclaurin, Dougal Iparraguirre, Jorge Bombarell, Rafael Hirzel, Timothy Aspuru-Guzik, Alan Adams & Ryan P. (2015). Convolutional Networks on Graphs for Learning Molecular Fingerprints. In C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama & R. Garnett, Advances in Neural Information Processing Systems, Vol. 28, Curran Associates, Inc., pp. 2224–2232. Qi, Liao, Jia, Fidler, Urtasun (b0165) 2017 (pp. 3844–3852). Ortega, Frossard, Kovačević, Moura, Vandergheynst (b0160) 2018; 106 Humphries, Gurney (b0100) 2008; 3 NIST (2016). Shannon diversity index. https://www.itl.nist.gov/div898/software/dataplot/refman2/auxillar/shannon.htm. DeVries, T., & Taylor, G.W. (2017). Improved regularization of convolutional neural networks with cutout. arXiv preprint arXiv:1708.04552. (pp. 4502–4510). Quek, Wang, Zhang, Feng (b0175) 2011 Zhou, J., Cui, G., Zhang, Z., Yang, C., Liu, Z., & Sun, M. (2018). Graph neural networks: A review of methods and applications. arXiv preprint arXiv:1812.08434. (pp. 631–636). ACM. Atwood, J., & Towsley, D. (2016). Diffusion-convolutional neural networks. In Li, Y., Tarlow, D., Brockschmidt, M., & Zemel, R. (2015). Gated graph sequence neural networks. arXiv preprint arXiv:1511.05493. Kersting, K., Kriege, N. M., Morris, C., Mutzel, P., & Neumann, M. (2016). Benchmark data sets for graph kernels. http://graphkernels.cs.tu-dortmund.de. Xinyi, Chen (b0235) 2019 Li, Han, Wu (b0135) 2018 Luzhnica, E., Day, B., & Liò, P. (2019). On graph classification networks, datasets and baselines. arXiv preprint arXiv:1905.04682. Kearnes, McCloskey, Berndl, Pande, Riley (b0105) 2016; 30 Ronneberger, Fischer, Brox (b0190) 2015 Simonovsky, Komodakis (b0205) 2017 Bianchi, F. M., Grattarola, D., Livi, L., & Alippi, C. (2019). Graph neural networks with convolutional arma filters. arXiv preprint arXiv:1901.01343. Simonovsky (10.1016/j.eswa.2021.114711_b0205) 2017 Teney (10.1016/j.eswa.2021.114711_b0215) 2017 10.1016/j.eswa.2021.114711_b0005 10.1016/j.eswa.2021.114711_b0125 Watts (10.1016/j.eswa.2021.114711_b0230) 1998; 393 Li (10.1016/j.eswa.2021.114711_b0135) 2018 10.1016/j.eswa.2021.114711_b0045 10.1016/j.eswa.2021.114711_b0120 10.1016/j.eswa.2021.114711_b0130 Ortega (10.1016/j.eswa.2021.114711_b0160) 2018; 106 10.1016/j.eswa.2021.114711_b0095 Do (10.1016/j.eswa.2021.114711_b0050) 2019 Qi (10.1016/j.eswa.2021.114711_b0165) 2017 Monti (10.1016/j.eswa.2021.114711_b0145) 2019 Niepert (10.1016/j.eswa.2021.114711_b0150) 2016 Gao (10.1016/j.eswa.2021.114711_b0075) 2018 10.1016/j.eswa.2021.114711_b0115 10.1016/j.eswa.2021.114711_b0035 10.1016/j.eswa.2021.114711_b0155 10.1016/j.eswa.2021.114711_b0110 Qu (10.1016/j.eswa.2021.114711_b0170) 2019 10.1016/j.eswa.2021.114711_b0085 10.1016/j.eswa.2021.114711_b0040 Kearnes (10.1016/j.eswa.2021.114711_b0105) 2016; 30 Ronneberger (10.1016/j.eswa.2021.114711_b0190) 2015 10.1016/j.eswa.2021.114711_b0225 10.1016/j.eswa.2021.114711_b0025 10.1016/j.eswa.2021.114711_b0220 10.1016/j.eswa.2021.114711_b0030 Sen (10.1016/j.eswa.2021.114711_b0200) 2008; 29 Ying (10.1016/j.eswa.2021.114711_b0245) 2018 Xinyi (10.1016/j.eswa.2021.114711_b0235) 2019 Schütt (10.1016/j.eswa.2021.114711_b0195) 2017; 8 Feng (10.1016/j.eswa.2021.114711_b0065) 2020; 33 Quek (10.1016/j.eswa.2021.114711_b0175) 2011 Rong (10.1016/j.eswa.2021.114711_b0185) 2019 10.1016/j.eswa.2021.114711_b0015 Gao (10.1016/j.eswa.2021.114711_b0070) 2019 10.1016/j.eswa.2021.114711_b0255 10.1016/j.eswa.2021.114711_b0055 10.1016/j.eswa.2021.114711_b0010 10.1016/j.eswa.2021.114711_b0020 Humphries (10.1016/j.eswa.2021.114711_b0100) 2008; 3 10.1016/j.eswa.2021.114711_b0140 Gilmer (10.1016/j.eswa.2021.114711_b0080) 2017 Hamilton (10.1016/j.eswa.2021.114711_b0090) 2017 Srivastava (10.1016/j.eswa.2021.114711_b0210) 2014; 15 10.1016/j.eswa.2021.114711_b0180 Zhang (10.1016/j.eswa.2021.114711_b0250) 2018 Yao (10.1016/j.eswa.2021.114711_b0240) 2019 Erdős (10.1016/j.eswa.2021.114711_b0060) 1960; 5 |
References_xml | – start-page: 1263 year: 2017 end-page: 1272 ident: b0080 article-title: Neural message passing for quantum chemistry publication-title: Proceedings of the 34th international conference on machine learning-volume 70 contributor: fullname: Dahl – start-page: 1416 year: 2018 end-page: 1424 ident: b0075 article-title: Large-scale learnable graph convolutional networks publication-title: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining contributor: fullname: Ji – volume: 3 year: 2008 ident: b0100 article-title: Network ‘small-world-ness’: a quantitative method for determining canonical network equivalence publication-title: PloS one contributor: fullname: Gurney – start-page: 3693 year: 2017 end-page: 3702 ident: b0205 article-title: Dynamic edge-conditioned filters in convolutional neural networks on graphs publication-title: Proceedings of the IEEE conference on computer vision and pattern recognition contributor: fullname: Komodakis – start-page: 5199 year: 2017 end-page: 5208 ident: b0165 article-title: 3d graph neural networks for rgbd semantic segmentation publication-title: Proceedings of the IEEE International Conference on Computer Vision contributor: fullname: Urtasun – volume: 29 start-page: 93 year: 2008 ident: b0200 article-title: Collective classification in network data publication-title: AI Magazine contributor: fullname: Eliassi-Rad – volume: 15 start-page: 1929 year: 2014 end-page: 1958 ident: b0210 article-title: Dropout: a simple way to prevent neural networks from overfitting publication-title: The Journal of Machine Learning Research contributor: fullname: Salakhutdinov – start-page: 1 year: 2017 end-page: 9 ident: b0215 article-title: Graph-structured representations for visual question answering publication-title: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition contributor: fullname: van DenHengel – start-page: 4800 year: 2018 end-page: 4810 ident: b0245 article-title: Hierarchical graph representation learning with differentiable pooling publication-title: Advances in Neural Information Processing Systems contributor: fullname: Leskovec – volume: 5 start-page: 17 year: 1960 end-page: 60 ident: b0060 article-title: On the evolution of random graphs publication-title: Publications of the Mathematical Institute of the Hungarian Academy of Sciences contributor: fullname: Rényi – year: 2019 ident: b0145 article-title: Fake news detection on social media using geometric deep learning contributor: fullname: Bronstein – start-page: 2083 year: 2019 end-page: 2092 ident: b0070 article-title: Graph u-nets publication-title: International conference on machine learning contributor: fullname: Ji – volume: 30 start-page: 595 year: 2016 end-page: 608 ident: b0105 article-title: Molecular graph convolutions: moving beyond fingerprints publication-title: Journal of Computer-Aided Molecular Design contributor: fullname: Riley – volume: 393 start-page: 440 year: 1998 end-page: 442 ident: b0230 article-title: Collective dynamics of ‘small-world’ networks publication-title: Nature contributor: fullname: Strogatz – year: 2019 ident: b0235 article-title: Capsule graph neural network publication-title: International Conference on Learning Representations contributor: fullname: Chen – volume: 33 year: 2020 ident: b0065 article-title: Graph random neural networks for semi-supervised learning on graphs publication-title: Advances in Neural Information Processing Systems contributor: fullname: Tang – year: 2019 ident: b0170 article-title: Gmnn: Graph markov neural networks contributor: fullname: Tang – start-page: 416 year: 2011 end-page: 421 ident: b0175 article-title: Structural image classification with graph neural networks publication-title: 2011 International Conference on Digital Image Computing: Techniques and Applications contributor: fullname: Feng – year: 2019 ident: b0185 article-title: Dropedge: Towards deep graph convolutional networks on node classification publication-title: International Conference on Learning Representations contributor: fullname: Huang – start-page: 1024 year: 2017 end-page: 1034 ident: b0090 article-title: Inductive representation learning on large graphs publication-title: Advances in neural information processing systems contributor: fullname: Leskovec – year: 2018 ident: b0135 article-title: Deeper insights into graph convolutional networks for semi-supervised learning publication-title: Thirty-Second AAAI Conference on Artificial Intelligence contributor: fullname: Wu – volume: 106 start-page: 808 year: 2018 end-page: 828 ident: b0160 article-title: Graph signal processing: Overview, challenges, and applications publication-title: Proceedings of the IEEE contributor: fullname: Vandergheynst – start-page: 2014 year: 2016 end-page: 2023 ident: b0150 article-title: Learning convolutional neural networks for graphs publication-title: International conference on machine learning contributor: fullname: Kutzkov – volume: 8 start-page: 1 year: 2017 end-page: 8 ident: b0195 article-title: Quantum-chemical insights from deep tensor neural networks publication-title: Nature Communications contributor: fullname: Tkatchenko – start-page: 234 year: 2015 end-page: 241 ident: b0190 article-title: U-net: Convolutional networks for biomedical image segmentation publication-title: International conference on medical image computing and computer-assisted intervention contributor: fullname: Brox – start-page: 7370 year: 2019 end-page: 7377 ident: b0240 article-title: Graph convolutional networks for text classification publication-title: Proceedings of the AAAI Conference on Artificial Intelligence contributor: fullname: Luo – start-page: 7535 year: 2019 end-page: 7539 ident: b0050 article-title: Matrix completion with variational graph autoencoders: Application in hyperlocal air quality inference publication-title: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) contributor: fullname: Deligiannis – year: 2018 ident: b0250 article-title: An end-to-end deep learning architecture for graph classification publication-title: Thirty-Second AAAI Conference on Artificial Intelligence contributor: fullname: Chen – year: 2018 ident: 10.1016/j.eswa.2021.114711_b0250 article-title: An end-to-end deep learning architecture for graph classification publication-title: Thirty-Second AAAI Conference on Artificial Intelligence contributor: fullname: Zhang – year: 2019 ident: 10.1016/j.eswa.2021.114711_b0145 contributor: fullname: Monti – volume: 3 issue: 4 year: 2008 ident: 10.1016/j.eswa.2021.114711_b0100 article-title: Network ‘small-world-ness’: a quantitative method for determining canonical network equivalence publication-title: PloS one doi: 10.1371/journal.pone.0002051 contributor: fullname: Humphries – ident: 10.1016/j.eswa.2021.114711_b0120 – ident: 10.1016/j.eswa.2021.114711_b0015 – volume: 33 year: 2020 ident: 10.1016/j.eswa.2021.114711_b0065 article-title: Graph random neural networks for semi-supervised learning on graphs publication-title: Advances in Neural Information Processing Systems contributor: fullname: Feng – year: 2018 ident: 10.1016/j.eswa.2021.114711_b0135 article-title: Deeper insights into graph convolutional networks for semi-supervised learning publication-title: Thirty-Second AAAI Conference on Artificial Intelligence contributor: fullname: Li – ident: 10.1016/j.eswa.2021.114711_b0220 – start-page: 1263 year: 2017 ident: 10.1016/j.eswa.2021.114711_b0080 article-title: Neural message passing for quantum chemistry contributor: fullname: Gilmer – ident: 10.1016/j.eswa.2021.114711_b0130 – year: 2019 ident: 10.1016/j.eswa.2021.114711_b0170 contributor: fullname: Qu – ident: 10.1016/j.eswa.2021.114711_b0025 – ident: 10.1016/j.eswa.2021.114711_b0040 – ident: 10.1016/j.eswa.2021.114711_b0115 – start-page: 5199 year: 2017 ident: 10.1016/j.eswa.2021.114711_b0165 article-title: 3d graph neural networks for rgbd semantic segmentation contributor: fullname: Qi – year: 2019 ident: 10.1016/j.eswa.2021.114711_b0185 article-title: Dropedge: Towards deep graph convolutional networks on node classification contributor: fullname: Rong – ident: 10.1016/j.eswa.2021.114711_b0035 – start-page: 1416 year: 2018 ident: 10.1016/j.eswa.2021.114711_b0075 article-title: Large-scale learnable graph convolutional networks contributor: fullname: Gao – ident: 10.1016/j.eswa.2021.114711_b0140 – ident: 10.1016/j.eswa.2021.114711_b0010 – year: 2019 ident: 10.1016/j.eswa.2021.114711_b0235 article-title: Capsule graph neural network contributor: fullname: Xinyi – ident: 10.1016/j.eswa.2021.114711_b0055 – volume: 393 start-page: 440 issue: 6684 year: 1998 ident: 10.1016/j.eswa.2021.114711_b0230 article-title: Collective dynamics of ‘small-world’ networks publication-title: Nature doi: 10.1038/30918 contributor: fullname: Watts – start-page: 1 year: 2017 ident: 10.1016/j.eswa.2021.114711_b0215 article-title: Graph-structured representations for visual question answering contributor: fullname: Teney – ident: 10.1016/j.eswa.2021.114711_b0225 – ident: 10.1016/j.eswa.2021.114711_b0045 – ident: 10.1016/j.eswa.2021.114711_b0020 – volume: 30 start-page: 595 issue: 8 year: 2016 ident: 10.1016/j.eswa.2021.114711_b0105 article-title: Molecular graph convolutions: moving beyond fingerprints publication-title: Journal of Computer-Aided Molecular Design doi: 10.1007/s10822-016-9938-8 contributor: fullname: Kearnes – ident: 10.1016/j.eswa.2021.114711_b0180 doi: 10.3115/v1/P15-2104 – volume: 29 start-page: 93 issue: 3 year: 2008 ident: 10.1016/j.eswa.2021.114711_b0200 article-title: Collective classification in network data publication-title: AI Magazine doi: 10.1609/aimag.v29i3.2157 contributor: fullname: Sen – start-page: 2014 year: 2016 ident: 10.1016/j.eswa.2021.114711_b0150 article-title: Learning convolutional neural networks for graphs contributor: fullname: Niepert – volume: 5 start-page: 17 issue: 1 year: 1960 ident: 10.1016/j.eswa.2021.114711_b0060 article-title: On the evolution of random graphs publication-title: Publications of the Mathematical Institute of the Hungarian Academy of Sciences contributor: fullname: Erdős – start-page: 416 year: 2011 ident: 10.1016/j.eswa.2021.114711_b0175 article-title: Structural image classification with graph neural networks contributor: fullname: Quek – ident: 10.1016/j.eswa.2021.114711_b0155 – ident: 10.1016/j.eswa.2021.114711_b0125 doi: 10.1145/1150402.1150479 – start-page: 3693 year: 2017 ident: 10.1016/j.eswa.2021.114711_b0205 article-title: Dynamic edge-conditioned filters in convolutional neural networks on graphs contributor: fullname: Simonovsky – start-page: 234 year: 2015 ident: 10.1016/j.eswa.2021.114711_b0190 article-title: U-net: Convolutional networks for biomedical image segmentation contributor: fullname: Ronneberger – start-page: 7370 year: 2019 ident: 10.1016/j.eswa.2021.114711_b0240 article-title: Graph convolutional networks for text classification contributor: fullname: Yao – start-page: 7535 year: 2019 ident: 10.1016/j.eswa.2021.114711_b0050 article-title: Matrix completion with variational graph autoencoders: Application in hyperlocal air quality inference contributor: fullname: Do – ident: 10.1016/j.eswa.2021.114711_b0095 – ident: 10.1016/j.eswa.2021.114711_b0085 – start-page: 1024 year: 2017 ident: 10.1016/j.eswa.2021.114711_b0090 article-title: Inductive representation learning on large graphs contributor: fullname: Hamilton – ident: 10.1016/j.eswa.2021.114711_b0255 – start-page: 4800 year: 2018 ident: 10.1016/j.eswa.2021.114711_b0245 article-title: Hierarchical graph representation learning with differentiable pooling contributor: fullname: Ying – ident: 10.1016/j.eswa.2021.114711_b0005 – volume: 15 start-page: 1929 issue: 1 year: 2014 ident: 10.1016/j.eswa.2021.114711_b0210 article-title: Dropout: a simple way to prevent neural networks from overfitting publication-title: The Journal of Machine Learning Research contributor: fullname: Srivastava – start-page: 2083 year: 2019 ident: 10.1016/j.eswa.2021.114711_b0070 article-title: Graph u-nets contributor: fullname: Gao – ident: 10.1016/j.eswa.2021.114711_b0110 – volume: 106 start-page: 808 issue: 5 year: 2018 ident: 10.1016/j.eswa.2021.114711_b0160 article-title: Graph signal processing: Overview, challenges, and applications publication-title: Proceedings of the IEEE doi: 10.1109/JPROC.2018.2820126 contributor: fullname: Ortega – ident: 10.1016/j.eswa.2021.114711_b0030 doi: 10.1609/aaai.v34i04.5747 – volume: 8 start-page: 1 issue: 1 year: 2017 ident: 10.1016/j.eswa.2021.114711_b0195 article-title: Quantum-chemical insights from deep tensor neural networks publication-title: Nature Communications doi: 10.1038/ncomms13890 contributor: fullname: Schütt |
SSID | ssj0017007 |
Score | 2.5434618 |
Snippet | •A new message passing formulation for graph convolutional neural networks is proposed.•An effective regularization technique to address over-fitting and... Graph convolutional neural networks (GCNNs) have received much attention recently, owing to their capability in handling graph-structured data. Among the... |
SourceID | proquest crossref elsevier |
SourceType | Aggregation Database Publisher |
StartPage | 114711 |
SubjectTerms | Agglomeration Artificial neural networks Deformation effects Geometric deep learning Graph classification Graph convolutional neural networks Graph theory Graphs Message passing Neural networks Node classification Nodes Regularization Regularization methods Representations Smoothing Transition probabilities Weighting |
Title | Graph convolutional neural networks with node transition probability-based message passing and DropNode regularization |
URI | https://dx.doi.org/10.1016/j.eswa.2021.114711 https://www.proquest.com/docview/2539561152 |
Volume | 174 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://sdu.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Nb5wwELU2yaWXpp9qmrTyoTdEBAavzTFpttlWai5JpdwQ_iBKqrIrdkmVf98ZbEM3Uau2Ui-ArAWvPI_hMczMI-RdoZNKJrmJC2k4SpgpuKUqG9tCp0xxrosaQwPzc3F2KU9m-WwyCYJM49h_tTSMga2xcvYvrD1cFAbgGGwOW7A6bP_I7qfYgbpPJvfTgA2waWW_61O-fUFbszAWFSIal7WFmVrKNe2-i_HZZqJvKI9yZaMlEOxQy3jSLpZneGbbi9i3voxzI8CP3ZPXvkd0qJ776Tv5wJ0XDiy2ieZdN0Slr7o77wo7HX2-boZ49bH9Osi3n16j1tJqhAvgo2q63tOZNkDeRzNYimFSV8_pQmyhzGbMaXKxShHnqZPzObTOU0uRxVPh5BUHV-5-8uCx4CIUN4d29R17TbEUWyQL7-U3222f42Q4F8PuVJzlW2SHwTTgQ3eOPs4uPw3fqETiivHDn_MlWS578P5Mv6I99whAz2ounpDH_nWEHjkcPSUT2zwju0Hqg3rP_5zc9rCiG7CiDlY0wIqipSnCio6wog9gRT2sqIcVBVjRACu6CasX5MuH2cX7eewlO2KdCbaOKy6VNvASbniipE5YbYAkKpEJqXmqzdQymUg9rWo7VUwkuQBKrLO0NkoJy9PsJdluFo19RajKqkKKQqWpAtacW1WDtwFGqZHDF6reI1FY0XLpOrOUIWXxpsT1L3H9S7f-e4SHRS89t3ScsQSM_Pa8g2Ch0t_lq5LxDOvBAR2v__Gy--TRiP4Dsr1uO_uGbK1M99bD7Aeu6a3A |
link.rule.ids | 315,782,786,27933,27934 |
linkProvider | Elsevier |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Graph+convolutional+neural+networks+with+node+transition+probability-based+message+passing+and+DropNode+regularization&rft.jtitle=Expert+systems+with+applications&rft.au=Do%2C+Tien+Huu&rft.au=Nguyen%2C+Duc+Minh&rft.au=Bekoulis%2C+Giannis&rft.au=Munteanu%2C+Adrian&rft.date=2021-07-15&rft.pub=Elsevier+Ltd&rft.issn=0957-4174&rft.eissn=1873-6793&rft.volume=174&rft_id=info:doi/10.1016%2Fj.eswa.2021.114711&rft.externalDocID=S0957417421001524 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0957-4174&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0957-4174&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0957-4174&client=summon |