您的位置:山东大学 -> 科技期刊社 -> 《山东大学学报(工学版)》

山东大学学报 (工学版) ›› 2024, Vol. 54 ›› Issue (5): 93-100.doi: 10.6040/j.issn.1672-3961.0.2023.150

• 机器学习与数据挖掘 • 上一篇    

基于盖根堡多项式最佳平方近似的谱图网络

林振宇,邵蓥侠*   

  1. 北京邮电大学计算机学院, 北京 100876
  • 发布日期:2024-10-18
  • 作者简介:林振宇(2000— ),男,浙江台州人,硕士研究生,主要研究方向为图神经网络. E-mail:linzy@bupt.edu.cn. *通信作者简介:邵蓥侠(1988— ),男,浙江宁波人,教授,博士生导师,博士,主要研究方向为大规模图处理. E-mail:shaoyx@bupt.edu.cn
  • 基金资助:
    国家自然科学基金资助项目(62272054,62192784)

Spectral graph networks based on best square approximation with Gegenbauer polynomials

LIN Zhenyu, SHAO Yingxia*   

  1. School of Computer Science, Beijing University of Post and Telecommunication, Beijing 100876, China
  • Published:2024-10-18

摘要: 针对现有谱图神经网络模型在学习图节点特征矩阵信号频率分布方面存在的不足,采用盖根堡正交基改进,提出一种泛化能力强、适合真实世界数据的谱图神经网络模型,有效提高节点分类任务精度。分析不同真实世界数据集中图节点特征矩阵的信号频率分布,使用盖根堡正交基学习谱图滤波函数,提高模型的泛化能力。理论分析表明,该模型能够以最佳平方误差有效学习闭区间上的任意连续谱滤波函数。在13个数据集上进行试验的结果显示,基于盖根堡正交基的谱图神经网络模型在8个数据集上的性能均超越目前的先进模型,验证了模型的有效性。可扩展性试验表明,该模型适用于大规模图数据。

关键词: 盖根堡正交基, 谱图神经网络, 图节点特征矩阵, 信号频率分布, 滤波函数

中图分类号: 

  • TP391
[1] BRUNA J, ZAREMBA W, SZLAM A, et al. Spectral networks and locally connected networks on graphs[EB/OL].(2014-05-21)[2023-03-26]. https://doi.org/10.48550/arXiv.1312.6203.
[2] DEFFERRARD M, BRESSON X, VANDERGHEYNST P. Convolutional neural networks on graphs with fast localized spectral filtering[C] //Proceedings of the 30th Conference on Neural Information Processing Systems. Barcelona, Spain: MIT, 2016: 3844-3852.
[3] HE M, WEI Z, WEN J R. Convolutional neural networks on graphs with Chebyshev approximation, revisited[C] //Advances in Neural Information Processing Systems. Los Angeles, USA: MIT, 2022: 7264-7276.
[4] LEVIE R, MONTI F, BRESSON X, et al. CayleyNets: graph convolutional neural networks with complex rational spectral filters[J]. IEEE Transactions on Signal Processing, 2018, 67(1): 97-109.
[5] BIANCHI F M, GRATTAROLA D, LIVI L, et al. Graph neural networks with convolutional ARMA filters[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, 44(7): 3496-3507.
[6] CHIEN E, PENG J, LI P, et al. Adaptive universal generalized pagerank graph neural network[EB/OL].(2021-10-26)[2023-03-26]. https://doi.org/10.48550/arXiv.2006.07988.
[7] HE M, WEI Z, XU H. BernNet: learning arbitrary graph spectral filters via Bernstein approximation[C] //Advances in Neural Information Processing Systems. [S.l.] : MIT, 2021: 14239-14251.
[8] WU F, SOUZA A, ZHANG T, et al. Simplifying graph convolutional networks[C] //International Conference on Machine Learning. Long Bench, USA: ACM, 2019: 6861-6871.
[9] KLICPERA J, BOJCHEVSKI A, GÜNNEMANN S. Predict then propagate: graph neural networks meet personalized pagerank[C] //International Conference on Learning Representations. New Orleans, USA: ICLR, 2019: 2667-2682.
[10] ZHU M, WANG X, SHI C, et al. Interpreting and unifying graph neural networks with an optimization framework[C] //Proceedings of the Web Conference 2021. Ljubljana, Slovenia: ACM, 2021: 1215-1226.
[11] CHEN Z, CHEN F, ZHANG L, et al. Bridging the gap between spatial and spectral domains: a unified framework for graph neural networks[J]. ACM Computing Surveys, 2023, 50(5): 1-42.
[12] WANG X, ZHANG M. How powerful are spectral graph neural networks[C] //International Conference on Machine Learning. Baltimore, USA: ACM, 2022: 23341-23362.
[13] KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks[C] //International Conference on Learning Representations. Toulon, France: ICLR, 2017: 48550.
[14] TANG J, LI J, GAO Z, et al. Rethinking graph neural networks for anomaly detection[C] //International Conference on Machine Learning. Baltimore, Maryland: ACM, 2022: 21076-21089.
[15] GAO Y, WANG X, HE X, et al. Alleviating structural distribution shift in graph anomaly detection[C] //Proceedings of the Sixteenth ACM International Conference on Web Search and Data Mining. Singapore: ACM, 2023: 357-365.
[16] JIANG D, WU Z, HSIEH C Y, et al. Could graph neural networks learn better molecular representation for drug discovery? A comparison study of descriptor-based and graph-based models[J]. Journal of Cheminformatics, 2021, 13(1): 1-23.
[17] TIAN Z, BAI T, ZHANG Z, et al. Directed acyclic graph factorization machines for CTR prediction via knowledge distillation[C] //Proceedings of the Sixteenth ACM International Conference on Web Search and Data Mining. Singapore: ACM, 2023: 715-723.
[18] LIU M, GAO H, JI S. Towards deeper graph neural networks[C] //Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. [S.l.] : ACM, 2020: 338-348.
[19] WANG S, DOU Z, ZHU Y. Heterogeneous graph-based context-aware document ranking[C] //Proceedings of the Sixteenth ACM International Conference on Web Search and Data Mining. Singapore: ACM, 2023: 724-732.
[20] SUN L, YE J, PENG H, et al. A self-supervised Riemannian GNN with time varying curvature for temporal graph learning[C] //Proceedings of the 31st ACM International Conference on Information & Knowledge Management. Atlanta, USA: ACM, 2022: 1827-1836.
[21] MACAVANEY S, TONELLOTTO N, MACDONALD C. Adaptive re-ranking with a corpus graph[C] //Proceedings of the 31st ACM International Conference on Information & Knowledge Management. Atlanta, USA: ACM, 2022: 1491-1500.
[22] ZHOU X, WANG J, LIU Y, et al. Inductive graph transformer for delivery time estimation[C] //Proceedings of the Sixteenth ACM International Conference on Web Search and Data Mining. Singapore: ACM, 2023: 679-687.
[23] TONG P, ZHANG Q, YAO J. Leveraging domain context for question answering over knowledge graph[J]. Data Science and Engineering, 2019, 4: 323-335.
[24] PRESS W H, TEUKOLSKY S A, VETTERLING W T, et al. Numerical recipes: the art of scientific computing[M]. New York, USA: Cambridge University Press, 2007.
[25] R·柯朗, D·希尔伯特. 数学物理方法[M]. 钱敏, 郭敦仁, 译. 北京: 科学出版社, 2011: 57-58.
[26] GAUTSCHI W. Orthogonal polynomials: computation and approximation[M]. Cambridge, UK: OUP Oxford, 2004.
[27] CHENEY E W, LIGHT W A. A course in approximation theory[M]. Providence, USA: AMS, 2009.
[28] LIM D, HOHNE F, LI X, et al. Large scale learning on non-homophilous graphs: new benchmarks and strong simple methods[C] //Advances in Neural Information Processing Systems. [S.l.] : MIT, 2021: 20887-20902.
[1] 杨巨成, 魏峰, 林亮, 贾庆祥, 刘建征. 驾驶员疲劳驾驶检测研究综述[J]. 山东大学学报 (工学版), 2024, 54(2): 1-12.
[2] 肖伟, 郑更生, 陈钰佳. 结合自训练模型的命名实体识别方法[J]. 山东大学学报 (工学版), 2024, 54(2): 96-102.
[3] 胡钢, 王乐萌, 卢志宇, 王琴, 徐翔. 基于节点多阶邻居递阶关联贡献度的重要性辨识[J]. 山东大学学报 (工学版), 2024, 54(1): 1-10.
[4] 李家春,李博文,常建波. 一种高效且轻量的RGB单帧人脸反欺诈模型[J]. 山东大学学报 (工学版), 2023, 53(6): 1-7.
[5] 樊禹江,黄欢欢,丁佳雄,廖凯,余滨杉. 基于云模型的老旧小区韧性评价体系[J]. 山东大学学报 (工学版), 2023, 53(5): 1-9, 19.
[6] 李颖,王建坤. 基于监督图正则化和信息融合的轻度认知障碍分类方法[J]. 山东大学学报 (工学版), 2023, 53(4): 65-73.
[7] 于艺旋,杨耕,耿华. 连续复合运动的多模态层次化关键帧提取方法[J]. 山东大学学报 (工学版), 2023, 53(2): 42-50.
[8] 张豪,李子凌,刘通,张大伟,陶建华. 融合社会学因素的模糊贝叶斯网技术预测模型[J]. 山东大学学报 (工学版), 2023, 53(2): 23-33.
[9] 吴艳丽,刘淑薇,何东晓,王晓宝,金弟. 刻画多种潜在关系的泊松-伽马主题模型[J]. 山东大学学报 (工学版), 2023, 53(2): 51-60.
[10] 余明骏,刁红军,凌兴宏. 基于轨迹掩膜的在线多目标跟踪方法[J]. 山东大学学报 (工学版), 2023, 53(2): 61-69.
[11] 黄华娟,程前,韦修喜,于楚楚. 融合Jaya高斯变异的自适应乌鸦搜索算法[J]. 山东大学学报 (工学版), 2023, 53(2): 11-22.
[12] 刘方旭,王建,魏本征. 基于多空间注意力的小儿肺炎辅助诊断算法[J]. 山东大学学报 (工学版), 2023, 53(2): 135-142.
[13] 刘行,杨璐,郝凡昌. 基于多特征融合的手指静脉图像检索方法[J]. 山东大学学报 (工学版), 2023, 53(2): 118-126.
[14] 袁钺,王艳丽,刘勘. 基于空洞卷积块架构的命名实体识别模型[J]. 山东大学学报 (工学版), 2022, 52(6): 105-114.
[15] 徐晓斌,王琪,高彬,孙志于,梁中军,王尚广. 异构网络中基于轨迹预测的资源预分配方案[J]. 山东大学学报 (工学版), 2022, 52(4): 12-19.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!