About Me
Hi, my name is Chengpei Wu. My research interests include Complex Networks, and Graph Representation Learning. Recently, my research work mainly focuses on unsupervised graph representation learning, such as graph pre-training and graph contrastive learning. I am also investigating deep learning approaches–such as graph neural networks and convolutional neural networks–for efficient and accurate robustness approximation in complex networks. In addition, I explore the integration of graph learning techniques with domain knowledge from complex network theory, and the application of graph neural networks in analyzing complex systems. I am further interested in combining large language models (LLMs) with graph learning methods to address graph-related tasks. ——
News*
- [06/2024] Paper accepted by IEEE Transactions on Cybernetics.
- [10/2023] I have been awarded the National Scholarship.
- [07/2023] Paper accepted by IEEE Transactions on Circuits and Systems I: Regular Papers.
- [06/2023] Paper accepted by IEEE International Conference on Systems, Man, and Cybernetics (SMC) 2023.
Education
- B.S. in Internet of Things Engineering, Chengdu University, Chengdu, China, 2017–2021.
- M.S. in Computer Science and Technology, Sichuan Normal University, Chengdu, China, 2021-2024.
Selected Publications
- Chengpei Wu, Yang Lou, and Junli Li “Pyramid Pooling-Based Local Profiles for Graph Classification” IEEE International Conference on Systems, Man, and Cybernetics (SMC) October 1-4, 2023, Maui, Hawaii, USA.
- Chengpei Wu, Yang Lou, Lin Wang, Junli Li, and Guanrong Chen, “SPP-CNN: An Efficient Framework for Network Robustness Prediction,” IEEE Transactions on Circuits and Systems I: Regular, doi:10.1109/TCSI.2023.3296602
- Chengpei Wu, Yang Lou, Junli Li, Lin Wang, Shengli Xie, and Guanrong Chen “A Multitask Network Robustness Analysis System Based on the Graph Isomorphism Network.” IEEE Transactions on Cybernetics (TCYB).
- Yang Lou, Chengpei Wu, Junli Li, Lin Wang, and Guanrong Chen , “Network Robustness Prediction: Influence of Training Data Distributions ,” IEEE Transactions on Neural Networks and Learning Systems, doi:10.1109/TNNLS.2023.3269753
Conference Participation
- International Joint Conference on Neural Networks (IJCNN) July 18-23, 2022, Padua, Italy.
- IEEE International Conference on Systems, Man, and Cybernetics (SMC) October 1-4, 2023, Maui, Hawaii, USA.
Skills
- Programming: Python, C, Java, HTML, Go.
- Tools: Git/GitHub, Linux, MySQL, VS Code, PyCharm.
- Frameworks: PyTorch, Scikit-Learn, Networkx, DGL, Numpy, Pandas, Scipy, Matplotlib.
- Language: Chinese (native), English (CET-6).
Honors and Awards
- National Scholarship, Sichuan Normal University (2023).
- Outstanding Graduate from Sichuan Normal University and Sichuan Province (2024).
Personal Project
- MiniTorch. An autograd deep-learning python library, MiniTorch inclues the most fundamental and essential features of a deep-learning framework, such as tensor computing, autograd mechanism, dataset (dataloader), nueral network modules, loss functions, and gradient decent optimizers (SGD, Adam…).
- Complex Network Tools. An open-source Python package for the generation, analysis, and optimization of complex networks. This package implements common complex network generation models (such as BA, SW, etc.), algorithms for network attack simulation, network robustness optimization, and network robustness prediction.
- GNNTraining. A DGL-based library for benchmark training of Graph Neural Networks (GNNs). It supports structure-tuned GCN, GAT, and GraphSAGE models on homophilous, heterophilous, and OGB datasets. The library provides unified data processing, flexible training pipelines, and reproducible results with scripts for training and hyperparameter tuning.