ML-GAT: Multi-label Node Classification Using Enhanced Graph Attention Network

  • Rajni Jindal, Ashi Gupta , Prachi Garg


Real-world graphs are ubiquitous data structures forming the backbone of a plethora of application domains, systems and phenomena ranging from bioinformatics and protein interaction to 3D modeling and learning for vision tasks. Graph neural networks are a promising class of message passing based neural network models that learn rich representations on graph-structured data in its raw form capturing complex entity relationships from the graph topological structure. While node classification using state-of-the-art graph neural networks is an active research direction, multi-label node classification on graphs is a relatively unexplored area. Since many real-world graph based problems require the assignment of more than one label to each node instance in the graph, we study here multi-label node classification using enhanced graph neural networks. We propose a novel architecture, Multi Label Graph Attention Network (ML-GAT) that leverages the applicability of the attention based GAT to efficient inductive semi-supervised multi-label classification by augmenting complex inter-label and node-label dependencies implicit in the graph structure to the learning process. We compare our method with ML-GCN [2] and GAT [3] and GCN [1] baselines to examine the influence these losses have on the models and perform three empirical studies on the datasets to make a comparative analysis over the methods. After extensive optimization, we achieve significant performance increase on both the datasets from earlier benchmarks. We believe that this study will serve as a benchmark for future research in multi-label learning.

Keywords: Graph Attention Network, Graph Convolution Network, Node embedding, Multi-label Classification

How to Cite
Rajni Jindal, Ashi Gupta , Prachi Garg. (2020). ML-GAT: Multi-label Node Classification Using Enhanced Graph Attention Network. International Journal of Advanced Science and Technology, 29(06), 6652 - 6662. Retrieved from