Deep Graph Library: Overview, Updates, and Future Developments [talk]

Conference

IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW) - May 18, 2020

Speakers/authors

George Karypis (professor)

Abstract

Learning from graph and relational data plays a major role in many applications including social network analysis, marketing, ecommerce, information retrieval, knowledge modeling, medical and biological sciences, engineering, and others. In the last few years, Graph Neural Networks (GNNs) have emerged as a promising new supervised learning framework capable of bringing the power of deep representation learning to graph and relational data. This ever-growing body of research has shown that GNNs achieve state-of-the-art performance for problems such as link prediction, fraud detection, target-ligand binding activity prediction, knowledge-graph completion, and product recommendations. Deep Graph Library (DGL) is an open source development framework for writing and training GNN-based models. It is designed to simplify the development of such models by using graph-based abstractions while at the same time achieving high computational efficiency and scalability by relying on optimized sparse matrix operations and existing highly optimized standard deep learning frameworks (e.g., MXNet, PyTorch, and TensorFlow). This talk provides an overview of DGL, describes some recent developments related to high-performance multi-GPU, multi-core, and distributed training, and describes our future development roadmap.

Link to conference website

Deep Graph Library: Overview, Updates, and Future Developments

Keywords

deep learning, graph neural networks

Share