search

Optimizing DNN Computation Graph using Graph Substitutions

CategoryPublications 232

Authors: Jingzhi Fang, Yanyan Shen, Yue Wang, Lei Chen

Name of Conference: International Conference on Very Large Data Bases (VLDB 2020), Aug 31- Sept 4, 2020, Tokyo, Japan

Date of Publication: Aug 31, 2020

Abstract

Deep learning has achieved great success in various realworld applications. As deep neural networks (DNNs) are getting larger, the inference and training cost of DNNs increases significantly. Since one round of inference or one iteration in the training phase of a DNN is typically modeled as a computation graph, existing works propose to optimize computation graphs by performing a sequence of functionally equivalent graph substitutions, leading to higher inference and training efficiency. In this work, we formally define the Optimizing Computation Graph using Graph Substitutions (OCGGS) problem, and prove it to be NP-hard and Poly-APX-complete. We develop two exact and efficient methods to the OCGGS problem. The pruning-based algorithm eliminates the examination of redundant graph substitution sequences, and the dynamic programming with pruning algorithm makes use of the explored graph substitutions. To further speed up the search process, we propose a sampling heuristic which is effective to optimize complex computation graphs with polynomial time and space complexity. Extensive experiments on various DNN architectures and sizes are conducted to verify the effectiveness and efficiency of our proposed solutions compared with existing techniques.

View Full Text

Keywords2020Big data analytics under constrained resourcesVLDBYue Wang Previous: Next: