PT - JOURNAL ARTICLE AU - Sekhon, Arshdeep AU - Wang, Beilun AU - Qi, Yanjun TI - Adding Extra Knowledge in Scalable Learning of Sparse Differential Gaussian Graphical Models AID - 10.1101/716852 DP - 2019 Jan 01 TA - bioRxiv PG - 716852 4099 - http://biorxiv.org/content/early/2019/07/28/716852.short 4100 - http://biorxiv.org/content/early/2019/07/28/716852.full AB - We focus on integrating different types of extra knowledge (other than the observed samples) for estimating the sparse structure change between two p-dimensional Gaussian Graphical Models (i.e. differential GGMs). Previous differential GGM estimators either fail to include additional knowledge or cannot scale up to a high-dimensional (large p) situation. This paper proposes a novel method KDiffNet that incorporates Additional Knowledge in identifying Differential Networks via an Elementary Estimator. We design a novel hybrid norm as a superposition of two structured norms guided by the extra edge information and the additional node group knowledge. KDiffNet is solved through a fast parallel proximal algorithm, enabling it to work in large-scale settings. KDiffNet can incorporate various combinations of existing knowledge without re-designing the optimization. Through rigorous statistical analysis we show that, while considering more evidence, KDiffNet achieves the same convergence rate as the state-of-the-art. Empirically on multiple synthetic datasets and one real-world fMRI brain data, KDiffNet significantly outperforms the cutting edge baselines with regard to the prediction performance, while achieving the same level of time cost or less.