In this work, we develop a nonparametric graphical model for multivariate random functions. Most existing graphical models are restricted by two assumptions, multivariate Gaussian or copula Gaussian distributions, and linear relations among the random variables or functions on the nodes. We relax those assumptions by building our graphical model based on a new statistical object — the functional additive regression operator. By carrying out regression and neighborhood selection at the operator level, our method can capture nonlinear relations without requiring any distributional assumptions. Moreover, the method is built up using only one-dimensional kernel, avoiding the curse of dimensionality that comes with a fully nonparametric approach. This feature is particularly attractive for large-scale networks. We derive error bounds for the estimated regression operator and establish graph estimation consistency, while allowing the number of functions to diverge at the exponential rate of the sample size. We demonstrate efficacy of our method by both simulations and analysis of an electroencephalography dataset. (This is joint work with Lexin Li (UC Berkeley), Bing Li (Penn State), and Hongyu Zhao (Yale)).