Institute of Statistical Science Academia Sinica [Seminar Feed] Statistics, Stat, Edu en-us Sun, 29 Mar 2020 12:57:46 +0800 PHP Un-rectifying Non-linear Networks for Signal Representation Abstract

    We consider deep neural networks with rectifier activations and max-pooling from a signal representation perspective. In this view, such representations mark the transition from using a single linear representation for all signals to utilizing a large collection of affine linear representations that are tailored to particular regions of the signal space. We propose a novel technique to “un-rectify” the nonlinear activations into data-dependent linear equations and constraints, from which we derive explicit expressions for the affine linear operators, their domains and ranges in terms of the network parameters. We show how increasing the depth of the network refines the domain partitioning and derive atomic decompositions for the corresponding affine mappings that process data belonging to the same partitioning region. In each atomic decomposition the connections over all hidden network layers are summarized and interpreted in a single matrix. We apply the decompositions to study the Lipschitz regularity of the networks and give sufficient conditions for network-depth-independent stability of the representation, drawing a connection to compressible weight distributions. Such analyses may facilitate and promote further theoretical insight and exchange from both the signal processing and machine learning communities.

Wed, 18 Mar 2020 14:06:47 +0800