jump to main area
:::
A- A A+

Seminars

Backpropagation of Pseudo-errors: Neural Networks That Are Adaptive to Heterogeneous Noise

  • 2004-08-02 (Mon.), 10:30 AM
  • Recreation Hall, 2F, Institute of Statistical Science
  • Professor Adam Ding
  • Department of Math, Northeastern Univ., USA

Abstract

Neural Networks were used for prediction model in many applications. The backpropagation algorithm used in most cases corresponds to a statistical nonlinear regression model assuming constant noise level. We propose a new algorithm of backpropagating pseudo-errors that can automatically adjust to varying noise level. This algorithm is motivated by the regression transformation model in Carroll and Ruppert (1988). The ability of online training is preserved by this new algorithm. The pseudo-errors leads naturally to corresponding prediction intervals that improve upon the existing intervals.

Update:
scroll to top