This paper describes a memory-based network that
provides estimates of continuous variables and converges to the
underlying (linear or nonlinear) regression surface. This general regression neural network (GRNN) is a one-pass learning
algorithm with a highly parallel structure. Even with sparse
data in a multidimensional measurement space, the algorithm
provides smooth transitions from one observed value to another. The algorithmic form can be used for any regression
problem in which an assumption of linearity is not justified.
The parallel network form should find use in applications such
as learning the dynamics of a plant model for prediction or
control.