Generalization networks are nonparametric estimators obtained from the application of Tychonov regularization or Bayes estimation to the hypersurface reconstruction problem. Under symmetry assumptions, they are a particular type of radial basis function neural network. In this correspondence, it is shown that such networks guarantee consistent identification of a very general (infinite-dimensional) class of NARX models. The proofs are based on the theory of reproducing kernel Hilbert spaces and the notion of frequency of time probability, by means of which it is not necessary to assume that the input is sampled from a stochastic process

Consistent identification of NARX models via regularization networks

DE NICOLAO, GIUSEPPE;FERRARI TRECATE, GIANCARLO
1999-01-01

Abstract

Generalization networks are nonparametric estimators obtained from the application of Tychonov regularization or Bayes estimation to the hypersurface reconstruction problem. Under symmetry assumptions, they are a particular type of radial basis function neural network. In this correspondence, it is shown that such networks guarantee consistent identification of a very general (infinite-dimensional) class of NARX models. The proofs are based on the theory of reproducing kernel Hilbert spaces and the notion of frequency of time probability, by means of which it is not necessary to assume that the input is sampled from a stochastic process
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11571/138367
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 22
  • ???jsp.display-item.citation.isi??? 21
social impact