Publication Cover
Monthly 288pp. per issue 6 x 9, illustrated Founded: 1989 2018 Impact Factor: 2.261 2018 Google Scholar h5-index: 34 ISSN: 0899-7667 E-ISSN: 1530-888X

More About NC


Article Metrics

About article usage data:

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Aenean euismod bibendum laoreet. Proin gravida dolor sit amet lacus accumsan et viverra justo commodo. Proin sodales pulvinar tempor. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus.

Feedforward neural networks trained by error backpropagation are examples of nonparametric regression estimators. We present a tutorial on nonparametric inference and its relation to neural networks, and we use the statistical viewpoint to highlight strengths and weaknesses of neural models. We illustrate the main points with some recognition experiments involving artificial data as well as handwritten numerals. In way of conclusion, we suggest that current-generation feedforward neural networks are largely inadequate for difficult problems in machine perception and machine learning, regardless of parallel-versus-serial hardware or other implementation issues. Furthermore, we suggest that the fundamental challenges in neural modeling are about representation rather than learning per se. This last point is supported by additional experiments with handwritten numerals.

Stuart Geman
Division of Applied Mathematics, Brown University, Providence, RI 02912 USA
Elie Bienenstock
ESPCI, 10 rue Vauquelin, 75005 Paris, France
René Doursat
ESPCI, 10 rue Vauquelin, 75005 Paris, France

Feedforward neural networks trained by error backpropagation are examples of nonparametric regression estimators. We present a tutorial on nonparametric inference and its relation to neural networks, and we use the statistical viewpoint to highlight strengths and weaknesses of neural models. We illustrate the main points with some recognition experiments involving artificial data as well as handwritten numerals. In way of conclusion, we suggest that current-generation feedforward neural networks are largely inadequate for difficult problems in machine perception and machine learning, regardless of parallel-versus-serial hardware or other implementation issues. Furthermore, we suggest that the fundamental challenges in neural modeling are about representation rather than learning per se. This last point is supported by additional experiments with handwritten numerals.

Stuart Geman
Division of Applied Mathematics, Brown University, Providence, RI 02912 USA
Elie Bienenstock
ESPCI, 10 rue Vauquelin, 75005 Paris, France
René Doursat
ESPCI, 10 rue Vauquelin, 75005 Paris, France