PUMA
Istituto di Studi sui Sistemi Intelligenti per l'Automazione     
Cervellera C., Muselli M. A Deterministic Learning Approach Based on Discrepancy. G. Goos, J. Hartmanis, J. van Leeuwen (eds.). (Lecture Notes in Computer Science, vol. 2859). Berlin: Springer, 2003.
 
 
Abstract
(English)
The general problem of reconstructing an unknown function from a finite collection of samples is considered, in case the position of each input vector in the training set is not fixed beforehand, but is part of the learning process. In particular, the consistency of the Empirical Risk Minimization (ERM) principle is analyzed, when the points in the input space are generated by employing a purely deterministic algorithm (deterministic learning). When the output generation is not subject to noise, classical number-theoretic results, involving discrepancy and variation, allow to establish a sufficient condition for the consistency of the ERM principle. In addition, the adoption of low-discrepancy sequences permits to achieve a learning rate of O(1/L), being L the size of the training set. An extension to the noisy case is discussed.
DOI: 10.1007/b13826
Subject ???


Icona documento 1) Download Document PDF


Icona documento Open access Icona documento Restricted Icona documento Private

 


Per ulteriori informazioni, contattare: Librarian http://puma.isti.cnr.it

Valid HTML 4.0 Transitional