First, well-known concepts from Statistical Learning Theory are reviewed. In reference to the problem of modelling an unknown input/output (I/O) relationship by fixed-structure parametrized functions, the concepts of expected risk, empirical risk, and generalization error are described. The last error is then split into approximation and estimation errors. Four quantities of interest are emphasized: the accuracy, the number of arguments of the I/O relationship, the model complexity, and the number of samples generated for the estimation. The possibility of generating such samples by deterministic algorithms like quasi-Monte Carlo methods, orthogonal arrays, Latin hypercubes, etc. gives rise to the so-called Deterministic Learning Theory. This possibility is an intriguing alternative to the random generation of input data, typically obtained by using Monte Carlo techniques, since it enables one to reduce the number of samples (under the same accuracy) and to obtain upper bounds on the errors in deterministic terms rather than in probabilistic ones. Deterministic learning relies on some basic quantities such as variation and discrepancy. Special families of deterministic sequences called “low-discrepancy sequences” are useful in the computation of integrals and in dynamic programming, to mitigate the danger of incurring the curse of dimensionality deriving from the use of regular grids.
Scheda prodotto non validato
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo
|Titolo:||Design of mathematical models by learning from data and FSP functions|
|Data di pubblicazione:||2020|
|Appare nelle tipologie:||02.01 - Contributo in volume (Capitolo o saggio)|