We report properties of fixed-structure parametrized (FSP) functions that give insights into the effectiveness of the “Extended Ritz Method” (ERIM) as a methodology for the approximate solution of infinite-dimensional optimization problems. First, we present the structure of some widespread FSP functions, including linear combinations of fixed-basis functions, one-hidden-layer (OHL) and multiple-hidden-layer (MHL) networks, and kernel smoothing models. Second, focusing on the case of OHL neural networks based on ridge and radial constructions, we report their density properties under different metrics. Third, we present rates of function approximation via ridge OHL neural networks, by reporting a fundamental theorem by Maurey, Jones, and Barron, together with its extensions, based on a norm tailored to approximation by computational units from a given set of functions. We also discuss approximation properties valid for MHL networks. Fourth, we compare the classical Ritz method and the ERIM from the point of view of the curse of dimensionality, proving advantages of the latter for a specific class of problems, where the functional to be optimized is quadratic. Finally, we provide rates of approximate optimization by the ERIM, based on the concepts of modulus of continuity and modulus of convexity of the functional to be optimized.

Some families of FSP functions and their properties

Zoppoli R.;Sanguineti M.;Gnecco G.;
2020-01-01

Abstract

We report properties of fixed-structure parametrized (FSP) functions that give insights into the effectiveness of the “Extended Ritz Method” (ERIM) as a methodology for the approximate solution of infinite-dimensional optimization problems. First, we present the structure of some widespread FSP functions, including linear combinations of fixed-basis functions, one-hidden-layer (OHL) and multiple-hidden-layer (MHL) networks, and kernel smoothing models. Second, focusing on the case of OHL neural networks based on ridge and radial constructions, we report their density properties under different metrics. Third, we present rates of function approximation via ridge OHL neural networks, by reporting a fundamental theorem by Maurey, Jones, and Barron, together with its extensions, based on a norm tailored to approximation by computational units from a given set of functions. We also discuss approximation properties valid for MHL networks. Fourth, we compare the classical Ritz method and the ERIM from the point of view of the curse of dimensionality, proving advantages of the latter for a specific class of problems, where the functional to be optimized is quadratic. Finally, we provide rates of approximate optimization by the ERIM, based on the concepts of modulus of continuity and modulus of convexity of the functional to be optimized.
2020
978-3-030-29691-9
978-3-030-29693-3
File in questo prodotto:
File Dimensione Formato  
Chapter 3 - Some Families of FSP Functions and Their Properties.pdf

accesso chiuso

Tipologia: Documento in versione editoriale
Dimensione 842.08 kB
Formato Adobe PDF
842.08 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/997327
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact