We consider one-step iterative algorithms to solve ill-posed inverse problems in the framework of variable exponent Lebesgue spaces Lp(·) . These unconventional spaces are particular (non-Hilbertian) Banach spaces which can induce adaptive local regularization in the resolution of inverse problems. We first study gradient descent iteration schemes in Banach spaces, where the classical Riesz theorem does not hold and, consequently, primal and dual spaces are no longer isometrically isomorphic. In particular, we prove that gradient methods in Banach spaces can be fully explained and understood in the context of proximal operator theory, with appropriate norm or Bregman distances as proximity measure, which shows a deep connection between regularization iterative schemes and convex optimization. We review the key concept of duality map, and provide an explicit formula of the duality map for the space Lp(·). Then we apply the Landweber and the Conjugate Gradient methods, extended to Banach setting, to solve deblurring imaging problems in Lp(·) and propose an effective strategy to select the point-wise variable exponent function p(·). Our numerical tests show the advantages of considering variable exponent Lebesgue spaces w.r.t. both the standard L2 Hilbert and the constant exponent Lebesgue space Lp, in terms of both reconstruction quality and convergence speed.

Dual descent regularization algorithms in variable exponent Lebesgue spaces for imaging

B. Bonino;C. Estatico;M. Lazzaretti
2023-01-01

Abstract

We consider one-step iterative algorithms to solve ill-posed inverse problems in the framework of variable exponent Lebesgue spaces Lp(·) . These unconventional spaces are particular (non-Hilbertian) Banach spaces which can induce adaptive local regularization in the resolution of inverse problems. We first study gradient descent iteration schemes in Banach spaces, where the classical Riesz theorem does not hold and, consequently, primal and dual spaces are no longer isometrically isomorphic. In particular, we prove that gradient methods in Banach spaces can be fully explained and understood in the context of proximal operator theory, with appropriate norm or Bregman distances as proximity measure, which shows a deep connection between regularization iterative schemes and convex optimization. We review the key concept of duality map, and provide an explicit formula of the duality map for the space Lp(·). Then we apply the Landweber and the Conjugate Gradient methods, extended to Banach setting, to solve deblurring imaging problems in Lp(·) and propose an effective strategy to select the point-wise variable exponent function p(·). Our numerical tests show the advantages of considering variable exponent Lebesgue spaces w.r.t. both the standard L2 Hilbert and the constant exponent Lebesgue space Lp, in terms of both reconstruction quality and convergence speed.
File in questo prodotto:
File Dimensione Formato  
CIRM_21_BoninoEstaticoLazzaretti_REV.pdf

accesso aperto

Tipologia: Documento in Pre-print
Dimensione 2 MB
Formato Adobe PDF
2 MB Adobe PDF Visualizza/Apri
s11075-022-01458-w.pdf

accesso chiuso

Tipologia: Documento in versione editoriale
Dimensione 3.28 MB
Formato Adobe PDF
3.28 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/1101254
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 1
social impact