In this paper we consider discrete inverse problems for which noise becomes negligible compared to data with increasing model norm. We introduce two novel definitions of regularization for characterizing inversion methods which provide approximations of ill-conditioned inverse operators consistent with such noisy data. In particular, these definitions, respectively, require that the reconstruction error computed from normalized data (p-asymptotic regularization) and the relative reconstruction error (p-relative regularization) go to zero as the model norm tends to infinity, 0 = p < 1 being a parameter controlling the increase rate of the noise level. We investigate the relationship between these two definitions and we prove that they are all equivalent for positively homogeneous iterative algorithms with suitable stopping rules. This result has as a crucial consequence that such iterative algorithms realize regularization independently of the noise model. Then we give sufficient conditions for such methods to be p-asymptotic and p-relative regularizations in a discrete setting and we prove that the classical expectation maximization algorithm for Poisson data and the Landweber algorithm, if suitably stopped, are regularization methods in this sense. We perform numerical simulations in the case of image deconvolution and computerized tomography to show that, in the presence of model-dependent noise, the reconstructions provided by the above mentioned methods improve with increasing model norm as required by the p-asymptotic and p-relative regularization properties. More extensive studies on the p-asymptotic and p-relative regularizations for Tikhonov-type methods will be the object of future work.
A study on regularization for discrete inverse problems with model-dependent noise
Benvenuto, Federico
2017-01-01
Abstract
In this paper we consider discrete inverse problems for which noise becomes negligible compared to data with increasing model norm. We introduce two novel definitions of regularization for characterizing inversion methods which provide approximations of ill-conditioned inverse operators consistent with such noisy data. In particular, these definitions, respectively, require that the reconstruction error computed from normalized data (p-asymptotic regularization) and the relative reconstruction error (p-relative regularization) go to zero as the model norm tends to infinity, 0 = p < 1 being a parameter controlling the increase rate of the noise level. We investigate the relationship between these two definitions and we prove that they are all equivalent for positively homogeneous iterative algorithms with suitable stopping rules. This result has as a crucial consequence that such iterative algorithms realize regularization independently of the noise model. Then we give sufficient conditions for such methods to be p-asymptotic and p-relative regularizations in a discrete setting and we prove that the classical expectation maximization algorithm for Poisson data and the Landweber algorithm, if suitably stopped, are regularization methods in this sense. We perform numerical simulations in the case of image deconvolution and computerized tomography to show that, in the presence of model-dependent noise, the reconstructions provided by the above mentioned methods improve with increasing model norm as required by the p-asymptotic and p-relative regularization properties. More extensive studies on the p-asymptotic and p-relative regularizations for Tikhonov-type methods will be the object of future work.File | Dimensione | Formato | |
---|---|---|---|
M104905.pdf
solo utenti autorizzati
Descrizione: Articolo principale
Tipologia:
Documento in Post-print
Dimensione
847.17 kB
Formato
Adobe PDF
|
847.17 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.