Cosmology is the field of Physics which studies the Universe as a whole. Our ancestors were already asking themselves profound questions about the cosmos thousands of years ago. However, the birth of Modern Cosmology is quite recent, dating back to the second decade of the twentieth century, after Albert Einstein published his theory of General Relativity. At first the Universe was thought to be static: Einstein himself was convinced that the Universe should have been static. However, in 1929 Edwin Hubble discovered that it was expanding: in particular, he observed that the light coming from distant galaxies was redshifted in a way proportional to the distance of the galaxy itself. This discovery paved the way to the theory of the Big Bang, according to which the whole Universe expanded from a hot dense plasma to everything we can observe today. The Big Bang paradigm got consolidated later in 1964, when Penzias and Wilson observed for the first time the Cosmic Microwave Background radiation (CMB). This radiation is made up of the photons that were once tightly bound with protons and electrons in the primeval plasma, when the Universe was hot and dense. With the expansion the Universe got progressively colder, until ∼ 380000 years after the Big Bang it was cold enough to allow the formation of the first atoms. At this time the photons coming from the primeval plasma – not having enough energy to break the atoms – decoupled and started to free-stream across the Universe, forming what we observe today as the CMB. Later in 1998 another important discovery in the field of Cosmology was made. From the observation of type Ia Supernovae, research groups led by Riess [1] and Perlmutter [2] obtained the first evidence that the Universe expansion is accelerating. This was also the first strong evidence for the possible existence of a cosmological constant (Λ), which is the simplest model of dark energy, a particular form of energy which can drive cosmic acceleration through its negative pressure. These observations helped to build and establish the ΛCDM model, which nowadays is currently accepted as the Standard Model of Cosmology. According this model, about 68% of the energy content of the Universe is made up of dark energy in the form of a cosmological constant, about which we know nothing at microscopic level. The remaining 32% is made by non-relativistic matter. About 5% is ordinary (baryonic) matter made of protons, neutrons and electrons, and 27% is dark matter. The name “dark” comes from the fact that it does not interact via the electromagnetic force, but only through gravity. Except for this, nothing else is known about dark matter. Thus, the 95% of the Universe seems to be made of something whose physical nature is not known yet. One of the greatest successes of the ΛCDM model is the high accuracy with which it predicts the power spectrum of the CMB temperature fluctuations. The Planck satellite [3], operating between 2009 and 2013, provided the highest resolution map of the CMB sky. The mission data analysis [4] showed an astonishing agreement between the observations and the theoretical prediction from ΛCDM. Despite being consistent with the observations, the ΛCDM model presents also various problems. It does not explain what dark energy and dark matter are. About the former we only know that it has a negative pressure which drives the current cosmic acceleration. About the latter we only know that it is almost non-relativistic and that it keeps galaxies and clusters of galaxies together with its gravitational pull. In order to tackle these and other open questions, new observations are coming in the next years. One of these is Euclid, an ESA medium class mission which will launch a satellite by the end of 2022, and which is the context of this thesis. The Euclid mission will observe about one third of the sky, performing one of the largest galaxy surveys ever made and probing the last 10 billion years of the Universe expansion history. The main cosmological probes for which Euclid is designed are weak lensing (WL) and galaxy clustering (GC). By weak lensing it is meant the slight deformation of the images of galaxies due to density fluctuations of the intervening matter distribution, either dark or baryonic. Through weak lensing measurements it is possible to probe the matter distribution which sources it. Galaxy clustering consists instead in studying the statistical properties of the distribution of galaxies, which is not random. In particular, it contains an oscillating pattern which is an imprint of the sound waves that were propagating in the primeval plasma of baryon and photons, which was permeating the Universe before the CMB photons decoupled. This pattern are the Baryon Acoustic Oscillations (BAO), and their length scale can be inferred from the galaxy distribution. Moreover, the galaxy distribution can also be used to trace the underlying dark matter field, being so-called biased tracers of the dark matter distribution, which cannot be observed directly. Euclid will study these probes with two instruments: the Visible Imager (VIS) and the Near-Infrared Spectro-Photometer (NISP). With VIS Euclid will obtain high resolution images of 1.5 billion galaxies for weak lensing measurements. With the NISP instrument in photometric mode it will measure the photometric redshifts of the same galaxies observed with VIS. In the spectroscopic mode NISP will instead measure the spectroscopic – which are about 50 times more accurate than the photometric ones – redshifts of 20 million Hα-emitting galaxies. Thus, Euclid will produce two galaxy samples: a photometric sample and a spectroscopic sample. In this sense, the galaxy clustering probe can be subdivided in two: photometric galaxy clustering (GCph) and spectroscopic galaxy clustering (GCsp). GCsp and the so-called Euclid 3 × 2pt statistics, composed by WL, GCph and their cross-correlation, represent the two main probes of Euclid. During the three years of my PhD at the University of Genoa, I had the opportunity to work in two complementary areas of Euclid. In the first half of my PhD I worked on image simulations needed for the validation of the official spectroscopic data reduction software of Euclid. In fact, in order to accurately measure the redshifts of the Hα-emitter galaxies of the Euclid spectroscopic sample, their one-dimensional spectra have to be extracted from the two-dimensional dispersed images acquired by NISP in spectroscopic mode. In this sense, a software which performs a data reduction – from images to 1D intensity-vs-wavelength spectra – is needed. In Euclid there is a specific Organisation Unit (OU-SIR) in charge of the development of a data reduction software, the SIR Pipeline. In order for the mission to reach the expected performances in the spectroscopic channel, the SIR Pipeline must satisfy specific requirements. Therefore, it needs to be validated before the launch with simulations of the real data which will be acquired by NISP during the mission. The extraction of the spectra is in fact affected by several sources of systematic errors, and the effect of each of these must be accurately assessed. For this reason a wide and varied set of simulations is needed, together with full control on every detail of the simulation. In this context I have developed a software which produces these simulations, built on top of the official Euclid spectroscopic simulator developed by OU-SIM. I added the possibility to easily customise all settings of a given simulation, mainly the instrumental noise effects and the input catalogue of the sources, together with their theoretical spectra. I also built a unified end-to-end pipeline which automates the run of the SIR Pipeline on the simulated images, allowing to perform a direct comparison between the input theoretical spectra and the output extracted ones. At the moment of writing, the software I have developed is being used by OU-SIR as the main tool for carrying out the validations. There are also plans for applying it to perform specific simulations for the Legacy Science groups, like the AGN and Galaxy Evolution teams. In the second half of my PhD, as a member of the Work Package (WP) Likelihood of the Galaxy Clustering Science Working Group (GC-SWG) of Euclid, I performed the first Euclid cosmological parameter forecast which includes the correlations between GCsp and the 3 × 2pt statistics. The data analysis of the survey needs in fact to be accurately planned: with this purpose pre-launch forecasts of the expected scientific performances are needed to help this planning process. In a previous Euclid forecast – the IST:F [5] – it has been shown that the inclusion in the analysis of the correlation between WL and GCph significantly improves the constraints on the cosmological parameters. In my work I extended the IST:F including also the two correlations between GCsp and GCph and between GCsp and WL, in order to understand their impact of the foreseen constraints given by Euclid . The results here presented show that these correlations may not affect the constraints as much as the XC(WL,GCph) studied in the IST:F [5]. However, in the next future some extensions of my work will be studied in order to take a final decision on whether including or not these correlations in the official Euclid data analysis.
Improving the Euclid performance: from spectroscopic simulations to the 6 × 2pt statistics.
PAGANIN, LUCA
2022-01-21
Abstract
Cosmology is the field of Physics which studies the Universe as a whole. Our ancestors were already asking themselves profound questions about the cosmos thousands of years ago. However, the birth of Modern Cosmology is quite recent, dating back to the second decade of the twentieth century, after Albert Einstein published his theory of General Relativity. At first the Universe was thought to be static: Einstein himself was convinced that the Universe should have been static. However, in 1929 Edwin Hubble discovered that it was expanding: in particular, he observed that the light coming from distant galaxies was redshifted in a way proportional to the distance of the galaxy itself. This discovery paved the way to the theory of the Big Bang, according to which the whole Universe expanded from a hot dense plasma to everything we can observe today. The Big Bang paradigm got consolidated later in 1964, when Penzias and Wilson observed for the first time the Cosmic Microwave Background radiation (CMB). This radiation is made up of the photons that were once tightly bound with protons and electrons in the primeval plasma, when the Universe was hot and dense. With the expansion the Universe got progressively colder, until ∼ 380000 years after the Big Bang it was cold enough to allow the formation of the first atoms. At this time the photons coming from the primeval plasma – not having enough energy to break the atoms – decoupled and started to free-stream across the Universe, forming what we observe today as the CMB. Later in 1998 another important discovery in the field of Cosmology was made. From the observation of type Ia Supernovae, research groups led by Riess [1] and Perlmutter [2] obtained the first evidence that the Universe expansion is accelerating. This was also the first strong evidence for the possible existence of a cosmological constant (Λ), which is the simplest model of dark energy, a particular form of energy which can drive cosmic acceleration through its negative pressure. These observations helped to build and establish the ΛCDM model, which nowadays is currently accepted as the Standard Model of Cosmology. According this model, about 68% of the energy content of the Universe is made up of dark energy in the form of a cosmological constant, about which we know nothing at microscopic level. The remaining 32% is made by non-relativistic matter. About 5% is ordinary (baryonic) matter made of protons, neutrons and electrons, and 27% is dark matter. The name “dark” comes from the fact that it does not interact via the electromagnetic force, but only through gravity. Except for this, nothing else is known about dark matter. Thus, the 95% of the Universe seems to be made of something whose physical nature is not known yet. One of the greatest successes of the ΛCDM model is the high accuracy with which it predicts the power spectrum of the CMB temperature fluctuations. The Planck satellite [3], operating between 2009 and 2013, provided the highest resolution map of the CMB sky. The mission data analysis [4] showed an astonishing agreement between the observations and the theoretical prediction from ΛCDM. Despite being consistent with the observations, the ΛCDM model presents also various problems. It does not explain what dark energy and dark matter are. About the former we only know that it has a negative pressure which drives the current cosmic acceleration. About the latter we only know that it is almost non-relativistic and that it keeps galaxies and clusters of galaxies together with its gravitational pull. In order to tackle these and other open questions, new observations are coming in the next years. One of these is Euclid, an ESA medium class mission which will launch a satellite by the end of 2022, and which is the context of this thesis. The Euclid mission will observe about one third of the sky, performing one of the largest galaxy surveys ever made and probing the last 10 billion years of the Universe expansion history. The main cosmological probes for which Euclid is designed are weak lensing (WL) and galaxy clustering (GC). By weak lensing it is meant the slight deformation of the images of galaxies due to density fluctuations of the intervening matter distribution, either dark or baryonic. Through weak lensing measurements it is possible to probe the matter distribution which sources it. Galaxy clustering consists instead in studying the statistical properties of the distribution of galaxies, which is not random. In particular, it contains an oscillating pattern which is an imprint of the sound waves that were propagating in the primeval plasma of baryon and photons, which was permeating the Universe before the CMB photons decoupled. This pattern are the Baryon Acoustic Oscillations (BAO), and their length scale can be inferred from the galaxy distribution. Moreover, the galaxy distribution can also be used to trace the underlying dark matter field, being so-called biased tracers of the dark matter distribution, which cannot be observed directly. Euclid will study these probes with two instruments: the Visible Imager (VIS) and the Near-Infrared Spectro-Photometer (NISP). With VIS Euclid will obtain high resolution images of 1.5 billion galaxies for weak lensing measurements. With the NISP instrument in photometric mode it will measure the photometric redshifts of the same galaxies observed with VIS. In the spectroscopic mode NISP will instead measure the spectroscopic – which are about 50 times more accurate than the photometric ones – redshifts of 20 million Hα-emitting galaxies. Thus, Euclid will produce two galaxy samples: a photometric sample and a spectroscopic sample. In this sense, the galaxy clustering probe can be subdivided in two: photometric galaxy clustering (GCph) and spectroscopic galaxy clustering (GCsp). GCsp and the so-called Euclid 3 × 2pt statistics, composed by WL, GCph and their cross-correlation, represent the two main probes of Euclid. During the three years of my PhD at the University of Genoa, I had the opportunity to work in two complementary areas of Euclid. In the first half of my PhD I worked on image simulations needed for the validation of the official spectroscopic data reduction software of Euclid. In fact, in order to accurately measure the redshifts of the Hα-emitter galaxies of the Euclid spectroscopic sample, their one-dimensional spectra have to be extracted from the two-dimensional dispersed images acquired by NISP in spectroscopic mode. In this sense, a software which performs a data reduction – from images to 1D intensity-vs-wavelength spectra – is needed. In Euclid there is a specific Organisation Unit (OU-SIR) in charge of the development of a data reduction software, the SIR Pipeline. In order for the mission to reach the expected performances in the spectroscopic channel, the SIR Pipeline must satisfy specific requirements. Therefore, it needs to be validated before the launch with simulations of the real data which will be acquired by NISP during the mission. The extraction of the spectra is in fact affected by several sources of systematic errors, and the effect of each of these must be accurately assessed. For this reason a wide and varied set of simulations is needed, together with full control on every detail of the simulation. In this context I have developed a software which produces these simulations, built on top of the official Euclid spectroscopic simulator developed by OU-SIM. I added the possibility to easily customise all settings of a given simulation, mainly the instrumental noise effects and the input catalogue of the sources, together with their theoretical spectra. I also built a unified end-to-end pipeline which automates the run of the SIR Pipeline on the simulated images, allowing to perform a direct comparison between the input theoretical spectra and the output extracted ones. At the moment of writing, the software I have developed is being used by OU-SIR as the main tool for carrying out the validations. There are also plans for applying it to perform specific simulations for the Legacy Science groups, like the AGN and Galaxy Evolution teams. In the second half of my PhD, as a member of the Work Package (WP) Likelihood of the Galaxy Clustering Science Working Group (GC-SWG) of Euclid, I performed the first Euclid cosmological parameter forecast which includes the correlations between GCsp and the 3 × 2pt statistics. The data analysis of the survey needs in fact to be accurately planned: with this purpose pre-launch forecasts of the expected scientific performances are needed to help this planning process. In a previous Euclid forecast – the IST:F [5] – it has been shown that the inclusion in the analysis of the correlation between WL and GCph significantly improves the constraints on the cosmological parameters. In my work I extended the IST:F including also the two correlations between GCsp and GCph and between GCsp and WL, in order to understand their impact of the foreseen constraints given by Euclid . The results here presented show that these correlations may not affect the constraints as much as the XC(WL,GCph) studied in the IST:F [5]. However, in the next future some extensions of my work will be studied in order to take a final decision on whether including or not these correlations in the official Euclid data analysis.File | Dimensione | Formato | |
---|---|---|---|
phdunige_4705770.pdf
accesso aperto
Descrizione: File contenente la tesi.
Tipologia:
Tesi di dottorato
Dimensione
2.9 MB
Formato
Adobe PDF
|
2.9 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.