The need for better accuracy, precision, and stability in spectroscopical observations is part of a quest for what can be called ?spectral fidelity?, i.e. the reliable reconstruction of the original spectral information (flux as a function of wavelength). Spectral fidelity can be achieved only when all steps of the data flow chain are covered by adequate instruments, adequate calibration schemes, and adequate tools for data reduction and analysis. Both instrumentation and calibration have undergone a major overhaul in the last 20 years, with the adoption of fibre injection optimised for scrambling capabilities, simultaneous wavelength calibration with Laser Frequency Combs (LFCs), and temperature- and pressure-stabilised vacuum vessels. The last generation of optical echelle fiber-fed spectrographs, like HARPS and ESPRESSO, brought the wavelength accuracy and precision down to the tens of cm/s, and are acting as proper pathfinders for the ELT-era instruments, which will require a tenfold improvement on this figure to achieve the scientific goals of current-age cosmology and astrophysics, e.g.:
(a) The measurement of fundamental constant variability from the relative wavelength shift of absorption lines observed towards distant quasars;
(b) The direct measurement of the expansion of the universe from a redshift drift of the spectral features of distant sources;
(c) The assessment of primordial nucleosynthesis from an accurate abundance measurement of Deuterium in low-metallicity systems;
(d) The measurement of the gravitational redshift on the Sun and other stars, and the determination of the nature of gravity in weak field regime.
Data treatment, unfortunately, is lagging behind. The main limitation is that the typical approach to spectral reduction is still based on reverse modelling: given the detector output, the input spectrum is recovered by tracing the signal along the echelle orders and integrating it along the transverse direction with an appropriate weighting (?optimal extraction?). While fast and robust, this method is generally wrong, as it assumes that the instrument point-spread function (PSF) is separable along the CCD axes (which is never the case for curved echelle orders). The correct approach (?spectro-perfectionism? or SP; Bolton and Schlegel (2010), PASP 122, 248) is based on forward modelling, similar to what is commonly applied e.g. to data in the X band: given the instrument characteristics, the detector output is interpreted as the result of a linear transformation of the input spectrum, which can be recovered by inverting the transformation itself. The advantage of SP is indisputable, as it allows a simultaneous modelling of the target signal (including its contaminants, e.g. the sky background) and of the instrumental signature (including R). Analysis of simulated data by shows that the approach is more than 10 times better than optimal extraction, according to different figures of merit. The cost is an increased technical and computational difficulty in evaluating and inverting the large matrices that characterise the instrumental.
The aim of this project is to implement the first full SP approach for super-stable spectrographs, namely HARPS, and to validate it on archival HARPS data. The time is ripe to bring SP to the high-precision cosmology arena it was originally meant for. In the case of HARPS, the sizes of the matrix to be inverted is of the order of 10^7, a figure that is now manageable in light of the following technical improvements:
(a) Computing capabilities have increased significantly (for CPUs: roughly 1.5x in clock speed and 3x in number of cores; for GPUS: roughly 3x in clock speed, 15x in VRAM size, and 30x in number of CUDA cores);
(b) The problem can be downscaled by splitting the spectra into chunks to reduce the matrix size.
The software will be written mostly in Python using common libraries (e.g. NumPy, Scipy, Astropy, XLA) and released as a self-standing module through standard channels (eg. INAF GitLab, pip, Anaconda). We will interface it with the official ESO HARPS reduction pipeline and with the Astrocook data analysis package (Cupani et al. 2020) to the benefit of the end users. We will also develop a pluggable SP-module to be adapted to other instruments, specifically HARPS-N and ESPRESSO. We will finally study the possibility to extend the SP approach to the next-generation spectrographs, like ELT ANDES, using state-of-the-art simulated data to investigate possible critical points.
The development of a full-SP data reduction tool for HARPS is impactful at two levels:
(a) It will boost the instrument?s performance, exploiting its large amount of archival data (including double LFC references). HARPS?s smaller detector size make the problem of evaluating and inverting A more tractable compared to ESPRESSO; in fact, a detailed assessment of the HARPS PSF from LFC frames is already in progress.
(b) On a wider perspective, it will foster the extension of the approach to different contexts. The HARPS test-case will help exploring, e.g., the degree to which the definition of A can be approximated without undermining the quality of the reduction, or the possibility of adapting SP to long-slit spectrographs, like the future VLT CUBES.