Contact

Position:
Univ Lyon, Univ Lyon1, Ens de Lyon, CNRS, Centre de Recherche Astrophysique de Lyon
Address
France

Miscellaneous Information

Miscellaneous Information

Abstract Reference: 31090
Identifier: P1.8
Presentation: Poster presentation
Key Theme: 1 Reduction and Analysis Algorithms for Large Databases and Vice-versa

Advanced Data Reduction for the MUSE Deep Fields

Authors:
Conseil Simon, Bacon Roland, Piqueras Laure, Shepherd Martin

The Multi Unit Spectroscopic Explorer (MUSE) is a second generation instrument installed at the Very Large Telescope (VLT). It is an integral-field spectrograph operating in the visible wavelength range. The official MUSE pipeline is available from ESO. However, for the data-reduction of the Deep Fields program (Bacon et al., in prep.), we have built a more sophisticated reduction pipeline, with additional recipes, to extend the official one. When there are many exposures (~300 in our cases, ~2.2Tb of raw data), it becomes infeasible to manually keep track of everything. Instead it is crucial to build automated and reproducible procedures for tasks like associating multiple calibration files with a specific observation. Similarly, since the official pipeline only provides tools for reducing a single exposure, we have built a custom data-reduction system with a database that contains all of the information from the science and calibration exposures. This lets us reliably identify files, based on criteria such as time offsets or temperature differences. A number of additional recipes are also used to reduce the data:
- An automatic flat-fielding procedure, using the sky level as a reference. This is part of the recently released MPDAF Python package (see the presentation at ADASS).
- Advanced sky subtraction using ZAP (Soto et al., 2016MNRAS.458.3210S), also released this year. ZAP uses a PCA to isolate and remove the residual sky subtraction features.
- Various masking steps, to remove instrumental artifacts that cannot be corrected.
- Exposures combination applied on data cubes, which allow to run additional steps on the cubes before combining them. Also part of MPDAF.
- An automated procedure for estimating pointing errors and PSF dimensions by comparing MUSE and HST images of the same fields.
This reduction pipeline is built with Python and various libraries like python-cpl (wrapper for ESO pipelines) and doit (workflow management system). We also make frequent use of the Jupyter notebook for quality analysis. This can be run on a server computer and accessed remotely. Once a notebook is ready, we use it to generate HTML pages for each individual exposure, or for the combined cubes.