[en] A unique and completely new statistical reduction method was recently proposed by Hanot et al. (2011) to improve the accuracy on interferometric null measurements. The main idea behind this method is to consider the full statistical distribution of the measured null depth and to reproduce it as the sum of three terms: (i) the astrophysical null, (ii) the background-related null, and (iii) a random contribution related to all instrumental imperfections (phase difference, intensity mismatch, polarisation errors, etc). For the latter two terms, we assume that we can directly inject the distributions of the measurable quantities (individual beam intensities and background) in the model, and that the phase difference can be modeled by a Gaussian distribution, from which we create random phase sequences. This results in three free parameters in our model: the astrophysical null, the mean phase difference and its standard deviation. We explore a certain range of values for these parameters and for each triplet, we build several random theoretical null histograms that we compare to the observed one using a least squares method. Finally, the best estimators for the three parameters are obtained by minimizing the chi square. In this talk, we will show how this statistical method has been used at the Palomar Fiber Nuller to gain a factor of 10 in the accuracy on the measured astrophysical null, compared to classical data reduction / calibration methods. We will then explain how we plan to adapt it to the LBTI/NOMIC nulling interferometer to reach unprecedented sensitivity levels in terms of detectable exozodiacal disks. First results based on LBTI/NOMIC data will be presented.
Disciplines :
Space science, astronomy & astrophysics
Author, co-author :
Marion, Lindsay ; Université de Liège - ULiège > Département d'astrophys., géophysique et océanographie (AGO) > Astroph. extragalactique et observations spatiales (AEOS)