Date: Mon, 6 Jan 2003 11:41:02 -0500 (EST) From: F Peter Schloerb To: "Wesley A. Traub" cc: rafael@ipac.caltech.edu, monnier@umich.edu, epedretti@cfa.harvard.edu, sragland@cfa.harvard.edu, schloerb@astro.umass.edu Subject: Re: ph cl test I think this is a good idea too. Here's what I've been up to and some initial results.... 1. developed a closure estimation algorithm, as discussed in previous set of emails circulated. 2. adapted my data reduction program to read the new three-fringe data type and estimate visibilities for all three fringes. Recall that my procedure is to fit the ideal response to the actual fringe data over about two fringes in the center of the packet. This is now "working", but I find that the new data have some problems not seen in the previous NICMOS data sets. The "fast fringe" data look quite good, comparable to what we are used to, but the frequency of the "slow fringes" is quite variable, and this has caused complications. To wit: - finding the best place to make a fit has been problematic. I updated the fringe finder in my program to use the algorithm used by the fringe tracker with variable frequency. This seems to work well to find the "central" fringe, and also get an initial guess about the best frequency. A decided improvement in the program. - even though veterans of my program will recall that it allows the option to solve for fringe frequency, I find that I don't get a decent solution in all cases. Thus, there are typically a lot of bad fits (maybe up to 10-20% sometimes) in a 500 scan file of slow fringes. I am trying to figure out why this is the case. 3. Next steps: - add the closure calculation to the program, using the algorithm I've already developed unless we come up with a better one. One basic decision I've made has been to estimate the visibilities from the peaks in the packets, but estimate the closure phase from the same time interval, as it needs to be. - try to figure out what the distribution of visibilities should look like given the fibre beam combiner. I note that the distributions that I am observing look a lot like those seen with NICMOS with the optical beam splitter. Thus, they are not gaussian, but rather have a low visibility tail which in the old system was thought to be due to beam overlap issues. I was expecting that the visibilities would be much more stable with the new system and that there would be greater variations in the intensities, but that doesn't seem to be the case. Perhaps I'm missing something. - replace the hybird mapping software I removed long ago. 4. For the record, I am now working primarily with the following scans in directory 2002Nov12: iota0-iota7 iota8,9,10,11 are background measurements. This is a calibrator with decent looking fringes on a decent night of seeing. Interested parties are welcome to do tests on these files, or any others from "my" run. After all, we are all in this together.... 5. Once the program works to get amplitudes and closure phases, I'll install it at the site. For post-realtime analysis and quick look reduction. I note that we COULD implement some of these algorithms (like the closure phase algorithm) on the data immediately and display initial results at the end of each scan as a part of the control system. If people think that sounds like a good idea I think it would be easy to add. Pete On Mon, 23 Dec 2002, Wesley A. Traub wrote: > All: > > Several of us are writing phase-closure programs, independently. > We may or may not choose to have a standard version in the future. > However, for the moment, it would be good if we applied a basic sanity > check to the currently-developing versions. > > Therefore I suggest that we select 3 blocks of data, one each from the > runs of Monnier/Millan-Gabet, Schloerb, and Ragland, for each of the > others to analyze. This would be a purely internal exercise. > > I am willing to act as a middleman in the process, but if one of you is > dying to do it instead, that's fine too. Whatever the process, I think > we need this sort of self-check. > > Ideally a data set would be at least one cycle of the following: > cal scan set > target scan set (preferably a non-pt target, but anything is OK) > cal scan set > > I think that we could allow 2 weeks for an initial pass at the data > sets, followed by a compilation of results, and perhaps a second pass. > > Let me know what you think about this. And if you can, send us all a > data set at the same time. > > Best wishes, > Wes > > -------------------------------------------------------------- > Wesley A. Traub wtraub@cfa.harvard.edu > Harvard-Smithsonian Center for Astrophysics > 60 Garden St., Cambridge MA 02138 > http://cfa-www.harvard.edu/cfa/oir/IOTA/ > tel 617-495-7406; cell 978-852-6390; fax 617-496-0121 > Travel: > vacation, NYC, 23-26 Dec. > AAS meeting, TPF talk, Seattle, 5-9 Jan. > FIRST meeting, Logan, Utah 22-23 Jan. > Darwin/TPF meeting, Heidelberg, 22-25 April > -------------------------------------------------------------- >