Status as of 2003-03-06
- Get positions to better than 0.25 arcseconds
You should use only a large catalog with good sky coverage and good
astrometry, such as the
USNO-B1.0, and the smaller
The catalog from which you select positions should all program objects
plus guide stars.
WCSTools SCAT should be able to convert anything reasonable
into an acceptable input for the next step.
Astrometric cluster catalogs
are probably not OK because there usually will not be available
guide stars on the same system.
We await the release of the complete 2MASS Point Source Catalog
and would like to have a local copy of the GSC II as the STScI server
is often down on weekends and at other awkward times.
Use procedure outline on the
"Getting Good Coordinates" web page
to make catalogs for each image using WCSTools software using one
of the above catalogs as the reference catalog. The large catalog can
then provide guide star candidates.
Software is still needed to merge catalogs into single project catalog.
- Find fiber assignments and select guide stars
Use John Roll's fitfibs or xfitfibs to create fiber
assignment tables to be taken to the telescope.
This step needs further testing and documentation before being
unleashed on the world.
It is not clear how guide stars are selected, and we need to know
the magnitude limits for guide stars. We should be able
to use existing software, such as
scat to prepare the input for this stage.
- Make sure you have prepared your target list so that coordinates are
good to 0.25".
DO NOT COME TO THE TELESCOPE WITHOUT PREPARED AND CHECKED
- Before arriving at the telescope, run the fiber positioning program;
check to make sure some fibers are on sky and the rest are on the target
- Take [x] biases at the beginning and end of evening.
- Take [x] flats at the beginning and the end of the evening.
If you are going to change spectral orders (filters)
and/or grating tilts, you need to take flats in each setup.
During the night also?
- Take x darks at least once during your run. Dark exposure times
should be the same as your longest object exposures.
- Take one comparison exposure at the beginning of the evening and
again at the end of the night.
- For each target pointing, take at least 3 identically timed exposures
so we can automatically remove cosmic rays.
From Lee Hartmann:
For many purposes, I don't think cosmic ray rejection is absolutely
necessary. For instance, when cross-correlating spectra for radial
velocities, we have so many lines in the bandpass that excising (a few)
CR events makes no difference to the correlation, at least in my
experience; these can also be excised in the spectrum if needed.
There might be some disadvantage to splitting up the exposure into three
chunks for faint objects. I suspect that cr rejection isn't going
to be important for most of my own programs.
- Check with quick look after each new target exposure to make sure you
have an adequate integration time and to check your positions: image and
Again, for many purposes echelle users aren't going to want or
need to subtract sky because their objects are very bright.
- Sky exposures (during night or at twilight?)
If this sky for wavelength calibration, what do you do when
the beginning or end of the night clouds up?
- Velocity standards?
There should be standard fields for standards.
If standards are not part of a program, should they be observed?
Target List: Observer's list of objects to be observed.
Field: Observer defined region of target list
Pointing: Fiber positioning of field; may be multiple pointings of a
Exposure/Integration: One of three or more images taken at a
Needed at multiple times:
- After first exposure to check brightness. Display counts above sky
in each fiber.
Maureen is onto an IRAF task which will give total flux along a
trace, so this look would be a combination of viewing the image and
checking the flux for any or all fibers.
- After second exposure to check whether sources are showing up.
View images of each chip after running ccdproc and combining amplifiers
using ds9 or SAOimage.
This can be done now.
- After third or final exposure to make sure data is usable.
Reduce data using dohecto and sumspec, viewing the
resulting 2-d stacked spectrum files (one per chip) with image viewer.
This is the pipeline we have developed. It would be fairly easy
in either SAOimage or ds9 to add a task to view any specified fiber as a
- Remove bias and combine amplifiers
Use mscred ccdproc in IRAF to remove the bias and dark levels
from the frames and combine amplifiers into a two images, one per detector,
each containing up to 150 fiber targets.
Existing IRAF code should work perfectly, but needs to be tested.
We need raw images from the separate chips for testing and long
exposure test data to see if the dark current is significant.
- Remove Cosmic Rays (Optional)
Use imcombine to combine the multiple exposures at a single
pointing, removing cosmic rays in the process.
Existing IRAF code should work perfectly, but needs to be tested.
This may not need to be done for Hectochelle; the plethora of features
overrides a few bright cosmic rays.
- Extract Spectra
dohspec is the main processing program. Before running, check the
parameter settings by typing epar dofibers and typing
params. Run dofibers on each of the images created in step 1.
The output will be a multispec file in which each of the fibers has
been normalized, perhaps scattered light corrected, traced, extracted
and wavelength calibrated.
This task will have to be rewritten or the wavelength dispersion
correction done as a separate step afterward if we have to correct for
shifts in spectrum position. The existing form of dofibers can be used
for quick-look reduction. How will sky flats be taken? What sort of
scattered light correction is needed?
- Recalibrate Spectra
Find and repair any shift which occured to the spectra.
Use an IRAF task (to be written) similar to reref which
will be used, once it is debugged, to analyze and fix calibration shifts.
For Hectochelle, reference fibers will be used rather than sky lines.
- Rebin Spectra for sky subtraction (Optional)
to rebin each of the 300 extracted spectra from the multispec file
into a stacked-spectrum 2-D file so they all have the same
number of pixels and cover the same wavelength range.
This task works. This is where quick look for all of the spectra
- Create individual spectrum files
Use the hectospec.hsplit task to split the two files
of stacked spectra into individual multispec files, putting these newly
created files into a new directory.
This task works.
- Remove Background/Sky (Optional)
- In this newly created master subdirectory, and outside of IRAF, run
skyproc which finds, sums, and then subtracts skies from the
object spectra. The output files are multispec files with 3 vectors:
sky-subtracted object, sky + object, and sky.
This program works, but it will not always be needed for Hectochelle.
- Produce a total count summary and compare with input magnitudes.
A script using IRAF's imstat task (or Doug's more scriptable
variation istat) would do the trick.
- Compute redshifts (Optional)
Use IRAF rvsao.xcsao to
find the redshifts of all of the object spectra.
We will have to make or find good templates for Hectochelle.
- Quality Control
Methods are yet to be determined.
Use the gather scripts written
by Bill Wyatt to distribute data to principal investigators.
This will be handled almost exactly the way FAST distribution is done,
given that the final data are FITS multispecs file just like FAST.
Individual spectra will be stored in a secure archive in a structure
to be determined. Pointing directories within night directories could
be easily accessed through reduced file numbers yyyymmdd.pppnnn,
where yyyymmdd is the UT date, ppp is the pointing
sequence number (or maybe the sequence number of the first exposure within
a pointing), and nnn is the fiber number.
Calibration files and velocity standards should
go into a parallel public archive immediately; PI files should be copied
from the secure archive to the public archive when their proprietary period
expires or when the PI releases the data, whichever comes first.
This is pretty close to how FAST works now. There would have to
be a new web interface and RFN to path code would have to be written for
the new RFN format, but not much else would be different. A database
needs to be set up to enable access by a variety of characteristics
allowing searches for observations with similar orders or order separating filters
or grating settings.