Status as of 2003-01-30
- Get positions to better than 0.25 arcseconds
You should use only a large catalog with good sky coverage and good
astrometry, such as the
USNO-B1.0, and the smaller
The catalog from which you select positions should all program objects
plus guide stars.
WCSTools SCAT should be able to convert anything reasonable
into an acceptable input for the next step.
Astrometric cluster catalogs
are probably not OK because there usually will not be available
guide stars on the same system.
We await the release of the complete 2MASS Point Source Catalog
and would like to have a local copy of the GSC II as the STScI server
is often down on weekends and at other awkward times.
Use procedure outline on the
"Getting Good Coordinates" web page
to make catalogs for each image using WCSTools software using one
of the above catalogs as the reference catalog. The large catalog can
then provide guide star candidates.
Software is still needed to merge catalogs into single project catalog.
- Find fiber assignments and select guide stars
Use John Roll's fitfibs or xfitfibs to create fiber
assignment tables to be taken to the telescope.
This step needs further testing and documentation before being
unleashed on the world.
It is not clear how guide stars are selected, and we need to know
the magnitude limits for guide stars. We should be able
to use existing software, such as
scat to prepare the input for this stage.
- Make sure you have prepared your target list so that coordinates are
good to 0.25".
DO NOT COME TO THE TELESCOPE WITHOUT PREPARED AND CHECKED
- Before arriving at the telescope, run the fiber positioning program;
check to make sure some fibers are on sky and the rest are on the target
- Take at least 10 biases each night.
- Take 10 domeflats each night.
- Each night take 10 twilight sky flats varying the exposure time from
2 seconds to 50 seconds.
- Take at least 5 darks at least once during your run. Dark exposure times
should be the same as your longest object exposures.
- Take a mininum of three 5-minute comparison exposures each night.
- For each target pointing, take at least 3 identically timed exposures
so we can automatically remove cosmic rays.
- Check with quick look after each new target exposure to make sure you
have an adequate integration time and to check your positions: image and
- Velocity standards?
There should be standard fields for standards.
Target List: Observer's list of objects to be observed.
Field: Observer defined region of target list
Pointing: Fiber positioning of field; may be multiple pointings of a
Exposure/Integration: One of three or more images taken at a
Needed at multiple times:
- After first exposure to check brightness. Display counts above sky
in each fiber.
Maureen is onto an IRAF task which will give total flux along a
trace, so this look would be a combination of viewing the image and
checking the flux for any or all fibers.
- After second exposure to check whether sources are showing up.
View images of each chip after running ccdproc and combining amplifiers
using ds9 or SAOimage.
This can be done now.
- After third or final exposure to make sure data is usable.
Reduce data using dohecto and sumspec, viewing the
resulting 2-d stacked spectrum files (one per chip) with image viewer.
This is the pipeline we have developed. It would be fairly easy
in either SAOimage or ds9 to add a task to view any specified fiber as a
- Remove bias and combine amplifiers
Use mscred ccdproc in IRAF to remove the bias and dark levels
from the frames and combine amplifiers into a two images, one per detector,
each containing up to 150 fiber targets.
Existing IRAF code should work perfectly, but needs to be tested.
We need raw images from the separate chips for testing and long
exposure test data to see if the dark current is significant.
- Remove Cosmic Rays
Use imcombine to combine the multiple exposures at a single
pointing, removing cosmic rays in the process.
Existing IRAF code should work perfectly, but needs to be tested.
- Extract Spectra
dohspec is the main processing program. Before running, check the
parameter settings by typing epar dofibers and typing
params. Run dofibers on each of the images created in step 1.
The output will be a multispec file in which each of the fibers has
been normalized, perhaps scattered light corrected, traced, extracted
and wavelength calibrated.
This task will have to be rewritten or the wavelength dispersion
correction done as a separate step afterward if we have to correct for
shifts in spectrum position. The existing form of dofibers can be used
for quick-look reduction. How will sky flats be taken
- Recalibrate Spectra
Find and repair any shift which occured to the spectra.
Use an IRAF task (to be written) similar to reref which
will be used, once it is debugged, to analyze and fix calibration shifts.
- Rebin Spectra for sky subtraction
to rebin each of the 300 extracted spectra from the multispec file
into a stacked-spectrum 2-D file so they all have the same
number of pixels and cover the same wavelength range.
This task works. This is where quick look for all of the spectra
- Create individual spectrum files
Use the hectospec.hsplit task to split the two files
of stacked spectra into individual multispec files, putting these newly
created files into a new directory.
This task works.
- Remove Background/Sky
- In this newly created master subdirectory, and outside of IRAF, run
skyproc which finds, sums, and then subtracts skies from the
object spectra. The output files are multispec files with 3 vectors:
sky-subtracted object, sky + object, and sky.
This program works.
- Or use singular value decomposition to remove skies from the stacked
spectrum file after rebinning and before creating the individual
There is a script in the IRAF svdfit package which will run
the multiple passes necessary to do this.
- Produce a total count summary and compare with input magnitudes.
A script using IRAF's imstat task (or Doug's more scriptable
variation istat) would do the trick.
- Compute redshifts
Use IRAF rvsao.xcsao to
find the redshifts of all of the object spectra.
We will have to make or find good templates for Hectospec.
- Quality Control
Methods are yet to be determined.
Use the gather scripts written
by Bill Wyatt to distribute data to principal investigators.
This will be handled almost exactly the way FAST distribution is done,
given that the final data are FITS multispecs file just like FAST.
Individual spectra will be stored in a secure archive in a structure
to be determined. Pointing directories within night directories could
be easily accessed through reduced file numbers yyyymmdd.pppnnn,
where yyyymmdd is the UT date, ppp is the pointing
sequence number (or maybe the sequence number of the first exposure within
a pointing), and nnn is the fiber number.
Calibration files and velocity standards should
go into a parallel public archive immediately; PI files should be copied
from the secure archive to the public archive when their proprietary period
expires or when the PI releases the data, whichever comes first.
This is pretty close to how FAST works now. There would have to
be a new web interface and RFN to path code would have to be written for
the new RFN format, but not much else would be different.