24µm Focal Plane Survey, Coarse

Principal: Jocelyn Keene
Deputy: Jane Morrison, Bill Wheaton
Data Monkey(s): Jane Morrison, Bill Wheaton
Priority: Critical
Downlink Priority: Normal
Analysis Time: Campaign F: 60 hours, Campaign G: 60 hours, Combined results for F and G: 2 hours
Last Updated:


Objective

To measure the pixel locations (i.e. array orientation, scale and distortion) as a function of scan mirror angle for the 24µm array.


Description

Use an IER based on the 24 µm photometry AOT for a compact source to measure the locations on the sky of several positions on the array as a function of scanning mirror angle. This task is performed in Campaign F and Campaign G. After Campaign G is finished the results from campaign F and campaign G are merged together to update Frame Table # 9. The frame table updates for all the campaigns can be found in the following table:
Frame Table Updates

Data Collected

IER for 24µm Focal Plane Survey, Coarse collects 336 DCEs, of which 234 have the source on the array (others will have only much weaker sources present). The IER is set up so data is collected in 3 columns on the array: on the left-side, center and right-side of the array. After moving the telescope in the V direction according to the V offset, the observational pattern is repeated 7 more times (with a V offset between each set).

Calibration Star

See IER for the star chosen. This star was chosen as the 24 µm focal plane calibrator based on the following requirements:
  • The star needs to used as the PCRS star as well as the MIPS focal plane star. Note that PCRS stars have a V magnitude between 7-10 (this catalog is based on the Tycho and Hipparcos stars).
  • In the CVZ
  • MIPS requirements: stellar brightness corresponding to S/N of 30 (3 sec integrations) at least 14 mJy, K mag 6.53 (range 14 to 500 mJy or K = 6.5 to 1.3)

    Observing Strategy


    Figure 1. Observing Pattern at 3 locations on the detector. Note size of box is not the total size of the array but a portion of the central region which depends on array and if it is a coarse or fine survey

    Definitions:

  • W axis direction is defined by the Frame Table, and is always within +/- 90 degrees of the TPF z axis as projected on the sky. Motion along this axis corresponds to motion in the spacecraft motion (left/right).
  • V axis direction is defined by the Frame Table, and is always within +/- 90 degrees of the TPF y axis as projected on the sky. Motion along this axis corresponds to motion in the scan mirror direction (up/down).
  • W offset, the amount of motion in the W direction which results in the spacing between left array, middle array and right array observations
  • V offset, the amount of motion in the V direction which occurs between a set of observations.

    24 µm Coarse FPS observational parameters:

  • W offset = 138 arc seconds
  • V offset = 69 arc seconds
  • W dither = 3.75 arc seconds
  • V dither = 3.75 arc seconds
  • mirror locations for 1 position shown in figure 1.
    1. position 1 = 0
    2. position 2 = -69
    3. position 3 = 23
    4. position 4 = -46
    5. position 5 = 46
    6. position 6 = -23
    7. position 7 = 69
    8. position 8 = position 1 = 0

    Observational Strategy

  • Step 1
    1. PCRS observation
  • Step 2
    1. Position the telescope so the data falls on the left side of array
    2. Take 8 DCES at positions shown in figure 1 (left)
    3. dither in V and W
    4. Take 8 DCES at positions shown in figure 1 (left)
  • Step 3
    1. Move the space craft according to the W offset (138 arc seconds), image should now be on the center of the array.
    2. Take 8 DCES at positions shown in figure 1 (middle)
    3. dither in V and W
    4. Take 8 DCES at positions shown in figure 1 (middle)
  • Step 4
    1. Move the space craft according to the W offset (138 arc seconds), image should now be on the right side of the array.
    2. Take 8 DCES at positions shown in figure 1 (right)
    3. dither in V and W
    4. Take 8 DCES at positions shown in figure 1 (right)
  • Step 5
    1. PCRS observation
  • Step 6: move telescope according to V offset (69 arc seconds) and repeat steps 1-5
  • Step 7: move telescope according to V offset (69 arc seconds) and repeat steps 1-5
  • Step 8: move telescope according to V offset (69 arc seconds) and repeat steps 1-5
  • Step 8: move telescope according to V offset (69 arc seconds) and repeat steps 1-5
  • Step 9: move telescope according to V offset (69 arc seconds) and repeat steps 1-5
  • Step 10: move telescope according to V offset (69 arc seconds) and repeat steps 1-5

    Number of observations from step 1-5, 48. Step 1-5 repeated 7 times for a total of (48 * 7) = 336 observations.

    Simulated data


    Figure 1: Simulated images of 24 µm Focal Plane Survey Coarse.


    Data Reformatting Requirements

    Array Data Desired:

    24 µm

    Data Reformatting Option:


    Task Dependencies

    The telescope must be focussed. The Spacecraft must be pointing and tracking optimally. The PCS, PCRS, and IRU must be calibrated. The 24 um array and scan mirror must be fully operational. The 24 um photometry AOT must be validated.

    Calibration Dependencies


    Output and Deliverable Products


    Data Analysis

    Task 130 is run both in Campaign F and Campaign G. The results from both runs are used to update Frame Table # 9.
  • Data Analysis Campaign F:
    1. JPL MIPL transfers downlink data to SSC.
    2. SSC downlink ops processes data and places results in sandbox.
    3. SSC MIPS IST uses FTZ to transfer AOR data from sandbox to SSCIST21. The data is run through MIPS DATPACK routine which repackages the data to run with the Arizona data analysis tool (DAT). The data are then calibrated using the Arizona DAT package.
    4. MIPS DAT options.
      1. Run mips_sloper : with following options:
        • mips_sloper -j dirname filename
        • -j : directory below $MIPS_DIR/Cal to find calibration files.
        • For 24 micron array - dark is applied at sloper level. Use -b: not to do the 24 micron dark subtraction.
      2. Run mips_caler : with following options:
        • mips_caler -F flatfield_filename -C pathname
        • -C path is the path to where the calibration files live. If you do not use the _C then the default will be used.
        • -l (to turn off latent correction) filename
        • if you want to apply the latent correction then do not do -l and use -L filename (text file with Si latent correction coefficients)
        • The calibrated data file mipsfps_YYY095.fits (YYY a 3-digit integer string, identifying the run number) is a FITS multi-extension image file, one extension per DCE.
      3. Run Xsloper_view and check to make sure data seems reasonable.
    5. The calibrated data file is placed in sscist21: /mipsdata/fps/inpdat. File is linked to processing subdirectory (currently /home/sscmip/users/waw/fps).
    6. The MIPS IDL Centroid File Generation tool, mipspos.pro, is run according to detailed instructions in sscist21: /home/sscmip/users/waw/fps/fps.doc. This program generates a set of PSF files for the centroiding program, as a function of source (X,Y) on array and CSMM position using STINYTIM.
    7. The centroided output files CAYYY095.m (and CSYYY095.m just passed along) are placed on the TFS at SSC and transfered to DOM at JPL.
    8. IPF team retrieves previously approved spreadsheet and SSC's CA/CS files from DOM.
    9. IPF filter is run using input files: CB,A,AS,O, CA, CS files and the previously approved spreadsheet.
    10. The output files (IF files) are placed on the DOM for MIPS team to analyze.
    11. End of Campaign F.
  • Data Analysis for Campaign G:
    1. The entire process (described in Campaign F) is repeated for Campaign G.
  • Analysis of output from Campaign F and Campaign G
    1. Results from the IPF filter for Campaign F and Campaign G are analyzed by the MIPS team and compared to one another for consistency.
    2. On approval of consistency, the IPF team runs the IPF multi-run tool and produces an output file, MF, that is placed on the DOM.
    3. THe MIPS IT/IST team retreives the multi-run results and reviews them.

    Software Requirements


    Actions Following Analysis


    Failure Modes and Responses

    1. Failure of one campaign (F or G) to produce useful data
      • If one of the runs looks bad, then go back and analyze the data better. We may need to plot the data to look for bad data, we may need to edit the CA file and resend it to the IPF team.
      • We might need to ask the IPF team to look closer at their analysis.
      • If one set is bad and can not be fixed, then use data from other campaign, if data appear reasonable and change is not large?
    2. Results of F & G inconsistent, neither obviously bad, then average the results.
    3. Both results are bad - then reschedule observations.

    Additional Notes