IOC-MIPS_225a |
|
|
10510C,10550C |
|
|
|
|
|
Title: |
|
Instrument Stability |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Objective: |
|
To measure how long after the turn-on sequence it takes the instrument to |
|
|
|
|
|
|
|
|
become stable enough to start science observations. |
|
|
|
|
|
|
|
|
Also, to measure how stable the instrument remains through several on/off cycles. |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Description: |
|
This test will require taking data at a regular frequency soon after the |
|
|
|
|
|
|
|
|
instrument is turned on, until the output is stable. |
|
|
|
|
|
|
|
|
Similar data should be taken periodically throughout the instrument campaign to |
|
|
|
|
|
|
|
|
ensure that the instrument response remains stable. |
|
|
|
|
|
|
|
|
This template should simply be a series of standard tests we run through |
|
|
|
|
|
|
|
|
every time we turn the instrument on and periodically throughout a campaign, similar |
|
|
|
|
|
|
|
|
to what we use in laboratory testing. A sequence of 50 dark DCEs and |
|
|
|
|
|
|
|
|
50 stim DCEs is probably appropriate. This will also help us track how the dark |
|
|
|
|
|
|
|
|
current changes throughout a campaign. This means we won't need the scan mirror |
|
|
|
|
|
|
|
|
or any of the observing AOTs working yet - we just want to turn a stim on and look |
|
|
|
|
|
|
|
|
at it for awhile. |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Place in Schedule: |
|
This test should be done as soon as the telescope is cold enough to |
|
|
|
|
|
|
|
|
measure standard stars at 70 and 160 microns. |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Required Conditions: |
|
We should be pointed at a dark patch of sky as for the standard dark measurement, |
|
|
|
|
|
|
|
|
but PCS accuracy and stability are not required (a few arcminutes is ok) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Resources: |
|
|
|
|
|
|
|
|
|
duration (best estimate) |
|
|
|
Using 10-second DCEs, 20 minutes each time it's used (e.g., 10 times = 3.3 hours) |
|
|
|
|
real time downlink (Y / N) |
|
|
|
n |
|
|
|
|
special post event actions (Y / N) |
|
|
|
data must be analyzed |
|
|
|
|
number of DCEs |
|
|
|
300 each time it's used (e.g., 10 times = 3000) |
|
|
|
|
|
|
|
|
|
|
|
|
Outcome: |
|
|
|
|
|
|
|
|
|
description |
|
|
|
|
set of SUR exposures |
|
|
|
can proceed in parallel with other activity |
|
|
|
|
y |
|
|
|
must outcome be confirmed before next event / test /activity |
|
|
|
|
n |
|
|
|
method of confirmation (sensor TLM, data analysis, etc.) |
|
|
|
|
simple analysis (average slope and dispersion) |
|
|
|
estimate of data turn around if required for confirmation |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Unique or included in planned uplink or downlink/analysis tool |
|
|
|
|
|
should execute dark and stim IETs |
|
|
|
|
|
|
|
|
|
|
|
Contingency Plan: |
|
|
|
|
|
|
|
|
"What if..." |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
IOC Critical |
|
|
|
|
Y |
|
|
|
|
|
|
|
|
|
|
|
|
References: |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Template last updated: |
|
|
|
|
11-Sep-95 |
|
|
|
|
|
|
|
|
C.W. Engelbracht |
|
|
|
|
|
|
|
|
|
|
|
|