next up previous
Next: Backgrounds: Dark Counts Up: Performance Previous: Instrument Sensitivity

Detector Performance

The gain of the detector had been monitored with the calibration lamp over the 18 month interval between spectrograph integration and launch. The gain declined steadily over this period, necessitating an increase in detector voltage at the start of the mission to restore the detector to its proper operating point. The detector gain was monitored frequently during the mission, which revealed a small decline over the first 160 hours. The detector high voltage was then raised slightly to maintain the pulse height distribution at its nominal position, and the detector gain remained stable thereafter. This voltage increase resulted in an increase in the quantum detection efficiency (QDE) of roughly 3%.

Localized regions of the detector exhibited more significant gain decreases over the course of the mission, as expected, due to ``scrubbing'' by bright emission lines. The intense geocoronal emission lines at Lyman and O I Å resulted in almost complete loss of QDE at the centers of the lines. Much weaker localized depressions in detector QDE were found as well, due to the O I Å airglow emission and the bright Wolf-Rayet star emission lines N V, C IV, and He II. These features are time and door-state dependent. By the end of the mission these features had equivalent widths of 0.2--0.3 Å. Corrections for these features are not included in the present calibration, but will be in future versions. Three similar gain-loss features are included in the present calibration, since they appear to have been induced by calibration lamps used in pre-flight instrument processing and do not appear to vary during the mission. These three features are at 1048 Å, 1066 Å, and 1664 Å, and have equivalent widths of 0.28, 0.20, and 1.34 Å, respectively.