The analysis was done on the extracted net spectra, before application
of the absolute calibration (i.e., in flux numbers), and the data sets
for each standard star were treated separately. The spectra were
corrected for camera head amplifier temperature (THDA) induced
sensitivity variations (Garhart 1991) and sections of the spectra
affected by camera reseaux were interpolated across using adjacent good
data points. Several absorption features (e.g., Si IV, C IV, and
geocoronal Ly , ) were also interpolated across by applying the
same technique. Each spectrum was then normalized by dividing by an
average of several spectra taken in a six-month time period centered on
1985. The normalized data were then binned in 5Å intervals and a set
of degradation ratios was produced by performing a final binning of the
data at six-month intervals. The ratios derived from each standard star
were compared and found to be in good agreement, so the last step of the
process was repeated using all the data and a combined set of
degradation ratios was derived. The same analysis was performed on
low-dispersion trailed data and a separate set of degradation ratios was
produced. Subsequent testing of the trailed corrections showed that only
the SWP solutions provided any improvement over use of point source
corrections when applied to trailed data. Therefore, the LWP and LWR
cameras apply point source degradation corrections to the trailed data.