|
|
|
|
|
|
Primary Quantities Measured with System Typical |
|
|
The WSI is designed to run either autonomously or networked to a site data system. It runs 24 hours a day, aquiring images at user-specified intervals. For the ARM program, this interval is normally a full data set every 6 minutes, with partial data sets every 2 minutes. For other applications, acquisition is often set to every 10 minutes for stand-by mode, and every 1 minute (2 minutes at night) for test mode. Some sites use data acquisition every 40 seconds at night for test mode.
The system performs its own housekeeping, which includes positioning the solar/lunar occultor, acquiring the image set, saving data to tape drive for archival and to hard disk for network access, processing the data to yield cloud cover, and performing self-checks. In order to complete these steps, the system determines the solar or lunar position (given the correct WSI location). Also, the WSI includes a flux control algorithm which checks sun location, moon location, moon phase, and earth-moon distance, and from these parameters determines the neutral density filter and exposure to use.
When data are acquired, the first step is to display an image for the user. The raw data are 16-bit, i.e., have a grey scale range of 0 - 65,535. In order to view the data, it is necessary to determine the portion of the range occupied by the data (such as grey levels 1000 - 8000) and map this range into the 8-bit range (256 grey levels) used to display an image. The WSI includes an automated windowing algorithm which determines a reasonable range to display, and derives the8-bit image and displays it. The 16-bit raw data are also always retained in unmodified format for further use.
On some systems, the cloud decision algorithm may be applied on the control computer as soon as the data are archived.Ñ Other systems, desiring both speed in data acquisition and near-real time processing, have the processing on a networked processing computer.Ñ Other systems are designed to send the raw data to a sponsor's archival computer, and the data are processed later in archival mode.
The ARM Program uses its own version of the cloud algorithms, developed in part from the MPL algorithms. The daytime version of the MPL cloud algorithm used in the real-time mode first applies calibration factors to the 16-bit data, then ratios the red and blue images to provide a ratio image. As a first step, the ratios are thresholded to identify the opaque clouds. Thus opaque clouds are identified by their spectral character.
In addition, a library of clear sky ratio images for a full range of solar zenith angles must be stored in advance. This library is site dependent, as it depends on such factors as site altitude (thus it can only be determined after the imager has been at the site for a reasonable period). This library is used to determine the background clear sky ratio for a given solar zenith angle, which is then adjusted to compensate for variations in the aerosol load. The ratio image that is being analyzed is then compared with this background ratio on a pixel-by-pixel basis. The algorithm identifies a pixel as thin cloud if the test image ratio exceeds the background ratio by 20%. Thus a pixel is identified as thin if its spectral ratio is not as high as that of an opaque cloud, but is significantly higher than that of the clear sky at the same look angle and solar/lunar angle.
If the aerosol load is so high (i.e., the sky is so hazy) that the clear sky ratio exceeds the opaque ratio in any pixels, then these pixels may be labeled indeterminate. For example, on a very hazy day, the aureole may be labeled indeterminate. Once the full image has been processed, the cloud cover may be determined by evaluating the number of pixels inside the image which are labled opaque or thin cloud. (Normally no correction for the solid angle per pixel is required, as this introduces very small changes in the total result.) The algorithm also labels pixels that are offscale bright or dark, as well as pixels which are blocked by the occultor.
We have recently updated this daytime algorithm for better handling of sunrise and sunset.Ñ We also plan to develop it for better handling of the distinction between thin cloud and aerosol.
At night, the real-time cloud algorithm is based currently on the detection of stars, using approximately xx stars in each image.Ñ The sky is divided into 237xx regions. Within each region, the stars are evaluated and used to define the average cell condition as having opaque cloud, thin cloud, or generally clear.We are currently extending this algorithm to identify the earth-to-space beam transmittance. As funding becomes available, we also plan to extend this algorithm, using the measured radiance distributions on clear and cloudy nights, to provide results on a pixel-by-pixel resolution as is done in the day algorithm.
The cloud decision algorithms used for the post-archival processing by the ARM program are similar. The day algorithm was based on the original day algorithm developed at MPL, and is modified to include the NIR data taken at 800 nm in the assessment. The night algorithm was based on the ideas developed at MPL (which later evolved into the real-time night algorithm), however it uses somewhat different procedures for interpolating between stars.
From the cloud decision images, in which each pixel is identified as opaque cloud or no cloud, the cloud cover over the full sky is computed, as well as the cloud cover in 9 sectors: a 10í circle at the zenith, four quadrants running from 0í to 45í zenith angle, and four quadrants running from 45í to 90í zenith angle. In addition, the ARM Program computes other cloud products such as cloud path length, area, and perimeter at the ARM Experiment Center.
These cloud product results may be used for evaluation of the nature of the cloud heterogeneities, their impact on radiative transfer models, and the impact of the cloud cover on the surface fluxes. The zenith measurements may be used to evaluate zenith looking instruments.
An additional product is the sky radiance distribution. When each image is calibrated for absolute radiance, the image consists of over 200,000 measurements of sky radiance acquired simultaneously at 1/3í spatial resolution. This product can be used in several ways: evaluation of the clear sky radiance distributions, potentially including extraction of aerosol characteristics, evaluation of the cloud radiances and their relations to heterogeneities, evaluation of the variations in the diffuse irradiance (which can be computed from the radiances), evaluations of thesolar disk for potential determination of optical depth and direct irradiance, and evaluation of the fine scale temporal and spatial variations in these quantities.
Finally, a third product which is often useful is a visual presentation of the images. Because the data can be presented as visual images (like pictures), they present tremendous information to the trained observer. One can qualitatively evaluate cloud type and opacity, and the general character of the cloud field. Cloud time lapse loops are often particularly interesting and beautiful, and useful for qualitative evaluations of cloud motion and cloud cover persistance. Cloud loops are an easy tool for evaluating features such as orographic clouds that may affect the net flux and yet not be detected by the zenith-looking sensors. Potential future developments include use of the instruments for sub-visual cirrus detection, 3-D retrievals, and cloud typing.
Produced
by the Marine Physical Laboratory, SIO.
Send questions, comments and suggestions about the Atmospheric Optics
Group website to:
webmaster@mpl.ucsd.edu
Copyright © 2002.
Official web page of the University of California, San Diego |