During the past 40 years, imaging the Earth from space progressed from simple photographs to digital multi-spectral and radar data. The Apollo astronauts first used color film, which defined the form and components of surface features. Ancient rocks, with much iron and other dark elements appeared brown, sediments like limestone looked bright, and sands showed up in yellow tones.
The second generation of images was relayed by digital sensors. NASA’s Landsat program initiated these in 1972. From the spacecraft altitude of 920 kilometers, an instrument sensed rows of tiny spots, measured the intensity (from 0 to 250) of reflected light from each spot, and beamed the numbers to ground receiving stations. There, the data were archived, processed and distributed for study and analysis.
Detail in space images depends upon: first, the altitude of the spacecraft; the lower the orbit, the higher the resolution, and second, the focal length of the camera lens; the longer the length, the greater the detail. Furthermore, digital imaging from space allows the use of filters to separate the reflected light into various wavelengths. These include infrared and thermal bands that measure differences in the temperatures of the exposed surfaces and help identify economic mineral resources. Another use of remote digital imaging is the possibility of repeat coverage of the same area from the same height by the same sensor. By overlying two datasets using computer software, accurate “change detection” maps are produced for the evaluation of environmental change due to natural as well as manmade processes.
The third generation satellite images were provided by radar remote sensing, for example by the Shuttle Imaging Radar (SIR) and Radarsat. As opposed to the passive sensing of reflected sunlight, a radar sensor emits waves toward the Earth and records the returned beam, or echo. One unique characteristic of radar is an ability to penetrate dry, fine-grained sand to reveal hidden topography. This makes it possible to unveil courses of former rivers beneath desert sand, hence, to locate groundwater resources in deserts.
The Boston University Center for Remote Sensing was established in 1986 to apply advanced imaging techniques in the fields of archaeology, geography and geology. Archaeological applications included using space-age instruments to unveil the contents and measure environmental parameters within a sealed boat chamber at the base of the Great Pyramid of Giza. Similarly, the cause of deterioration of the wall paintings (by salt crystallization) in the tomb of Nefertari near Luxor, Egypt, was revealed by infrared and ultraviolet imaging prior to restoration of the paintings.
Applying remotely sensed data in geography and environmental studies at the Center played a key role in classifying tree species in the California forests; detecting changes due to agricultural practices in semi-arid regions worldwide; and identifying global changes in land cover and land use patterns. Change detection techniques were also used to map the detrimental effects of the burning of oil wells and the movement of heavy military vehicles on the desert surface of Kuwait due to the Gulf War of 1991.
Geological applications emphasized the study of desert landforms and their comparison with the surface features of Mars. Techniques that were used to count sand dunes in the desert were applied to resolve the controversy of the number of people in the “Million Man March” of 1995. Furthermore, both multi-spectral and Shuttle Imaging Radar (SIR) and Radarsat images were used at the Center to identify locations of new groundwater resources in the deserts of Egypt, Oman and the United Arab Emirates. Based on the innovative research approach in such varied applications, the Center was selected by NASA in 1997 as a “Center of Excellence in Remote Sensing.”