Even today, the volume of data collected by remote sensing instruments challenges the processing and interpretation capabilities of the Earth science community. By the mid-1990s an additional terabit (1012 bits) per day is expected from the National Aeronautics and Space Administration (NASA) Earth Observing System (EOS) spacecraft alone. The Washington, D.C., phone book (white pages) contains only about 1/10,000 this amount (108 bits) of information. To put this another way, if you could read and absorb a quantity of data comparable to two 200-page books per week, it would take over 5000 years to ingest a single day's data (Moses lived about 3300 years ago!).
However, there is a growing appreciation for the interconnected nature of processes shaping the terrestrial environment [e.g., Earth System Science Committee, 1988]. This is driving the need to collect and study such large data sets. A range of time and space scales must be sampled if critical phenomena affecting the surface of Earth are to be captured by the observations. Data from multiple sources, measuring different aspects of the phenomena, must be intercompared. As a result, many of the important new insights that we hope to gain with future Earth-observing spacecraft can only be achieved if there are data-handling tools that are adequate for coping with the volume of new information.