## 1. Introduction

[2] With the advent of slant total electron content (TEC) measurements from dual-frequency Global Positioning Satellite (GPS) receivers, the volume of ionospheric data available for scientific analysis has greatly increased. When these measurements are combined with more traditional ionospheric observations (e.g., ionospheric soundings, in situ satellite measurements), several thousand, even tens of thousands, of observations of the ionospheric electron density distribution can be collected within a given 15 min period. These data represent both point measurements of the density and integrated measurements of electron content. They are distributed around the globe with a concentration in North America, Europe, Japan, and Australia. While some of the measurements are coincidental, many are complementary. For example, TEC measurements from a ground-based GPS receiver contain information on the ionosphere as well as the plasmasphere and at times the TEC measurements can be dominated by the topside ionosphere and plasmasphere. Ionosondes measure the electron density below the *F* region peak. By combining these data sets in a consistent manner, the electron density profile can be specified to higher altitudes, providing a better topside specification. In a similar manner, a collection of geographically dispersed measurements can be combined into a single picture of the large-scale ionospheric behavior [e.g., *García-Fernández et al.*, 2003a, 2003b].

[3] While this infusion of large amounts of data creates new and exciting opportunities for ionospheric physics, it also raises several issues: (1) What is the optimum way to combine these different data sets, each with its own sources of error, into a consistent synoptic or global specification? (2) How important are the various data sources to the overall global specification of electron density? (3) How accurate is the resulting electron density specification and what is its error?

[4] As the ionospheric community confronts these issues, it can rely on the techniques developed by the meteorological and oceanographic communities. Since the meteorological community developed the initial techniques, they also developed the language. This paper will use the meteorological terminology and attempt to explain these terms in the ionospheric context. A more complete description of the terminology and techniques used by meteorologists can be found in the works of *Daley* [1991], *Courtier et al.* [1997], *Lorenc* [1986], *Tarantola* [1987], and *Menke* [1989]. The term “analysis” will be used for a large-scale specification based upon a collection of different data. In meteorology, an analysis is also called a weather map or chart. By analogy, an ionospheric analysis is a space weather map. The term “objective analysis” is used to describe an analysis generated through an automated process. An analysis generated through the active participation of a human scientist is called a “subjective analysis.” A “spatial analysis” is an analysis that completely specifies the spatial weather at a given time. It is a snapshot of the weather. The term “statistical minimization” is used to describe any numerical technique that minimizes a cost function of the statistical values. The meteorological objective analysis algorithms were first developed in the late 1950s [e.g., *Panofsky*, 1949] and are now well developed. In the past 50 years the meteorological community has developed numerous mathematical techniques for performing objective analysis (see *Daley* [1991] and *Menke* [1989] for a survey of these techniques).

[5] An ionospheric objective analysis algorithm has several requirements that it must satisfy. It must be able to ingest measurements from different types of sensors and to use any measurement that can be derived from the electron density, the instrumentation, and the observation geometry. Because of the different types of data and their geometry, the algorithm must be three-dimensional. It also must weight the influence of the data sources. In addition, the objective analysis algorithm must have a mathematically rigorous way of determining the extent of data influence into regions where there is no data and must smoothly fall back to a predictive model far from data. Finally, it should be usable as an input to a data assimilative model.

[6] The development of an ionospheric objective analysis algorithm is important in developing a full (driven by a full-physics model) data-assimilative (time-dependent), ionospheric forecast. The standard data assimilation cycle [*Daley*, 1991] is a four step cycle that includes quality control, an objective analysis (to generate a complete spatial field with the available data), model initialization, and a theoretical forecast (to propagate the spatial field forward in time). The necessary components in this cycle are quality control algorithms, an objective analysis algorithm, and a forward predictive model. Model initialization is the essential process of data assimilation, while the other components can be viewed as input algorithms. A good data assimilation model should develop an initialization scheme that allows for new and different quality control, objective analysis, and/or predictive models to replace existing algorithms seamlessly.

[7] Instead of ingesting data directly into the predictive model (i.e., replacing the model output with an observation at only the points of the observation), observations are projected by an objective analysis algorithm into the proper scale and onto the proper grid for the predictive model and combined with the existing model prediction. The weather map created by the analysis algorithm replaces the model output at the previous time step during the initialization phase, and the physics model moves forward in time. The data assimilation cycle exist for operational, numerical, and physical reasons. Operationally, nowcasts and forecasts will not be useful if the predictive model needs to wait for all of the available observations to be collected and quality checked. Numerically, the computational resources to both assimilate each of the individual measurements and to calculate the future state of the system are immense. As both a physical and numerical reason, introducing individual datum into the predictive model can generate artificial discontinuities that propagate through the model system. These discontinuities can be avoided by collecting the data into a spatially continuous structure that is ingested into the predictive model at the same time. While this procedure can be performed within a predictive model, it is still a separate algorithm that can be treated independently of the physics model. Because of its role with the data assimilation cycle, objective analysis algorithms need to reproduce the background model predictions far from data and produce corrections in data-rich regions that are as free of spurious nonphysical modes as possible.

[8] Several different techniques have been developed to generate objective analysis of the ionospheric electron density. One of the earliest techniques is computerized ionospheric tomography (CIT). CIT is a remote sensing inversion technique that in standard usage, develops a two-dimensional electron density specification from a series of one-dimensional ionospheric observations (typically line-of-sight column electron density) and various minimization criteria [*Austen et al.*, 1988; *Kersley et al.*, 1993; *Raymund et al.*, 1994; *Bust et al.*, 1994; *Kronschnabl et al.*, 1995; *Raymund*, 1995]. As such, CIT is a spatial analysis technique. Standard CIT techniques use a single array of receivers, normally aligned in latitude along a constant longitude, and a low-Earth-orbiting (LEO) “beacon” satellite as the transmitter source (typically in polar orbit). For LEO satellite orbits, whose orbit longitude almost coincides with the array longitude, the collected data can approximately be considered to be in a two-dimensional plane defined by the latitude extent of the array and altitude to the satellite. For such an alignment the standard mathematical techniques for limited angle tomography apply, and CIT produces two-dimensional electron density maps in the region of the receiver array. This leads to three limitations for standard LEO-based tomographic methods. First, the reconstructed electron density is only two-dimensional. For satellite orbits offset by a large amount from the receiver array, the fidelity of the tomographic inversions degrades significantly. Second, the ionosphere is considered static over the time period of the data collection (typically ∼20 min). Third, the satellite passes do not occur continuously in time. For example, a midlatitude receiver array will collect data from ∼15 to 20 satellite passes per day with the passes spaced irregularly in time over the day. Thus traditional CIT cannot suitably produce a global spatial analysis of the electron density at regular update times. If there were enough satellite sources and ground receivers distributed globally to make CIT practical, CIT would be a useful global objective analysis method. Currently, even with the addition of a global network of ground GPS stations, the data is too sparse for CIT techniques to be applied directly. Thus different numerical methods are needed to produce the desired specification.

[9] To overcome the limitations of CIT and to exploit other data sources (especially GPS-TEC measurements), other numerical techniques have been developed. Regional specifications of electron density using combined data sets of CIT, ground GPS, and ionosondes have been developed at ARL:UT to overcome these and other weaknesses [*Coker*, 1997; *Kronschnabl et al.*, 1997]. When compared with independent sources, the results from these regional specification algorithms have been promising. However, the methodologies employed are somewhat ad hoc and do not allow for a mathematically rigorous method of adding arbitrary data sources. Similar work by other groups have led to the development of more advanced spatial analysis techniques. *Fremouw et al.* [1992] developed a direct inverse theory [*Menke*, 1989] approach that made use of empirical orthogonal functions (EOF). *Howe et al.* [1998] built upon the work by Fremouw and developed a Kalman filter method for the ionosphere based upon spherical harmonics horizontally and EOFs vertically. *Hernández-Pajares et al.* [1999, 2002] and *García-Fernández et al.* [2003a, 2003b] have developed a voxel-based Kalman filter of the vertical electron content using GPS-TEC measurements from ground receivers and occultation satellite receivers and ionosondes. Recently, *Schunk et al.* [2004] have described a Global Assimilation of Ionospheric Measurements (GAIM) cycles. They have a Gauss-Markov Kalman filter currently implemented and a full physics-based Kalman filter under development. *Wang et al.* [2004] describe a Global Assimilative Ionospheric Model (GAIM). They have a physics-based Kalman filter implemented and a 4DVAR method under development.

[10] This paper describes the development of the Ionospheric Data Assimilation Three-Dimensional (IDA3D) algorithm, an ionospheric objective analysis algorithm. IDA3D creates a global three-dimensional electron density specification by ingesting ionospheric measurements from a variety of instruments. IDA3D builds upon and extends previous work at ARL:UT in computerized ionospheric tomography (CIT) and in combining CIT, GPS-TEC, and ionosonde data to produce regional specifications of electron density [*Bust et al.*, 1994; *Kronschnabl et al.*, 1995; *Coker*, 1997; *Kronschnabl et al.*, 1997]. While in many ways IDA3D is an extension of this previous work, it treats and handles both observational data and errors in different ways and it is based upon a fundamentally different mathematical technique than the previous CIT algorithms developed at ARL:UT.

[11] The paper is organized as follows. Section 2 introduces three-dimensional variational data assimilation (3DVAR), the numerical method upon which IDA3D is developed. Section 3 contains a mathematical description of the IDA3D algorithm. Section 4 describes the data sources currently used by IDA3D. Section 5 presents sample results from IDA3D to demonstrate how it works in the field upon actual data. Finally, section 6 provides a discussion of some of the outstanding issues regarding objective analysis for the ionosphere and IDA3D as well as some future plans for improvements of the algorithm.