Physical Oceanography: The Shift to a Global View and Its Changing Culture

Many changes took place in physical oceanography over the last 60 years, as the science encountered the global‐scale problems of climate and in the availability of numerous new technologies. Changes have been in the known science, and in the intellectual culture in which it is pursued.


Perspectives of Earth and Space Scientists
WUNSCH 10.1029/2022CN000204 2 of 9 at NSF, said that "All successful programs require a hero."He meant that in the sense that much time and effort were required of an individual, one whose role, as with altimetry, might not always ultimately be recognized.) Crudely speaking, I would divide physical oceanography into two eras: the "classical" period prior to MODE and the "modern" period we are now in (cf.Wunsch & Ferrari, 2019).The classical period was that of a primarily ship-borne, hydrographic oceanography, which gave rise to elegant analytic models of a steady, laminar ocean.(For present purposes, "oceanography" from here on in this essay will mean "physical oceanography" unless otherwise stated.)The change in shipborne navigational practice is prototypical: from sextants, elaborate tables, and lengthy manual calculations, to far more accurate results by simply pushing a button.(Post World War II, LORAN navigation was available in a few restricted regions.)Today we live in an era of global-scale remote observations of all kinds, of hugely complicated multi-million-line numerical model codes, of a perception of a strong geographically variable ocean-one of both short and very long time-scales of change.As one might expect, the transition was a painful one for the practitioners in the classical world.Dry (2019) describes some of this transition in lay-person's terms.(Anyone interested in the nature of physical oceanography in the era of Ekman, Nansen, et al., or in the many decades both preceding and following, should consult Mills (2009) or Deacon (1971)).
As is well-known, in 1973 the UK-US physical oceanographic communities undertook-to a great extent under the urging of Henry Stommel-what became known as the Mid-Ocean Dynamics Experiment (MODE)-conducted under the aegis of the National Science Foundation's (NSF) IDOE; see Jennings (2000) or Lambert (2000) and which gave birth to a number of other cooperative regional programs (e.g., the Coastal Upwelling Ecosystems Analysis, CUEA).The MODE field program, which lasted about 3 months, had vindicated the expectation of a few oceanographers that the ocean circulation had to be turbulent-analogous to the atmosphere.Meteorologist Victor Starr (but going back to the 1920s work of Harold Jeffreys) had shown that atmospheric weather, in the sense of being "eddy" motions, had a profound influence on the longer-term mean (the "climate").Dynamics of the latter could not be understood without accounting quantitatively for the eddying motion.Stommel had recognized as early as 1948 (Stommel, 1948) (in a semi-popular account-see the references) that the high Reynolds number ocean had also to be turbulent, but then he and others conveniently set the concern aside while generating a series of classical papers treating the ocean as steady and laminar.But, circa 1970, he decided that the time had come to face up to the uncomfortable fact that the system was almost certainly turbulent in the Starr sense.Fragmentary observations, particularly float data from John Swallow and Jim Crease, had shown unanticipated (by them) strong transient motions at depth.(Stommel's original essay and sketch proposing MODE can be found in the Supporting Information of Wunsch, 2021a).

Toward Globalization
One might argue that oceanography was global nearly from the outset: HMS Challenger in 1872-1876 circumnavigated the world.But because the world-view treated the ocean as a static laminar system, data from multiple decades and multiple regions could readily be combined to form the global picture.The Atlantic-focused Meteor expeditions of the 1920s (Wüst & Defant, 1936) were a basin-scale analogue.(An extreme example of the steady-state assumption was Wůst's celebrated (1924) first putative demonstration of the validity of the geostrophic relationship.But as Warren (2006) notes, he combined anchored-ship current meter observations in the Florida Current from 1885 to 1886 with hydrographic measurements there from 1914. Warren concluded that the comparisons were more ingenious than convincing.)But, arising under the impact of concern about climate change, was the need to examine the ocean as a whole-as it varied in time from weeks and years to multiple decades.Indeed, almost the first question arising from MODE was whether the measurement site was in some way atypical of the rest of the world ocean?
The oceanographic community had a long history of international collaborative programs, but they had been primarily of the sort where different groups, often in several countries, had divided up shipborne hydrographic work (e.g., the International Geophysical Year (IGY), International Indian Ocean Expedition (IIOE); Geochemical Sections (GEOSECS)).Although regional, MODE was prototypical of a far more intense form of cooperative efforts, much of it under various IDOE physical oceanographic programs.They involved a variety of instrument types, moored, free-floating, and shipborne, and including theoreticians and modelers, in a serious dialogue.To the intense dislike of some individuals, the complexity of the human interactions thus required grew enormously.(Pochapsky & Malone, 1972;Sanford et al., 1978) became victims of the need for long time series at enormous numbers of points.More will be said below about acoustic tomography in particular.)The parallel revolution in computer power and data storage, embodied in Moore's Law, is well-known.Other developments proved important: the advent of passenger-carrying jet aircraft permitted oceanographic vessels to operate far from their home-ports for months or years at a time, with scientists and crew being rotated aboard in distant oceans.Email came early to oceanography, through the commercial efforts of Robert Heinmiller and his Omnet company, starting about 1979 in the PEQUOD (Pacific Equatorial Dynamics) program, and quickly becoming essential to organizing participants located in vast time-zone differences.(MODE investigators had relied upon the teletype.) At that time, the meteorological community was completing the First GARP Global Experiment (FGGE; later called the Global Weather Experiment; GARP was the Global Atmospheric Research Program) directed at weather prediction improvement.Part of that community was turning its attention to the climate system through the World Climate Research Program (WCRP).The Keeling carbon dioxide curve (Figure 1) had appeared.Prominent oceanographers including, particularly, Roger Revelle of SIO, were calling ever-more-urgent attention to increasing carbon dioxide in the Earth system as a whole, and pointing out the potentially dire effects about which little was understood.
In the summer of 1979 the National Academy of Sciences, through its National Research Council arm, convened a small committee meeting in Woods Hole to consider what was known about the climatic influence of increasing CO 2 .Of the nine committee members, three were physical oceanographers, Henry Stommel, D. James Baker, and I, and we were part of what became known as the Charney Report committee (National Research Council, 1979).The resulting comparatively brief report is still worth reading.It made plain how little we could actually say about the ocean, beyond the expectation that global warming would be delayed by some unknown factor, and that another unknown fraction of the carbon would go into the sea.Nothing was said about ocean acidification nor about sea-level rise-consistent with who we were and the focus on heat uptake.The report, notably, made the point that the central purpose of the then-developing complex numerical models was to be certain that the much-simpler ones, some dating back to Arrhenius and others, had not omitted important processes.
To a few of us, Baker and I included, it appeared that we had both a practical and an intellectual crisis at hand.Physical (and chemical and biological) oceanography was still mainly the domain of ship-board hydrography (temperature, salinity, oxygen) and lowered current-meter-measuring scientists.Major parts of the world ocean had never been sampled even once below a couple of thousand meters (Kenyon, 1983;Stommel et al., 1973), and hydrographic sampling generally ignored the presence of what, after MODE, came to be called "mesoscale" or "geostrophically balanced" eddies.Atmospheric models that did exist treated the ocean, sensibly-given what was known-as a passive "swamp."(Holland & Lin, 1975, published what may have been the first "eddy-like" numerical ocean model.Bryan et al., 1975, had pioneered the coupled modeling of ocean and atmosphere.)Climate is intrinsically a global problem: one cannot rule out any region of the ocean as making no significant contribution to determining distant climate and its changes on a huge variety of time-scales.
The necessity of a new, globally capable observing system, and the sense that the technology was going to make it possible, led to a self-appointed small group proposing a global-scale experiment, that later came to be called WOCE.I have written at greater length about my own perspective on how it all came about (Wunsch, 2006).Some people in the history of science field now claim that oceanography in this Cold War era was dominated and distorted by military needs and direction.I would emphasize that essentially none of what was a huge collective effort, and that which followed, was funded to any serious extent by the US Department of Defense/Office of Naval Research (ONR), in large part because the underlying motivation was specifically directed at climate effects.At that time, ONR had decided it would not support any program concerning climate and its changes.(One, regional, part of WOCE-the subduction experiment-had to be formally separated from the main program to accommodate a Navy wish to fund it.)Many of the most capable (it takes a very real talent to work effectively at sea) classical-period seagoing oceanographers would probably be best described as naturalists.They were uncomfortable with mathematics and were deeply skeptical of the efforts of outsiders to produce novel or better methods of measurement at sea-having witnessed numerous failures.A natural conservatism pervaded the field.(I've written elsewhere about the early correspondence, 1947-1953, between Walter Munk and Henry Stommel (Wunsch, 2021b).They were two individuals who had independently recognized the power mathematics would have in the future, and who eventually found each other.)The consequence was that, among some scientists at the major US oceanographic institutions, a real hostility existed toward the notion of a collective, global, measurement program like WOCE, and particularly one that involved such apparently outlandish data as that from satellite altimeters, scatterometers, acoustics, etc.People working at the big oceanographic institutions generally had colleagues who did similar things, and so they often lacked the day-to-day rubbing of elbows with scientists, engineers, and mathematicians in very different fields-the origin of much scientific serendipity.With some important major exceptions, I noticed that the most enthusiastic and helpful people in putting these programs together tended to come from the much smaller programs at various universities-where ongoing contact necessarily existed with scientists and engineers who were not oceanographers.The WOCE Steering Committee, for example, was well-aware that, despite the hostility of many sea-going observers, the necessity of a global hydrographic program meant, nonetheless, that the big institutions were going to get the major fraction of Federal agency support.And they did!An unfortunate impulse over the last few decades has led many governments to decide that, above all, oceanographers need a sea view.Thus many first-class oceanographic groups have been separated from the knowledge and stimulation of proximity to diverse sciences and engineering skills.The Australians moved their central group from Sydney to Hobart, Tasmania; the French moved from Paris to Brittany; the British took their group from the proximity of London to the commercial docks in Southampton; the US Navy moved a major laboratory from Washington DC to southern Mississippi.Over decades, the effect is a pernicious isolation.(In the opposite direction were the growth of the University of California branch near Scripps Institution of Oceanography; and the Woods Hole Oceanographic's links to MIT, albeit at a distance of about 150 km.Other examples exist.)

A Bit About ECCO
By the early 1990s, some form of WOCE appeared as becoming a reality.In my mind, that provoked another difficult question: a great variety of near-global data sets were to become available, ranging from traditional shipboard measurements, including a whole suite of tracers, floats, etc. to the then exotic satellite data.Meteorological fields over the ocean were known to contain errors on all space and time-scales and had to be dealt with.How was all of this going to be put together into some sort of coherent picture of the global ocean circulation and its changes?
That question led, after some difficulties, to the ECCO Project whose initial stages were formulated at MIT by a combination of planning and of luck (serendipity).For a few years, we had people, mostly on a single floor of the Department's building, who understood much of the oncoming data streams; a new model was under construction, one that became the MITgcm (Marshall, et al., 1997), as well as several individuals who were comfortable with the ideas of optimization as practiced in the wider field of control theory.We did encounter some strong pushback from the meteorological community-who correctly insisted that they were the experts in combining real data with ongoing, evolving numerical weather prediction models.What was not appreciated by many of that community was that the problem of prediction was well-known in the wider literature as being distinct from the one of understanding a system through an extended interval of time.In particular, weather forecasting as practiced for decades through "data assimilation" paid little or no attention to such fundamental building blocks of understanding as conservation of energy, mass, vorticity, etc. a problem which persists to this day in what meteorologists call "reanalyzes."But WOCE and successor programs were not directed toward prediction-they were directed instead at understanding the physics (and ultimately the chemistry and biology) so that sensible predictions might become possible at some future time (prediction for decades rather than for the days of weather prediction).Could one claim to understand the physics of a system evolving without mass or energy conservation?I thought not.
I knew little of the workings of ocean GCMs, and of the technical processes by which optimization of such models to data could be carried out using what has come to be called the "adjoint method" (but which, in discrete space-time, is most readily recognizable as a very large problem in nonlinear least-squares with Lagrange multipliers).With the help of the modelers down the hall, people including Jochem Marotzke, Detlef Stammer, Patrick Heimbach, and a few other individuals, undertook the problems of large-scale optimization.ECCO is 25 years down the road, with its main home now at the Jet Propulsion Laboratory, and has given rise to what I prefer to call a process of "state estimation," extending over 30+ years of data coverage.ECCO came about in large-part because we had assembled in very close contact, modelers, data, and computer experts.Numerous papers have been written using the results, including applications in biology and chemistry.Much more can be said, but space does not permit a more detailed discussion.As with the other major programs such as TOPEX/Poseidon and its altimetric successors (Figure 2), I am now mainly an interested bystander-encouraging the experts to carry on.To a great extent my own role lay with being able to envision what was going be required; some ideas about how it might be done; and a knowledge of who in the wider community might be able to bring it about.I remain strongly interested in the science being done with these instruments and efforts (I wanted the resulting data and the estimates!)

Tomography
As noted above, a few historians of science have attempted to make the case that ocean acoustic tomography was a key example of the distortion of oceanography by military concerns (and that oceanographers became interested in climate only as an easy, opportunistic, way to replace Cold War funding).Some perspective on this interpretation can be found in Briscoe (2021).I was there at the beginning of tomography (see Wunsch, 2021a) and I would summarize the situation as one in which tomography came about with little interest or help from the US Navy.
In fact, the major activities in oceanography as a whole during the period in which tomography was being developed included the numerous IDOE programs of the 1970s, funded primarily by the National Science Foundation.And WOCE, funded in the US by NSF, the National Oceanographic and Atmospheric Administration (NOAA), and the Department of Energy, was explicitly a climate program under the umbrella of the World Climate Research program-one dating back to 1979.
At the time WOCE was being considered, I thought we had a better chance of having a major large-scale tomographic role in it-far more likely than convincing NASA, and the French, and European Space agencies, to spend hundreds of millions of dollars on ocean-observing satellites.(See Munk & Wunsch, 1982, for a vision at that time of what might be coming.)It didn't work out that way-for a number of reasons: the hue and cry over hypothetical acoustic damage to marine mammal life; the expense of building and deploying large, awkward-to-handle at sea acoustic sources; the relative complexity-compared for example, to temperature and salinity profilesof the data analysis; the lack of familiarity with ocean acoustics generally in the oceanographic communityexacerbated by the presence of the fiercely classified Navy activities.But the virtues of using acoustic signals in the ocean remain; the story is not over yet (e.g., Wunsch, 2020).Fifty years after Munk and I first proposed tomography, its real promise is still to be realized.Patience, perseverance, and longevity can be rewarded!

The Changed Scene
Recently, the 30th anniversary of the launch of TOPEX/Poseidon altimeter system was noted within the several ongoing international altimetry projects.In some email exchanges, a few people who had been key to an ultimately successful mission, lamented that no one in the now-operational altimetric programs seemed interested in how it all came about, and who was responsible.My own response was that the general indifference was a measure of success-why should anyone grappling with sustaining missions decades later, or trying to improve the correction algorithms and calibration, pay attention to the now remote history of it all?True success may well imply being forgotten.One must be a bit of a philosopher.

Conundrum of Space and Time
Oceanic change, and that of climate more generally, are ultimately global.One cannot, by fiat, exclude particular regions of the ocean as unimportant to changes on long time scales.Life would be far easier if all of the interesting physics could be dealt with on a regional scale, over time-intervals more accessible to humans.Some (far from all) of the wider community has come to grips with the need for global-scale observations of change, but one can also perceive a strong growth in the number of scientists becoming primarily regional experts, much as is true in geology.The issue of time-scale is far more intractable.I used the slide in Figure 3 in a couple of faculty discussions of the time-scale problem in the context as depicted in figure.I then changed the subject.On one occasion, a faculty member came up to me afterward and said "What advice do you give her?"I said something like, "Don't do it.You won't have a career."The problem is now ever with us.The rise of indefinitely sustained, operational, measurement systems does require serious scientific attention from knowledgeable scientists and cannot be left wholly to technicians.Difficult questions always arise after some years: Is the calibration being maintained and is it sufficiently accurate and documented?(Existing long-duration climate measurements are all accompanied by troubling questions about adequate calibration.The paper of Chan et al. (2019) shows how difficult it is to use a seemingly simple set of measurements-those nominally of sea surface temperature.)Should the technology be replaced by newer instruments-given inevitable difficulties in relating data from before and after the change, if made?Has the system become redundant and hence obsolete?(Stopping long-duration measurements influences some peoples' employment-a built-in resistance.)Do the data prove, in practice, adequate for the originally intended purpose?Jim Baker, Ray Schmitt and I (Baker et al., 2007;Wunsch et al., 2013) have tried to make the point that climate is an inter-generational problem and that for many purposes, it has to be pursued by established scientists able to sacrifice time and energy for payoffs that may well lie beyond their own lifetimes.We have not been very successful in finding the resources to support that.
Great progress has occurred in the numerical modeling of the global ocean (and atmosphere and cryosphere).The more complex is a model, the more demands are made for data capable of testing it.Consistent with the rest of fluid mechanics, the observational component is more essential than ever as we go forward.But like the global-scale models, observations are complex, with complex error structures both random and systematic, and require very close attention if they are not to mislead.
In my own view, oceanography thus has been and remains a dominantly observational science.Whenever a new instrument, or laboratory setup has been developed, theory attempted to explain what was seen.Although examples do exist where theory predicted an oceanic phenomenon subsequently observed, a much longer list exists of phenomena that theory could have predicted-but did not.Examples include the equatorial undercurrents, the deep western boundary currents, fine-and micro-structure, a quasi-universal isotropic internal wave field, etc.

Changing Culture
Adequately observing and understanding a global fluid leads to a scientific culture involving large-teams, government agencies, and considerable patience.(A scientist was once introduced to me as "Dr.et al.," reflective of his usual place in the authorial pecking order of team-written papers.)Many of the major figures in oceanography in the past decades can be regarded as generalists-working on many different aspects of the subject.Doing that is now far more difficult-too much is known, large groups are often needed, the literature is vast and fragmented, and as noted, long duration records are hard to get.One might draw an analogy with medicine: no one understands all of it, leading to intense specialization.Long records are crucial, but are difficult to fund and sustain: the cardiac "Framingham" study (see Mahmood et al., 2014) has been running-with difficulty-for 75 years.Studies of the drug known as DES are now exploring its impact on the third generation of women.And like medicine, climate change science has its own quota of snake-oil salesmen.
Historically in oceanography, newly developed instruments were used in an exploration mode-with little regard for the (usually unknown) accuracy and precision requirements.New phenomena were expected (fine and microstructure studies following the development of the STD/CTD are a good example).A more mature science, one deploying expensive long-lived observation types, requires (but we rarely get) a quantitative understanding of sampling densities and distributions, accuracy and precision for the stated goals.In the studies that led to the T/P launch, for example, at least four different estimates of the relevant accuracies had to be considered, What: (a) the scientists wanted; (b) the scientists would still find useful; (c) the engineers thought they could do; (d) the engineers were willing publicly to commit to producing.Intense debate took place concerning a restricted latitude coverage versus aliasing of tidal lines.Etc.Parsing all that is not easy.
Directors of the major oceanographic institutions no longer "direct" in the conventional sense.They are today primarily fund-raisers and administrative managers and it is not easy to identify the intellectual leaders we had in the past.In the classical era, strong figures such as Maurice Ewing, Columbus Iselin, Roger Revelle, or George Deacon could actively influence the science being done at their institutions, for better or worse, sometimes controlling almost the entire research budget.The idea of "oceanography," not just physical, has become almost meaningless-an analogy would be the lumping together of land geology, meteorology, botany, zoology, agriculture, seismology, etc., into one grand discipline.These disciplines were, in the early days, forced together by the need to obtain and sustain research vessels.Although the need is still there, the rise of global-scale problems and the growth of knowledge in each area, has rendered that glue as a very feeble one.
One can envision a time when extremely high-resolution numerical models will exist of such demonstrable skill that no more unexpected phenomena will appear.At that stage, physical oceanography as a science will no longer exist-but it will have all kinds of applications.That time still seems far away.
What will an essay like this one look like in 60 years?Perhaps one of today's post-doctoral scientists or graduate students will write it.
period following MODE, a number of technical developments were underway, including ideas and beginnings of new technologies based upon the oncoming transistor/integrated circuit revolution: new salinometers, chemical auto-analyzers, moored instruments, developments arising from the SOFAR (Sound Fixing and Ranging) floats and autonomous vehicles, speculation about possible novel satellite measurements, the STD (Salinity-Temperature-Depth) and CTD (Conductivity-Temperature-Depth), a primitive form of satellite navigation, and more.(Not all clever measurement ideas come into widespread use: For example, the bottom pressure gauges of Walter Munk and of Jim Baker were overtaken by the altimetric and gravity satellites.The elegant ship-borne vertical profiling devices of Ted Pochapsky and of Tom Sanford

Figure 1 .
Figure 1.Measurements of atmospheric carbon dioxide concentration as initiated and sustained for many years by D. Keeling.(NOAA).

Figure 2 .
Figure 2. The TOPEX/Poseidon spacecraft configured as flown in 1992.(NASA).A post-doc once said to me "Flying TOPEX must have been really easy."Me: "What do you mean?" He: "It's such an obviously good idea."Me: "It took 12 years."He: "Oh."

Figure 3 .
Figure3.Inset is the central England atmospheric temperature record showing multi-decadal variability and the very short time of an NSF 5-year grant and of the duration of the very short modern oceanographic observing systems.Used by me in faculty discussions of the time-scale problem-one which is not confined to the academic community.