«Medium-Range Weather Prediction Austin Woods Medium-Range Weather Prediction The European Approach The story of the European Centre for Medium-Range ...»
170 Chapter 14The Level IIIb analyses were to be archived in the two World Data Centres for Meteorology in Moscow and Ashville, North Carolina for worldwide distribution. They were also of course archived by the Centre.
Many diagnostic tools for monitoring the analysis production had to be developed. Weaknesses in the analysis system were identified and corrected. In February 1979, and again in April and May, a series of near-real-time tests of the entire FGGE data processing organisation, from the primary data producers to the final analyses, were completed — so-called “End-to-End Tests”. During one of these, a visitor from ESA/ESOC, John Morgan, later EUMETSAT Director, was impressed to see for the first time “cloud drift” winds — winds estimated from cloud movement measured by satellite — being used in a numerical analysis.
Sea surface temperature analyses based on non-satellite data were produced for every ten-day period.
We saw in Chapter 8 the influence of Andrew Lorenc’s work on the ECMWF assimilation system. His results were vitally important for the success of the Centre’s FGGE efforts. Already he had developed the “Data Quality” files, predecessors of what today are called “feedback files”, which recorded events during the complex quality control and analysis operations.
These were crucial in the Centre being later declared a Lead Centre for monitoring global upper-air data in the WMO system, noted in Chapter 8.
The Centre could not begin production of the IIIb Dataset until December 1979. Production initially was slow with many teething problems both with the Level IIb data as received, and also with the ECMWF operational analysis scheme, which was used to produce the IIIb Dataset.
Boxes of 1,600 bits per inch or bpi tapes were collected from Sweden. A major effort was made to produce a complete successful assimilation for 00 UTC on 16 January 1979, and a ten-day forecast was run. This analysis and forecast were used as a test version; any new development was first verified against this. The quality of this forecast was — somewhat fortuitously — excellent. Bengtsson plagued the Research Department, and Andrew Lorenc in particular, for months afterwards to find out why changes being introduced into the fledgling forecasting system made this one case worse!
Three months of analyses were completed by mid-May 1980, and six months by early October. By April 1981, production had reached into September 1979 and analysis tapes up to June had been delivered to the World Data Centres. Already significant changes had been made to the ECMWF operational assimilation system, but the FGGE system was kept unchanged; the goal was to produce a consistent set of analyses for use in general circulation and climate studies.
Re-analysis — towards a new ERA 171 The assimilation of the FGGE data, at the time the most complete set of global observations ever assembled, to produce set of global analyses every six hours throughout the FGGE year — the IIIb Dataset — was completed in summer 1981. This Dataset allowed for the first time detailed examination of phenomena in areas of the globe normally almost devoid of observations such as the Indian Ocean. Paul Julian and Masao Kanamitsu, experts in tropical circulation, paid particular attention to these areas.
Significant cross-equatorial flows could be followed clearly in the analyses.
A series of “Observing System Experiments” (OSEs) was begun in association with FGGE, and carried out by groups of scientists coordinated by the European Working Group on Future Observing Systems discussed in Chapter 8. These important experiments were designed to assess the impact of different types of observations on the resulting analyses and forecasts.
How important were temperature data, as measured by satellite, for forecasting over the South Atlantic? Or cloud drift winds for predicting the tropical weather? Series of forecasts to ten days were run, with differing data types removed before carrying out the analyses. For example, over the Northern Hemisphere, forecasts to seven days ahead starting from analyses using all data had the same accuracy as five and a half day forecasts made without satellite or aircraft data — a gain of 36 hours in medium-range forecast skill.
From the OSE results, some of which were surprisingly significant, planning replaced informed guesswork in deciding on the future observing systems for the World Weather Watch.
Kållberg had returned to Sweden in 1982, but work continued on the FGGE data in the new Numerical Experimentation Group with Sakari Uppala and Stefano Tibaldi. Visiting scientists from the USA and from China, as well as from the Member States, took part in the work. The US National Science Foundation funded some of the visitors.
A new set of the observational data was delivered to the Centre in 1984–85, including additional observations and corrections to errors in the earlier Level IIb Dataset. By this time also many improvements had been made to the ECMWF analysis system. As reinforcement for the section Kållberg returned to the Centre for nine months in 1985. FGGE data including the Special Observing Periods 5 January to 5 March and 1 May to 20 June 1979, when intensive measurement campaigns were carried out, were re-analysed using the upgraded Final Level IIb data. Final IIIb analyses were delivered to the World Data Centres by the end 1986. The new analyses proved to be measurably superior to those made earlier. Subsequently the Final Level IIb data was used extensively for new and “comprehensive” OSEs over two separate two-week periods. During these Experiments clear
172 Chapter 14
From the important and valuable experience of the FGGE re-analyses, Bengtsson consulted many scientists worldwide on the possibilities of reanalysing the operational archive of the Centre — all the observations that had been received since 1979. Born in India, Prof Jagadish Shukla was a member of the US TOGA panel and the scientific steering group of the international TOGA. Shukla invited Bengtsson to the Center for Ocean-Land-Atmosphere Studies (COLA), Calverton, Maryland, USA. He discussed with Bengtsson proposing wider re-analysis projects — his proposal having been turned down by NMC Washington. Bengtsson and Shukla published a paper in 1988 advancing the concept of re-analysis.
• A comprehensive analysis of global observations based on a fourdimensional data assimilation system with a realistic physical model should be undertaken to integrate space and in situ observations to produce internally consistent, homogeneous, multivariate data sets for the earth’s climate system.
• Current and future observing systems are very expensive and dominate the expenditure budgets of the meteorological Services.
• There is no doubt that a reanalysis of global data over, say, a period of ten years is a considerable effort, both in manpower and computer resources.
Kevin Trenbreth and Jerry Olson in the USA had independently suggested that major global re-analyses be carried out. These suggestions about extensive re-analyses to produce climate data sets, which included detailed comment on the difficulties and how they might be overcome, were not well received in the beginning. Gradually, however, the meteorological community came to accept the concept. Several groups around the world are today carrying out re-analyses to produce data for climate research. Typical research applications which make good use of re-analyses include research on general circulation diagnostics, atmospheric low-frequency variability, the global hydrological and energy cycles, studies of predictability, coupled ocean-atmosphere modelling and observing system performance.
Slowly the concept of an ECMWF Re-Analysis (ERA) was developed. It was planned to use the 15 years of data in the archives from 1979 to 1993 inclusive: “ERA-15”. Bengtsson left the Centre at end 1990. The new Director David Burridge gave his full support to the project, and became the Project Chairman. The project team was Rex Gibson as Project Manager together with Kållberg and Uppala who both returned to the Centre.
Re-analysis — towards a new ERA 173 In the planning and development phase a Steering Group advised on matters of scientific and policy importance. Additional advice was obtained by setting up an External Advisory Group, comprised of eminent scientists from Europe and the USA.
Before beginning the ERA production, the assimilation system had to be defined. Proven, modern data assimilation, not necessarily identical to that of the operational suite, was required. The project began in February 1993 with a comprehensive set of experiments in the form of parallel data assimilations and forecasts, usually over three week periods and with extensive diagnostics. The first phase of the work included also the acquisition and preparation of the observations, and forcing fields such as sea surface temperatures, experimentation to determine the composition of the production system, and the development of both the production system and the internal validation tools.
A reliable production system capable of performing data assimilation was developed. The system was separate from both the operational and research systems. Using the combined experience of the Centre’s Operations and Research Departments, the systems in use were studied carefully, slimmed down where necessary, modified to use the data in an archive as opposed to real-time data, and optimised for performance. This resulted in a prototype system capable of performing at the required rate, which was further refined and completed while being used as the principle vehicle for the initial ERA experimentation.
It was decided early on that to optimise the use of resources the re-analyses should be carried out with a horizontal resolution of T106. For the vertical resolution 31 levels were used, rather than the 19 that corresponded more closely to the horizontal resolution, since the higher vertical resolution produced clearly superior analyses particularly around the tropopause.
At the time, “envelope orography” was being used in operations to parametrize the effects of sub-gridscale mountains. A new parametrization of the effects of sub-gridscale orography based on mean orography, and including a revised formulation of the gravity wave drag, developed by Francois Lott and Martin Miller, was also available; this was discussed in Chapter 9 when we considered the model. Test assimilations using this scheme showed no negative effects, while up to 10–15% more observations were accepted at 1000 hPa and 925 hPa. This scheme was chosen.
Using a prescribed soil climatology, which is based on very sparse information and may suffer from inconsistencies, as had been used in the pre-1995 operational system, had the risk of “forcing” a re-analysis towards its climate. Hence a new four-level self-contained soil parametrization
174 Chapter 14scheme developed at the Centre by Anton Beljaars and Pedro Viterbo for operational implementation was selected for ERA.
Ongoing work in the Research Department on the new variational assimilation scheme (3D-Var), and a new cloud parametrization with cloud water and cloud fraction as predictive parameters, were not sufficiently mature at the time of decision and were not selected for the re-analysis. The final production system was adopted in 1994. There followed a period of sustained production, monitoring and validation throughout 1995 and the first nine months of 1996.
The observations used by ERA came mainly from the ECMWF Meteorological Archive and Retrieval System (MARS). Additional sources
• 250 km cloud-cleared satellite radiance data.
• Ship and buoy observations from the “Comprehensive Ocean Atmosphere Data Set” (COADS), the most extensive collection of global surface marine data over the period.
• FGGE and Alpine Experiment (ALPEX) II-b data.
• Satellite cloud winds made available by Japan.
• The “pseudo-observations” (PAOBs) made available by NMC Melbourne: sea-level pressures, estimated from satellite imagery and forecast fields, over data-sparse parts of the Southern Hemisphere.
• Supplementary radiosonde and aircraft data, also provided by Japan.
• TOGA buoy and other oceanic data.
By 1979, winds and temperatures were being received from commercial aircraft all over the globe, although most of the flights, and therefore most of the data, were in the Northern Hemisphere. The reports improved significantly over time both in coverage and quality, with aircraft in flight reporting automatically every ten minutes replacing infrequent manual reporting. During the 1990s the frequency of reports increased automatically during takeoff and landing, thus giving a “profile” through the atmosphere of wind and temperature.
Once production began late in 1994, the scientific emphasis gradually moved from experimentation to monitoring and validation. The external forcing fields were validated before the production started by means of maps, averages and time series. Every effort was made to detect potential problems that would require further investigation as early as possible. When appropriate, production was halted and re-started from an earlier date. In some cases production was allowed to continue, but the month or months concerned re-run later. The monitoring made use of a set of quality control Re-analysis — towards a new ERA 175 tools, whose output, usually in the form of graphical information, was continuously assessed. All graphical and tabular monitoring results were kept both as hard copies and as files. Diaries were kept of all special events and problems encountered.
Production and monitoring continue throughout 1995 and into 1996.