«Medium-Range Weather Prediction Austin Woods Medium-Range Weather Prediction The European Approach The story of the European Centre for Medium-Range ...»
The reasons for GFDL’s institutional generosity became evident in the succeeding weeks. Dr Frederick G. Shuman had been Director of the National Meteorological Center (NMC), Washington, since 1963. He had had the difficult task of keeping an operational NWP system running, producing forecasts on schedule every day whilst introducing necessary improvements. His was not primarily a research institute. Burridge noted that Schuman’s job was to say the least a challenge, since “it took a mixture of science and art to run an operational NWP system at that time”.
Since the late 1960s Smagorinsky had been trying to persuade Schuman that NMC should follow up Miyakoda’s forecast results by initiating a vigorous programme in medium-range forecasting. Smagorinsky had failed in this effort. Perhaps this was partly because of Shuman’s conservatism and lack of will to introduce methods not originating at NMC, possibly based on a desire to avoid the difficulties inherent in introducing new software into operations. Another factor may have been institutional rivalry. However, Hollingsworth suspected that most of the problems were because of the sometimes abrasive relationship between these two formidable personalities. After failing for eight years to persuade NMC to get involved in medium-range forecasting, by 1975 Smagorinsky was eager to help the infant European institute, which was led by respected friends and which was charged with the operational implementation of one of GFDL’s most important initiatives.
1974 to 1980: the Formative Years 79 Hollingsworth returned to the Centre towards the end of July with copies of the model software and with initial data sets. At the same time Sadourny returned from the UCLA, where Mintz had provided him with copies of their model code.
Sadourny, who had expertise in designing finite difference schemes for atmospheric models, co-operated with Burridge in developing the Centre’s barotropic model. In fact, Sadourny’s finite difference scheme was used in the operational model until the spectral model came into use some years later. For personal reasons — getting married! — Sadourny returned to France after only six months. He recalled the difference between his pure research work at CNRS and his work at the Centre. At the Centre, “he had felt under some pressure to produce results which were oriented to the Centre’s forecasting goal”, although he later recalled the “pleasant atmosphere and good working relationships” with his colleagues.
Hollingsworth got the GFDL model running on the 6600 within a few days, and completed and validated a low-resolution forecast to ten days by mid-August. By mid-September he had adapted the cunningly-contrived GFDL I/O scheme to enable him to make a forecast with a higher-resolution model on the CDC 6600, which had only about 24K memory. At the first Council session on Tuesday 4 November 1975, Wiin-Nielsen was able to report that “the scientific staff by working very hard in the last weeks have on Friday night last finished the first experimental forecast to 10 days.
The forecast was made from real data from 1965.” The model had a grid of 4° in latitude and longitude. Even with this large grid size it took more than four hours computer time for a one-day forecast. Graphical output was produced as “zebra-charts” plotted on line-printer paper, which Bengtsson and Hollingsworth enjoyed highlighting with coloured pens.
In parallel with this work, David Dent got the GFDL model running on the IBM 360/195 computer at the Met Office in Bracknell. The resulting verification, and comparison with the UCLA model was the subject of the Centre’s first internal scientific publication, ECMWF Technical Report 1.
In 1976 and 1977 David Burridge, Jan Haseler and Rex Gibson wrote the adiabatic code for the ECMWF grid-point model, which a consortium of European countries was still using 25 years later as the basis of the HighResolution Limited Area Model (HIRLAM). The software design of this model benefited a great deal from the detailed study of GFDL’s software design.
We have mentioned in Chapter 1 the heated discussions with WiinNielsen on Bengtsson’s decision, which some thought to be a high-risk gamble, to use the semi-implicit scheme for the forecast model. At that time use of such a scheme, which correctly conserved important statistical
80 Chapter 7properties of the atmosphere, such as energy, had been restricted to models for limited areas. However Burridge had worked on this sort of scheme in the Met Office. Bengtsson recruited Dr Ian Rutherford, a research scientist at the “Division de Recherche en Prévision Numérique” (RPN) in Montreal, who served as Head of the Data Assimilation Section. Andrew Lorenc, recruited from the Met Office, also played a key part in the development of the data assimilation system.
In one sense, the Centre had an advantage over National Meteorological Services; its model was global. Models covering limited areas had problems at the edges or boundaries of the areas covered; these models made use of stable numerical techniques difficult.
The Centre’s reputation was growing in the world meteorological community. In autumn 1977, Prof M. A. Petrossiants, the Director of the Hydrological Research Centre Moscow, accompanied by Dr V. Sadokov, visited the Centre. Wiin-Nielsen, Bengtsson and Labrousse made a return visit to Moscow in January 1978. Soon after, two visitors from Academgorodok in Siberia came to the Centre: Dr Gennadi Kontarev, who stayed from 1979 to 1980 and Dr Vassily Lykossov, 1979 to 1981. Both were students and graduates of the renowned Prof Guri Marchuk.
During his visit, Kontarev gave several seminars on the adjoint method.
He wrote a report “The adjoint equation technique applied to meteorological problems”, published internally at the Centre in September 1980. The method had been developed by Prof Marchuk in 1974 to calculate the sensitivity of seasonal forecasts of Atlantic sea surface temperatures at three-month or six-month ranges to the initial sea surface temperatures in other areas of the world. We will see in Chapter 8 that the adjoint technique was to become important in development of the Centre’s forecasting system.
In these years, the Centre’s educational programme became well established. Distinguished invited lecturers, as well as the Centre’s scientists, gave presentations to annual autumn Seminars. Meteorological and computer training courses extending over several weeks were given for the benefit of advanced students from the National Meteorological Services.
In 1978 the computer hall and office block at Shinfield Park were ready for occupation, while work continued on the conference block. During the last days of October the staff moved from Bracknell to Shinfield Park. The CRAY-1 Serial Number 9 installed in the computer hall replaced the Centre’s prototype CRAY-1 Serial Number 1, which had been installed at the Rutherford Laboratory. The CDC Cyber 175 was transferred from the Rutherford Laboratory to the new computer hall. Member State scientists began using the system immediately; Council had decided that 25% of the 1974 to 1980: the Formative Years 81 Centre’s computer time should be allocated for National Meteorological Service use.
There were two alternatives for the model physics. The GFDL physics code was implemented in the ECMWF model as one option. The second was the first physics package developed by the research staff at the Centre.
This package in effect put together the results of 15 years of intellectual capital, based on worldwide research into modelling atmospheric physics, which to date had been relatively unexploited. Bengtsson’s plan was to hope for success with the ECMWF physics package, which had many modern ideas, and to use the well-known and proven GFDL physics package as the fallback. Hollingsworth recalled Bengtsson expressing his nervousness about the development of the physics: “For God’s sake Tony, don’t let them put too much dynamite in the model!” — “them” being the scientists of Hollingsworth’s section: Michael Tiedtke, Jean Francois Geleyn and Jean Francois Louis.
Hollingsworth led the team with the long and technically difficult job of making a set of ten-day forecasts on the Centre’s first CRAY-1 at the Rutherford Laboratory. There was a slow response time, and it was difficult to get the data to and from the computer. Assessing the performance of the two sets of physics was the principal objective. The GFDL physics package used a rather simplified representation of rain, snow, convection, internal turbulence in the “free” atmosphere aloft, turbulence at and close to the surface, and the effects of radiation and its interaction with the model clouds.
It had been in use at GFDL for 15 years, and so was robust and well tested, with well-known properties. In contrast the ECMWF physics package was more complex with more feedback loops; it was a state-of-the-art system, but with unknown characteristics. The model used was chosen to be close to the planned first operational model: a horizontal resolution of about 300 km, 15 levels in the vertical, an enstrophy-conserving finite difference scheme, and a semi-implicit time-stepping scheme. Good-quality global analyses from February 1976 provided by NMC Washington were used as the initial data from which the forecast experiments were run.
The main result of this work, completed in 1978/79, was something of a shock. Each of the two sets of forecasts with the Centre’s numerical scheme, using ECMWF or GFDL physics, had large amplitude, and similarly distributed large systematic errors in the large-scale flow. However the differences between the GFDL and ECMWF physics packages were surprisingly small.
There was no obvious way of choosing between the two with respect to forecast quality. Objective scores were no help, they were on the whole similar for both versions. Bengtsson decided to use the Centre’s own physics
82 Chapter 7package for the operational model. It had the best science, and the best prospect for later improvements — and as it happens the most dynamite!
This work, called the “Spring Experiments”, provided vital clues for later diagnostic work on orography and surface exchange processes, and set the research agenda for developments of the model physics for the next decade, leading to major model improvements in the period 1980–83.
The Centre’s first model was based on a grid-point approach, in which the forecast variables are specified on a set of evenly spaced grid points. The model resolution is defined by the space between the grid points; the closer they are, the higher the resolution. For a model covering a limited area this is fine. But the Centre had a model covering the globe. As we have noted above, this has a significant advantage over the models used by the National Meteorological Services, in that it had no horizontal boundaries.
These boundaries give rise to computational problems, which can quickly spread towards the centre of the model area. However, as we approach the North and South Poles, there is a different problem: the grid points get closer and closer, leading eventually to computational problems when we reach the Poles.
There is an alternative — the spectral model, which uses continuous waves to solve forecast equations; this was designed specifically for global domains. In fact, Lennart Bengtsson first met David Burridge and Adrian Simmons at a meeting on spectral models held in August 1974 in Copenhagen. Work had begun at the Centre already in 1976 on designing a spectral version of the Centre’s model. In May 1976, Bengtsson noted that “great attention is being paid to semi-implicit integration schemes and also to spectral representations”, and that an “experiment will replace the computation of finite difference horizontal derivatives in the GFDL model by spectral derivatives”. In fact, the Centre had a spectral model formulated even before the start of operations using the grid-point model.
As the highest priority was to get operational prediction started, the gridpoint model was used; it was a good model, with efficient and stable numerical techniques, and as we have seen good physics package. The horizontal grid was 1.875° in latitude and longitude, equivalent to about 200 km near the equator, and with 15 levels between the surface of the earth and the top of the model atmosphere at about 25 km.
To run the forecast operationally, software was required to manage the entire operational suite. The observational data were received on magnetic tapes delivered by car or motorbike several times a day from the Met Office in Bracknell, until high-speed telecommunications links were installed. These data had to be checked and quality-controlled on the 1974 to 1980: the Formative Years 83 front-end computer, and put in a database. At analysis time the data required were extracted. Then the analysis and forecast were run. The forecast fields required for the Member States and the Centre’s monitoring of the data, analyses and forecasts, and for the archives, had to be extracted as the forecast was running.
Many more operations of a technical or operational nature were required in real time. Roger Newson from the UK, as head of the Meteorological Division in the Operations Department, had overall responsibility for the initial pre-processing programs, graphic software and telecommunications.
The ECMWF Meteorological Operational System, or EMOS, developed in the Operations Department by Joël Martellet and his team, managed this complex operation. Martellet was able to save time by taking advantage of the fact that Météorologie National in France had the same CDC front-end processor; he based the Centre’s system on the data pre-processing program suite of France. There were some lively discussions within the Operations Department on the relative merits of adaptation or re-writing the programs.
A Meteorological Operations Room was established and suitably equipped. Here the operational forecasts were monitored, rejected observations examined, and the consistency and accuracy of the daily forecast runs discussed between scientists of the Meteorological Operations Section and the Research Department staff. This careful systematic monitoring of the observational data flowing to the Centre from all over the globe was unique in meteorology, an on-going and increasing effort that would in a few years prove its worth to the world meteorological community.