«Medium-Range Weather Prediction Austin Woods Medium-Range Weather Prediction The European Approach The story of the European Centre for Medium-Range ...»
By mid-2002, production was progressing in three streams:
• 1957 had reached September 1962, • 1972 had reached September 1976, and • 1989 had reached April 1997.
Forecasts run from the ERA-40 analyses were superior in many ways to the operational forecasts that had been run before 1999. Detection of tropical cyclones was good.
As planned, production of the ERA-40 analyses from 1 September 1957 to 31 December 2001 was completed shortly before the Fujitsu service ceased on 31 March 2003. Fujitsu in fact allowed the VPP700E computer to remain on site for a further month, and ERA-40 was extended to August 2002.
ERA-40 was the first re-analysis dataset in which an ocean wind-wave model was coupled to the atmospheric model. It provides the longest and most complete existing wave dataset. The ERA-40 ocean wave analyses Re-analysis — towards a new ERA 181 became the natural choice for studies of the climatology and variability of ocean waves, and for predicting extreme values of wave parameters over the whole globe.
The existence of the ERA-40 dataset allowed detailed studies including “re-forecasts” of major European weather events. To study the Dutch storm of 1 February 1953 mentioned in Chapter 12, the period 1 January to 10 February 1953 was analysed, see figure.
The study shows not only that today’s analysis techniques can be used for periods up to 50 years ago, but also that today’s numerical prediction models, if available, could have given warning of the weather leading to the floods.
Before finishing this story, we can quantify the rapid advances in technology.
• In a two-year period starting in 1980, a global re-analysis for just one year, the FGGE Level IIIb Dataset, was produced. It added 10 GB — ten thousand million bytes — of data and fields to the Centre’s archive.
• In 1994–96, a 15-year period was analysed to give ERA-15, adding 2000 GB of data and fields — about 130 GB per year.
• A 45-year period (ERA-40) was analysed in 2000-03. The archives increased by 70,000 GB, more than 1,500 GB per year.
Finer resolution, together with requirements for a greater range of information from the analyses and forecasts, account for the increases.
The upper-air (500 hPa) three-day forecast for 1 February 1953 is on the left, the ERA numerical analysis for that date to the right. Compare these with the tapestry on page 142, for which a hand analysis of the storm made in 1953 was used in the design. Note the good analysis and “forecast” of the storm made with ERA data.
Thus, today’s analysis technique can be extended back, to analyse significant historical weather events.
182 Chapter 14The ease of access to these datasets has improved. The FGGE analyses were available only by mounting up to 50 tape reels. All the ERA-15 and ERA-40 products — about 33 TB, or 33 million million bytes — are available effectively on-line for users from the Member States and Co-operating States. About 400 GB of the most useful ERA-40 products are also accessible freely on-line to researchers worldwide through the ECMWF public data server.
The ERA-40 re-analyses have been used for a wide range of applications:
studies on bird migration, to detect climatic temperature trends, on seasonal variations of climate and their better prediction, and much more.
Re-analyses in general and ERA-40 in particular has contributed greatly to different aspect of climate research.
The success of the ERA re-analyses and of the first-generation US reanalyses led the Japan Meteorological Agency, in association with the Japanese Central Research Institute of Electric Power Industry, to undertake JRA-25, a re-analysis from 1979 to 2004. A re-analysis from 1948, which is being continued in close to real time, has been produced in the USA by NCEP in collaboration with NCAR. As well, the Data Assimilation Office of the National Aeronautics and Space Administration, USA, has produced a sixteen-year re-analysis from March 1980. NCAR has set up a comprehensive ERA-40 data service for UCAR and other US members of the research community. At the time of writing, around 4,000 users worldwide have downloaded data from the subset made freely available online by ECMWF.
Archives and Graphics: towards MARS, MAGICS and Metview
The long-term goal was clear: to support the large and growing scientific community by providing a service from the ECMWF databases.
Observational data, analyses and forecasts would all be easily retrieved and supplied to support scientific field experiments, climate studies and more.
However, it was all a bit chaotic at first. Different ad-hoc solutions were being applied to individual requirements.
Newly recruited research staff started their work in 1975. Research progressed quickly in modelling, data assimilation and other areas. Model software had been acquired and was being modified and tested. The results of experiments had to be stored. They were copied onto tapes. The tapes were stored in John Scott House in Bracknell, at the Rutherford Laboratory after installation of the CRAY-1 there, and later in the computer hall at Shinfield Park. In actuality, they were easily accessible only to those who had written the data to the tapes. So long, that is, as they could remember the formats used to write them, where they were stored and so on.
However, although formats were generally documented, the data were not easily accessible to anyone else. It was not even clear how long they should be kept. Some scientists left the Centre after a short period, effectively abandoning their files. There was no functioning operational archival and retrieval system at the Centre until 1979, when operational forecasting started.
By dint of necessity, the Research Department set up its own system for storing and retrieving their data, interpolating data to give meaningful fields, and plotting and displaying the results. This system, advanced by the standards of the time, was used for the “Spring Experiments”, mentioned in Chapter 7, FGGE work and other Research Department activities. On the whole, it worked fairly well. However, the system was still based on private files. Researchers had to spend more and more of their valuable time in housekeeping their private archives. It was not a satisfactory permanent solution.
184 Chapter 15Besides the practical danger that the archives of the Centre would eventually become a black hole, into which data would be placed, never to be seen again, there was a strong legal and political need for a proper archive.
As early as 1970, the Project Study Plan for the Centre specified, after “better medium and long term forecasts for Europe and training facilities for post-graduate scientists”, the requirements for a functioning archive.
2. Nature and quantity of information to be stored
– material requirements – personnel It was planned the Centre would have an efficient data bank for the use by meteorologists in the Member States. The Convention as adopted in 1973 specified among the objectives of the Centre: “to collect and store appropriate meteorological data”, and to make these data “available to the meteorological offices of the Member States”. They would no longer have to depend on the service provided by the World Meteorological Data Centre in Obninsk, Russia. With the technology of the time, for example, Obninsk could not read 200 bpi tapes sent from Regional Meteorological Centre (RMC) Bracknell — although they could read those from ECMWF.
A group of experts met in July 1975 to consider requirements for graphical systems, both for the interim period leading up to the completion of the Centre’s building, and later when the Centre’s own computing facilities had been installed. Requirements for chart production, volume of output and coding for graphics were discussed. Recommendations were made on hardware and software, leading eventually to the acquisition of Versatec 8122 online electrostatic plotters being installed in Shinfield Park, superseding the Varian Statos offline plotters at Rutherford Laboratory.
Design of the ECMWF Meteorological Operational System (EMOS) began in early 1977, when the newly recruited staff of the Meteorological Applications Section of the Operations Department were in Fitzwilliam House in Bracknell, working under the direction of Joël Martellet, recruited from Météorologie Nationale, France. By October of 1978, even before the move to Shinfield Park, the plan for EMOS had been finalised. EMOS had the same logical structure required for any large operational numerical
• System for acquiring weather reports
• System for pre-processing and quality-controlling the reports
• Reports Data Base (RDB) into which the reports were streamed Archives and Graphics: towards MARS, MAGICS and Metview 185
• Analysis and Forecast system
• Post-processing system, to prepare forecasts for despatch to the Member States and for archiving
• System for disseminating the analyses and forecasts
• Archiving system
• Scheduling system, called the “Supervisor-Monitor-Scheduler”
• Operational Watch The “Supervisor-Monitor-Scheduler” (SMS) was the software that ensured proper synchronisation and scheduling of the operational programs.
In autumn 1979 the entire complex system was ready for implementation.
The Operational Watch provided information to the meteorologist on duty in the Meteorological Operations Room: “information on request typed on a keyboard from an alphanumeric VDU or a graphical VDU terminal”.
One important component of EMOS was the archiving system. It was planned that “8 to 10 6250 bpi (bits per inch) tapes will be mounted every day” to archive the weather reports from three days earlier, together with the current day’s analysis and forecasts. For security, the tapes would be duplicated. Weekly or monthly, a third copy would be made and stored outside the computer hall. Punch cards with various directives would be used to extract observations, analyses or forecasts.
A system “GETDATA” was designed and implemented by analyst John Chambers in the early 1980s, to give easy access to the archived data. It located the data that had been produced recently and was still held on disk, and the archived data that had been stored on tapes. Which tape held which data was recorded in a “master index”, providing a primitive database. The tape reels were kept on racks in the computer hall. Following a request for data, an operator received a printed tape ID number. The operator retrieved the tape and loaded it on a tape reader. When a faulty tape was discovered it was discarded, the backup immediately copied, and the copy used to meet the request. GETDATA worked well for a time. Its users were on the whole happy with the system, and soon became familiar with the directives used to retrieve data. The user no longer had to be aware of the operational timetable, of the methods and formats used to store the different kinds of data, or of technical changes to the archives.
However, it had its problems. Data from experimental forecasts were being stored in different formats, and some formats were dependent on the hardware and software of a specific computer system. Special routines were required to access the data. An additional irritant for Member State scientists was that they were charged units from their allocation of computer resources, not only for the data retrieved, but also for the computing resources used to carry out the retrieval. These were unknown in advance,
186 Chapter 15and could be large. A consultant from the University of Reading developed a utility called “FINDATA” that got around this. A short FINDATA job was submitted, costing only a few units, that launched a GETDATA request, and the user got his data without paying the unfair overheads.
By the standards of the time, enormous amounts of information were being stored. By mid-1983, it was foreseen that “the rate of growth of archived data is such that the tape library will be completely full within two years”.
Visualisation tools were required for research and for monitoring the operational forecast. No acceptable package was available. In-house development started. This led over the years to advanced packages that were tailored to the developing demands from the research and operational users at the Centre and in the various institutes in the Member States.
In early 1982, a “GETPLOT” system for plotting fields was introduced.
Now analyses and forecasts retrieved from the archives could be easily plotted. Later, overlaying of fields, plotting observations onto plotted fields, data coverage maps and cross-sections showing vertical slices through the model atmosphere were added. By 1985, GETPLOT had been replaced by the Meteorological Applications Graphics Integrated Colour System (MAGICS), a powerful software system for plotting map contours, satellite images, wind fields, observations, symbols, streamlines, isotachs, axes, graphs, text and legends.
As the scientists in the Research Department completed enhancements to the analysis and forecasting systems, the improvements had to be introduced into the operational forecast running under EMOS. There was of course a wonderful system in place to make introduction of the changes foolproof. But of course, any system can be defeated! And it was, an embarrassing number of times. We will draw a veil over most of these, but one was memorable.
In principle, changes were made once weekly, on Tuesdays, thus avoiding weekends; the rest of the working week was available to sort out unforeseen consequences. One analyst needed to make just “one tiny change” to adjust a single archive model level on a Friday afternoon before going on vacation to an isolated telephone-free farmhouse in France for a week. This was before mobile telephones were available. It took him several weeks’ work, including re-running a number of complete forecasts, to regenerate the missing fields from the archives that resulted from his “one tiny change”.