WWW.DISSERTATION.XLIBX.INFO
FREE ELECTRONIC LIBRARY - Dissertations, online materials
 
<< HOME
CONTACTS



Pages:     | 1 |   ...   | 16 | 17 || 19 | 20 |   ...   | 46 |

«Medium-Range Weather Prediction Austin Woods Medium-Range Weather Prediction The European Approach The story of the European Centre for Medium-Range ...»

-- [ Page 18 ] --

Thus, the system was multivariate — measurements of several variable quantities were used to analyse a single variable. The “first guess” or “background value” at the grid point was interpolated to all the observation points, and a “correction” to the background value found by subtracting it from the observed value. The analysed value was found by adding the background value to the weighted average of the corrections. The analysis was made statistically “optimal” by ensuring as far as possible that the weights took into account the relationships between the wind, temperature, pressure and so on. Further, the accuracy of the different types of observations was assessed, to ensure that they were each given their proper weight.

Analyses based on OI are not completely “balanced”; the mass and wind fields are not fully consistent. Consequently, if forecasts are run directly from the analyses, adjustments of the mass, temperature and wind fields are required, and these generate large amplitude gravity wave oscillations in the first few hours of the forecast. A process called “initialization” removes these oscillations, without destroying the meteorologically significant structures. Different techniques can be used in the initialization. At the Centre, Dave Williamson, a visitor from the US National Center from Atmospheric Research, and staff member Clive Temperton implemented the so-called “Non-linear Normal Mode Initialization” or NNMI. The model’s “normal modes” — mathematical idealizations that can describe the evolution of perturbations — were used to adjust the initial conditions of the model so that the unwanted high-frequency oscillations were removed from the subsequent forecast. Temperton and Williamson had the benefit of help from the visiting Danish scientist Bennart Machenhauer, the inventor of NNMI.

90 Chapter 8

Work progressed well in 1978 with a nine-level version of the model.

The horizontal grid spacing, or resolution, used for the analysis was 3.75˚.

A continuous data assimilation test was run over six days of observations from the DST set. The results compared favourably with analyses from other major centres. The following year the analysis system was improved, to analyse the data at the horizontal resolution of the model, 1.875˚, and at 15 levels. The analysis system was ready in time for operational prediction to begin in mid-1979.

Another significant milestone was reached soon afterwards: production of the FGGE analyses began in December 1979 using the system; see Chapter 14.

Ensembles of grid points within “boxes” were used in the analysis system.

It was found that there could be substantial gains in computational efficiency, with very small changes in the resulting forecasts, by (a) reducing the number of data selected for the analysis levels, and for the variables such as wind and temperature, and (b) reducing the area covered by the boxes.

The “incremental” approach to the analysis mentioned above was introduced late in 1980, with significant changes for the better in the modelling of global convection and in the heat transferred to the surface.

Not only the atmosphere was analysed. The earth’s surface — soil moisture, soil temperature and snow cover — also influenced the forecast, and had to be analysed. A method of analysis developed by DWD, the German Weather Service, was used as the basis for this.

Much research was now under way on ensuring that the observations were as well checked as possible, and that erroneous data were identified and corrected if possible, and otherwise rejected. One particularly interesting piece of scientific detective work was finding a systematic error in data from an isolated radiosonde station: Marion Island.

Marion Island, Republic of South Africa, located in the southern Indian Ocean, 2,300 km southeast of Cape Town, is one of the most isolated places in the world. A volcanic island, it has an area of 290 km2. The discovery of the island is accredited to the French explorer Nicolas Marion-Dufresne in

1772. Neither he, nor later, Cook in 1776, Ross in 1840, or the Gauss expedition of 1901 were able to land because of adverse weather conditions!

South Africa established a radiosonde station there that started sending its valuable reports of the wind, temperature and humidity from above the island twice each day from January 1961. Weather reports from such an isolated region, previously a data-void for meteorology, were of course extremely valuable.

The Analysis System — from OI to 4D-Var 91 As we have seen, the Centre’s data assimilation system uses a short-range forecast to give the background for the analysis. This background is modified to take into account the observational data. The daily reports from Marion Island were unremarkable, and were routinely assimilated to give the analysis. However looking at the monthly mean data during 1981, something odd was noticed: there were systematic differences averaging about 10° to 12° between the background winds and the reported winds. There were of course no nearby stations that could be compared. This was worrying. Was there a fault in the ECMWF analysis scheme or forecast model that was unrealistically and systematically backing the wind? Thorough testing showed nothing obvious that could explain the discrepancy.





A polite query was sent to the South African Meteorological Service. An investigation showed that when the software to calculate the wind direction had been installed many years earlier, magnetic north, instead of geographic north, had been assigned as the reference for wind direction! The local operators took the necessary corrective action.

An intensive joint project between scientists in the Meteorological Operations Section and those in the Research Departments in 1982 showed that other data were having a detrimental effect on the analyses. Some observing platforms were sending persistently incorrect reports, some had large random or systematic errors, and some simply did not code their data properly according to the agreed standards! In 1985, the Centre was designated by WMO as Lead Centre for monitoring global upper-air data. In mid-1985, the Centre provided WMO with the results of monitoring surface ship and radiosonde data for the three months March to May 1985, beginning a regular reporting that led to improvements in the Global Observing System of the World Weather Watch. Since then the Centre has regularly produced consolidated Reports or “suspect” lists of observations that consistently are of low quality. Action by local operators usually follows.

Earlier, in November 1979, the Council had set up a Working Group on a future observing system. The Group, chaired by Andrew Gilchrist of the UK Meteorological Office, was asked to “assess the requirements to be met by a future observing system”. The group met during 1980, and considered “Observing System Experiments”. These are carried out with numerical

models and analysis systems to investigate a variety of issues:

• Assessing how observations affect analyses.

• Planning an observing network to give sufficiently accurate analyses — what kind of observations? — how far apart? — what accuracy is required?

• Testing alternative observing systems to determine their costeffectiveness, thus guiding how resources should be best allocated.

92 Chapter 8

The last of these was emphasised. For the European Meteorological Services, decisions had to be made on the future observing system for the North Atlantic region as well as over Europe. Conventional systems were becoming more and more costly. New observing systems (e.g. automatic stations, buoys and satellites) would become increasingly available. The science of forecasting was advancing rapidly; it should not be hampered by deficiencies in the observations. Up to then the European Services had acted independently. It was now time for co-operative action, taking into account the commonality of interests of the services. A series of Observing System Experiments, involving scientists from four Member States, was under way by 1984.

Frédéric Delsol joined the Operations Department from Météo France at the beginning of 1982 for a four-year stint at the Centre. After studying modelling of precipitation schemes and boundary layer processes under Daniel Rousseau, he had been in charge of the avalanche-forecasting centre at Grenoble, and then had become Director of the Bordeaux regional centre. On his arrival at the Centre, he was quickly impressed by how the Centre had managed to harness the complex analysis and forecasting system to apply research ideas and results in a practical way. In Delsol’s mind, he compared it to an astronomical telescope; without the telescope, astronomers’ theories would have remained unproven. For the first time, the entire global observing system could be actively monitored in real time and erroneous data quickly and efficiently identified.

An interesting joint study between the Centre, the UK Met Office and NMC Washington in 1983 used identical sets of observational data to produce analyses and forecasts from the three systems. In some cases, the analyses were quite different. The differences were amplified in the forecasts. The research allowed identification of the best features of the different analysis systems, and indicated how the systems could be improved.

The study showed that what is happening to the weather in mid-Pacific today can affect the weather over Europe less than a week from now. The figure shows one case of a relatively small difference between the ECMWF analysis and that of NMC in mid-Pacific. The difference resulted from slightly different ways of handling satellite and weather ship data. We can see that small differences in the two-day forecasts over North America grew to larger differences over east Canada and stretching into the Atlantic by day four, and by day six, gave substantial differences in the forecasts over the North Sea and Europe, extending to Italy.

A comprehensive evaluation by David Shaw, Peter Lönnberg, Tony Hollingsworth and Per Undén identified many deficiencies in the optimum interpolation statistics, data selection, and quality control applied in the The Analysis System — from OI to 4D-Var 93 analysis. In 1984–85, major changes to the assimilation system were made to correct these deficiencies. Further research addressed the question of spreading information from the observations horizontally and vertically in the analysis, and how the information in one variable, for example wind, can be applied to another variable, for example pressure, in the multivariate analysis.

150ºW 180ºW 150ºW 180ºW

–  –  –

Differences between two forecasts made with the ECMWF model. One forecast was made starting with the ECMWF analysis, the other starting with the analysis made by NMC Washington. “Day 0” shows the difference between the analyses.

The differences increase as they move from west to east in the wind flow, from the Pacific over North America and on towards Europe. By Day 6, there are significant differences between the forecasts over the European area. Level: 300 hPa, contour interval 1 decametre, starting from 00 UTC on 18 February 1979. See Hollingsworth et al. (1985) The response of numerical weather prediction systems to FGGE level IIb data. Part I. Analyses. Quart J Roy Meteor Soc 111: 1–66.

94 Chapter 8

The suite of analysis programmes, more then 90,000 lines of code, was rewritten in 1985–86 to give a new, more efficient and more flexible analysis system. We have seen above that interpolation was required from the pressure levels of the analysis to the models “sigma” levels. While the “incremental” approach to this interpolation had improved the situation, the new system eliminated this pressure-to-sigma interpolation entirely. Data were now interpolated directly using a new three-dimensional multivariate analysis scheme at the levels at which the measurement was taken, without having to interpolate to “standard” levels. The entire troposphere was now analysed at once, no longer divided into “slabs” of atmosphere. Humidity analysis was significantly improved.

Throughout the years, the real or effective horizontal resolution of the analysis was significantly below that of the forecast model. In fact, weather systems with length scales below about 500 km could not be properly analysed. The resolution of the analysis was strongly controlled by the horizontal forecast error correlations; work began leading to an improvement in the resolution to about 300 km by July 1988.

After a period of steady improvement in the forecasts, Burridge recalled how from the mid-1980s the scores levelled out. It seemed that a plateau had been reached in the Centre’s forecast accuracy. Even some on the Scientific Advisory Committee believed that the Centre had reached its limit; one said that it had in a sense “used up its intellectual capital” by that time. Burridge had the growing feeling that in fact the Centre’s Optimum Interpolation data assimilation system had been pushed to its limits. The many different kinds of data coming from the satellite instruments were just not being used optimally. Something needed to be done here, but it was not yet clear just what.

We will see in the next Chapter how collaborative work between the Centre and Météo France, starting in 1987 with the development of an “incore” model, led to development of what was to become known in 1992 as the “Integrated Forecast System” or IFS. Philippe Courtier, seconded to ECMWF from Météo France in Toulouse, had been investigating both “variational data assimilation” — a new technique, that was to become a key part of the Centre’s system — and the potential for a “stretched” computational grid, which would allow enhanced resolution of the spectral model in places of particular interest. The former was of direct interest to both the Centre and Météo France, the latter appealed to Jean-François Geleyn for use in a model for Météo France. Toulouse was running a global model as well as a model covering a limited area. A model with variable resolution, one with an “elastic” or “stretched” grid allowing lower resolution for example over the Pacific, and higher resolution over France, could replace both of these.



Pages:     | 1 |   ...   | 16 | 17 || 19 | 20 |   ...   | 46 |


Similar works:

«Central Bank Independence and Credibility During and After a Crisis* by Alan S. Blinder, Princeton University Griswold Center for Economic Policy Studies Working Paper No. 229, September 2012 * Paper presented at the 2012 Federal Reserve Bank of Kansas City Economic Policy Symposium, The Changing Policy Landscape, in Jackson Hole, Wyoming, on September 1, 2012, and is forthcoming in the conference proceedings volume. Alan S. Blinder Jackson Hole Symposium, September 1, 2012 Central Bank...»

«Permit Process for Signs What is a sign? Signs are deemed to be a name, identification, description, display, illustration, or character which is affixed to, or represented directly or indirectly upon a building, structure, or piece of land and which directs attention to an object, product, place, activity, person, institution, organization, or business. What is an awning? Awnings are a rooflike covering extending over or in front of a door or window as a shelter with signage included. What is...»

«Gallery of Lawyers Established Top Lawyers The Ukrainian legal market is in its infancy compared with Western Europe and the United States, which boast centuriesand decades-old legal traditions. The country’s leading lawyers, therefore, are not only those who have worked on huge projects with big clients. They are the groundbreakers who established and developed their practices from scratch after the fall of the Soviet Union in 1991 saw Ukraine shift toward a market economy. The Kyiv Post...»

«UCLA Procedure 360.1: Misappropriation of University Assets by University Academic, Staff or Student Employees Issuing Officer: Administrative Vice Chancellor Responsible Dept: Audit & Advisory Services Effective Date: July 1, 1998 Supersedes: UCLA Procedure 560.1, dated 5/1/90 I. REFERENCES II. DEFINITIONS III. OBJECTIVES IV. GENERAL POLICY V. PROCEDURES I. REFERENCES 1. UC Business and Finance Bulletin G-29, Procedures for Investigating Misuse of University Resources; 2. UC Business and...»

«Core Idea: CONNECT Engage the wider world. Strengthening the International Development Institutions July 9, 1965 Adlai Stevenson II’s Speech before the United Nations Economic and Social Council Geneva, Switzerland Excerpt: We travel together, passengers on a little space ship, dependent on its vulnerable reserve of air and soil; all committed for our safety to its security and peace; preserved from annihilation only by the care, the work, and I will say, the love we give our fragile craft....»

«ISSN 1725-3209 EUROPEAN ECONOMY Occasional Papers 118 | October 2012 The Financial Sector Adjustment Programme for Spain Economic and Financial Affairs Occasional Papers are written by the Staff of the Directorate-General for Economic and Financial Affairs, or by experts working in association with them. The “Papers” are intended to increase awareness of the technical work being done by the staff and cover a wide spectrum of subjects. Views expressed do not necessarily reflect the official...»

«THE SIGNIFICANCE AND PERFORMANCE OF LISTED PROPERTY COMPANIES IN VIETNAM THI KIM NGUYEN University of Western Sydney ABSTRACT Vietnam has emerged as a rapidly growing economy in the last few years, with the average growth rate in excess of 8.0% per year before the global financial crisis and 5.3% in 2009. Although Vietnam is located in a region of significant growth in new property developments, details about the Vietnam property market are still not readily available. This paper presents a...»

«ENERGY ENVIRONMENT AND ECOLOGY Q1. Define the term environment. The term environment means the surroundings of a species to which it remains totally adapted and to which it continuously interact for every activity of life in some or other way for its survival. Biologically environment constitutes the abiotic component of nature such as soil, water, air, light, energy, humidity, temperature etc. which affects the existence, growth and metabolism of the organism. The word environment derived from...»

«The Future of Finance: The LSE Report What is the financial system for? What is the future of finance? This report presents a novel approach to the reform of the world’s financial system, starting with the basic question, what is a financial system for? It shows that the existing system has become far more complicated than it needs to be to discharge its functions – and dangerously unstable into the bargain. It proposes some drastic remedies. The Future of Finance: The LSE Report is the...»

«Green Paper on Retail Financial Services in the Single Market AEGON’s Response The Hague / Brussels July 2007 Green Paper on Retail Financial Services in the Single Market1 AEGON’s Response Introduction As one of Europe’s largest life insurance and pension providers, AEGON believes the Single Market makes an important contribution to the economic wellbeing of both individuals and companies in the European Union. Within the context of the Single Market, AEGON supports current efforts aimed...»

«Does Informative Media Commentary Reduce Politicians’ Incentives to Pander?1 Scott Ashworth2 Kenneth W. Shotts3 September 10, 2008 1 We thank seminar audiences at Berkeley, Emory, Harvard/MIT, Princeton, Stanford GSB, and Yale for useful comments. 2 Corresponding Author. Assistant Professor of Politics, Princeton University. Princeton, NJ 08544. Phone: (609) 258-2153. Email: sashwort@princeton.edu. 3 Associate Professor, Stanford Graduate School of Business. 518 Memorial Way, Stanford CA...»

«PECC Finance Forum Conference Issues and Prospects for Regional Cooperation for Financial Stability and Development Hilton Hawaiian Village, Honolulu August 11-13, 2002 Day One: Scope for Regional Financial Cooperation among PECC Economies Session I: An Overview – What are the issues? Whither Monetary and Financial Cooperation in Asia?1 Barry Eichengreen University of California, Berkeley 1 Prepared for the PECC Finance Forum Meeting, Honolulu, August 2002. This presentation is based on a...»





 
<<  HOME   |    CONTACTS
2016 www.dissertation.xlibx.info - Dissertations, online materials

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.