FREE ELECTRONIC LIBRARY - Dissertations, online materials

Pages:     | 1 |   ...   | 17 | 18 || 20 | 21 |   ...   | 46 |

«Medium-Range Weather Prediction Austin Woods Medium-Range Weather Prediction The European Approach The story of the European Centre for Medium-Range ...»

-- [ Page 19 ] --

The Analysis System — from OI to 4D-Var 95 In variational data assimilation one begins, as in OI, with the differences between the analysed values on the one hand, and the observed and background values on the other. Determining the adjustments to the background forecast that will minimize the sum of the weighted measures of these differences gives the analysis. The weights applied depend on estimates of the typical errors of the observations and background forecasts. They take dynamical imbalances, for example between wind and pressure fields, into account. In three-dimensional variational (3D-Var) assimilation, the differences from the observed values are somewhat artificially assumed to be valid at specific analysis times (usually the “synoptic hours” of 00, 06, 12 or 18UTC). In four-dimensional (4D-Var) assimilation, the differences are processed at the time of each observation. The minimization therefore involves repeated model runs for the period over which observations are being assimilated, typically six or twelve hours. This clearly requires very large computing resources.

Development of 4D-Var was seen at the outset as especially promising because of its optimal use of the so-called “asynoptic” data measured continuously by satellite, and because variational assimilation in general opened the door to the direct use of radiance data from satellites that we will consider in Chapter 13.

Where did the concept of variational assimilation originate? We saw in Chapter 7 that in 1980 a scientist visiting from Russia, Dr Kontarev, gave several seminars on the adjoint method that had been developed by Prof Marchuk in 1974. This method allows computation of the sensitivity of any output parameter to any input parameter for any dynamical system at a reasonable cost. Olivier Talagrand, who as we have seen developed the incremental approach to OI, followed the lectures. He returned to his institute in Paris, the Laboratoire de Météorologie Dynamique (LMD), and started working on the adjoint method in collaboration with a mathematician Xavier Le Dimet. Initial experiments with a shallow-water model were unsuccessful; gravity waves generated too much noise. However he proposed further research to his students. One of them, Philippe Courtier, newly-arrived at LMD from Météo France, started to work with a filtered model, that is one that filtered out the unwanted effects of the gravity waves.

By 1985, Courtier and Talagrand had obtained results showing that they had tamed the gravity-wave noise. Now the possibility was opened to apply the variational technique to an operational NWP system.

Talagrand returned to the Centre in early 1987. With Courtier, now a staff member on secondment from Météo France, he started a feasibility study on

96 Chapter 8

use of variational analysis in the Centre’s system. Their conclusion, that it would be more efficient to re-code the entire model than to write the adjoint of an old code, was not universally welcomed. However they persisted, with encouragement from Burridge. Their pioneering work resulted eventually in an award from the Academie des Sciences.

There was much work to be done before the benefits of the investment in variational data assimilation could be reaped. In October 1988, Lennart Bengtsson noted that “major efforts are required before this technique can be developed into a practical system”. This was true; 3D-Var did not become operational at ECMWF until January 1996 and almost ten years had elapsed before Florence Rabier put the finishing touches to the world’s first operational 4D-Var system, implemented at the Centre in November 1997.

Throughout this long period, Burridge, first as Head of Research, then as Director, “kept the faith”. He defended his research programme from those who queried the computing cost, and the overall feasibility, of 4D-Var. He was disappointed that the UK Met Office did not become involved, and share the workload. For him, this was “a very tough time”. He remembered the Council as being generous in its approach; it was not overly critical when quick results were not forthcoming from the long research programme. The benefits indeed took some time to become apparent; some claimed that years of research work seemed not to be producing anything useful. Eventually however Burridge was pleased that his conviction had been vindicated; it was not until the mid-to-late 1990s that it became clear that the decisions of the late 1980s to work towards 4D-Var were justified.

He noted later with satisfaction that at last “it became generally recognised that the substantial forecast improvements over the following years came largely from 4D-Var”. In the next Chapter, we will see just how much forecast accuracy improved from the late 1990s.

Burridge believes that still, at the time of writing, the potential of 4D-Var has not been fully realised. He is confident that there are “one or two more days of predictability to be gained from the Centre’s forecasting system”.

The challenge remains: to exploit fully the new data types.

The Centre was at the forefront in using these kinds of data. Operational introduction of 4D-Var has followed at other Centres. Jean-Noël Thépaut, one of the pioneers of pre-operational development of 4D-Var at the Centre, played a key role in the work leading to implementation at Météo France in June 2000, and Andrew Lorenc himself, who had returned to the UK Met Office in 1980, led work there that brought 4D-Var implementation in October 2004.

The Analysis System — from OI to 4D-Var 97 The ECMWF data assimilation system will play an important role in studies of observing system impact and observation network design, aiming at optimisation of the global observing system. The international work is coordinated through WMO, and a programme called EUMETNET Composite Observing System (EUCOS) which is run under the auspices of the Network of European Meteorological Services (EUMETNET).

Chapter 9

The Medium-Range Model

The comprehensive atmosphere-ocean-land model developed at the Centre over the years forms the basis for the Centre’s data assimilation and forecasting activities. In other Chapters, we review the Centre’s activities in analysis, wave modelling, seasonal prediction and ensemble forecasting.

Here we will review briefly the development of the main high-resolution medium-range model.

We see in Article 2 of the Convention that inter alia the objectives of the

Centre shall be:

• to develop dynamic models of the atmosphere with a view to preparing medium-range weather forecasts by means of numerical methods;

• to carry out scientific and technical research directed towards improving the quality of these forecasts.

A model covering the globe would be required. As we have seen, the weather in mid-Pacific today can influence the weather over Europe five or six days later. Today’s weather south of the equator will influence the weather next week in the Northern Hemisphere. Besides, States in Europe have an interest in global weather: for ship-routeing, for offshore oil exploration in the southern Pacific and elsewhere, for expeditions to the Antarctic, and for many other activities.

In Chapter 7 we saw how the Centre prepared its first operational medium-range forecasts beginning in August 1979. For its time, the Centre’s model of the world’s atmosphere was sophisticated. It delivered five-day forecasts to the National Meteorological Services with average accuracy similar to that of the best of the two-day forecasts that had been available to them ten years earlier.

We saw that a grid-point model was used, in which the temperature, wind and humidity were predicted on a network of points, separated by about 200 km around the equator, but closer together in the east-west direction nearer 98 The Medium-Range Model 99 the poles. The network was repeated at 15 levels between the surface, on which pressure, as well as rain- and snowfall were predicted, and the top of the model atmosphere, which was at a height of 25 km. The lower levels were separated vertically by a few hundred metres, those aloft by a couple of kilometres. Each level had 28,800 points; the model had 432,000 grid points in total.

At the beginning, the definition of cloud in the model was perhaps by today’s standards somewhat primitive, but was nonetheless impressive.

When the humidity at a grid point exceeded 100%, stratus clouds formed.

Rain or snow would fall if the temperature was low enough or if there was enough liquid water. Convective or cumulus clouds were formed depending on the instability of the grid column and convergence of water vapour. Rain falling through the model atmosphere would evaporate in dry air.

Short-wave radiation incoming from the sun, long-wave infrared radiation from the earth to space, and multiple scattering of radiation between cloud layers, were all calculated. Absorption of heat by water vapour, ozone and carbon dioxide was taken into account as well. Computing the effects of radiation took lots of computer power, and so was done only twice each forecast day at the start.

The laws of physics tell us what moves the air around, what makes it warmer or cooler, and what makes clouds give rain or snow. The model was based on the gas law for a mixture of dry air and water vapour, the laws of conservation of mass and water, the equation for momentum and the first law of thermodynamics. Heating and cooling of the atmosphere by radiation, the turbulent transfer of heat, moisture and momentum, the thermodynamic effects of evaporation, sublimation and condensation and the formulation of rain and snow were all described.

Starting from the analysis at noon, a forecast was made of the tiny changes in wind speed and direction, temperature, and humidity at each of the 432,000 grid points for 15 minutes later at 12.15. This gave a new starting point. A new forecast was made now for 12.30, and so on until after 960 15-minute time steps the forecast to ten days was completed. For each step, seven numbers — the temperature, wind and so on — were required at two time steps at each grid point — a total of six million numbers. The fields were stored on four disks of the CRAY-1. All the data for a vertical slice of atmosphere above a line of latitude were moved from the disks to the CRAY-1 memory. The CRAY-1 would perform the calculations for this slice, return the results to disk, and then move on to the next. About 50 million calculations were made each second, and the forecast to ten days took a little less than four hours. Although the analysis cycles were run over

100 Chapter 9

weekends, forecasts were run only from Monday to Friday. Weekend running of the forecast began in August 1980.

Development of the model from scratch to operational implementation was an achievement that was a source of pride to Wiin-Nielsen, and indeed to all the staff of the Centre. David Burridge had been given the task of designing the numerical scheme for the model. Burridge, Jan Haseler from the UK, Zavisa Janic from Yugoslavia and others, made their first experiments, making forecasts from low-resolution “Data Systems Test” analyses, which had been compiled for FGGE. It was soon evident that the model had the benefit of a robust and stable numerical scheme. Tony Hollingsworth, Head of the Physical Aspects section, with Jean-Francois Geleyn from France, Michael Tiedtke from Germany and Jean-Francois Louis from Belgium were largely responsible for the model physics.

A research team including David Burridge, Jan Haseler, David Dent, Michael Tiedtke and Rex Gibson went to Chippewa Falls, the Cray factory, in mid-1977 on a memorable trip. In between sometimes heated discussions between Tiedtke and Gibson, who did not always find it easy to see eye-toeye, with Burridge trying to keep the peace, Dent calmly typing away at the console, and Haseler getting some sleep under the table, the team managed to complete a one-day global “forecast” on a CRAY-1 at a speed about ten times faster than that of the CDC 6600.

By the end of the year, more predictions to ten days were being run. The scientists of the Research Department would run many thousands of numerical experiments in the years to come. Work was easier when the staff moved to Shinfield Park in late 1978, where the Centre’s CRAY-1 and CDC Cyber 175 had been installed in the Computer Hall.

Broadly, the work on modelling the atmosphere numerically to give a

forecast can be separated into:

• the analysis (or assimilation of the observations to give the initial fields from which the prediction starts); this is dealt with in the previous Chapter;

• the “physical aspects” of the model, such as modelling the processes that cause condensation of water to form clouds, rain, and snow; the consequent generation or absorption of heat, friction as the wind blows close to the surface and so on; and

• the “numerical aspects”, including modelling the movement of parcels of air, heating of air by compression and cooling by expansion, what sort of grid is best, or even if the calculations should be made not on a grid, but instead using continuous waves in a “spectral” version of the model.

The Medium-Range Model 101 Within this broad-brush description, other essential work was required.

Systems were developed to diagnose the model behaviour, and its accuracy and performance. Basic questions had to be answered. Given the power of the CRAY-1, what was best: to increase the model resolution, i.e. bring the grid points closer together, or make the physics more sophisticated? What was the best way to eliminate from the calculations those things not required for the forecast? For example, the atmosphere is suffused with gravity waves, most of which have little influence on what tomorrow’s weather will be like. A numerical model will use up lots of resources modelling these unless they are somehow eliminated.

Pages:     | 1 |   ...   | 17 | 18 || 20 | 21 |   ...   | 46 |

Similar works:

«Estimated Depth to Bedrock in Arizona Stephen M. Richard, Todd C. Shipman, Lizbeth C. Greene, and Raymond C. Harris Arizona Geological Survey Digital Geologic Map Series DGM-52 2007, Version 1.0 1 INTRODUCTION The Arizona Basin and Range Province consists of broad alluvial valleys that separate low but rugged mountain ranges. The broad expanses of dry alluvial deposits and ephemeral streams that characterize the basin areas belie their importance as water sources and their geologic complexity....»

«IOSR Journal of Business and Management (IOSR-JBM) e-ISSN: 2278-487X, p-ISSN: 2319-7668. Volume 18, Issue 5.Ver. I (May. 2016), PP 91-98 www.iosrjournals.org Impactof Working Capital Management on Market Return aCase of Pakistan Chemical Sector Muhammad Akram1,Sulaman Jamil2, Sheikh Najaf Ali 3 Department of Management Science, National University of Modern Languages,Islam Abad, Pakistan Abstract:Working Capital Management is an excellent tool that is used by a lot of companies to improve...»

«Tim V. Eaton Professor of Accountancy, Miami University (AUGUST 2015) Department of Accountancy Farmer School of Business Miami University Oxford, OH 45056 (513) 529 2132 phone (513) 529 4740 FAX eatont@miamioh.edu Refereed Publications (45) Marking Pensions to Market: What Are the Implications? (with Po-Chang Chen, Qing Burke). 2015. (May) CPA Journal. Where to Retire? The Tax Implications of Geography in Retirement” (with Brianne Kellner).2015. Journal of Business & Economics Research....»

«Rt Hon Andrew Adonis Chair, National Infrastructure Commission HM Treasury 1 Horse Guards Road London SW1 2HQ 8th January 2016 Dear Andrew National Infrastructure Commission call for evidence I am writing to you to set out London First’s views in response to the National Infrastructure Commission’s call for evidence. We support the creation of the National Infrastructure Commission and welcome the fact that the need for large scale transport improvements in London has been identified as one...»

«Curriculum Vitae (February 2013) GARY ARTHUR DYMSKI Email addresses: g.dymski@leeds.ac.uk and gary.dymski@ucr.edu EDUCATION 1975 B.A., University of Pennsylvania, Urban Studies 1977 M.P.A., Maxwell School, Syracuse University, Public Budgeting 1983 M.A., University of Massachusetts, Amherst, Economics 1987 PhD, University of Massachusetts, Amherst, Economics FIELDS: Public Policy, Political Economy, Money and Banking, Macroeconomics ACADEMIC APPOINTMENTS 2012-13 Leadership Chair in Applied...»

«Teaching KAIZEN to Small Business Owners: An Experiment in a Metalworking Cluster in Nairobi Yukichi Mano,1 John Akoten,2 Yutaka Yoshino,3 and Tetsushi Sonobe4 1. Hitotsubashi University, Tokyo 2. Anti-Counterfeit Agency, Nairobi 3. World Bank, Washington D.C. 4. National Graduate Institute for Policy Studies, Tokyo Abstract In recent years, managerial capital has received attention as one of the major determinants of enterprise productivity, growth, and longevity. This paper attempts to assess...»

«Discussion Paper No. 2001/55 Ethiopia’s New Financial Sector and Its Regulation Tony Addison1 and Alemayehu Geda2 August 2001 Abstract Ethiopia is one of a number of SSA economies that adopted state-led development strategies in the 1970s (others include Angola and Mozambique), and suffered from intense conflict (leading to the fall of the Derg regime in 1991). The new government was therefore faced with the twin tasks of reconstructing the economy, and embarking on the transition to a market...»

«Terminological Data Banks: a model for a British Linguistic Data Bank (LDB) John McNaught Centre for Computational Linguistics, UMIST Paper presented at the Aslib Technical Translation Group conference and exhibition, London 20 November 1980 A description of a model linguistic data bank (LDB) for a British market will be given, based on the results of a continuing feasibility study. A LDB represents an economical and highly efficient way of organizing Britain’s efforts in the field of...»

«Disclaimer This guide has been produced to help educate and entertain. Every effort has been made to be as accurate and complete as possible. There may, however, be mistakes in grammar or content. The purpose is to educate and inform. The author does not warrant the report to be fully complete and is not responsible for errors. This report should only be used as an informative guide. The author is not liable or responsible with respect to any loss or damage caused by use of this information....»

«DONALD R. LESSARD Epoch Foundation Professor of International Management, Emeritus MIT Sloan School of Management 100 Main Street E62-460 Cambridge, MA 02142-1347 Tel: 617-253-6688 Fax: 617-253-2660 E-mail: dlessard@mit.edu Academic Advisor, The Brattle Group, Cambridge, MA Member, Global Infrastructure Project Research Network Fields: Global strategy, major projects, and energy. Current research: Dynamic capabilities in global strategy; Global dynamics of development, production, and...»

«Financial institutions Energy Infrastructure, mining and commodities Transport Technology and innovation Life sciences and healthcare Listed company long-term executive share incentive plans Incentivising your people Listed company long-term executive share incentive plans – Incentivising your people Incentivising your people This brochure is part of the Norton Rose Fulbright LLP ‘Incentivising your people’ series which covers a range of equity and cash-based employee incentive schemes....»

«The European Union and Global Governance This book explores and analyses the multidimensional influence that the EU exerts in the world, focusing on its contribution to regional and global governance. Presenting a multidisciplinary approach with contributions by a panel of outstanding scholars from political science, economics, legal studies, philosophy and history, the book examines the EU as a global player and international power-in-the-making.The book is divided into three parts: • Part I...»

<<  HOME   |    CONTACTS
2016 www.dissertation.xlibx.info - Dissertations, online materials

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.