Weather and Climate and the Power Sector: Needs, Recent Developments and Challenges

2014 ◽  
pp. 379-398 ◽  
Author(s):  
Laurent Dubus
Author(s):  
Bryan Lim ◽  
Stefan Zohren

Numerous deep learning architectures have been developed to accommodate the diversity of time-series datasets across different domains. In this article, we survey common encoder and decoder designs used in both one-step-ahead and multi-horizon time-series forecasting—describing how temporal information is incorporated into predictions by each model. Next, we highlight recent developments in hybrid deep learning models, which combine well-studied statistical models with neural network components to improve pure methods in either category. Lastly, we outline some ways in which deep learning can also facilitate decision support with time-series data. This article is part of the theme issue ‘Machine learning for weather and climate modelling’.


2017 ◽  
Vol 98 (3) ◽  
pp. 565-588 ◽  
Author(s):  
Judith Berner ◽  
Ulrich Achatz ◽  
Lauriane Batté ◽  
Lisa Bengtsson ◽  
Alvaro de la Cámara ◽  
...  

Abstract The last decade has seen the success of stochastic parameterizations in short-term, medium-range, and seasonal forecasts: operational weather centers now routinely use stochastic parameterization schemes to represent model inadequacy better and to improve the quantification of forecast uncertainty. Developed initially for numerical weather prediction, the inclusion of stochastic parameterizations not only provides better estimates of uncertainty, but it is also extremely promising for reducing long-standing climate biases and is relevant for determining the climate response to external forcing. This article highlights recent developments from different research groups that show that the stochastic representation of unresolved processes in the atmosphere, oceans, land surface, and cryosphere of comprehensive weather and climate models 1) gives rise to more reliable probabilistic forecasts of weather and climate and 2) reduces systematic model bias. We make a case that the use of mathematically stringent methods for the derivation of stochastic dynamic equations will lead to substantial improvements in our ability to accurately simulate weather and climate at all scales. Recent work in mathematics, statistical mechanics, and turbulence is reviewed; its relevance for the climate problem is demonstrated; and future research directions are outlined.


2019 ◽  
pp. 45-63
Author(s):  
Krishna AchutaRao ◽  
Friederike Otto

Attribution of observed changes in long-term climate to anthropogenic and other external forcings has been a mainstay of many global assessments. Evidence of the role of anthropogenic forcings in the changing climate over the Indian region has been growing in recent years. Recent developments in event attribution techniques now make it possible to link global warming to individual weather and climate-related events. While a large number of event-attribution studies of extreme events from around the globe exist, very few have been carried out over India. In this chapter, a review of available event-attribution studies as well as studies that address attribution of long-term climate change over India is presented.


Author(s):  
C. Colliex ◽  
P. Trebbia

The physical foundations for the use of electron energy loss spectroscopy towards analytical purposes, seem now rather well established and have been extensively discussed through recent publications. In this brief review we intend only to mention most recent developments in this field, which became available to our knowledge. We derive also some lines of discussion to define more clearly the limits of this analytical technique in materials science problems.The spectral information carried in both low ( 0<ΔE<100eV ) and high ( >100eV ) energy regions of the loss spectrum, is capable to provide quantitative results. Spectrometers have therefore been designed to work with all kinds of electron microscopes and to cover large energy ranges for the detection of inelastically scattered electrons (for instance the L-edge of molybdenum at 2500eV has been measured by van Zuylen with primary electrons of 80 kV). It is rather easy to fix a post-specimen magnetic optics on a STEM, but Crewe has recently underlined that great care should be devoted to optimize the collecting power and the energy resolution of the whole system.


Author(s):  
Kent McDonald

At the light microscope level the recent developments and interest in antibody technology have permitted the localization of certain non-microtubule proteins within the mitotic spindle, e.g., calmodulin, actin, intermediate filaments, protein kinases and various microtubule associated proteins. Also, the use of fluorescent probes like chlorotetracycline suggest the presence of membranes in the spindle. Localization of non-microtubule structures in the spindle at the EM level has been less rewarding. Some mitosis researchers, e.g., Rarer, have maintained that actin is involved in mitosis movements though the bulk of evidence argues against this interpretation. Others suggest that a microtrabecular network such as found in chromatophore granule movement might be a possible force generator but there is little evidence for or against this view. At the level of regulation of spindle function, Harris and more recently Hepler have argued for the importance of studying spindle membranes. Hepler also believes that membranes might play a structural or mechanical role in moving chromosomes.


Author(s):  
G.Y. Fan ◽  
J.M. Cowley

In recent developments, the ASU HB5 has been modified so that the timing, positioning, and scanning of the finely focused electron probe can be entirely controlled by a host computer. This made the asynchronized handshake possible between the HB5 STEM and the image processing system which consists of host computer (PDP 11/34), DeAnza image processor (IP 5000) which is interfaced with a low-light level TV camera, array processor (AP 400) and various peripheral devices. This greatly facilitates the pattern recognition technique initiated by Monosmith and Cowley. Software called NANHB5 is under development which, instead of employing a set of photo-diodes to detect strong spots on a TV screen, uses various software techniques including on-line fast Fourier transform (FFT) to recognize patterns of greater complexity, taking advantage of the sophistication of our image processing system and the flexibility of computer software.


Author(s):  
William Krakow ◽  
David A. Smith

Recent developments in specimen preparation, imaging and image analysis together permit the experimental determination of the atomic structure of certain, simple grain boundaries in metals such as gold. Single crystal, ∼125Å thick, (110) oriented gold films are vapor deposited onto ∼3000Å of epitaxial silver on (110) oriented cut and polished rock salt substrates. Bicrystal gold films are then made by first removing the silver coated substrate and placing in contact two suitably misoriented pieces of the gold film on a gold grid. Controlled heating in a hot stage first produces twist boundaries which then migrate, so reducing the grain boundary area, to give mixed boundaries and finally tilt boundaries perpendicular to the foil. These specimens are well suited to investigation by high resolution transmission electron microscopy.


Author(s):  
W.J. de Ruijter ◽  
P. Rez ◽  
David J. Smith

There is growing interest in the on-line use of computers in high-resolution electron n which should reduce the demands on highly skilled operators and thereby extend the r of the technique. An on-line computer could obviously perform routine procedures hand, or else facilitate automation of various restoration, reconstruction and enhan These techniques are slow and cumbersome at present because of the need for cai micrographs and off-line processing. In low resolution microscopy (most biologic; primary incentive for automation and computer image analysis is to create a instrument, with standard programmed procedures. In HREM (materials researc computer image analysis should lead to better utilization of the microscope. Instru (improved lens design and higher accelerating voltages) have improved the interpretab the level of atomic dimensions (approximately 1.6 Å) and instrumental resolutior should become feasible in the near future.


Author(s):  
S.J. Krause ◽  
W.W. Adams

Over the past decade low voltage scanning electron microscopy (LVSEM) of polymers has evolved from an interesting curiosity to a powerful analytical technique. This development has been driven by improved instrumentation and in particular, reliable field emission gun (FEG) SEMs. The usefulness of LVSEM has also grown because of an improved theoretical and experimental understanding of sample-beam interactions and by advances in sample preparation and operating techniques. This paper will review progress in polymer LVSEM and present recent results and developments in the field.In the early 1980s a new generation of SEMs produced beam currents that were sufficient to allow imaging at low voltages from 5keV to 0.5 keV. Thus, for the first time, it became possible to routinely image uncoated polymers at voltages below their negative charging threshold, the "second crossover", E2 (Fig. 1). LVSEM also improved contrast and reduced beam damage in sputter metal coated polymers. Unfortunately, resolution was limited to a few tenths of a micron due to the low brightness and chromatic aberration of thermal electron emission sources.


Sign in / Sign up

Export Citation Format

Share Document