scholarly journals The impact of inhomogeneous subgrid clumping on cosmic reionization

2019 ◽  
Vol 491 (2) ◽  
pp. 1600-1621
Author(s):  
Yi Mao ◽  
Jun Koda ◽  
Paul R Shapiro ◽  
Ilian T Iliev ◽  
Garrelt Mellema ◽  
...  

ABSTRACT Cosmic reionization was driven by the imbalance between early sources and sinks of ionizing radiation, both of which were dominated by small-scale structure and are thus usually treated in cosmological reionization simulations by subgrid modelling. The recombination rate of intergalactic hydrogen is customarily boosted by a subgrid clumping factor, 〈n2〉/〈n〉2, which corrects for unresolved fluctuations in gas density n on scales below the grid-spacing of coarse-grained simulations. We investigate in detail the impact of this inhomogeneous subgrid clumping on reionization and its observables, as follows: (1) Previous attempts generally underestimated the clumping factor because of insufficient mass resolution. We perform a high-resolution N-body simulation that resolves haloes down to the pre-reionization Jeans mass to derive the time-dependent, spatially varying local clumping factor and a fitting formula for its correlation with local overdensity. (2) We then perform a large-scale N-body and radiative transfer simulation that accounts for this inhomogeneous subgrid clumping by applying this clumping factor-overdensity correlation. Boosting recombination significantly slows the expansion of ionized regions, which delays completion of reionization and suppresses 21 cm power spectra on large scales in the later stages of reionization. (3) We also consider a simplified prescription in which the globally averaged, time-evolving clumping factor from the same high-resolution N-body simulation is applied uniformly to all cells in the reionization simulation, instead. Observables computed with this model agree fairly well with those from the inhomogeneous clumping model, e.g. predicting 21 cm power spectra to within 20 per cent error, suggesting it may be a useful approximation.

2017 ◽  
Vol 10 (3) ◽  
pp. 1383-1402 ◽  
Author(s):  
Paolo Davini ◽  
Jost von Hardenberg ◽  
Susanna Corti ◽  
Hannah M. Christensen ◽  
Stephan Juricke ◽  
...  

Abstract. The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate – specifically the Madden–Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).


2016 ◽  
Author(s):  
Paolo Davini ◽  
Jost von Hardenberg ◽  
Susanna Corti ◽  
Hannah M. Christensen ◽  
Stephan Juricke ◽  
...  

Abstract. The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth-System Model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 km up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PBytes of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Center (LRZ) in Garching, Germany. About 140 TBytes of post-processed data are stored on the CINECA supercomputing center archives and are freely accessible to the community thanks to an EUDAT Data Pilot project. This paper presents the technical and scientific setup of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given: an improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increases is observed. It is also shown that including stochastic parameterisation in the low resolution runs helps to improve some aspects of the tropical climate – specifically the Madden-Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small scale processes on the large scale climate variability either explicitly (with high resolution simulations) or stochastically (in low resolution simulations).


2016 ◽  
Author(s):  
R. J. Haarsma ◽  
M. Roberts ◽  
P. L. Vidale ◽  
C. A. Senior ◽  
A. Bellucci ◽  
...  

Abstract. Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest the possibility for significant changes in both large-scale aspects of circulation, as well as improvements in small-scale processes and extremes. However, such high resolution global simulations at climate time scales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centers and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other MIPs. Increases in High Performance Computing (HPC) resources, as well as the revised experimental design for CMIP6, now enables a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950-2050, with the possibility to extend to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulation. HighResMIP thereby focuses on one of the CMIP6 broad questions: “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.


2016 ◽  
Vol 9 (11) ◽  
pp. 4185-4208 ◽  
Author(s):  
Reindert J. Haarsma ◽  
Malcolm J. Roberts ◽  
Pier Luigi Vidale ◽  
Catherine A. Senior ◽  
Alessio Bellucci ◽  
...  

Abstract. Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950–2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. HighResMIP thereby focuses on one of the CMIP6 broad questions, “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.


2021 ◽  
Author(s):  
Alessio Domeneghetti ◽  
Antonio Leonardi ◽  
Oliver E. J. Wing ◽  
Francesca Carisi ◽  
Armando Brath

<p>The execution of large-scale (i.e., continental or global) hydraulic modeling is nowadays a reality thanks to the increasing computational capacity, data availability, as well as understanding of essential physical dynamics. Such achievements are typically associated to a compromise in terms of model resolutions (the finer being of few tens of meters, with a coarsened representation of the terrain) and, thus, accuracy on representing the topographic peculiarities of the flood-prone areas. Nevertheless, the experience gained observing the dynamics of past inundations highlights the role of small-scale topographic features (e.g., minor embankments, road deck, railways, etc.) in driving the flow paths. Recent advances on automated identification of flood defense from high resolution digital elevation model paved the way to include hydraulically relevant features (e.g., main levees) while preserving the model resolution suitable for large-scale applications (Wing et al, 2020). <br>The present study extends this approach to flood-prone areas by investigating how the automatic detection of minor topographic discontinuities can enhance the estimation of flood dynamics of large-scale models. Taking advantage of high-resolution topographic data (i.e., 1-2 m) the approach automatically detects hydraulically relevant features and preserves their height while coarsening the resolution of the terrain used into the hydraulic model. The impact of such approach on the inundation dynamic is tested referring to three different case-studies that recently experienced riverine flooding: Secchia and Enza rivers (2014, 2017, respectively; Italy), Des Moines (Iowa, USA). The results confirm the relevance of small-scale topographic features, which, when considered, ensure a high correspondence to observations and local models. The element of strength of the presented approach is that such performances are ensured without requiring the adoption of high grid resolutions, and thus, not affecting the overall computational costs.</p>


2019 ◽  
Vol 15 (S359) ◽  
pp. 312-317
Author(s):  
Francoise Combes

AbstractGas fueling AGN (Active Galaxy Nuclei) is now traceable at high-resolution with ALMA (Atacama Large Millimeter Array) and NOEMA (NOrthern Extended Millimeter Array). Dynamical mechanisms are essential to exchange angular momentum and drive the gas to the super-massive black hole. While at 100pc scale, the gas is sometimes stalled in nuclear rings, recent observations reaching 10pc scale (50mas), may bring smoking gun evidence of fueling, within a randomly oriented nuclear gas disk. AGN feedback is also observed, in the form of narrow and collimated molecular outflows, which point towards the radio mode, or entrainment by a radio jet. Precession has been observed in a molecular outflow, indicating the precession of the radio jet. One of the best candidates for precession is the Bardeen-Petterson effect at small scale, which exerts a torque on the accreting material, and produces an extended disk warp. The misalignment between the inner and large-scale disk, enhances the coupling of the AGN feedback, since the jet sweeps a large part of the molecular disk.


Sensors ◽  
2018 ◽  
Vol 18 (10) ◽  
pp. 3232 ◽  
Author(s):  
Yan Liu ◽  
Qirui Ren ◽  
Jiahui Geng ◽  
Meng Ding ◽  
Jiangyun Li

Efficient and accurate semantic segmentation is the key technique for automatic remote sensing image analysis. While there have been many segmentation methods based on traditional hand-craft feature extractors, it is still challenging to process high-resolution and large-scale remote sensing images. In this work, a novel patch-wise semantic segmentation method with a new training strategy based on fully convolutional networks is presented to segment common land resources. First, to handle the high-resolution image, the images are split as local patches and then a patch-wise network is built. Second, training data is preprocessed in several ways to meet the specific characteristics of remote sensing images, i.e., color imbalance, object rotation variations and lens distortion. Third, a multi-scale training strategy is developed to solve the severe scale variation problem. In addition, the impact of conditional random field (CRF) is studied to improve the precision. The proposed method was evaluated on a dataset collected from a capital city in West China with the Gaofen-2 satellite. The dataset contains ten common land resources (Grassland, Road, etc.). The experimental results show that the proposed algorithm achieves 54.96% in terms of mean intersection over union (MIoU) and outperforms other state-of-the-art methods in remote sensing image segmentation.


2017 ◽  
Vol 10 (5) ◽  
pp. 2031-2055 ◽  
Author(s):  
Thomas Schwitalla ◽  
Hans-Stefan Bauer ◽  
Volker Wulfmeyer ◽  
Kirsten Warrach-Sagi

Abstract. Increasing computational resources and the demands of impact modelers, stake holders, and society envision seasonal and climate simulations with the convection-permitting resolution. So far such a resolution is only achieved with a limited-area model whose results are impacted by zonal and meridional boundaries. Here, we present the setup of a latitude-belt domain that reduces disturbances originating from the western and eastern boundaries and therefore allows for studying the impact of model resolution and physical parameterization. The Weather Research and Forecasting (WRF) model coupled to the NOAH land–surface model was operated during July and August 2013 at two different horizontal resolutions, namely 0.03 (HIRES) and 0.12° (LOWRES). Both simulations were forced by the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis data at the northern and southern domain boundaries, and the high-resolution Operational Sea Surface Temperature and Sea Ice Analysis (OSTIA) data at the sea surface.The simulations are compared to the operational ECMWF analysis for the representation of large-scale features. To analyze the simulated precipitation, the operational ECMWF forecast, the CPC MORPHing (CMORPH), and the ENSEMBLES gridded observation precipitation data set (E-OBS) were used as references.Analyzing pressure, geopotential height, wind, and temperature fields as well as precipitation revealed (1) a benefit from the higher resolution concerning the reduction of monthly biases, root mean square error, and an improved Pearson skill score, and (2) deficiencies in the physical parameterizations leading to notable biases in distinct regions like the polar Atlantic for the LOWRES simulation, the North Pacific, and Inner Mongolia for both resolutions.In summary, the application of a latitude belt on a convection-permitting resolution shows promising results that are beneficial for future seasonal forecasting.


Author(s):  
I. Smyrnov

Rural tourism is now seen as an important direction of development of the regional economy. From the perspective of sustainable development rural tourism affects the economic, social and environmental aspects of the regional and local economy. Rural tourism is closely linked with agrotourism, eco-tourism, natural tourism and so on. Sustainable rural tourism can be realized by applying logistic, geographic and marketing approaches as components of sustainable development strategies. Logistics approach is determined by logistic potential of resource base of rural tourism and appropriate tourist flows regulation. In this context in the article the concept of tourism capacity or capacity of the resource base of rural tourism is used. The problem of the definition of tourism pressure on the resource base of rural tourism, particularly in natural landscapes is disclosed. Unlike environmental and recrealogical sciences, which stop at the capacity definition of the resource base of tourism, tourism logistics compares this figure with the existing tourist flows and accordingly determines the safe way of tourism management to ensure its sustainable nature. It was shown that these strategies boil down to two basic types – the further development of tourism in a particular area or limit such activities to conserve the resource base of tourism. Recreational (travel) load is the indicator that reflects the impact of tourism on the resource base of tourism (especially landscape complex), expressed by the number of tourists or tourists-days per area unit or per tourist site for the certain period of time (day, month, season year). There are actual, allowable (the maximum) and destructive (dangerous) types of travel load. The latter can lead recreational area or resource base of rural tourism to destruction. Thus, depending on the intensity of tourism resource base using in rural tourism it may change – according to tourist consumption. Large number of tourists affects the entire range of recreational destinations and their individual components. The most vulnerable part of the environment in this sense is vegetation, except that significant changes may occur with soil, water bodies, air and so on. The geographic dimension of the problem of rural tourism sustainable development includes the concept of zoning, ie the division of the territory, offering to develop rural tourism in several zones with different modes of travel usage – from a total ban (in protected areas) for complete freedom with transitional stages, involving various limit degrees in the development of rural tourism. Marketing approach reflects the application of the curve R. Butler to the stages of development of rural tourism destinations with the release of such steps as: research, involvement, development, consolidation, stagnation (also called “saturation”), revival or decline. Shown the models that link the stage of resource base tourist development (under “Curve Butler”), strength of tourism consumption the magnitude of such effects (eg weak (disperse) impact in large scale, strong (concentrated) impact in large scale, strong (concentrated) impact in small scale, weak (disperse) impact in small scale), dynamics of tourism development at the territory.


2016 ◽  
Vol 144 (4) ◽  
pp. 1407-1421 ◽  
Author(s):  
Michael L. Waite

Abstract Many high-resolution atmospheric models can reproduce the qualitative shape of the atmospheric kinetic energy spectrum, which has a power-law slope of −3 at large horizontal scales that shallows to approximately −5/3 in the mesoscale. This paper investigates the possible dependence of model energy spectra on the vertical grid resolution. Idealized simulations forced by relaxation to a baroclinically unstable jet are performed for a wide range of vertical grid spacings Δz. Energy spectra are converged for Δz 200 m but are very sensitive to resolution with 500 m ≤ Δz ≤ 2 km. The nature of this sensitivity depends on the vertical mixing scheme. With no vertical mixing or with weak, stability-dependent mixing, the mesoscale spectra are artificially amplified by low resolution: they are shallower and extend to larger scales than in the converged simulations. By contrast, vertical hyperviscosity with fixed grid-scale damping rate has the opposite effect: underresolved spectra are spuriously steepened. High-resolution spectra are converged except for the stability-dependent mixing case, which are damped by excessive mixing due to enhanced shear over a wide range of horizontal scales. It is shown that converged spectra require resolution of all vertical scales associated with the resolved horizontal structures: these include quasigeostrophic scales for large-scale motions with small Rossby number and the buoyancy scale for small-scale motions at large Rossby number. It is speculated that some model energy spectra may be contaminated by low vertical resolution, and it is recommended that vertical-resolution sensitivity tests always be performed.


Sign in / Sign up

Export Citation Format

Share Document