node type
Recently Published Documents


TOTAL DOCUMENTS

42
(FIVE YEARS 10)

H-INDEX

9
(FIVE YEARS 1)

2021 ◽  
Vol 22 (15) ◽  
pp. 8067
Author(s):  
Kinga Szyman ◽  
Bartek Wilczyński ◽  
Michał Dąbrowski

Maps of Hi-C contacts between promoters and enhancers can be analyzed as networks, with cis-regulatory regions as nodes and their interactions as edges. We checked if in the published promoter–enhancer network of mouse embryonic stem (ES) cells the differences in the node type (promoter or enhancer) and the node degree (number of regions interacting with a given promoter or enhancer) are reflected by sequence composition or sequence similarity of the interacting nodes. We used counts of all k-mers (k = 4) to analyze the sequence composition and the Euclidean distance between the k-mer count vectors (k-mer distance) as the measure of sequence (dis)similarity. The results we obtained with 4-mers are interpretable in terms of dinucleotides. Promoters are GC-rich as compared to enhancers, which is known. Enhancers are enriched in scaffold/matrix attachment regions (S/MARs) patterns and depleted of CpGs. Furthermore, we show that promoters are more similar to their interacting enhancers than vice-versa. Most notably, in both promoters and enhancers, the GC content and the CpG count increase with the node degree. As a consequence, enhancers of higher node degree become more similar to promoters, whereas higher degree promoters become less similar to enhancers. We confirmed the key results also for human keratinocytes.


2021 ◽  
Author(s):  
Oliver Bent ◽  
Julian Kuehnert ◽  
Sekou Remy ◽  
Anne Jones ◽  
Blair Edwards

<div data-node-type="line"> <div data-node-type="line"><span>The increase in extreme weather associated with acute climate change is leading to more frequent and severe flood events. </span><span> In the window of months </span><span>and </span><span>years, climate change </span><span>adaption </span><span>is critical to </span><span>mitigate risk on socio-economic systems</span><span>. Mathematical and computational models have become widely used tool</span><span>s</span><span> to </span><span>quantify the impact of catastrophic flooding</span><span> and to predict future</span><span> flood</span> <span>risks</span><span>.</span><span> For decision makers to plan ahead and to select informed policies and interventions, it is </span><span>vital</span><span> that the uncertainties of these models are well estimated</span><span>.</span><span> Besides the inherent uncertainty of the mathematical model, uncertainties arise from parameter calibration and the driving observational climate data.</span></div> <div data-node-type="line"><span>Here we focus on the uncertainty of seasonal flood risk prediction for which we</span><span> treat u</span><span>ncertainty propagation</span><span> as a two step process. Firstly through calibration of model parameter distributions based on observational data. In order to propagate parameter uncertainties, the posed calibration framework is required to infer model parameter posterior distributions, as opposed to a single best-fit estimate. While secondly uncertainty is propagated by the </span><span>seasonal </span><span>weather </span><span>forecasts </span><span>driving the flood risk prediction models, such model drivers have their own inherent uncertainty as predictions. Through handling both sources of uncertainty and its propagation we investigate the impacts of combined</span><span> uncertainty</span><span> quantification methods</span><span> for flooding predictions. </span><span>The first step focussing on the flooding models own characterisation of uncertainty and the second characterising how uncertain model drivers impact our future predictions.</span></div> <div data-node-type="line"><span>In order to achieve the above features of a calibration framework for flood models we leverage concepts from machine learning. At the core we assume a minimisation of a loss function by the methods based on the supervised learning task in order to achieve calibration of the flood model. Uncertainty quantification is equally a growing field in machine learning or AI with regards the interpretability of parametric models. For this purpose we have adopted a Bayesian framework which contains natural descriptions of model expectation and variance. Through combining uncertainty quantification with the steps of supervised learning for parameter calibrations we propose a novel approach for seasonal flood risk prediction.</span></div> </div><div data-node-type="line"></div>


2021 ◽  
Author(s):  
Andreas Geiges ◽  
Claire Fyson ◽  
Frederic Hans ◽  
Louise Jeffery ◽  
Silke Mooldijk ◽  
...  

<div data-node-type="line"><span>In 2020, c</span><span>limate target</span><span> announcements</span><span> were dominated by net-zero </span><span>commitments, including</span> <span>by a number of</span><span> major emitters. Despite the urgency of more ambitious ND</span><span>Cs</span><span> in </span><span>the </span><span>short term, long-term net-zero targets are important for the transition to global zero emissions. </span><span>Tracking progress towards and assessing the adequacy of these targets requires an assessment of what they mean for transition pathways and associated emissions trajectories at both national and global levels. </span></div><div data-node-type="line"></div><div data-node-type="line"><span>We present an</span><span> assessment of net-zero targets of the </span><span>major emitting</span><span> countries</span><span> and their implications for long-term emissions trajectories and warming levels.</span><span> Based on the work of the Climate Action Tracker, country</span><span>-</span><span>specific analys</span><span>e</span><span>s are aggregated to a global emission</span><span>s</span><span> pathway to derive a best estimate for a resulting global warming in 2100. </span><span>Undertaking this analysis requires assumptions to be made regarding projected emissions and removals from the land-use sector, non-CO2 emissions, and the trajectory of total net emissions after net-zero, which we explain and explore. For example, by computing the cumulative emissions of our aggregated net-zero target emissions pathway, we can compare this pathway with modelled global emissions pathways from the IPCC's SR1.5 Special Report, to draw broad conclusions over what current net-zero commitments might mean for carbon dioxide removal and non-CO2 emissions, and the uncertainties therein.</span></div>


2021 ◽  
Author(s):  
Tatsuya Ishikawa ◽  
Takao Moriyama ◽  
Paolo Fraccaro ◽  
Anne Jones ◽  
Blair Edwards

<div data-node-type="line"><span>Floods have significant impact on social and economic activities</span><span>,</span> <span>with</span><span> flood </span><span>frequency projected </span><span>to increase in the future in </span><span>many regions of the world</span> <span>due to</span><span> climate change</span><span>. Quantification of current and future flood risk at lead times of months to years are potentially of high value for planning activities in a wide range of humanitarian and business applications across multiple sectors. However, there are also many technical and methodological challenges in producing accurate, local predictions which also adequately quantify uncertainty. Multiple geospatial datasets are freely available to improve flood predictions, but their size and complexity mean they are difficult to store and combine. Generation of flood inundation risk maps requires the combination of several static geospatial data layers with potentially multiple simulation models and ensembles of climate inputs.</span></div><div> </div><div data-node-type="line"></div><div data-node-type="line"><span>Here w</span><span>e present a geospatial climate impact modelling framework, which we apply to the challenge of flooding </span><span>risk quantification</span><span>.  </span><span>Our framework</span><span> is modular, scalable cloud-based </span><span>and </span><span>allows for the easy deployment of different impact models and model components with a range of input datasets (different spatial and temporal scales) and model configurations.  </span></div><div data-node-type="line"><span> </span></div><div data-node-type="line"><span>The framework allows us to use automated tools to carry out AI-enabled parameter calibration, model validation and uncertainty quantification/propagation, with the ability to quickly run the impact models for any location where the appropriate data is available.  We can additionally trial different sources of input data, pulling data from IBM PAIRS Geoscope and other sources, and we have done this with our pluvial flood models.</span></div><div> </div><div data-node-type="line"></div><div data-node-type="line"><span>In this presentation, we provide pluvial flood risk assessments </span><span>generated through</span><span> our framework. We calibrate</span><span> our</span><span> flood models to accurately reproduce inundations derived from historical precipitation datasets</span><span>, validated </span><span>against flood maps obtained from corresponding satellite imager</span><span>y,</span><span> and quantify uncertainties for hydrological parameters. Probabilistic flood risk </span><span>is</span><span> generated through ensemble execution of </span><span>such</span><span> models</span><span>,</span><span> incorporating climate change and model parameter uncertainties.</span></div>


2020 ◽  
Vol 72 (1) ◽  
Author(s):  
Kaoru Sawazaki ◽  
Takeshi Nakamura

Abstract We have identified “N”-shaped Y/X amplitude spectral ratios in S-coda records from a significant number of OBSs (ocean bottom seismometers) belonging to in-line-type ocean bottom networks of S-net and ETMC deployed around the Japan Trench and Sagami Trough, respectively. The “N”-shape reflects a sharp peak and notch at approximately 5–13 Hz and 10–23 Hz, respectively. This shape does not characterize OBSs belonging to node-type ocean bottom network of DONET deployed near the Nankai Trough. For S-net stations, the “N”-shape is not clearly formed for stations installed within grooves dug in the seafloor. We interpret the “N”-shaped Y/X amplitude spectral ratio is caused by the natural vibrations of a cylindrical pressure vessel that is placed sideways (long-axis lies in the horizontal plane) on the seafloor. The notch and peak frequencies in the Y/X amplitude spectral ratio likely correspond to natural frequencies of longitudinal (X-direction) and torsional and/or bending (Y-direction) vibrations, respectively. These natural vibrations are not observed for buried OBSs or those installed within grooves in the seafloor probably because they are better coupled to the seafloor. We propose a simple model to evaluate the extent to which the peak and notch have formed, which depends on the natural frequencies and coupling of the pressure vessel. We suggest users of in-line-type OBSs carefully examine if there are different responses between the X and Y components when frequencies about > 3 Hz are used. When installing OBS networks in the future, installing OBSs and cables within grooves dug in the seafloor or by burial will be effective in suppressing such natural vibrations.


Wooden trusses are widely used in construction and differ in a variety of structural forms. In general, their bearing capacity and stiffness are determined by the design solution of the node joints. In order to accept significant loads and reduce the overall deformation of trusses, it is necessary to develop new types of nodes that would also be characterized by low labor intensity of manufacturing and a high degree of operational reliability. Proposed by the authors nodes of wooden trusses based on steel glued flat rods are met the above requirements. The article describes the results of experimental studies of a wooden truss with nodal joints on glued flat rods under the short-term loads. The layout principles of the proposed node type are given; test procedure of experimental structures and results of experimental studies are presented: features of operation of steel connecting plates glued into wood in the nodes are revealed. It is shown that the adopted design solution of nodes refers to the joints of wooden structures of a rigid type and provides sufficient load-bearing capacity of the trusses and their increased rigidity. The nature of the destruction and the value of the destructive load confirmed the operational reliability of the proposed type of wooden trusses, including under the action of long-term loads. The analysis of the results revealed the directions of further improvement of wooden trusses nodes with steel glued flat rods.


2020 ◽  
Vol 5 (3) ◽  
pp. 387-394
Author(s):  
Boniface A. Oriji ◽  
Iribhogbe Silas Aire

Stuck pipe incidents translate to non-productive time. There is a need to mitigate stuck pipe incidents which can be achieved by conforming to recommended practices. Also, quick diagnosis is necessary in order to free a stuck pipe. Trial-and-error method can further complicate the situation. This work aims at diagnosing stuck pipe mechanisms and recommend practices to free a stuck pipe. spANALYZE also estimates the axial force and torque needed to free a stuck pipe caused by differential sticking. spANALYZE is a thick desktop client application developed in C# using the Microsoft Visual Studio 2019 development environment. It is an object-oriented .NET application that utilizes the Windows Presentation Foundation (WPF) architecture for its user interface. Each of the analyzers within spANALYZE were implemented generically as a list of nodes, representing the concept of a flow chart. New analyzers can easily be added simply by programmatically defining each node in the flow chart. Each node has a node identifier, a node type, node text, and the node identifiers of each answer – yes, no and restricted. spANALYZE presents the following benefits: quick and early detection of stuck pipe mechanisms, propose recommended action steps to free pipe, calculate stuck pipe depth, compute the torque and axial force needed to free a stuck string.


Author(s):  
Katerine Guerrero ◽  
Jorge Finke

Abstract Many networks are made up of different nodes types, which are determined by a set of common quantitative or qualitative node properties. Understanding the effects of homophilic relationships, that is, the tendency of nodes to establish links to other nodes that are alike, requires formal frameworks that explain how local decision-making mechanisms contribute to the formation of particular network structures. Based on two simple stochastic mechanisms for establishing links, this article introduces a model that explains the emergence of homophily as an aggregate group and network level outcome. We characterize the dynamics of homophily and present conditions that guarantee that the amount of homophily exceeds the expected amount of a purely random decision-making process. Moreover, we show that the proposed model resembles patterns of homophily in a citation network of political blogs. Finally, we use the model to design a non-homophilic node detection algorithm for identifying nodes that establish connections without a particular preference for either node type.


IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 46480-46490 ◽  
Author(s):  
Fei Xue ◽  
Shaofeng Lu ◽  
Ettore Bompard ◽  
Ciwei Gao ◽  
Lin Jiang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document