Implementation of the information-diffusion-based routing on a large-scale ASON test-bed

Author(s):  
Yue Chen ◽  
Nan Hua ◽  
Xiaoping Zheng ◽  
Hanyi Zhang ◽  
Bingkun Zhou
2012 ◽  
Vol 424-425 ◽  
pp. 132-136
Author(s):  
Guo Jin Chen ◽  
Zhang Ming Peng ◽  
Jian Guo Yang ◽  
Qiao Ying Huang

On the diesel engine’s test bed, this paper has studied the parameters regarding the diesel engine’s rotational speed, the piston ring’s width and wearing capacity and so on, and their relation with the output signal of the magnetoresistive sensor under the reverse drawing of the diesel engine. The research discovered that the piston ring’s wear and the magnetoresistive sensor’s output have the corresponding relationship. And on the oil tanker with the 6RTA52U diesel engine, the influence of the diesel engine’s operating parameters and the load situations to the magnetoresistive sensor’s output is surveyed under four kinds of different operating modes. The test result and the research conclusion provide the technical foundation for the online Wear monitoring of the large-scale marine diesel engine’s piston ring.


2014 ◽  
Vol 2 (1) ◽  
pp. 26-65 ◽  
Author(s):  
MANUEL GOMEZ RODRIGUEZ ◽  
JURE LESKOVEC ◽  
DAVID BALDUZZI ◽  
BERNHARD SCHÖLKOPF

AbstractTime plays an essential role in the diffusion of information, influence, and disease over networks. In many cases we can only observe when a node is activated by a contagion—when a node learns about a piece of information, makes a decision, adopts a new behavior, or becomes infected with a disease. However, the underlying network connectivity and transmission rates between nodes are unknown. Inferring the underlying diffusion dynamics is important because it leads to new insights and enables forecasting, as well as influencing or containing information propagation. In this paper we model diffusion as a continuous temporal process occurring at different rates over a latent, unobserved network that may change over time. Given information diffusion data, we infer the edges and dynamics of the underlying network. Our model naturally imposes sparse solutions and requires no parameter tuning. We develop an efficient inference algorithm that uses stochastic convex optimization to compute online estimates of the edges and transmission rates. We evaluate our method by tracking information diffusion among 3.3 million mainstream media sites and blogs, and experiment with more than 179 million different instances of information spreading over the network in a one-year period. We apply our network inference algorithm to the top 5,000 media sites and blogs and report several interesting observations. First, information pathways for general recurrent topics are more stable across time than for on-going news events. Second, clusters of news media sites and blogs often emerge and vanish in a matter of days for on-going news events. Finally, major events, for example, large scale civil unrest as in the Libyan civil war or Syrian uprising, increase the number of information pathways among blogs, and also increase the network centrality of blogs and social media sites.


Author(s):  
S. Varatharajan ◽  
K. V. Sureshkumar ◽  
K. V. Kasiviswanathan ◽  
G. Srinivasan

The second stage of Indian nuclear programme envisages the deployment of fast reactors on a large scale for the effective use of India’s limited uranium reserves. The Fast Breeder Test Reactor (FBTR) at Kalpakkam is a loop type, sodium cooled fast reactor, meant as a test bed for the fuels and structural materials for the Indian fast reactor programme. The reactor was made critical with a unique high plutonium MK-I carbide fuel (70% PuC+30%UC). Being a unique untested fuel of its kind, it was decided to test it as a driver fuel, with conservative limits on Linear Heat Rating and burn-up, based on out-of-pile studies. FBTR went critical in Oct 1985 with a small core of 23 MK-I fuel subassemblies. The Linear Heat Rating and burn-up limits for the fuel were conservatively set at 250 W/cm & 25 GWd/t respectively. Based on out-of-pile simulation in 1994, it was possible to raise the LHR to 320 W/cm. It was decided that when the fuel reaches the target burn-up of 25 GWd/t, the MK-I core would be progressively replaced with a larger core of MK-II carbide fuel (55% PuC+45%UC). Induction of MK-II subassemblies was started in 1996. However, based on the Post-Irradiation Examination (PIE) of the MK-I fuel at 25, 50 & 100 GWd/t, it became possible to enhance the burn-up of the MK-I fuel to 155 GWd/t. More than 900 fuel pins of MK-I composition have reached 155 GWd/t without even a single failure and have been discharged. One subassembly (61 pins) was taken to 165 GWd/t on trial basis, without any clad failure. The core has been progressively enlarged, adding MK-I subassemblies to compensate for the burn-up loss of reactivity and replacement of discharged subassemblies. The induction of MK-II fuel was stopped in 2003. One test subassembly simulating the composition of the MOX fuel (29% PuO2) to be used in the 500 MWe Prototype Fast Breeder Reactor was loaded in 2003. It is undergoing irradiation at 450 W/cm, and has successfully seen a burn-up of 92.5 GWd/t. In 2006, it was proposed to test high Pu MOX fuel (44% PuO2), in order to validate the fabrication and fuel cycle processes developed for the power reactor MOX fuel. Eight MOX subassemblies were loaded in FBTR core in 2007. The current core has 27 MK-I, 13 MK-II, eight high Pu MOX and one power reactor MOX fuel subassemblies. The reactor power has been progressively increased from 10.5 MWt to 18.6 MWt, due to the progressive enlargement of the core. This paper presents the evolution of the core based on the progressive enhancement of the burn-up limit of the unique high Pu carbide fuel.


Author(s):  
M. Minutillo ◽  
E. Jannelli ◽  
F. Tunzio

The main objective of this study is to evaluate the performance of a proton exchange membrane (PEM) fuel cell generator operating for residential applications. The fuel cell performance has been evaluated using the test bed of the University of Cassino. The experimental activity has been focused to evaluate the performance in different operating conditions: stack temperature, feeding mode, and fuel composition. In order to use PEM fuel cell technology on a large scale, for an electric power distributed generation, it could be necessary to feed fuel cells with conventional fuel, such as natural gas, to generate hydrogen in situ because currently the infrastructure for the distribution of hydrogen is almost nonexistent. Therefore, the fuel cell performance has been evaluated both using pure hydrogen and reformate gas produced by a natural gas reforming system.


Entropy ◽  
2021 ◽  
Vol 23 (9) ◽  
pp. 1216
Author(s):  
Jedidiah Yanez-Sierra ◽  
Arturo Diaz-Perez ◽  
Victor Sosa-Sosa

One of the main problems in graph analysis is the correct identification of relevant nodes for spreading processes. Spreaders are crucial for accelerating/hindering information diffusion, increasing product exposure, controlling diseases, rumors, and more. Correct identification of spreaders in graph analysis is a relevant task to optimally use the network structure and ensure a more efficient flow of information. Additionally, network topology has proven to play a relevant role in the spreading processes. In this sense, more of the existing methods based on local, global, or hybrid centrality measures only select relevant nodes based on their ranking values, but they do not intentionally focus on their distribution on the graph. In this paper, we propose a simple yet effective method that takes advantage of the underlying graph topology to guarantee that the selected nodes are not only relevant but also well-scattered. Our proposal also suggests how to define the number of spreaders to select. The approach is composed of two phases: first, graph partitioning; and second, identification and distribution of relevant nodes. We have tested our approach by applying the SIR spreading model over nine real complex networks. The experimental results showed more influential and scattered values for the set of relevant nodes identified by our approach than several reference algorithms, including degree, closeness, Betweenness, VoteRank, HybridRank, and IKS. The results further showed an improvement in the propagation influence value when combining our distribution strategy with classical metrics, such as degree, outperforming computationally more complex strategies. Moreover, our proposal shows a good computational complexity and can be applied to large-scale networks.


2013 ◽  
Vol 2013 ◽  
pp. 1-11 ◽  
Author(s):  
Chia-Wei Lin ◽  
Tzuu-Hseng S. Li ◽  
Chung-Cheng Chen

The paper presents a novel feedback linearization controller of nonlinear multiinput multioutput time-delay large-scale systems to obtain both the tracking and almost disturbance decoupling (ADD) performances. The significant contribution of this paper is to build up a control law such that the overall closed-loop system is stable for given initial condition and bounded tracking trajectory with the input-to-state-stability characteristic and almost disturbance decoupling performance. We have simulated the two-inverted-pendulum system coupled by a spring for networked control systems which has been used as a test bed for the study of decentralized control of large-scale systems.


OPNET is a network simulation tool which can simulate various elements in a network. It is able to analyze traffics and even can simulate security events recently. System designers can identify vulnerabilities or find an efficient algorithm/protocol that could reduce costs if simulations are performed prior to actual implementation/construction of a large-scale system. Thus, this chapter includes the method of simulating a PLC-system, designing a PLC-based vertical fish farm and an expandable simulation test bed with which students or researchers will be able simulate their implementation methods. With these methods, it will be possible to perform large-scale simulations.


2019 ◽  
Vol 3 (Supplement_1) ◽  
pp. S443-S443
Author(s):  
Sandra Varey ◽  
Mandy Dixon ◽  
Alejandra Hernandez ◽  
Ceu Mateus ◽  
Tom Palmer ◽  
...  

Abstract Ways to address the increasing healthcare needs of older people are a priority for the National Health Service (NHS) in England. The NHS England Test Bed programme was designed to trial new models of care that are supported by digital health technologies. This paper reports on findings from one Test Bed programme, the Lancashire and Cumbria Innovation Alliance (LCIA) – a partnership between NHS England, industry and Lancaster University, which ran from 2016 to 2018. A key aim of the LCIA Test Bed was to explore the extent to which supported self-care telehealth technology helped older people with long-term conditions to better self-manage their own care, promoting independence and enabling them to remain at home for longer. Each patient received a combination of health technologies over a six-month period. This paper presents results from the qualitative data that formed part of a large-scale mixed-methods evaluation. Specifically it draws on the analysis of 34 observational interviews with 17 participants with chronic obstructive pulmonary disease (COPD) to understand the role of these technologies in the self-management of their care. The data revealed that the majority of participants felt more confident about self-managing COPD as a result of their participation in the programme. These increases in confidence were the result of participants’ increased knowledge and skills in managing their COPD. The paper demonstrates how patients learned to better manage their respiratory condition, the impact of this learning on their daily lives and that of their family carers, and the implications for healthcare practice.


2019 ◽  
Vol 9 (18) ◽  
pp. 3758 ◽  
Author(s):  
Xiang Li ◽  
Xiaojie Wang ◽  
Chengli Zhao ◽  
Xue Zhang ◽  
Dongyun Yi

Locating the source that undergoes a diffusion-like process is a fundamental and challenging problem in complex network, which can help inhibit the outbreak of epidemics among humans, suppress the spread of rumors on the Internet, prevent cascading failures of power grids, etc. However, our ability to accurately locate the diffusion source is strictly limited by incomplete information of nodes and inevitable randomness of diffusion process. In this paper, we propose an efficient optimization approach via maximum likelihood estimation to locate the diffusion source in complex networks with limited observations. By modeling the informed times of the observers, we derive an optimal source localization solution for arbitrary trees and then extend it to general graphs via proper approximations. The numerical analyses on synthetic networks and real networks all indicate that our method is superior to several benchmark methods in terms of the average localization accuracy, high-precision localization and approximate area localization. In addition, low computational cost enables our method to be widely applied for the source localization problem in large-scale networks. We believe that our work can provide valuable insights on the interplay between information diffusion and source localization in complex networks.


Sign in / Sign up

Export Citation Format

Share Document