scholarly journals Robust Wireless Sensor Network Deployment

2016 ◽  
Vol Vol. 17 no. 3 (Distributed Computing and...) ◽  
Author(s):  
Milan Erdelj ◽  
Nathalie Mitton ◽  
Tahiry Razafindralambo

International audience In this work we present a decentralized deployment algorithm for wireless mobile sensor networks focused on deployment Efficiency, connectivity Maintenance and network Reparation (EMR). We assume that a group of mobile sensors is placed in the area of interest to be covered, without any prior knowledge of the environment. The goal of the algorithm is to maximize the covered area and cope with sudden sensor failures. By relying on the locally available information regarding the environment and neighborhood, and without the need for any kind of synchronization in the network, each sensor iteratively chooses the next-step movement location so as to form a hexagonal lattice grid. Relying on the graph of wireless mobile sensors, we are able to provide the properties regarding the quality of coverage, the connectivity of the graph and the termination of the algorithm. We run extensive simulations to provide compactness properties of the deployment and evaluate the robustness against sensor failures. We show through the analysis and the simulations that EMR algorithm is robust to node failures and can restore the lattice grid. We also show that even after a failure, EMR algorithm call still provide a compact deployment in a reasonable time.

2018 ◽  
Vol Volume 27 - 2017 - Special... ◽  
Author(s):  
Vianney Kengne Tchendji ◽  
Blaise Paho Nana

International audience Wireless sensor networks (WSN) face many implementation’s problems such as connectivity, security, energy saving, fault tolerance, interference, collision, routing problems, etc. In this paper, we consider a low-density WSN where the distribution of the sensors is poor, and the virtual architecture introduced by Wadaa and al which provides a powerful and fast partitioning of the network into a set of clusters. In order to effectively route the information collected by each sensor node to the base station (sink node, located at the center of the network), we propose a technique based on multiple communication frequencies in order to avoid interferences during the communications. Secondly, we propose an empty clusters detection algorithm, allowing to know the area actually covered by the sensors after the deployment, and therefore, giving the possibility to react accordingly. Finally, we also propose a strategy to allow mobile sensors (actuators) to move in order to: save the WSN’s connectivity, improve the routing of collected data, save the sensors’ energy, improve the coverage of the area of interest, etc. Les réseaux de capteurs sans fil (RCSF) font face à de nombreux problèmes dans leur mise en oeuvre, notamment aux problèmes de connectivité des noeuds, de sécurité, d'économie d'énergie, de tolérance aux pannes, d'interférence, de collision, de routage, etc. Dans ce document, nous considérons un RCSF peu dense, caractérisé par une mauvaise couverture de la zone d'inté-rêt, et l'architecture virtuel introduite par Wadaa et al qui permet de partitionner efficacement ce type de réseau en clusters. Dans l'optique de router optimalement les informations collectés par chaque capteur jusqu'à une station de base (noeud sink, supposé au centre du réseau), nous proposons une technique d'utilisation des fréquences multiples pour limiter les interférences lors des communications. Ensuite, nous proposons un algorithme de détection de clusters vides permettant d'avoir une vue globale de la répartition réelle des capteurs dans la zone d'intérêt, et ainsi donner la possibilité de réagir en conséquence. Nous proposons également une stratégie de déplacement des capteurs mobiles (actuators) afin de: sauvegarder la connectivité du RCSF, optimiser le routage, économiser l'énergie des capteurs, améliorer la couverture de la zone d'intérêt, etc.


Molecules ◽  
2021 ◽  
Vol 26 (6) ◽  
pp. 1672
Author(s):  
Ysadora A. Mirabelli-Montan ◽  
Matteo Marangon ◽  
Antonio Graça ◽  
Christine M. Mayr Marangon ◽  
Kerry L. Wilkinson

Smoke taint has become a prominent issue for the global wine industry as climate change continues to impact the length and extremity of fire seasons around the world. Although the issue has prompted a surge in research on the subject in recent years, no singular solution has yet been identified that is capable of maintaining the quality of wine made from smoke-affected grapes. In this review, we summarize the main research on smoke taint, the key discoveries, as well as the prevailing uncertainties. We also examine methods for mitigating smoke taint in the vineyard, in the winery, and post production. We assess the effectiveness of remediation methods (proposed and actual) based on available research. Our findings are in agreement with previous studies, suggesting that the most viable remedies for smoke taint are still the commercially available activated carbon fining and reverse osmosis treatments, but that the quality of the final treated wines is fundamentally dependent on the initial severity of the taint. In this review, suggestions for future studies are introduced for improving our understanding of methods that have thus far only been preliminarily investigated. We select regions that have already been subjected to severe wildfires, and therefore subjected to smoke taint (particularly Australia and California) as a case study to inform other wine-producing countries that will likely be impacted in the future and suggest specific data collection and policy implementation actions that should be taken, even in countries that have not yet been impacted by smoke taint. Ultimately, we streamline the available information on the topic of smoke taint, apply it to a global perspective that considers the various stakeholders involved, and provide a launching point for further research on the topic.


Impact ◽  
2021 ◽  
Vol 2021 (7) ◽  
pp. 15-17
Author(s):  
Hirofumi Hamada

Education reform helps ensure that the education in a given country is of the highest possible quality and is a key area of focus for many developed countries. Japan's education system rates highly and the evolution of education reform is key to ensuring this high level is sustained. School principals play a key role in delivering high-quality education and, indeed, a school principal's leadership correlates with the quality of education available. This is an area of interest for Professor Hirofumi Hamada, School Management Laboratory in the Faculty of Human Sciences, University of Tsukuba, Japan, who is currently exploring the institutional and organisational conditions that affect the leadership of principals. The goal of this research is to help shape education reform in Japan. Hamada believes it is necessary to create an environment of independent and collaborative learning and to value the individuality of children. In addition, problem situations among children are diverse and complex and how schools respond influences the quality of education. Given that the principal is in charge of how a school is run, they play a vital role in assuring the quality of education. Key to Hamada's work is the idea that principals can share their knowledge and leadership with teachers and this creates an environment of shared leadership. He believes that empowering teachers and encouraging them to take on leadership duties is essential. He is working to inform educators that schools require the leadership of principals and for principals to promote a distributed approach to leadership.


2012 ◽  
Vol 241-244 ◽  
pp. 3116-3120
Author(s):  
Xiao Mei Hu ◽  
Biao Wang

Collaborative Virtual Environment (CVE) system supports a large number of users to explore a virtual world and interact with each other through networks, so one of the key issues in the design of scalable CVE systems is the partitioning problem. Existing partitioning algorithms in CVE systems based on multiple-server architecture, in our opinion, hardly consider the communication character of virtual environment. In this paper, we propose a new partitioning method based on area of interest (AOI) model matching to improve the quality of partitioning. The experimental results show preliminarily that our partitioning approach based on AOI model matching does decrease the traffic among the servers in the system and improve the partitioning performance.


Author(s):  
Karim Achour ◽  
Nadia Zenati ◽  
Oualid Djekoune

International audience The reduction of the blur and the noise is an important task in image processing. Indeed, these two types of degradation are some undesirable components during some high level treatments. In this paper, we propose an optimization method based on neural network model for the regularized image restoration. We used in this application a modified Hopfield neural network. We propose two algorithms using the modified Hopfield neural network with two updating modes : the algorithm with a sequential updates and the algorithm with the n-simultaneous updates. The quality of the obtained result attests the efficiency of the proposed method when applied on several images degraded with blur and noise. La réduction du bruit et du flou est une tâche très importante en traitement d'images. En effet, ces deux types de dégradations sont des composantes indésirables lors des traitements de haut niveau. Dans cet article, nous proposons une méthode d'optimisation basée sur les réseaux de neurones pour résoudre le problème de restauration d'images floues-bruitées. Le réseau de neurones utilisé est le réseau de « Hopfield ». Nous proposons deux algorithmes utilisant deux modes de mise à jour: Un algorithme avec un mode de mise à jour séquentiel et un algorithme avec un mode de mise à jour n-simultanée. L'efficacité de la méthode mise en œuvre a été testée sur divers types d'images dégradées.


2020 ◽  
Author(s):  
Xavier Couvelard ◽  
Christophe Messager ◽  
Pierrick Penven ◽  
Phillipe Lattes

Abstract The oceanic circulation south of Africa is characterized by a complex dynamics with a strong variability due to the presence of the Agulhas current and numerous eddies. The area of interest of this paper, is also the location of several natural gas fields under seafloor which are targeted for drilling and exploitation.The complex and powerful ocean currents induce significant issues for ship operations at the surface as well as under the surface for deep sea operations. Therefore, the knowledge of the state of the currents and the ability to forecast them in a realistic manner could greatly enforce the safety of various marine operation. Following this objective an array of HF radar systems was deployed to allow a detailed knowledge of the Agulhas currents and its associated eddy activity. It is shown in this study that 4DVAR assimilation of HF radar allow to represent the surface circulation more realistically. Two kind of experiments have been performed, a one-month analysis and two days forecast. The one-month 4DVAR experiment have been compared to geostrophic currents issued from altimeters and highlight an important improvement of the geostrophic currents. Furthermore, despite the restricted size of the area covered with HF radar, we show that the solution is improved almost in the whole domain, mainly upstream and downstream of the HF radar's covered area. We also show that while benefits of the assimilation on the surface current intensity is significantly reduced in the first 6 hours of the forecast, the correction in direction persists after 48 hours.


2021 ◽  
Author(s):  
Kay Wilhelm ◽  
Tonelle Handley ◽  
Catherine McHugh McHugh ◽  
David Lowenstein ◽  
Kristy Arrold

BACKGROUND The internet is increasingly seen as an important source of health information for consumers and their families. Accessing information related to their illness and treatment enables consumers to more confidently discuss their health and treatments with their doctors, but the abundance of readily available information also means can be confusing in terms of how reliable the information to enable consumers, families and clinicians to participate in the decision-making process of their care. OBJECTIVE The current study aimed to rate the quality of websites with psychosis-related information (using a validated instrument (DISCERN) and purpose-developed Psychosis Website Quality Checklist (PWQC) to assess quality over time and aid professionals in directing consumers to the best available information. METHODS Entering search terms ‘psychotic’, ‘psychosis’, ‘schizophrenia’, ‘delusion’, ‘hallucination’ into the search engine Google (www.google.com.au) provided 25 websites evaluated by DISCERN and PWQC at two time points, January-March 2014, and January-March 2018, by three diverse health professionals. RESULTS Only the six highest ranked achieved DISCERN scores indicating “good” quality. The overall mean scores of websites were 43.96 (SD=12.08) indicating “fair” quality. PWQC ratings were high on “availability and usability” but poor on “credibility,” “currency,” and “breadth and accuracy”, with no substantial improvement quality over time. Having an editorial/ review process (56% of websites) was significantly associated with higher quality scores on both scales. CONCLUSIONS The quality of available information was ‘fair’ and had not significantly improved over time. While higher-quality websites exist, there is no easy way to assess this on face value. Having a readily identifiable editorial/review process was one indicator of website quality. CLINICALTRIAL Not applicable


Author(s):  
Besma Khalfi ◽  
Cyril De Runz ◽  
Herman Akdag

When analyzing spatial issues, it is often that the geographer is confronted with many problems concerning the uncertainty of the available information. These problems may appear on the geometric or semantic quality of objects and as a result, a low precision is considered. So, it is necessary to develop representation and modeling methods that are suited to the imprecise nature of geographic data. This leads proposing recently F-Perceptory to manage fuzzy geographic data modeling. From the model described in Zoghlami, et al, (2011) some limits are relieved. F-Perceptory does not manage fuzzy composite geographic objects. The paper shows proposition to enhance the approach by the managing this type of objects in modeling and its transformation to the UML. On the technical level, the object modeling tools commonly used do not take into account fuzzy data. The authors propose new functional modules integrated under an existing CASE tool.


2001 ◽  
Vol 21 (2) ◽  
pp. 36-38 ◽  
Author(s):  
M MacKlin

Heart failure is a common reason for admission to the hospital and to critical care units. The care of patients with heart failure is changing almost daily as new research and therapies become available. Nurses caring for these patients must use available information and assessment findings to discern which type of heart failure exists in each patient. In this way, the care provided can be enhanced, and outcomes can be optimized. Critically thinking nurses can positively influence patients' quality of life and potentially reduce the devastating morbidity and mortality associated with heart failure.


Sign in / Sign up

Export Citation Format

Share Document