scholarly journals Web Application for Atmospheric Aerosol Data Management: Software and Case Study in the Spanish Network on Environmental Differential Mobility Analysers

Atmosphere ◽  
2019 ◽  
Vol 10 (5) ◽  
pp. 279 ◽  
Author(s):  
Javier Andrade-Garda ◽  
Sonia Suárez-Garaboa ◽  
Antonio Álvarez-Rodríguez ◽  
María Piñeiro-Iglesias ◽  
Purificación López-Mahía ◽  
...  

SCALA© (Sampling Campaigns for Aerosols in the Low Atmosphere) is a web-based software system that was developed in a multidisciplinary manner to integrally support the documentation and the management and analysis of atmospheric aerosol data from sampling campaigns. The software development process applied considered the prototyping and the evolutionary approaches. The software product (SCALA©) allows for the comprehensive management of the sampling campaigns’ life cycle (management of the profiles and processes involved in the start-up, development and closure of a campaign) and provides support for both intra- and inter-campaigns data analysis. The pilot deployment of SCALA© considers the Spanish Network on Environmental Differential Mobility Analysers (DMAs) (REDMAAS) and the PROACLIM project. This research project involves, among other objectives, the study of temporal and spatial variations of the atmospheric aerosol through a set of microphysical properties (size distribution, optical properties, hygroscopicity, etc.) measured in several locations in Spain. The main conclusions regarding size distribution are presented in this work. These have been have been extracted through SCALA© from the data collected in the REDMAAS 2015 and 2019 intercomparison campaigns and two years (2015 and 2016) of measurements with two Scanning Mobility Particle Sizers (SMPS) at CIEMAT (Madrid, central Spain) and UDC (A Coruña, NW of Spain) sites.

2010 ◽  
Vol 3 (6) ◽  
pp. 5521-5587 ◽  
Author(s):  
A. Wiedensohler ◽  
W. Birmili ◽  
A. Nowak ◽  
A. Sonntag ◽  
K. Weinhold ◽  
...  

Abstract. Particle mobility size spectrometers often referred to as DMPS (Differential Mobility Particle Sizers) or SMPS (Scanning Mobility Particle Sizers) have found a wide application in atmospheric aerosol research. However, comparability of measurements conducted world-wide is hampered by lack of generally accepted technical standards with respect to the instrumental set-up, measurement mode, data evaluation as well as quality control. This article results from several instrument intercomparison workshops conducted within the European infrastructure project EUSAAR (European Supersites for Atmospheric Aerosol Research). Under controlled laboratory conditions, the number size distribution from 20 to 200 nm determined by mobility size spectrometers of different design are within an uncertainty range of ±10% after correcting internal particle losses, while below and above this size range the discrepancies increased. Instruments with identical design agreed within ±3% in the peak number concentration when all settings were done carefully. Technical standards were developed for a minimum requirement of mobility size spectrometry for atmospheric aerosol measurements. Technical recommendations are given for atmospheric measurements including continuous monitoring of flow rates, temperature, pressure, and relative humidity for the sheath and sample air in the differential mobility analyser. In cooperation with EMEP (European Monitoring and Evaluation Program), a new uniform data structure was introduced for saving and disseminating the data within EMEP. This structure contains three levels: raw data, processed data, and final particle size distributions. Importantly, we recommend reporting raw measurements including all relevant instrument parameters as well as a complete documentation on all data transformation and correction steps. These technical and data structure standards aim to enhance the quality of long-term size distribution measurements, their comparability between different networks and sites, and their transparency and traceability back to raw data.


2012 ◽  
Vol 2 (2) ◽  
pp. 112-116
Author(s):  
Shikha Bhatia ◽  
Mr. Harshpreet Singh

With the mounting demand of web applications, a number of issues allied to its quality have came in existence. In the meadow of web applications, it is very thorny to develop high quality web applications. A design pattern is a general repeatable solution to a generally stirring problem in software design. It should be noted that design pattern is not a finished product that can be directly transformed into source code. Rather design pattern is a depiction or template that describes how to find solution of a problem that can be used in many different situations. Past research has shown that design patterns greatly improved the execution speed of a software application. Design pattern are classified as creational design patterns, structural design pattern, behavioral design pattern, etc. MVC design pattern is very productive for architecting interactive software systems and web applications. This design pattern is partition-independent, because it is expressed in terms of an interactive application running in a single address space. We will design and analyze an algorithm by using MVC approach to improve the performance of web based application. The objective of our study will be to reduce one of the major object oriented features i.e. coupling between model and view segments of web based application. The implementation for the same will be done in by using .NET framework.


2019 ◽  
Author(s):  
Ruslan N. Tazhigulov ◽  
James R. Gayvert ◽  
Melissa Wei ◽  
Ksenia B. Bravaya

<p>eMap is a web-based platform for identifying and visualizing electron or hole transfer pathways in proteins based on their crystal structures. The underlying model can be viewed as a coarse-grained version of the Pathways model, where each tunneling step between hopping sites represented by electron transfer active (ETA) moieties is described with one effective decay parameter that describes protein-mediated tunneling. ETA moieties include aromatic amino acid residue side chains and aromatic fragments of cofactors that are automatically detected, and, in addition, electron/hole residing sites that can be specified by the users. The software searches for the shortest paths connecting the user-specified electron/hole source to either all surface-exposed ETA residues or to the user-specified target. The identified pathways are ranked based on their length. The pathways are visualized in 2D as a graph, in which each node represents an ETA site, and in 3D using available protein visualization tools. Here, we present the capability and user interface of eMap 1.0, which is available at https://emap.bu.edu.</p>


2021 ◽  
pp. 193229682098557
Author(s):  
Alysha M. De Livera ◽  
Jonathan E. Shaw ◽  
Neale Cohen ◽  
Anne Reutens ◽  
Agus Salim

Motivation: Continuous glucose monitoring (CGM) systems are an essential part of novel technology in diabetes management and care. CGM studies have become increasingly popular among researchers, healthcare professionals, and people with diabetes due to the large amount of useful information that can be collected using CGM systems. The analysis of the data from these studies for research purposes, however, remains a challenge due to the characteristics and large volume of the data. Results: Currently, there are no publicly available interactive software applications that can perform statistical analyses and visualization of data from CGM studies. With the rapidly increasing popularity of CGM studies, such an application is becoming necessary for anyone who works with these large CGM datasets, in particular for those with little background in programming or statistics. CGMStatsAnalyser is a publicly available, user-friendly, web-based application, which can be used to interactively visualize, summarize, and statistically analyze voluminous and complex CGM datasets together with the subject characteristics with ease.


2018 ◽  
Vol 7 (4.15) ◽  
pp. 130
Author(s):  
Emil Semastin ◽  
Sami Azam ◽  
Bharanidharan Shanmugam ◽  
Krishnan Kannoorpatti ◽  
Mirjam Jonokman ◽  
...  

Today’s contemporary business world has incorporated Web Services and Web Applications in its core of operating cycle nowadays and security plays a major role in the amalgamation of such services and applications with the business needs worldwide. OWASP (Open Web Application Security Project) states that the effectiveness of security mechanisms in a Web Application can be estimated by evaluating the degree of vulnerability against any of the nominated top ten vulnerabilities, nominated by the OWASP. This paper sheds light on a number of existing tools that can be used to test for the CSRF vulnerability. The main objective of the research is to identify the available solutions to prevent CSRF attacks. By analyzing the techniques employed in each of the solutions, the optimal tool can be identified. Tests against the exploitation of the vulnerabilities were conducted after implementing the solutions into the web application to check the efficacy of each of the solutions. The research also proposes a combined solution that integrates the passing of an unpredictable token through a hidden field and validating it on the server side with the passing of token through URL.  


2020 ◽  
pp. 5-9
Author(s):  
Manasvi Srivastava ◽  
◽  
Vikas Yadav ◽  
Swati Singh ◽  
◽  
...  

The Internet is the largest source of information created by humanity. It contains a variety of materials available in various formats such as text, audio, video and much more. In all web scraping is one way. It is a set of strategies here in which we get information from the website instead of copying the data manually. Many Web-based data extraction methods are designed to solve specific problems and work on ad-hoc domains. Various tools and technologies have been developed to facilitate Web Scraping. Unfortunately, the appropriateness and ethics of using these Web Scraping tools are often overlooked. There are hundreds of web scraping software available today, most of them designed for Java, Python and Ruby. There is also open source software and commercial software. Web-based software such as YahooPipes, Google Web Scrapers and Firefox extensions for Outwit are the best tools for beginners in web cutting. Web extraction is basically used to cut this manual extraction and editing process and provide an easy and better way to collect data from a web page and convert it into the desired format and save it to a local or archive directory. In this paper, among others the kind of scrub, we focus on those techniques that extract the content of a Web page. In particular, we use scrubbing techniques for a variety of diseases with their own symptoms and precautions.


2014 ◽  
Vol 102 (1) ◽  
pp. 69-80 ◽  
Author(s):  
Torregrosa Daniel ◽  
Forcada Mikel L. ◽  
Pérez-Ortiz Juan Antonio

Abstract We present a web-based open-source tool for interactive translation prediction (ITP) and describe its underlying architecture. ITP systems assist human translators by making context-based computer-generated suggestions as they type. Most of the ITP systems in literature are strongly coupled with a statistical machine translation system that is conveniently adapted to provide the suggestions. Our system, however, follows a resource-agnostic approach and suggestions are obtained from any unmodified black-box bilingual resource. This paper reviews our ITP method and describes the architecture of Forecat, a web tool, partly based on the recent technology of web components, that eases the use of our ITP approach in any web application requiring this kind of translation assistance. We also evaluate the performance of our method when using an unmodified Moses-based statistical machine translation system as the bilingual resource.


Sign in / Sign up

Export Citation Format

Share Document