loss of information
Recently Published Documents


TOTAL DOCUMENTS

299
(FIVE YEARS 64)

H-INDEX

22
(FIVE YEARS 3)

Author(s):  
Husna Sarirah Husin

Messaging applications have become one of the largest and most popular smartphone applications. It includes the capacity for the users to communicate between themselves via text messages, photos and files. It is necessary to safeguard all messages. Privacy is one of the biggest issues which most individuals in the world of instant messaging ignore. Although several instant messaging applications offer varying security for users, the weaknesses and danger of data assault are increasing. Not just business discussions, our data must also be safeguarded during everyday discussions since data is very sensitive for everybody, and data protection is very crucial to prevent undesired loss of information. To address these types of weaknesses and hazards associated with data attacks, we require an encrypted messaging protocol and also hide IP address method for a safe interaction. This paper's goal is to protect conversations from targeted attacker by securing the communication between user and hide IP address from unauthorized access.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Márcio A. Diniz ◽  
Gillian Gresham ◽  
Sungjin Kim ◽  
Michael Luu ◽  
N. Lynn Henry ◽  
...  

Abstract Background Graphical displays and data visualization are essential components of statistical analysis that can lead to improved understanding of clinical trial adverse event (AE) data. Correspondence analysis (CA) has been introduced decades ago as a multivariate technique that can communicate AE contingency tables using two-dimensional plots, while quantifying the loss of information as other dimension reduction techniques such as principal components and factor analysis. Methods We propose the application of stacked CA using contribution biplots as a tool to explore differences in AE data among treatments in clinical trials. We defined five levels of refinement for the analysis based on data derived from the Common Terminology Criteria for Adverse Events (CTCAE) grades, domains, terms and their combinations. In addition, we developed a Shiny app built in an R-package, visae, publicly available on Comprehensive R Archive Network (CRAN), to interactively investigate CA configurations based on the contribution to the explained variance and relative frequency of AEs. Data from two randomized controlled trials (RCT) were used to illustrate the proposed methods: NSABP R-04, a neoadjuvant rectal 2 × 2 factorial trial comparing radiation therapy with either capecitabine (Cape) or 5-fluorouracil (5-FU) alone with or without oxaliplatin (Oxa), and NSABP B-35, a double-blind RCT comparing tamoxifen to anastrozole in postmenopausal women with hormone-positive ductal carcinoma in situ. Results In the R04 trial (n = 1308), CA biplots displayed the discrepancies between single agent treatments and their combinations with Oxa at all levels of AE classes, such that these discrepancies were responsible for the largest portion of the explained variability among treatments. In addition, an interaction effect when adding Oxa to Cape/5-FU was identified when the distance between Cape+Oxa and 5-FU + Oxa was observed to be larger than the distance between 5-FU and Cape, with Cape+Oxa and 5-FU + Oxa in different quadrants of the CA biplots. In the B35 trial (n = 3009), CA biplots showed different patterns for non-adherent Anastrozole and Tamoxifen compared with their adherent counterparts. Conclusion CA with contribution biplot is an effective tool that can be used to summarize AE data in a two-dimensional display while minimizing the loss of information and interpretation.


Author(s):  
Ji Seung Yang ◽  
Brian MacWhinney ◽  
Nan Bernstein Ratner

Purpose The Index of Productive Syntax (IPSyn) is a well-known language sample analysis tool. However, its psychometric properties have not been assessed across a wide sample of typically developing preschool-age children and children with language disorders. We sought to determine the profile of IPSyn scores by age over early childhood. We additionally explored whether the IPSyn could be shortened to fewer items without loss of information and whether the required language sample could be shortened from a current required number of 100 utterances to 50. Method We used transcripts from the Child Language Data Exchange System, including 1,051 samples of adult–child conversational play with toys within the theoretical framework of item response theory. Samples included those from typically developing children as well as children with hearing loss, Down syndrome, and late language emergence. Results The Verb Phrase and Sentence Structure subscales showed more stable developmental trajectories over the preschool years and greater differentiation between typical and atypical cohorts than did the Noun Phrase and Question/Negation subscales. A number of current IPSyn scoring items can be dropped without loss of information, and 50-utterance samples demonstrate most of the same psychometric properties of longer samples. Discussion Our findings suggest ways in which the IPSyn can be automated and streamlined (proposed IPSyn-C) so as to provide useful clinical guidance with fewer items and a shorter required language sample. Reference values for the IPSyn-C are provided. Trajectories for one subscale (Question/Negation) appear inherently unstable and may require structured elicitation. Potential limitations, ramifications, and future directions are discussed. Supplemental Material https://doi.org/10.23641/asha.16915690


Author(s):  
G. S. Floros ◽  
C. Ellul

Abstract. Modern cities will have a catalytic role in regulating global economic growth and development, highlighting their role as centers of economic activity. With urbanisation being a consequence of that, the built environment is pressured to withstand the rapid increase in demand of buildings as well as safe, resilient and sustainable transportation infrastructure. Transportation Infrastructure has a unique characteristic: it is interconnected and thus, it is essential for the stakeholders to be able to capture, analyse and visualise these interlinked relationships efficiently and effectively. This requirement is addressed by an Asset Information Management System (AIMS) which enables the capture of such information from the early stages of a transport infrastructure construction project. Building Information Modelling (BIM) and Geographic Information Science/Systems (GIS) are two domains which facilitate the authoring, management and exchange of asset information by providing the location underpinning, both in the short term and through the very long lifespan of the infrastructure. These systems are not interoperable by nature, with extensive Extract/Transform/Load procedures required when developing an integrated location-based Asset Management system, with consequent loss of information. The purpose of this paper is to provide an insight regarding the information lifecycle during Design and Construction on a Highways Project, focusing on identifying the stages in which loss of information can impact decision-making during operational Asset Management: (i) 3D Model to IFC, (ii) IFC to AIM and (iii) IFC to 3DGIS for AIM. The discussion highlights the significance of custom property sets and classification systems to bridge the different data structures as well as the power of 3D in visualizing Asset Information, with future work focusing on the potential of early BIM-GIS integration for operational AM.


2021 ◽  
Vol 4 (2(112)) ◽  
pp. 6-17
Author(s):  
Vladimir Barannik ◽  
Serhii Sidchenko ◽  
Dmitriy Barannik ◽  
Sergii Shulgin ◽  
Valeriy Barannik ◽  
...  

Along with the widespread use of digital images, an urgent scientific and applied issue arose regarding the need to reduce the volume of video information provided it is confidential and reliable. To resolve this issue, cryptocompression coding methods could be used. However, there is no method that summarizes all processing steps. This paper reports the development of a conceptual method for the cryptocompression coding of images on a differentiated basis without loss of information quality. It involves a three-stage technology for the generation of cryptocompression codograms. The first two cascades provide for the generation of code structures for information components while ensuring their confidentiality and key elements as a service component. On the third cascade of processing, it is proposed to manage the confidentiality of the service component. The code values for the information components of nondeterministic length are derived out on the basis of a non-deterministic number of elements of the source video data in a reduced dynamic range. The generation of service data is proposed to be organized in blocks of initial images with a dimension of 16×16 elements. The method ensures a decrease in the volume of source images during the generation of cryptocompression codograms, by 1.14–1.58 times (12–37 %), depending on the degree of their saturation. This is 12.7‒23.4 % better than TIFF technology and is 9.6‒17.9 % better than PNG technology. The volume of the service component of cryptocompression codograms is 1.563 % of the volume of the source video data or no more than 2.5 % of the total code stream. That reduces the amount of data for encryption by up to 40 times compared to TIFF and PNG technologies. The devised method does not introduce errors into the data in the coding process and refers to methods without loss of information quality.


2021 ◽  
Vol 17 ◽  
Author(s):  
Nikolaos Naziris ◽  
Maria Chountoulesi ◽  
Stavros Stavrinides ◽  
Michael Hanias ◽  
Costas Demetzos

Background: Natural and living systems are dynamical systems that demonstrate complex behavior, which appears to be deterministic chaotic, characterized and governed by entropy increase and loss of information throughout their entire lifespan. Lipidic nanoparticles, such as liposomes, as artificial biomembranes, have long been considered appropriate models for studying various membrane phenomena that cell systems exhibit. By utilizing these models, we can better comprehend cellular functions, stability, as well as factors that might alter the cell physiology, leading to severe disease states. In addition, liposomes are well-established drug and vaccine delivery nanosystems, which are present in the market, playing a significant role; therefore, due to their importance, issues concerning their effectiveness and stability are research topics that are constantly investigated and updated. Methods: In this study, the emergent deterministic chaotic behavior of liposomes is described, while evaluation in accordance to their colloidal physical stability, by utilizing established nonlinear dynamics tools, is presented. Two liposomes of different composition and physical stability were developed and a chaotic evaluation on the time series of their size and polydispersity was conducted. Results: The utilized models revealed instability, loss of information and order loss for both liposomes in due time, with important differentiation. An initial interpretation of the results is apposed, whereas the foundations for further investigating possible exploitation of the demonstrated nonlinearity and adaptability of artificial biomembranes is laid, with projection on biosystems. Conclusion: The present approach is expected to impact the application of lipidic nanoparticles and liposomes in various crucial fields, such as drug and vaccine delivery, providing useful information for academia and the industry.


2021 ◽  
Author(s):  
Ruchi Gupta ◽  
Courtney N Day ◽  
W Oliver Tobin ◽  
Cynthia S Crowson

Abstract Many Neuro-Oncology studies commonly assess the association between a prognostic factor (predictor) and disease or outcome, such as the association between age and glioma. Predictors can be continuous (e.g., age) or categorical (e.g., race/ethnicity). Effects of categorical predictors are frequently easier to visualize and interpret than effects of continuous variables. This makes it an attractive, and seemingly justifiable, option to subdivide the continuous predictors into categories (e.g., age< 50 years vs. age ≥50 years). However, this approach results in loss of information (and power) compared to the continuous version. This review outlines the use cases for continuous and categorized predictors and provides tips and pitfalls for interpretation of these approaches.


2021 ◽  
Vol 10 (9) ◽  
pp. e57710918243
Author(s):  
Amanda Letícia Abegg da Silveira ◽  
Gustavo Marques e Amorim ◽  
Dhonatan Diego Pessi ◽  
Normandes Matos da Silva ◽  
Camila Leonardo Mioto ◽  
...  

The main objective of this study is qualify the seismic activities captured at the Santo Antônio do Leverger seismic station (SALV), state of Mato Grosso, Brazil. The choice of this station was due to its location in the northern portion of the Pantanal Sedimentary Basin, considered historically as one of the seismogenic regions of Brazil. The methodology involved downloading the files through the poet.py script, where the events were identified, selected and stored within 440 km from the SALV station. Once confirmed, filtrations were applied using the Butterworth method. This method minimizes the loss of information recorded by events that have occurred. Thus, most events indicate disagreement with the applied filters, demonstrating that there is a structure that influences the propagation of the wave, changing its original speed and frequency. The triggering method was applied to the behavior of seismic waves that crossed the sedimentary interval of the Pantanal Basin. The results were plotted on maps, overlaid in the form of sub-regions of the Pantanal using the QGIS geoprocessing software, in which the geological contacts were determined. Thus, it was possible to observe that each sub-region presents a different profile for each event, with sub-regions that tend to increase the signal's acceleration, maintain it or slow it down.


2021 ◽  
Vol 11 (14) ◽  
pp. 6405
Author(s):  
Pere Marti-Puig ◽  
Alejandro Bennásar-Sevillá ◽  
Alejandro Blanco-M. ◽  
Jordi Solé-Casals

Today, the use of SCADA data for predictive maintenance and forecasting of wind turbines in wind farms is gaining popularity due to the low cost of this solution compared to others that require the installation of additional equipment. SCADA data provides four statistical measures (mean, standard deviation, maximum value, and minimum value) of hundreds of wind turbine magnitudes, usually in a 5-min or 10-min interval. Several studies have analysed the loss of information associated with the reduction of information when using five minutes instead of four seconds as a sampling frequency, or when compressing a time series recorded at 5 min to 10 min, concluding that some, but not all, of these magnitudes are seriously affected. However, to our knowledge, there are no studies on increasing the time interval beyond 10 min to take these four statistical values, and how this aggregation affects prognosis models. Our work shows that, despite the irreversible loss of information that occurs in the first 5 min, increasing the time considered to take the four representative statistical values improves the performance of the predicted targets in normality models.


Sign in / Sign up

Export Citation Format

Share Document