scholarly journals Simulating Organizational Data from Redacted Input for Inference Enterprise Modeling

Author(s):  
Paul Sticha ◽  
Tirso Diaz ◽  
Elise Axelrad ◽  
Sean Vermillion ◽  
Dennis Buede

Organizations that use data to assess insider threats, or other workforce outcomes, need to evaluate the quality of their assessment methods. This evaluation relies on an accurate representation of the predictors and criterion variables within the organization?s workforce. However, privacy concerns often limit the information that is available for evaluation. For example, the organization might anonymize identifying information of its workforce, or the evaluation might be restricted to use group statistics, such as marginal distributions of predictors and criteria, along with their intercorrelations. In this paper we demonstrate a hybrid approach for simulating workforce data to support inference-enterprise evaluation, including the crowdsourced elicitation of marginal distributions and correlations of predictors and the simulation of a workforce population from the statistical properties of a redacted set of predictor distributions. The methods provide a way to simulate a population that has statistical characteristics of the workforce, in order to assess the performance of the assessment methods. The statistical methods are supplemented by expert judgments for situations where required information is not available. We evaluate these methods using anonymized data from an actual organization.


2020 ◽  
Author(s):  
Agustin Lara-Esqueda ◽  
Sergio A Zaizar-Fregoso ◽  
Violeta M Madrigal-Perez ◽  
Mario Ramirez-Flores ◽  
Daniel A Montes-Galindo ◽  
...  

BACKGROUND Diabetes Mellitus is a worldwide health problem and the leading cause of premature death with increasing prevalence over time. Usually, along with it, Hypertension presents and acts as another risk factor that increases mortality risk. Both diseases impact the country's health while also producing an economic burden for society, causing billions of dollars to be invested in their management. OBJECTIVE The present study evaluated the quality of medical care for patients diagnosed with diabetes mellitus (DM), hypertension (HBP), and both pathologies (DM+HBP) within a public health system in Mexico, according to the official Mexican standard for each pathology. METHODS 45,498 patients were included from 2012 to 2015. All information was taken from the electronic medical records database, exported as anonymized data for research purposes. Each patient record was compared against the standard to test the quality of medical care. RESULTS Glycemia with hypertension goals reached 29.6% in DM+HBP, 48.6% in DM, and 53.2% in HBP. The goals of serum lipids were reached by 3% in DM+HBP, 5% in DM, and 0.2% in HBP. Glycemia, hypertension, and LDL cholesterol reached 0.04%. 15% of patients had an undiagnosed disease of diabetes or hypertension. Clinical follow-up examinations reached 20% for foot examination and clinical eye examination in the whole population. Specialty referral reached 1% in angiology or cardiology in the whole population. CONCLUSIONS Goals for glycemic and hypertension reached 50% in the overall population, while serum lipids, clinical follow-up examinations, and referral to a specialist were deficient. Patients who had both diseases had more consultations, better control for hypertension and lipids, but inferior glycemic control. Overall, quality care for DM and/or HBP has not been met according to the standards. While patients with DM and HBP do not have a current standard to evaluate their own needs.



i-com ◽  
2019 ◽  
Vol 18 (3) ◽  
pp. 197-216 ◽  
Author(s):  
Verena Zimmermann ◽  
Paul Gerber ◽  
Karola Marky ◽  
Leon Böck ◽  
Florian Kirchbuchner

AbstractSmart Home technologies have the potential to increase the quality of life, home security and facilitate elderly care. Therefore, they require access to a plethora of data about the users’ homes and private lives. Resulting security and privacy concerns form a relevant barrier to adopting this promising technology. Aiming to support end users’ informed decision-making through addressing the concerns we first conducted semi-structured interviews with 42 potential and little-experienced Smart Home users. Their diverse concerns were clustered into four themes that center around attacks on Smart Home data and devices, the perceived loss of control, the trade-off between functionality and security, and user-centric concerns as compared to concerns on a societal level. Second, we discuss measures to address the four themes from an interdisciplinary perspective. The paper concludes with recommendations for addressing user concerns and for supporting developers in designing user-centered Smart Home technologies.



2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Michael Hodgkins ◽  
Meg Barron ◽  
Shireesha Jevaji ◽  
Stacy Lloyd

AbstractIt took the advent of SARS-CoV-2, a “black swan event”, to widely introduce telehealth, remote care, and virtual house calls. Prior to the epidemic (2019), the American Medical Association (AMA) conducted a routine study to compare physicians’ adoption of emerging technologies to a similar survey in 2016. Most notable was a doubling in the adoption of telehealth/virtual technology to 28% and increases in the use of remote monitoring and management for improved care (13–22%). These results may now seem insignificant when compared to the unprecedented surge in telehealth visits because of SARS-CoV-2. Even as this surge levels off and begins to decline, many observers believe we will continue to see a persistent increase in the use of virtual visits compared to face-to-face care. The requirements for adoption communicated by physicians in both the 2016 and 2019 surveys are now more relevant than ever: Is remote care as effective as in-person care and how best to determine when to use these modalities? How do I safeguard my patients and my practice from liability and privacy concerns? How do I optimize using these technologies in my practice and, especially integration with my EHR and workflows to improve efficiency? And how will a mix of virtual and in-person visits affect practice revenue and sustainability? Consumers have also expressed concerns about payment for virtual visits as well as privacy and quality of care. If telehealth and remote care are here to stay, continuing to track their impact during the current public health emergency is critically important to address so that policymakers and insurers will take necessary steps to ensure that the “new normal” will reflect a health care delivery model that can provide comparable or improved results today and into the future.



2018 ◽  
Vol 2018 ◽  
pp. 1-7 ◽  
Author(s):  
Mohammed Al-Maitah ◽  
Olena O. Semenova ◽  
Andriy O. Semenov ◽  
Pavel I. Kulakov ◽  
Volodymyr Yu. Kucheruk

Artificial intelligence is employed for solving complex scientific, technical, and practical problems. Such artificial intelligence techniques as neural networks, fuzzy systems, and genetic and evolutionary algorithms are widely used for communication systems management, optimization, and prediction. Artificial intelligence approach provides optimized results in a challenging task of call admission control, handover, routing, and traffic prediction in cellular networks. 5G mobile communications are designed as heterogeneous networks, whose important requirement is accommodating great numbers of users and the quality of service satisfaction. Call admission control plays a significant role in providing the desired quality of service. An effective call admission control algorithm is needed for optimizing the cellular network system. Many call admission control schemes have been proposed. The paper proposes a methodology for developing a genetic neurofuzzy controller for call admission in 5G networks. Performance of the proposed admission control is evaluated through computer simulation.



2018 ◽  
Vol 9 (4) ◽  
pp. 22-36
Author(s):  
Mohammed Mahseur ◽  
Abdelmadjid Boukra ◽  
Yassine Meraihi

Multicast routing is the problem of finding the spanning tree of a set of destinations whose roots are the source node and its leaves are the set of destination nodes by optimizing a set of quality of service parameters and satisfying a set of transmission constraints. This article proposes a new hybrid multicast algorithm called Hybrid Multi-objective Multicast Algorithm (HMMA) based on the Strength Pareto Evolutionary Algorithm (SPEA) to evaluate and classify the population in dominated solutions and non-dominated solutions. Dominated solutions are evolved by the Bat Algorithm, and non-dominated solutions are evolved by the Firefly Algorithm. Old and weak solutions are replaced by new random solutions by a process of mutation. The simulation results demonstrate that the proposed algorithm is able to find good Pareto optimal solutions compared to other algorithms.





2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Shah Imran Alam ◽  
Ihtiram Raza Khan ◽  
Syed Imtiyaz Hassan ◽  
Farheen Siddiqui ◽  
M. Afshar Alam ◽  
...  

The benefits of open data were realised worldwide since the past decades, and the efforts to move more data under the license of open data intensified. There was a steep rise of open data in government repositories. In our study, we point out that privacy is one of the consistent and leading barriers among others. Strong privacy laws restrict data owners from opening the data freely. In this paper, we attempted to study the applied solutions and to the best of our knowledge, we found that anonymity-preserving algorithms did a substantial job to protect privacy in the release of the structured microdata. Such anonymity-preserving algorithms argue and compete in objectivethat not only could the released anonymized data preserve privacy but also the anonymized data preserve the required level of quality. K-anonymity algorithm was the foundation of many of its successor algorithms of all privacy-preserving algorithms. l-diversity claims to add another dimension of privacy protection. Both these algorithms used together are known to provide a good balance between privacy and quality control of the dataset as a whole entity. In this research, we have used the K-anonymity algorithm and compared the results with the addon of l-diversity. We discussed the gap and reported the benefits and loss with various combinations of K and l values, taken in combination with released data quality from an analyst’s perspective. We first used dummy fictitious data to explain the general expectations and then concluded the contrast in the findings with the real data from the food technology domain. The work contradicts the general assumptions with a specific set of evaluation parameters for data quality assessment. Additionally, it is intended to argue in favour of pushing for research contributions in the field of anonymity preservation and intensify the effort for major trends of research, considering its importance and potential to benefit people.



2021 ◽  
Vol 2083 (3) ◽  
pp. 032049
Author(s):  
Xinchang Hu ◽  
Pengbo Wang ◽  
Yanan Guo ◽  
Qian Han ◽  
Xinkai Zhou

Abstract The azimuth ambiguities appear widely in Synthetic Aperture Radar (SAR) images, which causes a large number of false targets and seriously affect the quality of image interpretation. Due to under-sampling in Doppler domain, ambiguous energy is mixed with energy from the main zone in the time and frequency domains. In order to effectively suppress the ambiguous energy in SAR images without loss of resolution, this paper presents a novel method of KSVD dictionary learning based on variance statistics (VS-KSVD) and compressed sensing (CS) reconstruction. According to the statistical characteristics of distributed targets, the dictionary that represents the ambiguities is selected and suppressed by coefficient weighting, in which local window filtering is carried out to remove the block effect and optimize the edge information. Finally, the high resolution images with low-ambiguity can be reconstructed by CS. With the proposed approach, the feasibility and effectiveness of the proposed approach is validated by using satellite data and simulation in suppressing azimuth ambiguity.



Author(s):  
Tristan Allard ◽  
Nicolas Anciaux ◽  
Luc Bouganim ◽  
Philippe Pucheral ◽  
Romuald Thion

During the past decade, many countries launched ambitious Electronic Health Record (EHR) programs with the objective to increase the quality of care while decreasing its cost. Pervasive healthcare aims itself at making healthcare information securely available anywhere and anytime, even in disconnected environments (e.g., at patient home). Current server-based EHR solutions badly tackle disconnected situations and fail in providing ultimate security guarantees for the patients. The solution proposed in this paper capitalizes on a new hardware device combining a secure microcontroller (similar to a smart card chip) with a large external Flash memory on a USB key form factor. Embedding the patient folder as well as a database system and a web server in such a device gives the opportunity to manage securely a healthcare folder in complete autonomy. This paper proposes also a new way of personalizing access control policies to meet patient’s privacy concerns with minimal assistance of practitioners. While both proposals are orthogonal, their integration in the same infrastructure allows building trustworthy pervasive healthcare folders.



2019 ◽  
Vol 2019 (6) ◽  
pp. 92-97
Author(s):  
Кирилл Батенков ◽  
Kirill Batenkov ◽  
Александр Королев ◽  
Aleksandr Korolev ◽  
Михаил Илюшин ◽  
...  

The aim of the work is to investigate statistical characteristics of multimedia services traffic of IP-network on the basis of the service information analysis of package headings. The investigation method is mathematical mod-eling in the Mathcad 14 environment. As a result of the investigations, a mathematical apparatus allowing the fulfillment of a point and interval assessment of separate constituents of voice messaging quality and also the decisionmaking regarding the satisfaction degree by users with the service offered. In the work on the basis of measurements with the aid of the Wireshark program and the TCP protocol there are carried out the investigations of permissible fields in a quality parameter variation of voice message perception. A stochastic tie is revealed between transfer standard quality parameters by means of plotting diagrams and histograms of the assessments of these parameters mutual impact. The most significant indices of transfer quality parameters form the standpoint of their impact upon quality of voice message perception are RTT indices and loss factors.



Sign in / Sign up

Export Citation Format

Share Document