code base
Recently Published Documents


TOTAL DOCUMENTS

106
(FIVE YEARS 64)

H-INDEX

7
(FIVE YEARS 2)

2022 ◽  
Vol 2 (1) ◽  
pp. 26-31
Author(s):  
Hendra Rohman

Background: Analysis of accuracy and validity fill code diagnosis on medical record document is very important because if diagnosis code is not appropriate with ICD-10, will cause decline in quality services health center, generated data have this validation data level is low, because accuracy code very important for health center such as index process and statistical report, as basis for making outpatient morbidity report and top ten diseases reports, as well as influencing policies will be taken by primary health center management. This study aims to analyze accuracy and validity diagnosis disease code based on ICD-10 fourth quarter in 2020 Imogiri I Health Center Bantul.Methods: Descriptive qualitative approach, case study design. Subject is a doctor, nurse, head record medical and staff. Object is outpatients medical record document in Imogiri I Health Center Bantul. Total sample 99 medical record file. Obtaining data from this study through interviews and observations.Results: Number of complete accurate diagnosis codes is 60 (60,6%), incomplete accurate diagnosis codes is 26 (26.3%) and inaccurate diagnosis codes is 13 (13.1%). Inaccuracies include errors in determining code, errors in determining 4th character ICD-10 code, not adding 4th and 5th characters, not including external cause, and multiple diseases.Conclusions: Inaccuracy factors are not competence medical record staff, incomplete diagnosis writing and no training, no evaluation or coding audit has been carried out, and standard operational procedure is not socialized.


2021 ◽  
Vol 14 (12) ◽  
pp. 7673-7704
Author(s):  
Mark G. Flanner ◽  
Julian B. Arnheim ◽  
Joseph M. Cook ◽  
Cheng Dang ◽  
Cenlin He ◽  
...  

Abstract. The Snow, Ice, and Aerosol Radiative (SNICAR) model has been used in various capacities over the last 15 years to model the spectral albedo of snow with light-absorbing constituents (LACs). Recent studies have extended the model to include an adding-doubling two-stream solver and representations of non-spherical ice particles; carbon dioxide snow; snow algae; and new types of mineral dust, volcanic ash, and brown carbon. New options also exist for ice refractive indices and solar-zenith-angle-dependent surface spectral irradiances used to derive broadband albedo. The model spectral range was also extended deeper into the ultraviolet for studies of extraterrestrial and high-altitude cryospheric surfaces. Until now, however, these improvements and capabilities have not been merged into a unified code base. Here, we document the formulation and evaluation of the publicly available SNICAR-ADv3 source code, web-based model, and accompanying library of constituent optical properties. The use of non-spherical ice grains, which scatter less strongly into the forward direction, reduces the simulated albedo perturbations from LACs by ∼9 %–31 %, depending on which of the three available non-spherical shapes are applied. The model compares very well against measurements of snow albedo from seven studies, though key properties affecting snow albedo are not fully constrained with measurements, including ice effective grain size of the top sub-millimeter of the snowpack, mixing state of LACs with respect to ice grains, and site-specific LAC optical properties. The new default ice refractive indices produce extremely high pure snow albedo (>0.99) in the blue and ultraviolet part of the spectrum, with such values only measured in Antarctica so far. More work is needed particularly in the representation of snow algae, including experimental verification of how different pigment expressions and algal cell concentrations affect snow albedo. Representations and measurements of the influence of liquid water on spectral snow albedo are also needed.


2021 ◽  
Vol 923 (1) ◽  
pp. 19
Author(s):  
Eileen C. Gonzales ◽  
Ben Burningham ◽  
Jacqueline K. Faherty ◽  
Channon Visscher ◽  
Mark Marley ◽  
...  

Abstract We present the first retrieval analysis of a substellar subdwarf, SDSS J125637.13−022452.4 (SDSS J1256−0224), using the Brewster retrieval code base. We find SDSS J1256−0224 is best fit by a cloud-free model with an ion (neutral H, H−, and electron) abundance corresponding to Fe / H ion = − 1.5 . However, this model is indistinguishable from a cloud-free model with Fe / H ion = − 2.0 and a cloud-free model with Fe / H ion = − 1.5 assuming a subsolar carbon-to-oxygen ratio. We are able to constrain abundances for H2O, FeH, and CrH, with an inability to constrain any carbon-bearing species likely due to the low metallicity of SDSS J1256−0224. We also present an updated spectral energy distribution (SED) and semiempirical fundamental parameters. Our retrieval- and SED-based fundamental parameters agree with the Baraffe low-metallicity evolutionary models. From examining our “rejected” models (those with ΔBIC > 45), we find that we are able to retrieve gas abundances consistent with those of our best fitting model. We find the cloud in these poorer fitting “cloudy” models is either pushed to the bottom of the atmosphere or made optically thin.


2021 ◽  
Author(s):  
◽  
Paul Radford

<p>Event log messages are currently the only genuine interface through which computer systems administrators can effectively monitor their systems and assemble a mental perception of system state. The popularisation of the Internet and the accompanying meteoric growth of business-critical systems has resulted in an overwhelming volume of event log messages, channeled through mechanisms whose designers could not have envisaged the scale of the problem. Messages regarding intrusion detection, hardware status, operating system status changes, database tablespaces, and so on, are being produced at the rate of many gigabytes per day for a significant computing environment. Filtering technologies have not been able to keep up. Most messages go unnoticed; no  filtering whatsoever is performed on them, at least in part due to the difficulty of implementing and maintaining an effective filtering solution. The most commonly-deployed  filtering alternatives rely on regular expressions to match pre-defi ned strings, with 100% accuracy, which can then become ineffective as the code base for the software producing the messages 'drifts' away from those strings. The exactness requirement means all possible failure scenarios must be accurately anticipated and their events catered for with regular expressions, in order to make full use of this technique. Alternatives to regular expressions remain largely academic. Data mining, automated corpus construction, and neural networks, to name the highest-profi le ones, only produce probabilistic results and are either difficult or impossible to alter in any deterministic way. Policies are therefore not supported under these alternatives. This thesis explores a new architecture which utilises rich metadata in order to avoid the burden of message interpretation. The metadata itself is based on an intention to improve end-to-end communication and reduce ambiguity. A simple yet effective filtering scheme is also presented which fi lters log messages through a short and easily-customisable set of rules. With such an architecture, it is envisaged that systems administrators could signi ficantly improve their awareness of their systems while avoiding many of the false-positives and -negatives which plague today's fi ltering solutions.</p>


2021 ◽  
Author(s):  
◽  
Paul Radford

<p>Event log messages are currently the only genuine interface through which computer systems administrators can effectively monitor their systems and assemble a mental perception of system state. The popularisation of the Internet and the accompanying meteoric growth of business-critical systems has resulted in an overwhelming volume of event log messages, channeled through mechanisms whose designers could not have envisaged the scale of the problem. Messages regarding intrusion detection, hardware status, operating system status changes, database tablespaces, and so on, are being produced at the rate of many gigabytes per day for a significant computing environment. Filtering technologies have not been able to keep up. Most messages go unnoticed; no  filtering whatsoever is performed on them, at least in part due to the difficulty of implementing and maintaining an effective filtering solution. The most commonly-deployed  filtering alternatives rely on regular expressions to match pre-defi ned strings, with 100% accuracy, which can then become ineffective as the code base for the software producing the messages 'drifts' away from those strings. The exactness requirement means all possible failure scenarios must be accurately anticipated and their events catered for with regular expressions, in order to make full use of this technique. Alternatives to regular expressions remain largely academic. Data mining, automated corpus construction, and neural networks, to name the highest-profi le ones, only produce probabilistic results and are either difficult or impossible to alter in any deterministic way. Policies are therefore not supported under these alternatives. This thesis explores a new architecture which utilises rich metadata in order to avoid the burden of message interpretation. The metadata itself is based on an intention to improve end-to-end communication and reduce ambiguity. A simple yet effective filtering scheme is also presented which fi lters log messages through a short and easily-customisable set of rules. With such an architecture, it is envisaged that systems administrators could signi ficantly improve their awareness of their systems while avoiding many of the false-positives and -negatives which plague today's fi ltering solutions.</p>


2021 ◽  
Vol 20 (5s) ◽  
pp. 1-22
Author(s):  
Yuheng Shen ◽  
Hao Sun ◽  
Yu Jiang ◽  
Heyuan Shi ◽  
Yixiao Yang ◽  
...  

A real-time operating system (RTOS) is an operating system designed to meet certain real-time requirements. It is widely used in embedded applications, and its correctness is safety-critical. However, the validation of RTOS is challenging due to its complex real-time features and large code base. In this paper, we propose Rtkaller , a state-aware kernel fuzzer for the vulnerability detection in RTOS. First, Rtkaller implements an automatic task initialization to transform the syscall sequences into initial tasks with more real-time information. Then, a coverage-guided task mutation is designed to generate those tasks that explore more in-depth real-time related code for parallel execution. Moreover, Rtkaller realizes a task modification to correct those tasks that may hang during fuzzing. We evaluated it on recent versions of rt-Linux, which is one of the most widely used RTOS. Compared to the state-of-the-art kernel fuzzers Syzkaller and Moonshine, Rtkaller  achieves the same code coverage at the speed of 1.7X  and 1.6X, gains an increase of 26.1% and 22.0% branch coverage within 24 hours respectively. More importantly, Rtkaller  has confirmed 28 previously unknown vulnerabilities that are missed by other fuzzers.


Author(s):  
Isaac Lyngaas ◽  
Matt Norman ◽  
Youngsung Kim

In this work, we demonstrate the process for porting the cloud resolving model (CRM) used in the Energy Exascale Earth System Model Multi-Scale Modeling Framework (E3SM-MMF) from its original Fortran code base to C++ code using a portability library. This porting process is performed using the Yet Another Kernel Library (YAKL), a simplified C++ portability library that specializes in Fortran porting. In particular, we detail our step-by-step approach for porting the System for Atmospheric Modeling (SAM), the CRM used in E3SM-MMF, using a hybrid Fortran/C++ framework that allows for systematic reproduction and correctness testing of gradually ported YAKL C++ code. Additionally, analysis is done on the performance of the ported code using OLCF’s Summit supercomputer.


2021 ◽  
Vol 81 (10) ◽  
Author(s):  
◽  
Richard D. Ball ◽  
Stefano Carrazza ◽  
Juan Cruz-Martinez ◽  
Luigi Del Debbio ◽  
...  

AbstractWe present the software framework underlying the NNPDF4.0 global determination of parton distribution functions (PDFs). The code is released under an open source licence and is accompanied by extensive documentation and examples. The code base is composed by a PDF fitting package, tools to handle experimental data and to efficiently compare it to theoretical predictions, and a versatile analysis framework. In addition to ensuring the reproducibility of the NNPDF4.0 (and subsequent) determination, the public release of the NNPDF fitting framework enables a number of phenomenological applications and the production of PDF fits under user-defined data and theory assumptions.


Author(s):  
Andy Kelleher Stuhl

Rivendell, a free and open source software suite for automated radio broadcasting, has brought several groups with clashing stances on technology, communication, and cultural politics into cooperation. This paper treats Rivendell as an opening onto the politics at play when the liberal ethos propelling free and open source software (Coleman, 2013) meets the autonomy-prizing traditions of independent broadcasting within an automation system. Complicating this already tense juncture, Rivendell has drawn users and code contributors from drastically opposed political groups within American broadcastings—right-wing Christian talk radio networks and progressive community stations—and has sustained a difficult terrain of working compromise that the activist push for low-power FM broadcasting inaugurated (Dunbar-Hester, 2014). In this paper, analysis of Rivendell's open source code base sheds light on its development and helps connect it to longer histories of media automation and its attendant social frictions. Interviews with lead Rivendell developers complete the picture of the project's trajectory, of its relation to the religious right context where the project began, and of the negotiations that have played out among its developers and its community of users in terrestrial and internet radio. The ongoing compromises and tensions threaded through Rivendell can offer insight into an issue that becomes larger and more pressing as media become increasingly complex and networked: how artists, activists, and media technologists who prioritize independence have reckoned with their reliance on socio-technical infrastructures whose connections may strike them as far less than savory.


Sign in / Sign up

Export Citation Format

Share Document