Journal of Computer Science and Technology
Latest Publications


TOTAL DOCUMENTS

82
(FIVE YEARS 43)

H-INDEX

2
(FIVE YEARS 1)

Published By Universidad Nacional De La Plata

1666-6038, 1666-6046

2021 ◽  
Vol 21 (2) ◽  
pp. e18
Author(s):  
Vinay Raj ◽  
Ravichandra Sadam

This Distributed systems have evolved rapidly as the demand for independent design, and deployment ofsoftware applications has increased. It has emerged from the monolithic style of client-server architecture toservice-oriented architecture, and then to the trending microservices. Monolithic applications are difficult toupdate, maintain, and deploy as it makes the application code very complex to understand. To overcome the designand deployment challenges in monolithic applications, service oriented architecture has emerged as a style ofdecomposing the entire application into loosely coupled, scalable, and interoperable services. Though SOA hasbecome popular in the integration of multiple applications using the enterprise service bus, there are fewchallenges related to delivery, deployment, governance, and interoperability of services. Additionally, the servicesin SOA applications are tending towards monolithic in size with the increase in changing user requirements. Toovercome the design and maintenance challenges in SOA, microservices has emerged as a new architectural styleof designing applications with loose coupling, independent deployment, and scalability as key features.


2021 ◽  
Vol 21 (2) ◽  
pp. e14
Author(s):  
Diego Omar Encinas ◽  
Lucas Maccallini ◽  
Fernando Romero

This publication presents an approach to a simulator to recreate a large number of scenarios and to make agile decisions in the planning of a real emergency room system. A modeling and simulation focused on the point prevalence of intrahospital infections in an emergency room and how it is affected by different factors related to hospital management. To carry out the simulator modeling, the Agent-based Modeling and Simulation (ABMS) paradigm was used. Thus, different intervening agents in the emergency room environment — patients and doctors, among others— were classified. The user belonging to the health system has different data to configure the simulation, such as the number of patients, the number of available beds, etc. Based on the tests carried out and the measurements obtained, it is concluded that the disease propagation model relative to the time and contact area of the patients has greater precision than the purely statistical model of the intensive care unit.


2021 ◽  
Vol 21 (2) ◽  
pp. e15
Author(s):  
Federico Walas ◽  
Andrés Redchuk

The advance of digitalization in industry is making possible that connected products and processes help people, industrial plants and equipment to be more productive and efficient, and the results for operative processes should impact throughout the economy and the environment.Connected products and processes generate data that is being seen as a key source of competitive advantage, and the management and processing of that data is generating new challenges in the industrial environment.The article to be presented looks into the framework of the adoption of Artificial Intelligence and Machine Learning and its integration with IIoT or IoT under industry 4.0, or smart manufacturing framework. This work is focused on the discussion around Artificial Intelligence/Machine Learning and IIoT/IoT as driver for Industrial Process optimization.The paper explore some related articles that were find relevant to start the discussion, and includes a bibliometric analysis of the key topics around Artificial Intelligence/Machine Learning as a value added solution for process optimization under Industry 4.0 or Smart Manufacturing paradigm.The main findings are related to the importance that the subject has acquired since 2013 in terms of published articles, and the complexity of the approach of the issue proposed by this work in the industrial environment.


2021 ◽  
Vol 21 (2) ◽  
pp. e09
Author(s):  
Federico Favaro ◽  
Ernesto Dufrechou ◽  
Pablo Ezzatti ◽  
Juan Pablo Oliver

The dissemination of multi-core architectures and the later irruption of massively parallel devices, led to a revolution in High-Performance Computing (HPC) platforms in the last decades. As a result, Field-Programmable Gate Arrays (FPGAs) are re-emerging as a versatile and more energy-efficient alternative to other platforms. Traditional FPGA design implies using low-level Hardware Description Languages (HDL) such as VHDL or Verilog, which follow an entirely different programming model than standard software languages, and their use requires specialized knowledge of the underlying hardware. In the last years, manufacturers started to make big efforts to provide High-Level Synthesis (HLS) tools, in order to allow a grater adoption of FPGAs in the HPC community.Our work studies the use of multi-core hardware and different FPGAs to address Numerical Linear Algebra (NLA) kernels such as the general matrix multiplication GEMM and the sparse matrix-vector multiplication SpMV. Specifically, we compare the behavior of fine-tuned kernels in a multi-core CPU processor and HLS implementations on FPGAs. We perform the experimental evaluation of our implementations on a low-end and a cutting-edge FPGA platform, in terms of runtime and energy consumption, and compare the results against the Intel MKL library in CPU.  


2021 ◽  
Vol 21 (1) ◽  
pp. e6
Author(s):  
Karina Beatriz Eckert ◽  
Paola Ver´´onica Britos

Decision making can present a considerable amount of complexity in competitive environments; where methods that support possess great relevance. The article presents an extension of the Hierarchic Analytical Process; complemented with Personal Construct Theory, which purpose is to reduce ambiguity when defining and establishing values for the criteria in a determined problem. In recent years, the scope for decision making based on data has considerably raised, which is why Data Science as a scientific field is rising in popularity; where one of the main activities for data scientists is selecting an adequate methodology to guide a project with this traits. The steps defined in the proposed model guide this task, from establishing and prioritizing criteria based on degrees of compliance, grouping them by levels, completing the hierarchical structure of the problem, performing the correct comparisons through different levels in an ascendant manner, to finally obtaining the definitive priorities of each methodology for each validation case and sorting them by their adequacy percentages. Both disparate cases, one referred to an industrial/commercial field and the other to an academic field, were effective to corroborate the extent of usefulness of the proposed model; for which in both cases MoProPEI obtained the best results.


2021 ◽  
Vol 21 (1) ◽  
pp. e2
Author(s):  
Remzi GÜRFİDAN ◽  
Mevlüt Ersoy

The works produced within the music industry arepresented to their listeners on a digital platform,taking advantage of technology. The problems of thepast, such as pirated cassettes and CDs, have left theirplace to the problem of copyright protection on digitalplatforms today. Block chain is one of the mostreliable and preferred technologies in recent timesregarding data integrity and data security. In thisstudy, a blockzincir-based music wallet model isproposed for safe and legal listening of audio files.The user's selected audio files are converted intoblock chain structure using different techniques andalgorithms and are kept securely in the user's musicwallet. In the study, performance comparisons aremade with the proposed model application in terms ofthe length of time an ordinary audio player can addnew audio files to the list and the response times ofthe user. The findings suggest that the proposedmodel implementation has acceptable differences inperformance with an ordinary audio player.


2021 ◽  
Vol 21 (1) ◽  
pp. e4
Author(s):  
Ramiro Germán Rodríguez Colmeiro ◽  
Claudio Verrastro ◽  
Daniel Minsky ◽  
Thomas Grosges

The correction of attenuation effects in Positron Emission Tomography (PET) imaging is fundamental to obtain a correct radiotracer distribution. However direct measurement of this attenuation map is not error-free and normally results in additional ionization radiation dose to the patient. Here, we explore the task of whole body attenuation map generation using 3D deep neural networks. We analyze the advantages thar an adversarial network training cand provide to such models. The networks are trained to learn the mapping from non attenuation corrected [18 ^F]-fluorodeoxyglucose PET images to a synthetic Computerized Tomography (sCT) and also to label the input voxel tissue. Then the sCT image is further refined using an adversarial training scheme to recover higher frequency details and lost structures using context information. This work is trained and tested on public available datasets, containing several PET images from different scanners with different radiotracer administration and reconstruction modalities. The network is trained with 108 samples and validated on 10 samples. The sCT generation was tested on 133 samples from 8 distinct datasets. The resulting mean absolute error of the networks is 90±20  and 103±18HU and a peak signal to noise ratio of 19.3±1.7 dB and 18.6±1.5, for the base model and the adversarial model respectively. The attenuation correction is tested by means of attenuation sinograms, obtaining a line of response attenuation mean error lower than 1% with a standard deviation lower than 8%. The proposeddeep learning topologies are capable of generating whole body attenuation maps from uncorrected PET image data. Moreover, the accuracy of both methods holds in the presence of data from multiple sources and modalities and are trained on publicly available datasets. Finally, while the adversarial layer enhances visual appearance of the produced samples, the 3D U-Net achieves higher metric performance


2021 ◽  
Vol 21 (1) ◽  
pp. e7
Author(s):  
Pablo Mennuto ◽  
Julio César Meca Belahonia ◽  
Patricia Bazán

The use of BPM (Business Process Management) has matured over the years, reaching high levels of acceptance and utilization. Despite this, there are still points that BPM does not fully resolve. One of the main limitations of the use of BPM is the lack of a complete acquisition of valuable information during the design stage, taking place in contexts where communication between the stakeholders is not appropriate and it is not possible to fully collect essential data. At the execution stage, the participation of users has not been studied in depth to record detected problems or indicate improvements in business processes. The emergence and development of Web 2.0 opened a way to solve these problems. This work proposes to base how the socialization tools can solve current problems in BPM through a theoretical analysis added to the practical development of a socialization tool integrated to a BPMS (Business Process Management System).


2021 ◽  
Vol 21 (1) ◽  
pp. e5
Author(s):  
Nelson Dugarte Jerez ◽  
Antonio Alvarez ◽  
Edison Dugarte ◽  
Negman Alvarado ◽  
Sonu Bhaskar

This paper introduces a practical technique for the design of an instrument used in air flow measurement or flowmeter. This instrument is an essential component in the hospital medical ventilation equipment functioning, therefore, the parameters design presented in this article focus on this purpose. However, this instrument can be employed to any measurement scale. The technique is based on indirect flow measurement, using a sensor that converts the flow parameter into a differential pressure measurement. An electronic transducer allows the differential pressure values to be obtained as an electrical signal, which is then digitized and analyzed to obtain the original parameter. The experimental procedure presented in this paper utilizes a computational algorithm to perform the signal analysis; however, given the simplicity of the procedure, this could be adapted to any digital processing card or platform, to show the measurement obtained immediately. Preliminary analyses demonstrated instrument efficiency with sensitivity of 0.0681 L/s. Accuracy evaluation showed an average measurement error lesser than 1.4%, with a standard deviation of 0.0612 and normal distribution over the set of test measurements.


2021 ◽  
Vol 21 (1) ◽  
pp. e1
Author(s):  
Sergio Gastón Burdisso ◽  
Marcelo Errecalde ◽  
Manuel Montes-y-Gómez

Psychologists have used tests and carefully designed survey questions, such as Beck's Depression Inventory (BDI), to identify the presence of depression and to assess its severity level.On the other hand, methods for automatic depression detection have gained increasing interest since all the information available in social media, such as Twitter and Facebook, enables novel measurement based on language use.These methods learn to characterize depression through natural language use and have shown that, in fact, language usage can provide strong evidence in detecting depressive people.However, not much attention has been paid to measuring finer grain relationships between both aspects, such as how is connected the language usage with the severity level of depression.The present study is a first step towards that direction.We train a binary text classifier to detect ``depressed'' users and then we use its confidence value to estimate the user's clinical depression level.In order to do that, our system has to be able to fill the standard BDI depression questionnaire on users' behalf, based only on their posts in Reddit.Our proposal was publicly tested in the eRisk 2019 task obtaining the best and second-best performance among the other 13 submitted models.


Sign in / Sign up

Export Citation Format

Share Document