design of algorithms
Recently Published Documents


TOTAL DOCUMENTS

99
(FIVE YEARS 26)

H-INDEX

11
(FIVE YEARS 2)

2022 ◽  
Vol 31 (2) ◽  
pp. 1-32
Author(s):  
Luca Ardito ◽  
Andrea Bottino ◽  
Riccardo Coppola ◽  
Fabrizio Lamberti ◽  
Francesco Manigrasso ◽  
...  

In automated Visual GUI Testing (VGT) for Android devices, the available tools often suffer from low robustness to mobile fragmentation, leading to incorrect results when running the same tests on different devices. To soften these issues, we evaluate two feature matching-based approaches for widget detection in VGT scripts, which use, respectively, the complete full-screen snapshot of the application ( Fullscreen ) and the cropped images of its widgets ( Cropped ) as visual locators to match on emulated devices. Our analysis includes validating the portability of different feature-based visual locators over various apps and devices and evaluating their robustness in terms of cross-device portability and correctly executed interactions. We assessed our results through a comparison with two state-of-the-art tools, EyeAutomate and Sikuli. Despite a limited increase in the computational burden, our Fullscreen approach outperformed state-of-the-art tools in terms of correctly identified locators across a wide range of devices and led to a 30% increase in passing tests. Our work shows that VGT tools’ dependability can be improved by bridging the testing and computer vision communities. This connection enables the design of algorithms targeted to domain-specific needs and thus inherently more usable and robust.


2021 ◽  
Author(s):  
Sumit Kumar Ram ◽  
Shyam Nandan ◽  
Didier Sornette

Abstract We investigate the predictability and persistence (hot-hand effect) of individual and team performance by analyzing the complete recorded history of international cricket. We introduce an original temporal representation of performance streaks, which is suitable to be modelled as a self-exciting point process. We confirm the presence of predictability and hot-hands across the individual performance and the absence of the same in team performance and game outcome. Thus, Cricket is a game of skill for individuals and a game of chance for the teams. Our study contributes to recent historiographical debates concerning the presence of persistence in individual and collective productivity and success. The introduction of several metrics and methods can be useful to test and exploit clustering of performance in the study of human behavior and design of algorithms for predicting success.


2021 ◽  
Vol 2021 ◽  
pp. 1-7
Author(s):  
Mohammad Alsaffar ◽  
Gharbi Alshammari ◽  
Abdullah Alshammari ◽  
Saud Aljaloud ◽  
Tariq S. Almurayziq ◽  
...  

Machine learning is a branch of computing that studies the design of algorithms with the ability to “learn.” A subfield would be deep learning, which is a series of techniques that make use of deep artificial neural networks, that is, with more than one hidden layer, to computationally imitate the structure and functioning of the human organ and related diseases. The analysis of health interest images with deep learning is not limited to clinical diagnostic use. It can also, for example, facilitate surveillance of disease-carrying objects. There are other examples of recent efforts to use deep learning as a tool for diagnostic use. Chest X-rays are one approach to identify tuberculosis; by analysing the X-ray, you can spot any abnormalities. A method for detecting the presence of tuberculosis in medical X-ray imaging is provided in this paper. Three different classification methods were used to evaluate the method: support vector machines, logistic regression, and nearest neighbors. Cross-validation and the formation of training and test sets were the two classification scenarios used. The acquired results allow us to assess the method’s practicality.


Author(s):  
Zhicheng Guo ◽  
Cheng Ding ◽  
Xiao Hu ◽  
Cynthia Rudin

Abstract Objective. Wearable devices equipped with plethysmography (PPG) sensors provided a low-cost, long-term solution to early diagnosis and continuous screening of heart conditions. However PPG signals collected from such devices often suffer from corruption caused by artifacts. The objective of this study is to develop an effective supervised algorithm to locate the regions of artifacts within PPG signals. Approach. We treat artifact detection as a 1D segmentation problem. We solve it via a novel combination of an active-contour-based loss and an adapted U-Net architecture. The proposed algorithm was trained on the PPG DaLiA training set, and further evaluated on the PPG DaLiA testing set, WESAD dataset and TROIKA dataset. Main results. We evaluated with the DICE score, a well-established metric for segmentation accuracy evaluation in the field of computer vision. The proposed method outperforms baseline methods on all three datasets by a large margin (≈ 7 percentage points above the next best method). On the PPG DaLiA testing set, WESAD dataset and TROIKA dataset, the proposed method achieved 0.8734±0.0018, 0.9114±0.0033 and 0.8050±0.0116 respectively. The next best method only achieved 0.8068±0.0014, 0.8446±0.0013 and 0.7247±0.0050. Significance. The proposed method is able to pinpoint exact locations of artifacts with high precision; in the past, we had only a binary classification of whether a PPG signal has good or poor quality. This more nuanced information will be critical to further inform the design of algorithms to detect cardiac arrhythmia.


2021 ◽  
Vol 9 (4) ◽  
pp. 222-233 ◽  
Author(s):  
Florian Saurwein ◽  
Charlotte Spencer-Smith

Social media platforms like Facebook, YouTube, and Twitter have become major objects of criticism for reasons such as privacy violations, anticompetitive practices, and interference in public elections. Some of these problems have been associated with algorithms, but the roles that algorithms play in the emergence of different harms have not yet been systematically explored. This article contributes to closing this research gap with an investigation of the link between algorithms and harms on social media platforms. Evidence of harms involving social media algorithms was collected from media reports and academic papers within a two-year timeframe from 2018 to 2019, covering Facebook, YouTube, Instagram, and Twitter. Harms with similar casual mechanisms were grouped together to inductively develop a typology of algorithmic harm based on the mechanisms involved in their emergence: (1) algorithmic errors, undesirable, or disturbing selections; (2) manipulation by users to achieve algorithmic outputs to harass other users or disrupt public discourse; (3) algorithmic reinforcement of pre-existing harms and inequalities in society; (4) enablement of harmful practices that are opaque and discriminatory; and (5) strengthening of platform power over users, markets, and society. Although the analysis emphasizes the role of algorithms as a cause of online harms, it also demonstrates that harms do not arise from the application of algorithms alone. Instead, harms can be best conceived of as socio-technical assemblages, composed of the use and design of algorithms, platform design, commercial interests, social practices, and context. The article concludes with reflections on possible governance interventions in response to identified socio-technical mechanisms of harm. Notably, while algorithmic errors may be fixed by platforms themselves, growing platform power calls for external oversight.


2021 ◽  
Author(s):  
Sumit Ram ◽  
Shyam Nandan ◽  
Didier Sornette

Abstract We investigate the predictability and persistence (hot-hand effect) of individual and team performance by analyzing the complete recorded history of international cricket. We introduce an original temporal representation of performance streaks, which is suitable to be modelled as a self-exciting point process. We confirm the presence of predictability and hot-hands across the individual performance and the absence of the same in team performance and game outcome. Thus, Cricket is a game of skill for individuals and a game of chance for the teams. Our study contributes to recent historiographical debates concerning the presence of persistence in individual and collective productivity and success. The introduction of several metrics and methods can be useful to test and exploit clustering of performance in the study of human behavior and design of algorithms for predicting success.


Electronics ◽  
2021 ◽  
Vol 10 (19) ◽  
pp. 2438
Author(s):  
Amnah Nasim ◽  
David Nchekwube ◽  
Yoon Kim

Standing up and sitting down are prerequisite motions in most activities of daily living scenarios. The ability to sit down in and stand up from a chair or a bed depreciates and becomes a complex task with increasing age. Hence, research on the analysis and recognition of these two activities can help in the design of algorithms for assistive devices. In this work, we propose a reliability analysis for testing the internal consistency of nonlinear recurrence features for sit-to-stand (Si2St) and stand-to-sit (St2Si) activities for motion acceleration data collected by a wearable sensing device for 14 healthy older subjects in the age range of 78 ± 4.9 years. Four recurrence features—%recurrence rate, %determinism, entropy, and average diagonal length—were calculated by using recurrence plots for both activities. A detailed relative and absolute reliability statistical analysis based on Cronbach’s correlation coefficient (α) and standard error of measurement was performed for all recurrence measures. Correlation values as high as α = 0.68 (%determinism) and α = 0.72 (entropy) in the case of Si2St and α = 0.64 (%determinism) and α = 0.69 (entropy) in the case of St2Si—with low standard error in the measurements—show the reliability of %determinism and entropy for repeated acceleration measurements for the characterization of both the St2Si and Si2St activities in the case of healthy older adults.


2021 ◽  
Vol 12 ◽  
Author(s):  
Agnieszka Rychwalska ◽  
Magdalena Roszczyńska-Kurasińska ◽  
Karolina Ziembowicz ◽  
Jeremy V. Pitt

Recent discourse on Information and Communication Technologies’ (ICT) impact on societies has been dominated by negative side-effects of information exchange in huge online social systems. Yet, the size of ICT-based communities also provides an unprecedented opportunity for collective action, as exemplified through crowdfunding, crowdsourcing, or peer production. This paper aims to provide a framework for understanding what makes online collectives succeed or fail in achieving complex goals. The paper combines social and complexity sciences’ insights on structures, mechanics, and emergent phenomena in social systems to define a Community Complexity Framework for evaluating three crucial components of complexity: multi-level structuration, procedural self-organization, and common identity. The potential value of such a framework would be to shift the focus of efforts aimed at curing the malfunctions of online social systems away from the design of algorithms that can automatically solve such problems, and toward the development of technologies which enable online social systems to self-organize in a more productive and sustainable way.


Author(s):  
Klaus Jansen ◽  
Kim-Manuel Klein ◽  
Marten Maack ◽  
Malin Rau

AbstractInteger linear programs of configurations, or configuration IPs, are a classical tool in the design of algorithms for scheduling and packing problems where a set of items has to be placed in multiple target locations. Herein, a configuration describes a possible placement on one of the target locations, and the IP is used to choose suitable configurations covering the items. We give an augmented IP formulation, which we call the module configuration IP. It can be described within the framework of n-fold integer programming and, therefore, be solved efficiently. As an application, we consider scheduling problems with setup times in which a set of jobs has to be scheduled on a set of identical machines with the objective of minimizing the makespan. For instance, we investigate the case that jobs can be split and scheduled on multiple machines. However, before a part of a job can be processed, an uninterrupted setup depending on the job has to be paid. For both of the variants that jobs can be executed in parallel or not, we obtain an efficient polynomial time approximation scheme (EPTAS) of running time $$f(1/\varepsilon )\cdot \mathrm {poly}(|I|)$$ f ( 1 / ε ) · poly ( | I | ) . Previously, only constant factor approximations of 5/3 and $$4/3 + \varepsilon $$ 4 / 3 + ε , respectively, were known. Furthermore, we present an EPTAS for a problem where classes of (non-splittable) jobs are given, and a setup has to be paid for each class of jobs being executed on one machine.


Author(s):  
Hadjer Benmeziane ◽  
Kaoutar El Maghraoui ◽  
Hamza Ouarnoughi ◽  
Smail Niar ◽  
Martin Wistuba ◽  
...  

There is no doubt that making AI mainstream by bringing powerful, yet power hungry deep neural networks (DNNs) to resource-constrained devices would required an efficient co-design of algorithms, hardware and software. The increased popularity of DNN applications deployed on a wide variety of platforms, from tiny microcontrollers to data-centers, have resulted in multiple questions and challenges related to constraints introduced by the hardware. In this survey on hardware-aware neural architecture search (HW-NAS), we present some of the existing answers proposed in the literature for the following questions: "Is it possible to build an efficient DL model that meets the latency and energy constraints of tiny edge devices?", "How can we reduce the trade-off between the accuracy of a DL model and its ability to be deployed in a variety of platforms?". The survey provides a new taxonomy of HW-NAS and assesses the hardware cost estimation strategies. We also highlight the challenges and limitations of existing approaches and potential future directions. We hope that this survey will help to fuel the research towards efficient deep learning.


Sign in / Sign up

Export Citation Format

Share Document