scholarly journals Evaluating the Interface Using Expert-heuristic Method

Author(s):  
Ulyana Khaleeva

The research aims to form a new method for evaluating interfaces, ensuring its multi-criteria nature and eliminating the shortcomings of previous methods. A combination of expert and heuristic approach is proposed, to detect a wide range of UI/UX problems, to ensure assessment competence and to reduce the level of distrust of the expert. In the first experiment, two groups of interfaces with different characteristics were evaluated, with two interfaces in each group. Fifteen heuristics were evaluated: ten general purpose criteria and five specialized criteria. Thirteen experts were involved, for whom weighting coefficients were previously calculated, taking into account their professional competencies and personal qualities influencing the reasonableness of the evaluation. After analyzing the results of the first experiment, it was decided to investigate the influence of the number of experts in the sample on the overall UI score. Therefore, for the second experiment, the optimal number of experts in the group was calculated to ensure the lowest score variance. Applications were evaluated in five groups (the number of heuristics did not change). Also, in each experiment, the outlier weights of the experts were calculated to ensure consistency of the opinions of the sample group members. In the conclusion, an analysis of the feasibility of applying the new method to mobile interfaces was performed. Conclusions on the suitability of the chosen mathematical apparatus and further ways of development of the method have been made.

Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1031
Author(s):  
Joseba Gorospe ◽  
Rubén Mulero ◽  
Olatz Arbelaitz ◽  
Javier Muguerza ◽  
Miguel Ángel Antón

Deep learning techniques are being increasingly used in the scientific community as a consequence of the high computational capacity of current systems and the increase in the amount of data available as a result of the digitalisation of society in general and the industrial world in particular. In addition, the immersion of the field of edge computing, which focuses on integrating artificial intelligence as close as possible to the client, makes it possible to implement systems that act in real time without the need to transfer all of the data to centralised servers. The combination of these two concepts can lead to systems with the capacity to make correct decisions and act based on them immediately and in situ. Despite this, the low capacity of embedded systems greatly hinders this integration, so the possibility of being able to integrate them into a wide range of micro-controllers can be a great advantage. This paper contributes with the generation of an environment based on Mbed OS and TensorFlow Lite to be embedded in any general purpose embedded system, allowing the introduction of deep learning architectures. The experiments herein prove that the proposed system is competitive if compared to other commercial systems.


2020 ◽  
Vol 11 (1) ◽  
pp. 241
Author(s):  
Juliane Kuhl ◽  
Andreas Ding ◽  
Ngoc Tuan Ngo ◽  
Andres Braschkat ◽  
Jens Fiehler ◽  
...  

Personalized medical devices adapted to the anatomy of the individual promise greater treatment success for patients, thus increasing the individual value of the product. In order to cater to individual adaptations, however, medical device companies need to be able to handle a wide range of internal processes and components. These are here referred to collectively as the personalization workload. Consequently, support is required in order to evaluate how best to target product personalization. Since the approaches presented in the literature are not able to sufficiently meet this demand, this paper introduces a new method that can be used to define an appropriate variety level for a product family taking into account standardized, variant, and personalized attributes. The new method enables the identification and evaluation of personalizable attributes within an existing product family. The method is based on established steps and tools from the field of variant-oriented product design, and is applied using a flow diverter—an implant for the treatment of aneurysm diseases—as an example product. The personalization relevance and adaptation workload for the product characteristics that constitute the differentiating product properties were analyzed and compared in order to determine a tradeoff between customer value and personalization workload. This will consequently help companies to employ targeted, deliberate personalization when designing their product families by enabling them to factor variety-induced complexity and customer value into their thinking at an early stage, thus allowing them to critically evaluate a personalization project.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Seyed Hossein Jafari ◽  
Amir Mahdi Abdolhosseini-Qomi ◽  
Masoud Asadpour ◽  
Maseud Rahgozar ◽  
Naser Yazdani

AbstractThe entities of real-world networks are connected via different types of connections (i.e., layers). The task of link prediction in multiplex networks is about finding missing connections based on both intra-layer and inter-layer correlations. Our observations confirm that in a wide range of real-world multiplex networks, from social to biological and technological, a positive correlation exists between connection probability in one layer and similarity in other layers. Accordingly, a similarity-based automatic general-purpose multiplex link prediction method—SimBins—is devised that quantifies the amount of connection uncertainty based on observed inter-layer correlations in a multiplex network. Moreover, SimBins enhances the prediction quality in the target layer by incorporating the effect of link overlap across layers. Applying SimBins to various datasets from diverse domains, our findings indicate that SimBins outperforms the compared methods (both baseline and state-of-the-art methods) in most instances when predicting links. Furthermore, it is discussed that SimBins imposes minor computational overhead to the base similarity measures making it a potentially fast method, suitable for large-scale multiplex networks.


2015 ◽  
Vol 17 (32) ◽  
pp. 20687-20698 ◽  
Author(s):  
Serena De Santis ◽  
Giancarlo Masci ◽  
Francesco Casciotta ◽  
Ruggero Caminiti ◽  
Eleonora Scarpellini ◽  
...  

Fourteen cholinium-amino acid based room temperature ionic liquids were prepared using a cleaner synthetic method. Chemicophysical properties were well correlated with the wide range of amino acid chemical structures.


2017 ◽  
Vol 7 (1) ◽  
Author(s):  
Simuck F. Yuk ◽  
Krishna Chaitanya Pitike ◽  
Serge M. Nakhmanson ◽  
Markus Eisenbach ◽  
Ying Wai Li ◽  
...  

Abstract Using the van der Waals density functional with C09 exchange (vdW-DF-C09), which has been applied to describing a wide range of dispersion-bound systems, we explore the physical properties of prototypical ABO 3 bulk ferroelectric oxides. Surprisingly, vdW-DF-C09 provides a superior description of experimental values for lattice constants, polarization and bulk moduli, exhibiting similar accuracy to the modified Perdew-Burke-Erzenhoff functional which was designed specifically for bulk solids (PBEsol). The relative performance of vdW-DF-C09 is strongly linked to the form of the exchange enhancement factor which, like PBEsol, tends to behave like the gradient expansion approximation for small reduced gradients. These results suggest the general-purpose nature of the class of vdW-DF functionals, with particular consequences for predicting material functionality across dense and sparse matter regimes.


2010 ◽  
Vol 20 (02) ◽  
pp. 103-121 ◽  
Author(s):  
MOSTAFA I. SOLIMAN ◽  
ABDULMAJID F. Al-JUNAID

Technological advances in IC manufacturing provide us with the capability to integrate more and more functionality into a single chip. Today's modern processors have nearly one billion transistors on a single chip. With the increasing complexity of today's system, the designs have to be modeled at a high-level of abstraction before partitioning into hardware and software components for final implementation. This paper explains in detail the implementation and performance evaluation of a matrix processor called Mat-Core with SystemC (system level modeling language). Mat-Core is a research processor aiming at exploiting the increasingly number of transistors per IC to improve the performance of a wide range of applications. It extends a general-purpose scalar processor with a matrix unit. To hide memory latency, the extended matrix unit is decoupled into two components: address generation and data computation, which communicate through data queues. Like vector architectures, the data computation unit is organized in parallel lanes. However, on parallel lanes, Mat-Core can execute matrix-scalar, matrix-vector, and matrix-matrix instructions in addition to vector-scalar and vector-vector instructions. For controlling the execution of vector/matrix instructions on the matrix core, this paper extends the well known scoreboard technique. Furthermore, the performance of Mat-Core is evaluated on vector and matrix kernels. Our results show that the performance of four lanes Mat-Core with matrix registers of size 4 × 4 or 16 elements each, queues size of 10, start up time of 6 clock cycles, and memory latency of 10 clock cycles is about 0.94, 1.3, 2.3, 1.6, 2.3, and 5.5 FLOPs per clock cycle; achieved on scalar-vector multiplication, SAXPY, Givens, rank-1 update, vector-matrix multiplication, and matrix-matrix multiplication, respectively.


Birds ◽  
2021 ◽  
Vol 2 (3) ◽  
pp. 250-260
Author(s):  
Christoph Randler

The purpose of this study was to segment birdwatchers into clusters. Members from a wide range of bird related organizations, from highly specialized birders as well as Facebook bird group members were studied to provide a diverse dataset (n = 2766; 50.5% men). Birding specialization was measured with a battery of questionnaires. Birding specialization encompassed the three constructs of skill/competence, behavior, personal and behavioral commitment. Additionally, involvement, measured by centrality to lifestyle, attraction, social bonding, and identity, was used. The NbClust analyses showed that a three-cluster solution was the optimal solution. Then, k-means cluster analysis was applied on three groups: casual/novice, intermediate, and specialist/advanced birdwatchers. More men than women were in the specialist/advanced group and more women than men in the casual/novice group. As a conclusion, this study confirms a three-cluster solution for segmenting German birdwatchers based on a large and diverse sample and a broad conceptualization of the construct birding specialization. These data can be used to address different target audiences (novices, advanced birders) with different programs, e.g., in nature conservation.


1989 ◽  
Vol 21 (8-9) ◽  
pp. 889-897 ◽  
Author(s):  
J. M. Lopez-Real ◽  
E. Witter ◽  
F. N. Midmer ◽  
B. A. O. Hewett

Collaborative research between Southern Water and Wye College, University of London, has led to the development of a static aerated pile composting process for the treatment of dewatered activated sludge cake/straw mixtures. The process reduces bulk volume of the sludge producing an environmentally acceptable, stabilised, odour and pathogen-free product. Characteristics of the compost make it a suitable general purpose medium for container grown plants, providing the salt concentration is reduced by washing the compost prior to planting. Compared with peat the compost has a higher bulk density, a lower waterholding capacity, a lower cation exchange capacity, a high content of soluble salts, and a higher content of plant nutrients. A compost mixture was successfully developed in the growing trials containing equal quantities of compost, Sphagnum peat, and horticultural vermiculite. The compost has been used successfully to grow a wide range of plants. Plants grown in mixtures based on the compost were in general similar to those grown in peat-based growing media. The compost is a valuable soil conditioner and slow release fertilizer.


Author(s):  
Tobias Leibner ◽  
Mario Ohlberger

In this contribution we derive and analyze a new numerical method for kinetic equations based on a variable transformation of the moment approximation. Classical minimum-entropy moment closures are a class of reduced models for kinetic equations that conserve many of the fundamental physical properties of solutions. However, their practical use is limited by their high computational cost, as an optimization problem has to be solved for every cell in the space-time grid. In addition, implementation of numerical solvers for these models is hampered by the fact that the optimization problems are only well-defined if the moment vectors stay within the realizable set. For the same reason, further reducing these models by, e.g., reduced-basis methods is not a simple task. Our new method overcomes these disadvantages of classical approaches. The transformation is performed on the semi-discretized level which makes them applicable to a wide range of kinetic schemes and replaces the nonlinear optimization problems by inversion of the positive-definite Hessian matrix. As a result, the new scheme gets rid of the realizability-related problems. Moreover, a discrete entropy law can be enforced by modifying the time stepping scheme. Our numerical experiments demonstrate that our new method is often several times faster than the standard optimization-based scheme.


2012 ◽  
Vol 1 (1) ◽  
pp. 59-70
Author(s):  
Rulli Nasrullah

It is interesting to look at the Head of Criminal Investigation statement of the Indonesian National Police Commissioner General Ito Sumardi (Kompas, 22/9/2010), which warns that the crime of terrorism is closely related to ideology. Sociologist Van Dijk (1993) states that ideology is basically a mental system that is exchanged, represented both in the level of discourse and action to achieve certain goals or desires in a groups (defi ned as the system of mental representations and processes of group members). Why (technology) Internet so powerful in spreading the message of terrorism? First, the interaction happens on internet can be done anywhere and anytime. Second, Internet users in Indonesia, which is increasingly growing in number, allows access to the site or content to be easily obtained terrorism. Third, Internet medium provides access not only cheap but free. Fourth, the Internet allows anyone to construct new identity. In a fact proves that the identities of individuals in cyber world are individuals who have two possibilities, it could be the same or different identities as in the real world. Furthermore, the individual does not only have one identity per se on the internet, they could have multiple identities as well as different characteristics from each other. In according to Gilmore (1996), those on the Internet nobody knows you at all, not either knows your race or your sex. This is the opportunities that could be used by the perpetrators of terrorism to spread the ideology of terrorism and violence in the name of religion without worried their identity will be revealed. Key words: cybermedia, virtual terorism, internet, identity.


Sign in / Sign up

Export Citation Format

Share Document