Core Idea
Recently Published Documents





2022 ◽  
Vol 18 (1) ◽  
pp. 1-26
Youjing Lu ◽  
Fan Wu ◽  
Qianyi Huang ◽  
Shaojie Tang ◽  
Linghe Kong ◽  

To build a secure wireless networking system, it is essential that the cryptographic key is known only to the two (or more) communicating parties. Existing key extraction schemes put the devices into physical proximity and utilize the common inherent randomness between the devices to agree on a secret key, but they often rely on specialized hardware (e.g., the specific wireless NIC model) and have low bit rates. In this article, we seek a key extraction approach that only leverages off-the-shelf mobile devices, while achieving significantly higher key generation efficiency. The core idea of our approach is to exploit the fast varying inaudible acoustic channel as the common random source for key generation and wireless parallel communication for exchanging reconciliation information to improve the key generation rate. We have carefully studied and validated the feasibility of our approach through both theoretical analysis and a variety of measurements. We implement our approach on different mobile devices and conduct extensive experiments in different real scenarios. The experiment results show that our approach achieves high efficiency and satisfactory robustness. Compared with state-of-the-art methods, our approach improves the key generation rate by 38.46% and reduces the bit mismatch ratio by 42.34%.

2022 ◽  
Vol 12 (1) ◽  
Rhorry Gauld

The prediction of differential cross-sections in hadron-hadron scattering processes is typically performed in a scheme where the heavy-flavour quarks (c, b, tc,b,t) are treated either as massless or massive partons. In this work, a method to describe the production of colour-singlet processes which combines these two approaches is presented. The core idea is that the contribution from power corrections involving the heavy-quark mass can be numerically isolated from the rest of the massive computation. These power corrections can then be combined with a massless computation (where they are absent), enabling the construction of differential cross-section predictions in a massive variable flavour number scheme. As an example, the procedure is applied to the low-mass Drell-Yan process within the LHCb fiducial region, where predictions for the rapidity and transverse-momentum distributions of the lepton pair are provided. To validate the procedure, it is shown how the n_fnf-dependent coefficient of a massless computation can be recovered from the massless limit of the massive one. This feature is also used to differentially extract the massless N^3LON3LO coefficient of the Drell-Yan process in the gluon-fusion channel.

2022 ◽  
Vol 8 (1) ◽  
pp. 171-191
Stefan Schnell ◽  
Nils Norman Schiborr

Corpus-based studies have become increasingly common in linguistic typology over recent years, amounting to the emergence of a new field that we call corpus-based typology. The core idea of corpus-based typology is to take languages as populations of utterances and to systematically investigate text production across languages in this sense. From a usage-based perspective, investigations of variation and preferences of use are at the core of understanding the distribution of conventionalized structures and their diachronic development across languages. Specific findings of corpus-based typological studies pertain to universals of text production, for example, in prosodic partitioning; to cognitive biases constraining diverse patterns of use, for example, in constituent order; and to correlations of diverse patterns of use with language-specific structures and conventions. We also consider remaining challenges for corpus-based typology, in particular the development of crosslinguistically more representative corpora that include spoken (or signed) texts, and its vast potential in the future.

Lucas Woltmann ◽  
Claudio Hartmann ◽  
Dirk Habich ◽  
Wolfgang Lehner

AbstractCardinality estimation is a fundamental task in database query processing and optimization. As shown in recent papers, machine learning (ML)-based approaches may deliver more accurate cardinality estimations than traditional approaches. However, a lot of training queries have to be executed during the model training phase to learn a data-dependent ML model making it very time-consuming. Many of those training or example queries use the same base data, have the same query structure, and only differ in their selective predicates. To speed up the model training phase, our core idea is to determine a predicate-independent pre-aggregation of the base data and to execute the example queries over this pre-aggregated data. Based on this idea, we present a specific aggregate-based training phase for ML-based cardinality estimation approaches in this paper. As we are going to show with different workloads in our evaluation, we are able to achieve an average speedup of 90 with our aggregate-based training phase and thus outperform indexes.


Transdisciplinary is a paradigm based on the integration and balance of opposite points of view (dualities). This paper methodology involves transdisciplinarity applied to problem solutions, mainly from 1) Plato philosophy; 2) Taoist principle of duality Yin Yang, 3) Jungian psychology, so connected to modern physics and 4) Weil, Leloup and Crema psychological vision of holistic transdisciplinarity. Our findings in this regard involves Figure 11 for Jungian functions and Figure 12 model for problem solving through duality plus four elements. The objective is to create a comprehensive understanding of reality through Plato and Taoist philosophies, Alchemical tradition and Jungian psychology improved by the MBTI system, as tools for problem solving. Understanding the psychological types types to comprehend on how to achieve the best of each one, due to their innate strengths and capacities, so that a synergy of results can be created within the relationships. The MBTI serves both i) for self-knowledge, to make his own self-management on a day-to-day life, and ii) to understand how other people work psychologically, so that a synergy can be created in the process of relationships. The applied transdisciplinary approach is based on the principles of i) duality – interaction and integration of opposites, specially the analytical and synthetic methods and ii) four elements - rationality, feasibility, reasonableness and meaning. However, it is not a simplistic or a magical-vitalistic approach as it may seem to rationalists at first, as far as modern physics is concerned. The holistic view of reality, including holology (the study of the whole) and holopraxis (the praxis of the whole) can´t be confused with political ideology, something that happens very frequently to scientificists, who consider themselves "exempt" and “impartial”. Finally, the core idea is to promote transformation of the culture and personal behavior, connected to reasonableness and meaning, emotional and intuitive intelligences, mainly because of psychological sustainability and mental health.  

2022 ◽  
pp. 265-279
Elisabetta Risi ◽  
Riccardo Pronzato

The role of digital platforms in everyday life is a concern within different research fields; therefore, several authors have supported the need to investigate them and their underlying meshing of human and computational logic. In this chapter, the authors present a methodological proposal according to which auto-ethnographic diaries can be fruitfully employed to examine the relationship between individuals and algorithmic platforms. By drawing on a critical pedagogy approach, they consider auto-ethnography both as a practice of access to algorithmic logics through rich first-hand data regarding everyday usage practices as a response to datafication. The core idea behind this narrative method is to use inductive self-reflexive methodological tools to help individuals critically reflect on their daily activities, thereby making their consumption of algorithmic contents more aware and allowing researchers to collect in-depth reports about their use of digital platforms and the following processes of subjectification.

Tsaqofah ◽  
2021 ◽  
Vol 19 (02) ◽  
pp. 91
Asep Yusup Hudayat

Women, nature, ghost, and taboo are the main discourses related to magical realism in “Burak Siluman”, a novel by Moh. Ambri. In Burak Siluman, women (the main sign) were connected to the discourse of nature, ghost, and taboo. In it, women represent the suppressed desires of the lower class for wealth, position, honor wrapped in narratives of fascination, search, wandering, misfortune, and a curse. Discourses on the supernatural, half-ghost, and taboo legends in the novel are important traditional realities that are studied and seen by the workings of the concepts of magical realism in the colonial period of the Dutch East Indies. The main problem is: how does the concept of magical realism affect the construction of the world (physical and supernatural), especially related to ghost and taboo narratives in “Burak Siluman”. Thus, the main objective of this research is the interaction of the influence of magical realism on narratives construction related to women, nature, ghost, and taboo. To resolve the issue, the concept of contemporary magical realism is used from a postcolonial perspective. The results of this study is the placement of the "between" space (magic in rational) which is represented in the wandering figure is the core idea of ​​magical realism in “Burak Siluman”.

2021 ◽  
pp. 135-141
Jason Brennan

This chapter lays out a general theoretical case for democracy, specifically the kind of democracy that democratic theorists call “deliberative democracy,” which traces the legitimacy of laws and policies to the reasoned exchange of arguments among free and equal citizens. The chapter shows the benefits of distributing political decision-power in an inclusive and egalitarian manner, especially in the deliberative phase of the legislative process. The core idea is that many minds deliberating together are better than few when it comes to dealing with the uncertainty and complexity of the world and figuring out solutions that work for all within it.

Csaba Varga

As to the conceptualisation of any one institution, the apparently identical notional term can cover four types of institutional systems: (1) the actually existing concrete system, which is a unit that functions as it is (e.g., constitutional system of liberalism as practised in a given area in a given time, e.g., in the United States nowadays); (2) the historically developed concrete system which is a unit that functions as it has been (e.g., constitutional system of liberalism as practised in a given area in a given period, e.g., in the United States since the time it developed); (3) the generalisation of the historically concrete systems as developed in our civilisation (e.g., the constitutional system of liberalism as known and practised in our civilisation); and (4) the core idea of the functioning underlying all kinds of generalisation (e.g., the abstract universal formulation of the ultimate principles of operation, of which the constitutional system of liberalism is but one of the theoretically possible forms of realisation). Within a quasi monographic analysis of them, both their role as a normative ideology and their actual objectivity and contingency are treated.

2021 ◽  
Zhi Lin ◽  
Xia Wang ◽  
Rongfang Bie ◽  
Hongwei Shi

Abstract The DTN (Delay/Interrupt Tolerant Network) protocol that relies on nodes to handle network interruptions is one of the important components of the wireless sensor network (WSN) routing protocol. However, due to resource consumption, nodes may be unable to unconditionally relay data. To address this issue, several incentive mechanisms have recently been proposed to encourage node participation. However, the existing solutions either do not fully consider the sender’s budget or do not consider the relay cost limitation, which violated the practical incentive mechanism requirements of the DTN protocol. In this paper, we focus on developing a new incentive mechanism for DTN routing that specifically address the challenges brought up by budget and relay cost limitation. Our core idea is to define the payoff functions of the sender and the next hops, then optimize the strategies under the constraints of the sender’s budget and the relay’s cost. Our experimental results demonstrate that the maximized social welfare for all participants can be realized under these constraints.

Sign in / Sign up

Export Citation Format

Share Document