scholarly journals A Formal Framework for Cognitive Models of Multitasking

2020 ◽  
Author(s):  
Michael Lesnick ◽  
Sebastian Musslick ◽  
Biswadip Dey ◽  
Jonathan D. Cohen

This note introduces mathematical foundations for modeling of human multitask performance. Using basic definitions from set theory and graph theory, we introduce formal definitions of the environment in which multitasks are performed, of an agent which attempts to perform a multitask, and of the success rate of the agent on a multitask. Drawing on the recent literature on modeling of multitasking, we give two simple examples of multitasking agents, and illustrate the performance of these agents on two multitasking problems: the well-known Stroop task, and a more complex variant.

2021 ◽  
Author(s):  
Jingyi Cai

Blockchain technology is a distributed database and a public ledger. It records every transaction that has been made from its inception. Once entered, these records cannot be modified or erased. The technology utilizes various algorithms of a cryptographic nature to reach a consensus. These cryptograhic functions also ensure the integrity and authenticity of data that has been interchanged across the network. Because of these features, blockchain technology has been implemented into various financial and non-financial fields. In this thesis, we introduce the mathematical foundations of Bitcoin, and different cryptographic functions that are used in Blockchain, especially elliptic curve multiplication. We construct two MATLAB models to study the fork events which is the one of typical consensus problems in the system. Moreover, we use graph theory and MATLAB models to represent and describe the Bitcoin protocols.


Author(s):  
Yasuo Kudo ◽  
Tetsuya Murai

This paper focuses on rough set theory which provides mathematical foundations of set-theoretical approximation for concepts, as well as reasoning about data. Also presented in this paper is the concept of relative reducts which is one of the most important notions for rule generation based on rough set theory. In this paper, from the viewpoint of approximation, the authors introduce an evaluation criterion for relative reducts using roughness of partitions that are constructed from relative reducts. The proposed criterion evaluates each relative reduct by the average of coverage of decision rules based on the relative reduct, which also corresponds to evaluate the roughness of partition constructed from the relative reduct,


2005 ◽  
Vol 44 (04) ◽  
pp. 498-507 ◽  
Author(s):  
B. Smith ◽  
L. Goldberg ◽  
W. Ceusters

Summary Objective: The National Cancer Institute Thesaurus is described by its authors as “a biomedical vocabulary that provides consistent, unambiguous codes and definitions for concepts used in cancer research” and which “exhibits ontology-like properties in its construction and use”. We performed a qualitative analysis of the Thesaurus in order to assess its conformity with principles of good practice in terminology and ontology design. Materials and Methods: We used both the on-line browsable version of the Thesaurus and its OWL-representation (version 04.08b, released on August 2, 2004), measuring each in light of the requirements put forward in relevant ISO terminology standards and in light of ontological principles advanced in the recent literature. Results: We found many mistakes and inconsistencies with respect to the term-formation principles used, the underlying knowledge representation system, and missing or inappropriately assigned verbal and formal definitions. Conclusion: Version 04.08b of the NCI Thesaurus suffers from the same broad range of problems that have been observed in other biomedical terminologies. For its further development, we recommend the use of a more principled approach that allows the Thesaurus to be tested not just for internal consistency but also for its degree of correspondence to that part of reality which it is designed to represent.


2000 ◽  
Vol 177 ◽  
pp. 37-38
Author(s):  
M. Kramer ◽  
B. Klein ◽  
D. Lorimer ◽  
P. Müller ◽  
A. Jessner ◽  
...  

AbstractWe report the status of a search for pulsars in the Galactic Centre, using a completely revised and improved high-sensitivity double-horn system at 4.85-GHz. We also present calculations about the success rate of periodicity searches for such a survey, showing that in contrast to conclusions in recent literature pulsars can be indeed detected at the chosen search frequency.


2012 ◽  
Vol 178-181 ◽  
pp. 1887-1890
Author(s):  
Wen Bin Liu

In the paper,using graph theory,set theory and iteration,we give gradual search algorithm with number of transfer acting as parameter. Through dealing with data of traffic line, line is united in algorithm,and computational model is simplified. Through optimization of left and right in circuit site and same line repeating site, the shortest timing function in circuit is realized. We consider the subway and walking time of all sites furtherly.


2011 ◽  
Vol 21 (4) ◽  
pp. 883-911 ◽  
Author(s):  
MIHNEA IANCU ◽  
FLORIAN RABE

Over recent decades there has been a trend towards formalised mathematics, and a number of sophisticated systems have been developed both to support the formalisation process and to verify the results mechanically. However, each tool is based on a specific foundation of mathematics, and formalisations in different systems are not necessarily compatible. Therefore, the integration of these foundations has received growing interest. We contribute to this goal by using LF as a foundational framework in which the mathematical foundations themselves can be formalised and therefore also the relations between them. We represent three of the most important foundations – Isabelle/HOL, Mizar and ZFC set theory – as well as relations between them. The relations are formalised in such a way that the framework permits the extraction of translation functions, which are guaranteed to be well defined and sound. Our work provides the starting point for a systematic study of formalised foundations in order to compare, relate and integrate them.


2019 ◽  
Vol 17 (1) ◽  
pp. 423-438
Author(s):  
Choonkil Park ◽  
Nasir Shah ◽  
Noor Rehman ◽  
Abbas Ali ◽  
Muhammad Irfan Ali ◽  
...  

Abstract Soft set theory and rough set theory are two new tools to discuss uncertainty. Graph theory is a nice way to depict certain information. Particularly soft graphs serve the purpose beautifully. In order to discuss uncertainty in soft graphs, some new types of graphs called soft covering based rough graphs are introduced. Several basic properties of these newly defined graphs are explored. Applications of soft covering based rough graphs in decision making can be very fruitful. In this regard an algorithm has been proposed.


2003 ◽  
Vol 57 (3) ◽  
pp. 643-659 ◽  
Author(s):  
Daniel W. Drezner

Why do policymakers consistently employ economic sanctions even though scholars consider them an ineffective tool of statecraft? Game-theoretic models of economic coercion suggest the success rate may be understated because of selection effects. When the targeted country prefers conceding to incurring the cost of sanctions, it has an incentive to acquiesce before the imposition of sanctions. The bulk of successful coercion episodes should therefore end with sanctions threatened but not imposed. This contradicts the recent literature on sanctions, which assumes that sanctions rarely, if ever, work at generating significant concessions from the targeted country and are imposed for domestic or symbolic political reasons. If the game-theoretic argument is correct, the crucial cases to study are those in which coercion is threatened but not implemented. A statistical analysis of data on sanctions in pursuit of economic or regulatory goals strongly supports the gametheoretic argument. These results suggest that the significance of economic coercion has been undervalued in the study of statecraft and international relations more generally.


This paper contains a formal framework within which logic, set theory and programming are presented together. These elements can be presented together because, in this work, we no longer regard a (procedural) programming notation (such as PASCAL) as a notation for expressing a computation; rather, we regard it as a mere extension to the conventional language of logic and set theory. The extension constitutes a convenient (economical) way of expressing certain relational statements. A consequence of this point of view is that the activity of program construction is transformed into that of proof construction. To ensure that this activity of proof construction can be given a sound mechanizable foundation, we present a number of theories in the form of some basic deduction and definition rules. For instance, such theories compose the two logical calculi, a weaker version of the standard Zermelo-Fraenkel set theory, as well as some other elementary mathematical theories leading up to the construction of natural numbers. This last theory acts as a paradigm for the construction of other types such as sequences or trees. Parallel to these mathematical constructions we axiomatize a certain programming notation by giving equivalents to its basic constructs within logic and set theory. A number of other non-logical theories are also presented, which allows us to completely mechanize the calculus of proof that is implied by this framework.


Author(s):  
Arturo Tozzi

When an edge is removed, a cycle graph Cn becomes a n-1 tree graph. This observation from extremal set theory leads us to the realm of set theory, in which a topological manifold of genus-1 turns out to be of genus-0. Starting from these premises, we prove a theorem suggesting that a manifold with disjoint points must be of genus-0, while a manifold of genus-1 cannot encompass disjoint points.


Sign in / Sign up

Export Citation Format

Share Document