On modular approaches to grammar: Evidence from Polish

Author(s):  
Bartłomiej Czaplicki

AbstractModularity of grammar has been explicitly or tacitly assumed in many generative analyses. Modules are separate computational systems that perform specific tasks and make use of domain-specific information. It is argued that the concept is difficult to maintain in the light of evidence from Polish. I look at palatalization effects before vowels and conclude that phonological regularities must have access to morphosyntactic information. In addition, certain regularities in the selection of diminutive allomorphs suggest that morphology must have access to phonetic information. As domain specificity, the core concept of modular approaches, is compromised, modularity does not seem a likely candidate for a universal property of grammar.

2020 ◽  
Vol 10 (9) ◽  
pp. 1153
Author(s):  
Yumin Gong

Sven Birkerts (1951-) is an American essayist. His essay The Strange Days is well received by readers. In the context of globalization, literary translation is an important part of cultural exchanges. The Skopos Theory is the theory that applies the Skopos concept to translation. The core concept of Skopos Theory is that translation strategies and methods are determined by the purpose of translation. In the process of translation, the translator should follow three principles, namely, skopos rule, coherence rule and fidelity rule. The translation of literary texts coincides with the idea of Skopos Theory. This paper analyzes the advantages of Skopos Theory in the selection of translation strategies for the translation of The Strange Days from the perspective of the principle of skopos, coherence and fidelity.


2019 ◽  
Vol 2 (5) ◽  
Author(s):  
Mengda Zhang ◽  
Chenjing Zhou ◽  
Tian-tian Zhang ◽  
Yan Han

Selecting check index quantitatively is the core of the calibration of micro traffic simulation parameters at signal intersection. Five indexes in the node (intersection) module of VISSIM were selected as the check index set. Twelve simulation parameters in the core module were selected as the simulation parameters set. Optimal process of parameter calibration was proposed and model of the intersection of Huangcun west street and Xinghua street in Beijing was built in VISSIM to verify it. The sensitivity analysis between each check index and simulation parameter in their own set was conducted respectively. Sensitive parameter sets of different check indices were obtained and compared. The results show that different indexes have different size of set, and average vehicle delay's is maximum, so it's necessary to select index quantitatively. The results can provide references for scientific selection of the check indexes and improve the study efficiency of parameter calibration.


2021 ◽  
Vol 11 (9) ◽  
pp. 4011
Author(s):  
Dan Wang ◽  
Jindong Zhao ◽  
Chunxiao Mu

In the field of modern bidding, electronic bidding leads a new trend of development, convenience and efficiency and other significant advantages effectively promote the reform and innovation of China’s bidding field. Nowadays, most systems require a strong and trusted third party to guarantee the integrity and security of the system. However, with the development of blockchain technology and the rise of privacy protection, researchers has begun to emphasize the core concept of decentralization. This paper introduces a decentralized electronic bidding system based on blockchain and smart contract. The system uses blockchain to replace the traditional database and uses chaincode to process business logic. In data interaction, encryption techniques such as zero-knowledge proof based on graph isomorphism are used to improve privacy protection, which improves the anonymity of participants, the privacy of data transmission, and the traceability and verifiable of data. Compared with other electronic bidding systems, this system is more secure and efficient, and has the nature of anonymous operation, which fully protects the privacy information in the bidding process.


2021 ◽  
Vol 7 (12) ◽  
pp. eabc9800
Author(s):  
Ryan J. Gallagher ◽  
Jean-Gabriel Young ◽  
Brooke Foucault Welles

Core-periphery structure, the arrangement of a network into a dense core and sparse periphery, is a versatile descriptor of various social, biological, and technological networks. In practice, different core-periphery algorithms are often applied interchangeably despite the fact that they can yield inconsistent descriptions of core-periphery structure. For example, two of the most widely used algorithms, the k-cores decomposition and the classic two-block model of Borgatti and Everett, extract fundamentally different structures: The latter partitions a network into a binary hub-and-spoke layout, while the former divides it into a layered hierarchy. We introduce a core-periphery typology to clarify these differences, along with Bayesian stochastic block modeling techniques to classify networks in accordance with this typology. Empirically, we find a rich diversity of core-periphery structure among networks. Through a detailed case study, we demonstrate the importance of acknowledging this diversity and situating networks within the core-periphery typology when conducting domain-specific analyses.


Author(s):  
Yufei Li ◽  
Xiaoyong Ma ◽  
Xiangyu Zhou ◽  
Pengzhen Cheng ◽  
Kai He ◽  
...  

Abstract Motivation Bio-entity Coreference Resolution focuses on identifying the coreferential links in biomedical texts, which is crucial to complete bio-events’ attributes and interconnect events into bio-networks. Previously, as one of the most powerful tools, deep neural network-based general domain systems are applied to the biomedical domain with domain-specific information integration. However, such methods may raise much noise due to its insufficiency of combining context and complex domain-specific information. Results In this paper, we explore how to leverage the external knowledge base in a fine-grained way to better resolve coreference by introducing a knowledge-enhanced Long Short Term Memory network (LSTM), which is more flexible to encode the knowledge information inside the LSTM. Moreover, we further propose a knowledge attention module to extract informative knowledge effectively based on contexts. The experimental results on the BioNLP and CRAFT datasets achieve state-of-the-art performance, with a gain of 7.5 F1 on BioNLP and 10.6 F1 on CRAFT. Additional experiments also demonstrate superior performance on the cross-sentence coreferences. Supplementary information Supplementary data are available at Bioinformatics online.


2004 ◽  
Vol 02 (01) ◽  
pp. 215-239 ◽  
Author(s):  
TOLGA CAN ◽  
YUAN-FANG WANG

We present a new method for conducting protein structure similarity searches, which improves on the efficiency of some existing techniques. Our method is grounded in the theory of differential geometry on 3D space curve matching. We generate shape signatures for proteins that are invariant, localized, robust, compact, and biologically meaningful. The invariancy of the shape signatures allows us to improve similarity searching efficiency by adopting a hierarchical coarse-to-fine strategy. We index the shape signatures using an efficient hashing-based technique. With the help of this technique we screen out unlikely candidates and perform detailed pairwise alignments only for a small number of candidates that survive the screening process. Contrary to other hashing based techniques, our technique employs domain specific information (not just geometric information) in constructing the hash key, and hence, is more tuned to the domain of biology. Furthermore, the invariancy, localization, and compactness of the shape signatures allow us to utilize a well-known local sequence alignment algorithm for aligning two protein structures. One measure of the efficacy of the proposed technique is that we were able to perform structure alignment queries 36 times faster (on the average) than a well-known method while keeping the quality of the query results at an approximately similar level.


2016 ◽  
Vol 34 (4) ◽  
pp. 280-289 ◽  
Author(s):  
Ellen T Crumley

Background Internationally, physicians are integrating medical acupuncture into their practice. Although there are some informative surveys and reviews, there are few international, exploratory studies detailing how physicians have accommodated medical acupuncture (eg, by modifying schedules, space and processes). Objective To examine how physicians integrate medical acupuncture into their practice. Methods Semi-structured interviews and participant observations of physicians practising medical acupuncture were conducted using convenience and snowball sampling. Data were analysed in NVivo and themes were developed. Despite variation, three principal models were developed to summarise the different ways that physicians integrated medical acupuncture into their practice, using the core concept of ‘helping’. Quotes were used to illustrate each model and its corresponding themes. Results There were 25 participants from 11 countries: 21 agreed to be interviewed and four engaged in participant observations. Seventy-two per cent were general practitioners. The three models were: (1) appointments (44%); (2) clinics (44%); and (3) full-time practice (24%). Some physicians held both appointments and regular clinics (models 1 and 2). Most full-time physicians initially tried appointments and/or clinics. Some physicians charged to offset administration costs or compensate for their time. Discussion Despite variation within each category, the three models encapsulated how physicians described their integration of medical acupuncture. Physicians varied in how often they administered medical acupuncture and the amount of time they spent with patients. Although 24% of physicians surveyed administered medical acupuncture full-time, most practised it part-time. Each individual physician incorporated medical acupuncture in the way that worked best for their practice.


2018 ◽  
Vol 7 (11) ◽  
pp. 236
Author(s):  
Ank Michels ◽  
Harmen Binnema

In recent decades, so-called “mini-publics” have been organized in many countries to renew policy making and democracy. One characteristic of mini-publics is that the selection of the participants is based on random sampling or sortition. This gives each member of the community an equal chance of being selected. Another feature is that deliberation forms the core of the process of how proposals are developed. In this paper, we investigate the possibilities and challenges of sortition and deliberation in the context of the call for a deepening of democracy and more citizen engagement in policy making. Based on extensive research on citizens’ forums (G1000) in The Netherlands, we show the potential of mini-publics, but a number of shortcomings as well. Some of these are related to the specific design of the G1000, while others are of a more fundamental nature and are due to the contradictory democratic values that deliberative mini-publics try to combine. One of these concerns the tension between the quality of deliberation and political impact. We conclude that combining institutional approaches could be a way out to deal with these tensions and a step forward to both deepen and connect democratic processes.


Sign in / Sign up

Export Citation Format

Share Document