scholarly journals Discourse markers and (dis)fluency in English and French

2017 ◽  
Vol 22 (2) ◽  
pp. 242-269 ◽  
Author(s):  
Ludivine Crible

Abstract While discourse markers (DMs) and (dis)fluency have been extensively studied in the past as separate phenomena, corpus-based research combining large-scale yet fine-grained annotations of both categories has, however, never been carried out before. Integrating these two levels of analysis, while methodologically challenging, is not only innovative but also highly relevant to the investigation of spoken discourse in general and form-meaning patterns in particular. The aim of this paper is to provide corpus-based evidence of the register-sensitivity of DMs and other disfluencies (e.g. pauses, repetitions) and of their tendency to combine in recurrent clusters. These claims are supported by quantitative findings on the variation and combination of DMs with other (dis)fluency devices in DisFrEn, a richly annotated and comparable English-French corpus representative of eight different interaction settings. The analysis uncovers the prominent place of DMs within (dis)fluency and meaningful association patterns between forms and functions, in a usage-based approach to meaning-in-context.

1989 ◽  
Vol 67 (7) ◽  
pp. 2005-2016 ◽  
Author(s):  
Alan Dickman ◽  
Stanton Cook

Two mortality factors create large-scale pattern in forests of Tsuga mertensiana in subalpine central Oregon Cascade Mountains. Half of an 18 000 - ha study area has experienced stand-destroying fire during the last 500 years. These fires varied in size from 1 to 3200 ha. Individual Phellinus weirii infestations are smaller than most fires and collectively cover less total area, but they are more numerous and fine-grained in their dispersion. Postfire stands are colonized by Pinus contorta, which persists for 2 centuries before being replaced by mountain hemlock. In stands older than 200 years, Phellinus weirii becomes apparent as it spreads from centers and forms patches where it infects trees and alters the plant community. Fungal isolates were collected from 61 infestations; these were subjected to clonal analyses. Spatial dispersion of ramets and genets supports the inference that many infestations are sibling ramets of genets that have survived stand-destroying fire. The age distribution of genets yields the inference that infestations have been initiated or modified by basidiospore infection during the past 1300 years. Several genets are older than 1000 years. Fire has reduced the visible area of infestation of the fungus. It probably has done so by favoring less susceptible host species and by reducing the modal size of dead roots and logs. While repeated fire could reduce the level of fungal infestation, infestations may enhance the probability of stand-destroying fires.


2015 ◽  
Vol 61 ◽  
pp. 34-48
Author(s):  
Catherine Morgan

Over the past year the School has delivered a rich and varied research programme combining a range of projects in antiquity, spanning the Palaeolithic to Byzantine periods, science-based archaeology to epigraphy (including the work of the Fitch Laboratory and the Knossos Research Centre), with research in sectors from the fine arts to history and the social sciences (see Map 2).At Knossos, new investigation in the suburb of Gypsadhes, directed by Ioanna Serpetsedaki (23rd EPCA), Eleni Hatzaki (Cincinnati), Amy Bogaard (Oxford) and Gianna Ayala (Sheffield), forms part of Oxford University's ERC-funded project Agricultural Origins of Urban Civilisation. The Gypsadhes excavation features large-scale bioarchaeological research, aimed at providing the fine-grained information necessary to reconstruct the Knossian economy through time.


2016 ◽  
Vol 2 (7) ◽  
pp. e1501215 ◽  
Author(s):  
Nabeel Abdur Rehman ◽  
Shankar Kalyanaraman ◽  
Talal Ahmad ◽  
Fahad Pervaiz ◽  
Umar Saif ◽  
...  

Thousands of lives are lost every year in developing countries for failing to detect epidemics early because of the lack of real-time disease surveillance data. We present results from a large-scale deployment of a telephone triage service as a basis for dengue forecasting in Pakistan. Our system uses statistical analysis of dengue-related phone calls to accurately forecast suspected dengue cases 2 to 3 weeks ahead of time at a subcity level (correlation of up to 0.93). Our system has been operational at scale in Pakistan for the past 3 years and has received more than 300,000 phone calls. The predictions from our system are widely disseminated to public health officials and form a critical part of active government strategies for dengue containment. Our work is the first to demonstrate, with significant empirical evidence, that an accurate, location-specific disease forecasting system can be built using analysis of call volume data from a public health hotline.


2020 ◽  
Author(s):  
Donna Rose Addis

Mental time travel (MTT) is defined as projecting the self into the past and the future. Despite growing evidence of the similarities of remembering past and imagining future events, dominant theories conceive of these as distinct capacities. I propose that memory and imagination are fundamentally the same process – constructive episodic simulation – and demonstrate that the ‘simulation system’ meets the three criteria of a neurocognitive system. Irrespective of whether one is remembering or imagining, the simulation system: (1) acts on the same information, drawing on elements of experience ranging from fine-grained perceptual details to coarser-grained conceptual information and schemas about the world; (2) is governed by the same rules of operation, including associative processes that facilitate construction of a schematic scaffold, the event representation itself, and the dynamic interplay between the two (cf. predictive coding); and (3) is subserved by the same brain system. I also propose that by forming associations between schemas, the simulation system constructs multi-dimensional cognitive spaces, within which any given simulation is mapped by the hippocampus. Finally, I suggest that simulation is a general capacity that underpins other domains of cognition, such as the perception of ongoing experience. This proposal has some important implications for the construct of ‘MTT’, suggesting that ‘time’ and ‘travel’ may not be defining, or even essential, features. Rather, it is the ‘mental’ rendering of experience that is the most fundamental function of this simulation system, enabling humans to re-experience the past, pre-experience the future, and also comprehend the complexities of the present.


2020 ◽  
Author(s):  
Lungwani Muungo

The purpose of this review is to evaluate progress inmolecular epidemiology over the past 24 years in canceretiology and prevention to draw lessons for futureresearch incorporating the new generation of biomarkers.Molecular epidemiology was introduced inthe study of cancer in the early 1980s, with theexpectation that it would help overcome some majorlimitations of epidemiology and facilitate cancerprevention. The expectation was that biomarkerswould improve exposure assessment, document earlychanges preceding disease, and identify subgroupsin the population with greater susceptibility to cancer,thereby increasing the ability of epidemiologic studiesto identify causes and elucidate mechanisms incarcinogenesis. The first generation of biomarkers hasindeed contributed to our understanding of riskandsusceptibility related largely to genotoxic carcinogens.Consequently, interventions and policy changes havebeen mounted to reduce riskfrom several importantenvironmental carcinogens. Several new and promisingbiomarkers are now becoming available for epidemiologicstudies, thanks to the development of highthroughputtechnologies and theoretical advances inbiology. These include toxicogenomics, alterations ingene methylation and gene expression, proteomics, andmetabonomics, which allow large-scale studies, includingdiscovery-oriented as well as hypothesis-testinginvestigations. However, most of these newer biomarkershave not been adequately validated, and theirrole in the causal paradigm is not clear. There is a needfor their systematic validation using principles andcriteria established over the past several decades inmolecular cancer epidemiology.


1987 ◽  
Vol 19 (5-6) ◽  
pp. 701-710 ◽  
Author(s):  
B. L. Reidy ◽  
G. W. Samson

A low-cost wastewater disposal system was commissioned in 1959 to treat domestic and industrial wastewaters generated in the Latrobe River valley in the province of Gippsland, within the State of Victoria, Australia (Figure 1). The Latrobe Valley is the centre for large-scale generation of electricity and for the production of pulp and paper. In addition other industries have utilized the brown coal resource of the region e.g. gasification process and char production. Consequently, industrial wastewaters have been dominant in the disposal system for the past twenty-five years. The mixed industrial-domestic wastewaters were to be transported some eighty kilometres to be treated and disposed of by irrigation to land. Several important lessons have been learnt during twenty-five years of operating this system. Firstly the composition of the mixed waste stream has varied significantly with the passage of time and the development of the industrial base in the Valley, so that what was appropriate treatment in 1959 is not necessarily acceptable in 1985. Secondly the magnitude of adverse environmental impacts engendered by this low-cost disposal procedure was not imagined when the proposal was implemented. As a consequence, clean-up procedures which could remedy the adverse effects of twenty-five years of impact are likely to be costly. The question then may be asked - when the total costs including rehabilitation are considered, is there really a low-cost solution for environmentally safe disposal of complex wastewater streams?


2019 ◽  
Vol 19 (1) ◽  
pp. 4-16 ◽  
Author(s):  
Qihui Wu ◽  
Hanzhong Ke ◽  
Dongli Li ◽  
Qi Wang ◽  
Jiansong Fang ◽  
...  

Over the past decades, peptide as a therapeutic candidate has received increasing attention in drug discovery, especially for antimicrobial peptides (AMPs), anticancer peptides (ACPs) and antiinflammatory peptides (AIPs). It is considered that the peptides can regulate various complex diseases which are previously untouchable. In recent years, the critical problem of antimicrobial resistance drives the pharmaceutical industry to look for new therapeutic agents. Compared to organic small drugs, peptide- based therapy exhibits high specificity and minimal toxicity. Thus, peptides are widely recruited in the design and discovery of new potent drugs. Currently, large-scale screening of peptide activity with traditional approaches is costly, time-consuming and labor-intensive. Hence, in silico methods, mainly machine learning approaches, for their accuracy and effectiveness, have been introduced to predict the peptide activity. In this review, we document the recent progress in machine learning-based prediction of peptides which will be of great benefit to the discovery of potential active AMPs, ACPs and AIPs.


Author(s):  
Jeasik Cho

This book provides the qualitative research community with some insight on how to evaluate the quality of qualitative research. This topic has gained little attention during the past few decades. We, qualitative researchers, read journal articles, serve on masters’ and doctoral committees, and also make decisions on whether conference proposals, manuscripts, or large-scale grant proposals should be accepted or rejected. It is assumed that various perspectives or criteria, depending on various paradigms, theories, or fields of discipline, have been used in assessing the quality of qualitative research. Nonetheless, until now, no textbook has been specifically devoted to exploring theories, practices, and reflections associated with the evaluation of qualitative research. This book constructs a typology of evaluating qualitative research, examines actual information from websites and qualitative journal editors, and reflects on some challenges that are currently encountered by the qualitative research community. Many different kinds of journals’ review guidelines and available assessment tools are collected and analyzed. Consequently, core criteria that stand out among these evaluation tools are presented. Readers are invited to join the author to confidently proclaim: “Fortunately, there are commonly agreed, bold standards for evaluating the goodness of qualitative research in the academic research community. These standards are a part of what is generally called ‘scientific research.’ ”


2019 ◽  
Vol 22 (3) ◽  
pp. 365-380 ◽  
Author(s):  
Matthias Olthaar ◽  
Wilfred Dolfsma ◽  
Clemens Lutz ◽  
Florian Noseleit

In a competitive business environment at the Bottom of the Pyramid smallholders supplying global value chains may be thought to be at the whims of downstream large-scale players and local market forces, leaving no room for strategic entrepreneurial behavior. In such a context we test the relationship between the use of strategic resources and firm performance. We adopt the Resource Based Theory and show that seemingly homogenous smallholders deploy resources differently and, consequently, some do outperform others. We argue that the ‘resource-based theory’ results in a more fine-grained understanding of smallholder performance than approaches generally applied in agricultural economics. We develop a mixed-method approach that allows one to pinpoint relevant, industry-specific resources, and allows for empirical identification of the relative contribution of each resource to competitive advantage. The results show that proper use of quality labor, storage facilities, time of selling, and availability of animals are key capabilities.


Geosciences ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 41
Author(s):  
Tim Jurisch ◽  
Stefan Cantré ◽  
Fokke Saathoff

A variety of studies recently proved the applicability of different dried, fine-grained dredged materials as replacement material for erosion-resistant sea dike covers. In Rostock, Germany, a large-scale field experiment was conducted, in which different dredged materials were tested with regard to installation technology, stability, turf development, infiltration, and erosion resistance. The infiltration experiments to study the development of a seepage line in the dike body showed unexpected measurement results. Due to the high complexity of the problem, standard geo-hydraulic models proved to be unable to analyze these results. Therefore, different methods of inverse infiltration modeling were applied, such as the parameter estimation tool (PEST) and the AMALGAM algorithm. In the paper, the two approaches are compared and discussed. A sensitivity analysis proved the presumption of a non-linear model behavior for the infiltration problem and the Eigenvalue ratio indicates that the dike infiltration is an ill-posed problem. Although this complicates the inverse modeling (e.g., termination in local minima), parameter sets close to an optimum were found with both the PEST and the AMALGAM algorithms. Together with the field measurement data, this information supports the rating of the effective material properties of the applied dredged materials used as dike cover material.


Sign in / Sign up

Export Citation Format

Share Document