scholarly journals A Survey on Edge Performance Benchmarking

2021 ◽  
Vol 54 (3) ◽  
pp. 1-33
Author(s):  
Blesson Varghese ◽  
Nan Wang ◽  
David Bermbach ◽  
Cheol-Ho Hong ◽  
Eyal De Lara ◽  
...  

Edge computing is the next Internet frontier that will leverage computing resources located near users, sensors, and data stores to provide more responsive services. Therefore, it is envisioned that a large-scale, geographically dispersed, and resource-rich distributed system will emerge and play a key role in the future Internet. However, given the loosely coupled nature of such complex systems, their operational conditions are expected to change significantly over time. In this context, the performance characteristics of such systems will need to be captured rapidly, which is referred to as performance benchmarking, for application deployment, resource orchestration, and adaptive decision-making. Edge performance benchmarking is a nascent research avenue that has started gaining momentum over the past five years. This article first reviews articles published over the past three decades to trace the history of performance benchmarking from tightly coupled to loosely coupled systems. It then systematically classifies previous research to identify the system under test, techniques analyzed, and benchmark runtime in edge performance benchmarking.

2018 ◽  
Author(s):  
Yang Xu ◽  
Barbara Claire Malt ◽  
Mahesh Srinivasan

One way that languages are able to communicate a potentially infinite set of ideas through a finite lexicon is by compressing emerging meanings into words, such that over time, individual words come to express multiple, related senses of meaning. We propose that overarching communicative and cognitive pressures have created systematic directionality in how new metaphorical senses have developed from existing word senses over the history of English. Given a large set of pairs of semantic domains, we used computational models to test which domains have been more commonly the starting points (source domains) and which the ending points (target domains) of metaphorical mappings over the past millennium. We found that a compact set of variables, including externality, embodiment, and valence, explain directionality in the majority of about 5000 metaphorical mappings recorded over the past 1100 years. These results provide the first large-scale historical evidence that metaphorical mapping is systematic, and driven by measurable communicative and cognitive principles.


Author(s):  
Nuria Lloret Romero

E-collaboration and collaborative systems bring geographically dispersed teams together, supporting communication, coordination and cooperation. From the scientific perspective, the development of theories and mechanisms to enable building collaborative systems presents exciting research challenges across information subfields. From the applications perspective, the capability to collaborate with users and other systems is essential if large-scale information systems of the future are to assist users in finding the information they need and solving the problems they have. This chapter presents a review of research in the area of creating collaborative applications and taxonomies. The author analyzes previous literature, and examines some practice cases and research prototypes in the domain of collaborative computing. Finally the chapter provides a list of basic collaboration services, and tools are presented relating to the services they provide. All surveyed tools are then classified under categories of functional services. In conclusion, the chapter highlights a number of areas for consideration and improvement that arise when studying collaborative applications.


2015 ◽  
Vol 9 (2) ◽  
pp. 306-326 ◽  
Author(s):  
Allan Megill

In recent years David Christian and others have promoted “Big History” as an innovative approach to the study of the past. The present paper juxtaposes to Big History an old Big History, namely, the tradition of “universal history” that flourished in Europe from the mid-sixteenth century until well into the nineteenth century. The claim to universality of works in that tradition depended on the assumed truth of Christianity, a fact that was fully acknowledged by the tradition’s adherents. The claim of the new Big History to universality likewise depends on prior assumptions. Simply stated, in its various manifestations the “new” Big History is rooted either in a continuing theology, or in a form of materialism that is assumed to be determinative of human history, or in a somewhat contradictory amalgam of the two. The present paper suggests that “largest-scale history” as exemplified in the old and new Big Histories is less a contribution to historical knowledge than it is a narrativization of one or another worldview. Distinguishing between largest-scale history and history that is “merely” large-scale, the paper also suggests that a better approach to meeting the desire for large scale in historical writing is through more modest endeavors, such as large-scale comparative history, network and exchange history, thematic history, and history of modernization.


Urban History ◽  
2020 ◽  
pp. 1-17
Author(s):  
Simon Briercliffe

Abstract The recreation of urban historical space in museums is inevitably a complex, large-scale endeavour bridging the worlds of academic and public history. BCLM: Forging Ahead at the Black Country Living Museum is a £23m project recreating a typical Black Country town post-World War II. This article uses case-studies of three buildings – a Civic Restaurant, a record shop and a pub – to argue that urban-historical research methodology and community engagement can both create a vivid sense of the past, and challenge pervasive prejudices. It also argues that such a collaborative and public project reveals much about the urban and regional nature of industrial areas like the Black Country in this pivotal historical moment.


1983 ◽  
Vol 13 (4) ◽  
pp. 539-547 ◽  
Author(s):  
J. R. Blais

The history of spruce budworm (Choristoneurafumiferana (Clem.)) outbreaks for the past 200 to 300 years, for nine regions in eastern Canada, indicates that outbreaks have occurred more frequently in the 20th century than previously. Regionally, 21 outbreaks took place in the past 80 years compared with 9 in the preceding 100 years. Earlier infestations were restricted to specific regions, but in the 20th century they have coalesced and increased in size, the outbreaks of 1910, 1940, and 1970 having covered 10, 25, and 55 million ha respectively. Reasons for the increase in frequency, extent, and severity of outbreaks appear mostly attributable to changes caused by man, in the forest ecosystem. Clear-cutting of pulpwood stands, fire protection, and use of pesticides against budworm favor fir–spruce stands, rendering the forest more prone to budworm attack. The manner and degree to which each of these practices has altered forest composition is discussed. In the future, most of these practices are expected to continue and their effects could intensify, especially in regions of recent application. Other practices, including large-scale planting of white spruce, could further increase the susceptibility of forest stands. Forest management, aimed at reducing the occurrence of extensive fir–spruce stands, has been advocated as a long-term solution to the budworm problem. The implementation of this measure at a time when man's actions result in the proliferation of fir presents a most serious challenge to forest managers.


Plant Disease ◽  
2014 ◽  
Vol 98 (11) ◽  
pp. 1534-1542 ◽  
Author(s):  
Anmin Wan ◽  
Xianming Chen

Puccinia striiformis f. sp. tritici causes stripe rust (yellow rust) of wheat and is highly variable in virulence toward wheat with race-specific resistance. During 2010, wheat stripe rust was the most widespread in the recorded history of the United States, resulting in large-scale application of fungicides and substantial yield loss. A new differential set with 18 yellow rust (Yr) single-gene lines was established and used to differentiate races of P. striiformis f. sp. tritici, which were named as race PSTv in distinction from the PST races identified in the past. An octal system was used to describe the virulence and avirulence patterns of the PSTv races. From 348 viable P. striiformis f. sp. tritici isolates recovered from a total of 381 wheat and grass stripe rust samples collected in 24 states, 41 races, named PSTv-1 to PSTv-41, were identified using the new set of 18 Yr single-gene differentials, and their equivalent PST race names were determined on the previous set of 20 wheat cultivar differentials. The frequencies and distributions of the races and their virulences were determined. The five most predominant races were PSTv-37 (34.5%), PSTv-11 (17.5%), PSTv-14 (7.2%), PSTv-36 (5.2%), and PSTv-34 (4.9%). PSTv-37 was distributed throughout the country while PSTv-11 and PSTv-14 were almost restricted to states west of the Rocky Mountains. The races had virulence to 0 to 13 of the 18 Yr genes. Frequencies of virulences toward resistance genes Yr6, Yr7, Yr8, Yr9, Yr17, Yr27, Yr43, Yr44, YrTr1, and YrExp2 were high (67.0 to 93.7%); those to Yr1 (32.8%) and YrTye (31.3%) were moderate; and those to Yr10, Yr24, Yr32, and YrSP were low (3.4 to 5.7%). All of the isolates were avirulent to Yr5 and Yr15.


Author(s):  
Bas van der Vossen ◽  
Jason Brennan

One popular argument for global redistribution focuses on the history of colonialism, which is rife with injustices perpetrated by the former governments of Western nations. Current citizens of these societies can be taxed to pay reparations to people their former colonies. The chapter inspects two different arguments for this view: one focusing on unjustly gotten gains for rich Western citizens, the other focusing on unjust harms befalling citizens of developing nations. The former argument fails because it misdecribes the fact; contrary to popular belief, most Western citizens were actually harmed by colonialism. The latter argument is better, and actually supports a case for reparations. However, contrary to its proponents’ beliefs, such reparations ought not take the form of large-scale redistribution, but the form of removing the unjust barriers people face that continue the harms they now experience. The duty to repair past injustice mostly strengthens the conclusions of this book.


2013 ◽  
Vol 70 (16) ◽  
pp. 1414-1427 ◽  
Author(s):  
Charles E. Myers

Abstract Purpose The evolution of sterile compounding in the context of hospital patient care, the evolution of related technology, past incidents of morbidity and mortality associated with preparations compounded in various settings, and efforts over the years to improve compounding practices are reviewed. Summary Tightened United States Pharmacopeial Convention standards (since 2004) for sterile compounding made it difficult for hospitals to achieve all of the sterile compounding necessary for patient care. Shortages of manufactured injections added to the need for compounding. Non-hospital-based compounding pharmacies increased sterile compounding to meet the needs. Gaps in federal and state laws and regulations about compounding pharmacies led to deficiencies in their regulation. Lapses in sterility led to injuries and deaths. Perspectives offered include potential actions, including changes in practitioner education, better surveillance of sterile compounding, regulatory reforms, reexamination of the causes of drug shortages, and the development of new technologies. Conclusion Over the years, there have been numerous exhortations for voluntary better performance in sterile compounding. In addition, professional leadership has been vigorous and extensive in the form of guidance, publications, education, enforceable standards, and development of various associations and organizations dealing with safe compounding practices. Yet problems continue to occur. We must engage in diligent learning from the injuries and tragedies that have occurred. Assuming that we are already doing all we can to avoid problems would be an abdication of the professional mission of pharmacists. It would be wrong thinking to assume that the recent problems in large-scale compounding pharmacies are the only problems that warrant attention. It is time for a systematic assessment of the nature and the dimensions of the problems in every type of setting where sterile compounding occurs. It also is time for some innovative thinking about ensuring safety in sterile compounding.


2021 ◽  
Vol 4 ◽  
Author(s):  
Axel Bohmann ◽  
Martin Bohmann ◽  
Lars Hinrichs

We explore the relationship between word dissemination and frequency change for a rapidly receding feature, the relativizer whom. The success of newly emerging words has been shown to correlate with high dissemination scores. However, the reverse—a correlation of lower dissemination scores with receding features—has not been investigated. Based on two established and two newly developed measures of word dissemination—across texts, linguistic environments, registers, and topics—we show that a general correlation between dissemination and frequency does not obtain in the case of whom. Different dissemination measures diverge from each other and show internally variable developments. These can, however, be explained with reference to the specific sociolinguistic history of whom over the past 300 years. Our findings suggest that the relationship between dissemination and word success is not static, but needs to be contextualized against different stages in individual words’ life-cycles. Our study demonstrates the applicability of large-scale, quantitative measures to qualitatively informed sociolinguistic research.


Sign in / Sign up

Export Citation Format

Share Document