scholarly journals Exploring inter-organizational paradoxes: Methodological lessons from a study of a grand challenge

2018 ◽  
Vol 17 (1) ◽  
pp. 120-132 ◽  
Author(s):  
Paula Jarzabkowski ◽  
Rebecca Bednarek ◽  
Konstantinos Chalkias ◽  
Eugenia Cacciatori

In this article, we outline a methodological framework for studying the inter-organizational aspects of paradoxes and specify this in relation to grand challenges. Grand challenges are large-scale, complex, enduring problems that affect large populations, have a strong social component and appear intractable. Our methodological insights draw from our study of the insurance protection gap, a grand challenge that arises when economic losses from large-scale disaster significantly exceed the insured loss, leading to economic and social hardship for the affected communities. We provide insights into collecting data to uncover the paradoxical elements inherent in grand challenges and then propose three analytical techniques for studying inter-organizational paradoxes: zooming in and out, tracking problematization and tracking boundaries and boundary organizations. These techniques can be used to identify and follow how contradictions and interdependences emerge and dynamically persist within inter-organizational interactions and how these shape and are shaped by the unfolding dynamics of the grand challenge. Our techniques and associated research design help advance paradox theorizing by moving it to the inter-organizational and systemic level. This article also illustrates paradox as a powerful lens through which to further our understanding of grand challenges.

2011 ◽  
Vol 26 (2) ◽  
pp. 99-108 ◽  
Author(s):  
Susan J Winter ◽  
Brian S Butler

The impact of a discipline's research is constrained by its ability to articulate compelling problems. Well-crafted problems are the foundation for mobilizing the effort, resources, and attention essential to scientific progress and broader impact. We argue that Information Systems (IS) scholars, individually and collectively, must develop the practice of articulating and engaging large-scale, broad scope problems – or grand challenges. To support this position, we examine the role and value of grand challenge efforts in science and engineering based on a theory of grand challenges as socially constructed boundary objects. Conceptualizing grand challenges in these terms implies strategies and approaches for magnifying the impact of IS research by engaging these types of problems.


Author(s):  
Simon Thomas

Trends in the technology development of very large scale integrated circuits (VLSI) have been in the direction of higher density of components with smaller dimensions. The scaling down of device dimensions has been not only laterally but also in depth. Such efforts in miniaturization bring with them new developments in materials and processing. Successful implementation of these efforts is, to a large extent, dependent on the proper understanding of the material properties, process technologies and reliability issues, through adequate analytical studies. The analytical instrumentation technology has, fortunately, kept pace with the basic requirements of devices with lateral dimensions in the micron/ submicron range and depths of the order of nonometers. Often, newer analytical techniques have emerged or the more conventional techniques have been adapted to meet the more stringent requirements. As such, a variety of analytical techniques are available today to aid an analyst in the efforts of VLSI process evaluation. Generally such analytical efforts are divided into the characterization of materials, evaluation of processing steps and the analysis of failures.


1969 ◽  
Vol 08 (01) ◽  
pp. 07-11 ◽  
Author(s):  
H. B. Newcombe

Methods are described for deriving personal and family histories of birth, marriage, procreation, ill health and death, for large populations, from existing civil registrations of vital events and the routine records of ill health. Computers have been used to group together and »link« the separately derived records pertaining to successive events in the lives of the same individuals and families, rapidly and on a large scale. Most of the records employed are already available as machine readable punchcards and magnetic tapes, for statistical and administrative purposes, and only minor modifications have been made to the manner in which these are produced.As applied to the population of the Canadian province of British Columbia (currently about 2 million people) these methods have already yielded substantial information on the risks of disease: a) in the population, b) in relation to various parental characteristics, and c) as correlated with previous occurrences in the family histories.


Author(s):  
Sheree A Pagsuyoin ◽  
Joost R Santos

Water is a critical natural resource that sustains the productivity of many economic sectors, whether directly or indirectly. Climate change alongside rapid growth and development are a threat to water sustainability and regional productivity. In this paper, we develop an extension to the economic input-output model to assess the impact of water supply disruptions to regional economies. The model utilizes the inoperability variable, which measures the extent to which an infrastructure system or economic sector is unable to deliver its intended output. While the inoperability concept has been utilized in previous applications, this paper offers extensions that capture the time-varying nature of inoperability as the sectors recover from a disruptive event, such as drought. The model extension is capable of inserting inoperability adjustments within the drought timeline to capture time-varying likelihoods and severities, as well as the dependencies of various economic sectors on water. The model was applied to case studies of severe drought in two regions: (1) the state of Massachusetts (MA) and (2) the US National Capital Region (NCR). These regions were selected to contrast drought resilience between a mixed urban–rural region (MA) and a highly urban region (NCR). These regions also have comparable overall gross domestic products despite significant differences in the distribution and share of the economic sectors comprising each region. The results of the case studies indicate that in both regions, the utility and real estate sectors suffer the largest economic loss; nonetheless, results also identify region-specific sectors that incur significant losses. For the NCR, three sectors in the top 10 ranking of highest economic losses are government-related, whereas in the MA, four sectors in the top 10 are manufacturing sectors. Furthermore, the accommodation sector has also been included in the NCR case intuitively because of the high concentration of museums and famous landmarks. In contrast, the Wholesale Trade sector was among the sectors with the highest economic losses in the MA case study because of its large geographic size conducive for warehouses used as nodes for large-scale supply chain networks. Future modeling extensions could potentially include analysis of water demand and supply management strategies that can enhance regional resilience against droughts. Other regional case studies can also be pursued in future efforts to analyze various categories of drought severity beyond the case studies featured in this paper.


2021 ◽  
Vol 99 (Supplement_3) ◽  
pp. 239-239
Author(s):  
Ashley S Ling ◽  
Taylor Krause ◽  
Amanda Warner ◽  
Jason Duggin ◽  
Bradley Heins ◽  
...  

Abstract Horn flies (Haematobia irritans) are a major nuisance to cattle, especially in warm, humid regions, and are estimated to cause economic losses in excess of $1 billion annually to the U.S. beef cattle industry. Variation in horn fly tolerance has been reported within and across breeds, and heritability estimates ranging between 10 and 80% show a clear genetic basis. However, collecting fly abundance phenotypes is costly and logistically demanding, which precludes large-scale implementation. Consequently, finding correlated phenotypes and endo-phenotypes that are heritable and relatively easy to measure would facilitate implementation of horn fly tolerance genetic improvement programs. Thrombin (TH), a blood coagulation precursor, has a reported association with horn fly count variation within and across cattle breeds. In this study, the genetic basis of thrombin in beef cattle was investigated. Blood samples and horn fly count were collected on 360 cows and heifers twice during the summer of 2019 (June and August). Due to uncertainty associated with assessment of horn fly abundance and thrombin and the fact that economic losses occur only when fly abundance exceeds a certain threshold, thrombin was categorized into 4 classes (1=TH > 500 ng/ml; 2=250< TH< 500 ng/ml; 3=100< TH< 250 ng/ml; and 4=TH< 100 ng/ml). The trait was analyzed using linear (continuous) and threshold (discrete) mixed models. Both models included farm, pregnancy status, and cow age as fixed effects and additive and permanent environment random effects. The pedigree included 642 animals. Estimates of heritability were 0.24 and 0.29 using linear and threshold models, respectively. Estimates of repeatability were slightly higher using the threshold model (0.21 vs 0.19). Despite the small data size, all estimates were non-zero based on their respective highest posterior density intervals. These results indicate reasonable genetic variation for thrombin that could be harnessed for improvement of horn fly tolerance in cattle.


2012 ◽  
Vol 4 (4) ◽  
pp. 475-504 ◽  
Author(s):  
Lindsey N. Kingston ◽  
Saheli Datta

Norms of global responsibility have changed significantly since the 1948 Universal Declaration of Human Rights (UDHR), and today’s international community critically considers responsibilities within and beyond state borders, as evidenced by the adoption of the Responsibility to Protect (R2P) doctrine. From this starting point, protection must be extended to large populations susceptible to structural violence – social harms resulting from the pervasive and persistent impact of economic, political and cultural violence in societies. In order to show the potential of expanded conceptions of global responsibility, this article proceeds as follows: First, a discussion of the evolving concepts of responsibility outlines a shift in thinking about sovereignty that creates a multilayered system of responsibility. This section defines key concepts and highlights an ‘unbundled R2P’ framework for approaching structural violence. Second, an overview of two vulnerable populations – internally displaced persons (IDPs) and the stateless – illustrates that large-scale cases of state abuse and neglect are not limited to acts of physical violence, and that pervasive structural violence requires further attention from the international community. Lastly, recommendations are provided for expanding the scope of global responsibility in order to assist the internally displaced and the stateless. These recommendations address who is responsible, when global responsibility is warranted, and how such responsibility should be implemented.


2018 ◽  
Vol 5 (3) ◽  
pp. 172265 ◽  
Author(s):  
Alexis R. Hernández ◽  
Carlos Gracia-Lázaro ◽  
Edgardo Brigatti ◽  
Yamir Moreno

We introduce a general framework for exploring the problem of selecting a committee of representatives with the aim of studying a networked voting rule based on a decentralized large-scale platform, which can assure a strong accountability of the elected. The results of our simulations suggest that this algorithm-based approach is able to obtain a high representativeness for relatively small committees, performing even better than a classical voting rule based on a closed list of candidates. We show that a general relation between committee size and representatives exists in the form of an inverse square root law and that the normalized committee size approximately scales with the inverse of the community size, allowing the scalability to very large populations. These findings are not strongly influenced by the different networks used to describe the individuals’ interactions, except for the presence of few individuals with very high connectivity which can have a marginal negative effect in the committee selection process.


2020 ◽  
Author(s):  
zhenhua Guo ◽  
Kunpeng Li ◽  
Songlin Qiao ◽  
Xinxin Chen ◽  
Ruiguang Deng ◽  
...  

Abstract Background: African swine fever (ASF) is the most important disease to the pigs and cause serious economic losses to the countries with large-scale swine production. Vaccines are recognized as the most useful tool to prevent and control ASF virus (ASFV) infection. Currently, the MGF505 and MGF360 gene-deleted ASFVs or combined with CD2v deletion were confirmed to be the most promising vaccine candidates. Thus, it is essential to develop a diagnosis method to discriminate wide-type strain from the vaccines used.Results: In this study, we established a duplex TaqMan real-time PCR based on the B646L gene and MGF505-2R gene. The sequence alignment showed that the targeted regions of primers and probes are highly conserved in the genotype II ASFVs. The duplex real-time assay can specifically detect B646L and MGF505-2R gene single or simultaneously without cross-reaction with other porcine viruses tested. The limit of detection was 5.8 copies and 3.0 copies for the standard plasmids containing B646L and MGF505-2R genes, respectively. Clinical samples were tested in parallel by duplex real-time PCR and a commercial ASFV detection kit. The detection results of these two assays against B646L gene were well consistent.Conclusion: We successfully developed and evaluated a duplex TaqMan real-time PCR method which can effectively distinguish the wide type and MGF505 gene-deleted ASFVs. It would be a useful tool for the clinical diagnosis and control of ASF.


2021 ◽  
Vol 1 (1) ◽  
pp. 76-87
Author(s):  
Alexander Buhmann ◽  
Christian Fieseler

Organizations increasingly delegate agency to artificial intelligence. However, such systems can yield unintended negative effects as they may produce biases against users or reinforce social injustices. What pronounces them as a unique grand challenge, however, are not their potentially problematic outcomes but their fluid design. Machine learning algorithms are continuously evolving; as a result, their functioning frequently remains opaque to humans. In this article, we apply recent work on tackling grand challenges though robust action to assess the potential and obstacles of managing the challenge of algorithmic opacity. We stress that although this approach is fruitful, it can be gainfully complemented by a discussion regarding the accountability and legitimacy of solutions. In our discussion, we extend the robust action approach by linking it to a set of principles that can serve to evaluate organisational approaches of tackling grand challenges with respect to their ability to foster accountable outcomes under the intricate conditions of algorithmic opacity.


Author(s):  
David Greenwood ◽  
Ian Sommerville

Society is demanding larger and more complex information systems to support increasingly complex and critical organisational work. Whilst troubleshooting socio-technical issues in small-to-medium scale situations may be achievable using approaches such as ethnography, troubleshooting enterprise scale situations is an open research question because of the overwhelming number of socio-technical elements and interactions involved. This paper demonstrates proof-of-concept tools for network analysis and visualisation that may provide a promising avenue for identifying problematic elements and interactions among an overwhelming number of socio-technical elements. The findings indicate that computers may be used to aid the analysis of problematic large-scale complex socio-technical situations by using analytical techniques to highlighting elements, or groups of interacting elements, that are important to the overall outcome of a problematic situation.


Sign in / Sign up

Export Citation Format

Share Document