law of requisite variety
Recently Published Documents


TOTAL DOCUMENTS

37
(FIVE YEARS 8)

H-INDEX

5
(FIVE YEARS 2)

2021 ◽  
pp. 026839622098853
Author(s):  
Jacob L Cybulski ◽  
Rens Scheepers

The field of data science emerged in recent years, building on advances in computational statistics, machine learning, artificial intelligence, and big data. Modern organizations are immersed in data and are turning toward data science to address a variety of business problems. While numerous complex problems in science have become solvable through data science, not all scientific solutions are equally applicable to business. Many data-intensive business problems are situated in complex socio-political and behavioral contexts that still elude commonly used scientific methods. To what extent can such problems be addressed through data science? Does data science have any inherent blind spots in this regard? What types of business problems are likely to be addressed by data science in the near future, which will not, and why? We develop a conceptual framework to inform the application of data science in business. The framework draws on an extensive review of data science literature across four domains: data, method, interfaces, and cognition. We draw on Ashby’s Law of Requisite Variety as theoretical principle. We conclude that data-scientific advances across the four domains, in aggregate, could constitute requisite variety for particular types of business problems. This explains why such problems can be fully or only partially addressed, solved, or automated through data science. We distinguish between situations that can be improved due to cross-domain compensatory effects, and problems where data science, at best, only contributes merely to better understanding of complex phenomena.


2021 ◽  
pp. 009539972098543
Author(s):  
Ahmed S. Alojairi

This study examines conflict that can co-determine the effectiveness of nonprofit organization performance. Based on Ashby’s law of requisite variety, interorganizational conflict is defined in terms of a lack of fit between input variety and variety-handling capabilities. The calculated organizational interaction effectiveness (IE) ratio of 2.04 is used to determine the quality of interactions. “Flexibility” is the dominant category for helpful incidents (49.03%). Within non-helpful incidents (45.67%), however, “Unreliability” is the dominant category. This major source of conflict commonly produces an imbalance between flexibility and reliability as manifest by a mismatch between input variety and variety-handling capabilities.


Systems ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 53
Author(s):  
Mick Ashby

This paper combines the good regulator theorem with the law of requisite variety and seven other requisites that are necessary and sufficient for a cybernetic regulator to be effective and ethical. The ethical regulator theorem provides a basis for systematically evaluating and improving the adequacy of existing or proposed designs for systems that make decisions that can have ethical consequences; regardless of whether the regulators are humans, machines, cyberanthropic hybrids, organizations, or government institutions. The theorem is used to define an ethical design process that has potentially far-reaching implications for society. A six-level framework is proposed for classifying cybernetic and superintelligent systems, which highlights the existence of a possibility-space bifurcation in our future time-line. The implementation of “super-ethical” systems is identified as an urgent imperative for humanity to avoid the danger that superintelligent machines might lead to a technological dystopia. It is proposed to define third-order cybernetics as the cybernetics of ethical systems. Concrete actions, a grand challenge, and a vision of a super-ethical society are proposed to help steer the future of the human race and our wonderful planet towards a realistically achievable minimum viable cyberanthropic utopia.


2020 ◽  
pp. 017084062094455 ◽  
Author(s):  
Konstantinos Poulis ◽  
Efthimios Poulis ◽  
Paul Jackson

Alignment of organizations with external imperatives is seen as a sine qua non of proper organizing and strategizing by many fit and complexity scholars. Any deviation from this management mantra engenders organizational decline and, ultimately, mortality. We put this axiomatic principle under empirical scrutiny and use the law of requisite variety as our organizing principle to do so. The law is an iconic cornerstone of this matching contingency logic and it has served to legitimize a wide range of fit decisions in, e.g., leadership, organizational learning and corporate governance. Inspired by organizational vignettes inhabiting antithetical complexity regimes, we introduce a novel concept, which we label ‘agentic misfit’. In this way, we deconstruct deterministic assumptions related to environmental fittingness, we challenge teleological orientations in the fit literature, and we flesh out the viability of non-matching human agency amid complexity.


2020 ◽  
Vol 10 (13) ◽  
pp. 4442 ◽  
Author(s):  
Susana Suarez-Fernandez de Miranda ◽  
Francisco Aguayo-González ◽  
Jorge Salguero-Gómez ◽  
María Jesús Ávila-Gutiérrez

Engineering 4.0 environments are characterised by the digitisation, virtualisation, and connectivity of products, processes, and facilities composed of reconfigurable and adaptive socio-technical cyber-physical manufacturing systems (SCMS), in which Operator 4.0 works in real time in VUCA (volatile, uncertain, complex and ambiguous) contexts and markets. This situation gives rise to the interest in developing a framework for the conception of SCMS that allows the integration of the human factor, management, training, and development of the competencies of Operator 4.0 as fundamental aspects of the aforementioned system. The present paper is focused on answering how to conceive the adaptive manufacturing systems of Industry 4.0 through the operation, growth, and development of human talent in VUCA contexts. With this objective, exploratory research is carried, out whose contribution is specified in a framework called Design for the Human Factor in Industry 4.0 (DfHFinI4.0). From among the conceptual frameworks employed therein, the connectivist paradigm, Ashby’s law of requisite variety and Vigotsky’s activity theory are taken into consideration, in order to enable the affective-cognitive and timeless integration of the human factor within the SCMS. DfHFinI4.0 can be integrated into the life cycle engineering of the enterprise reference architectures, thereby obtaining manufacturing systems for Industry 4.0 focused on the human factor. The suggested framework is illustrated as a case study for the Purdue Enterprise Reference Architecture (PERA) methodology, which transforms it into PERA 4.0.


Author(s):  
Mick Ashby

This paper combines the Good Regulator Theorem with the Law of Requisite Variety and seven other requisites that are necessary and sufficient for a cybernetic regulator to be effective and ethical. The resulting Ethical Regulator Theorem provides a basis for systematically evaluating and improving the adequacy of existing or proposed designs for systems that make decisions that can have ethical consequences; regardless of whether the regulators are human, machines, cyberanthropic hybrids, organizations, corporations, or government institutions. The theorem is then used to define an ethical design process that has potentially far-reaching implications for society. A six-level framework is proposed for classifying cybernetic and superintelligent systems, which highlights the existence of a possibility-space bifurcation in our future time-line. The implementation of “super-ethical” systems is identified as an urgent imperative for humanity to avoid the danger that superintelligent machines might lead to a technological dystopia. Third-order cybernetics is defined as the cybernetics of ethical systems. Concrete actions, a grand challenge, and a vision of a super-ethical society are proposed to help steer the future of the human race and our wonderful planet towards a realistically achievable minimum viable cyberanthropic utopia.


Kybernetes ◽  
2019 ◽  
Vol 48 (4) ◽  
pp. 793-804 ◽  
Author(s):  
Thomas Fischer

Purpose Ranulph Glanville has argued that ambitions of strict control are misplaced in epistemic processes such as learning and designing. Among other reasons, he has presented quantitative arguments for this ethical position. As a part of these arguments, Glanville claimed that strict control even of modest systems transcends the computational limits of our planet. The purpose of this paper is to review the related discourse and to examine the soundness of this claim. Design/methodology/approach Related literature is reviewed and pertinent lines of reasoning are illustrated and critically examined using examples and straightforward language. Findings The claim that even modest epistemic processes transcend the computational means of our planet is challenged. The recommendation to assume out-of-control postures in epistemic processes, however, is maintained on ethical rather than on quantitative grounds. Research limitations/implications The presented reasoning is limited in as far as it is ultimately based on an ethical standpoint. Originality/value This paper summarizes an important cybernetic discourse and dispels the notion therein that epistemic processes necessarily involve computational demands of astronomical proportions. Furthermore, this paper presents a rare discussion of Glanville’s Corollary of Ashby’s Law of Requisite Variety.


2018 ◽  
Vol 32 (2) ◽  
pp. 201-209
Author(s):  
Sridhar Ramamoorti

SYNOPSIS For organizing my discussant response, I use the “input-process-output-outcomes” framework, where the PCAOB is part of the standard-setting infrastructure (input), the processes are its standard-setting activities, the standards eventually developed and approved by the SEC (the output), and assessing the outcome(s) refers to evaluating how well the PCAOB is achieving its mission and vision. From this perspective, Nolder and Palmrose's (2018) critique of the PCAOB's standard-setting pace as being “glacial” can only be regarded an “operational criticism.” In fact, viewed in light of the recently released Monitoring Group's (2017) Consultation paper, the PCAOB actually appears to be ahead of the IAASB as a much more independent and active standard-setter. Expanding the PCAOB's “economic analysis” to incorporate behavioral disciplines could certainly make the rationale more grounded and stronger, but will likely further slow down the pace of standard-setting—a result that goes against the authors' preference. Beyond just behavioral approaches, I consider the even broader perspective of “political economy” in this context and highlight the relevance of Ashby's Law of Requisite Variety (Ashby 1956) as an overarching framework. Toward the end, I support certain arguments made by Nolder and Palmrose (2018), and in particular commend their suggestion that the PCAOB develop a conceptual framework for standard-setting going forward.


Sign in / Sign up

Export Citation Format

Share Document