Entering a Period of bi-polar Internet Standard-Setting? Analyzing the Chinese Contention of US-Dominance in the Internet Engineering Task Force

Author(s):  
David Weyrauch1 ◽  
Thomas Winzen
2012 ◽  
Vol 102 (1) ◽  
pp. 305-336 ◽  
Author(s):  
Timothy Simcoe

Voluntary Standard Setting Organizations (SSOs) use a consensus process to create new compatibility standards. Practitioners have suggested that SSOs are increasingly politicized and perhaps incapable of producing timely standards. This article develops a simple model of standard setting committees and tests its predictions using data from the Internet Engineering Task Force, an SSO that produces many of the standards used to run the Internet. The results show that an observed slowdown in standards production between 1993 and 2003 can be linked to distributional conflicts created by the rapid commercialization of the Internet. (JEL C78, L15, L86)


2019 ◽  
Vol 5 (2) ◽  
pp. 36-72
Author(s):  
Vasiliki Koniakou

The article focuses on the relationship between the Internet Governance and democracy on the governance of the logical layer of the Internet.Due to the impactful role and the normative effects of standards, protocols and technical decisions for the Internet and Internet users, and the centrality of the Internet in almost every aspect of thesocial, financial and political life, it argues thatwe ought to examine the ideologies, narratives and assumptions that have informed and shaped key governance arrangements.It explores the influence of technological determinism as a technocratic governing mentality, applying the argument of Taylor Dotson in the context of Internet Governance, and more specifically on the governance of the logical layer, focusing on standard-setting and technical decision-making by the Internet Engineering Task Force (IETF).It argues that technological determinism has been pervasive in Internet Governance discoursesince the early days of the Internet, while standard-setting and technical decision-making are technocratically organized and non-democratic procedures, considering also how the technical community takes decisions, as well as how itself frames its tasks and perceives standard-setting and technical decision-making.It concludes arguing that we need to review the way governance on the logical layer is organized, dispelling technological determinism, while introducing social considerations and democratic principles.


2021 ◽  
Vol 51 (3) ◽  
pp. 29-32
Author(s):  
Michael Welzl ◽  
Stephan Oepen ◽  
Cezary Jaskula ◽  
Carsten Griwodz ◽  
Safiqul Islam

RFC 9000, published in May 2021, marks an important milestone for the Internet's standardization body, the Internet Engineering Task Force (IETF): finally, the specification of the QUIC protocol is available. QUIC is the result of a five-year effort - and it is also the second of two major protocols (the first being SPDY, which later became HTTP/2) that Google LLC first deployed, and then brought to the IETF for standardization. This begs the question: when big players follow such a "shoot first, discuss later" approach, is IETF collaboration still "real", or is the IETF now being (mis-)used to approve protocols for standardization when they are already practically established, without really actively involving anyone but the main proponents?


Author(s):  
Erkki Harjula ◽  
Jani Hautakorpi ◽  
Nicklas Beijar ◽  
Mika Ylianttila

Due to the increasing popularity of Peer-to-Peer (P2P) computing, the information technology industry and standardization organizations have started to direct their efforts on standardizing P2P algorithms and protocols. The Internet Engineering Task Force (IETF) has recently formed the Peer-to-Peer SIP (P2PSIP) working group for enabling serverless operation of Session Initiation Protocol (SIP). This chapter introduces the P2PSIP by presenting its background and purpose, operational principles, current status, and application areas. The focus is on the challenges and problem areas from the viewpoint of standardization and related research. The mobile- and heterogeneous environments are considered with special care. The authors provide a glance to the existing and emerging solutions that may be used in tackling the mentioned challenges and thus paving the way for successful deployment of P2PSIP in mobile environments.


Global Jurist ◽  
2018 ◽  
Vol 19 (2) ◽  
Author(s):  
Sara De Vido

Abstract The purpose of this contribution is to analyze two major standard setting bodies, namely the Financial Action Task Force on money laundering and the Financial Stability Board from an international law perspective. It will be demonstrated that they are “soft organizations”, which, despite their loose structure, can exercise “hard powers” in inducing States to comply with their standards.


Author(s):  
Dan Schiller

This chapter examines the Commerce Department's free-flow policy as part of its power over internet policy. It first provides an overview of U.S.–centric internet and Commerce's Internet Policy Task Force, established to launch an inquiry into “the global free flow of information on the Internet.” The inquiry's purpose was “to identify and examine the impact that restrictions on the flow of information over the Internet have on American businesses and global commerce.” The chapter also considers Commerce's commodification strategies based in part on data centers and the place of cloud computing services in the department's free-flow inquiry. It shows that the Commerce Department's free-flow policy was a major component of the federal government's overall efforts to keep corporate data flows streaming without restriction as new profit sites emerged around an extraterritorial internet managed by the United States.


1997 ◽  
Vol 6 (1) ◽  
pp. 45-54
Author(s):  
Gary C. Kessler
Keyword(s):  

2020 ◽  
Author(s):  
Raphael Rosa ◽  
Christian Rothenberg

Network Functions Virtualization (NFV) aims at high-end carriergrade performance but lacks common methodologies for testing Virtual Network Functions (VNFs). Benchmarking VNFs should consider different degrees of freedom instead of the black-box common approaches created for bare metal network functions. We understand such status-quo needs to be altered having basis on the solid ground of extensive and automated experimentation. Since 2015, we have been addressing a role in this scenario, from a position paper to the creation of the draft “Methodology for VNF Benchmarking Automation” in the Internet Engineering Task Force (IETF) Benchmarking Methodology Working Group (BMWG). This paper tells the tale about this draft in BMWG, associated with the perks of developing an open source reference implementation and academic papers, as the means of the old IETF mantra on running code. The story intends to showcase our experiences in IETF and BMWG, covering technical content (e.g., YANG models) as much as draft reviews on mailing-lists.


Sign in / Sign up

Export Citation Format

Share Document