Global Digital Trade and Implications for Trade Negotiation: Deciphering the Data Flows and Implications on Revenues Losses

2020 ◽  
Author(s):  
Murali Kallummal
Keyword(s):  
1991 ◽  
Vol 30 (4I) ◽  
pp. 579-599
Author(s):  
Robert E. Baldwin

Until negotiations collapsed in early December, the Uruguay Round gave promise of being the most significant multilateral trade negotiation since 1947, when the General Agreement on Tariffs and Trade (GA TI) was implemented and tariffs levels of the industrial countries were sharply cut. There are at least three reasons for this conclusion. First, by agreeing at the outset to bring both agriculture and textiles under GATT discipline, the participants created the opportunity for both rich and poor agricultural exporting nations and relatively low-wage, newly industrializing LDCs to benefit significantly from GATT-sponsored trade negotiations. Prior to the Uruguay Round, the benefits to these countries of such negotiations had been limited, since these two sectors were excluded from any significant liberalization. Second, by agreeing to formulate new rules relating to trade in services, trade-related aspects of· intellectual property rights, and trade-related investment issues, members took an important step in modernizing the GATT. As economic globalization has accelerated, there is a growing realization that arms-length merchandise transactions, the traditional concern of the GATT, are only one aspect of the real-side economic relations of current concern to national policy-makers and the economic interests they represent Now international commercial activities also involve merchandise trade among multinational firms and their foreign affiliates, international trade in services among independent agents as well as among affiliated enterprises, foreign direct investment activities, production nf goods and services in foreign affiliates for sale either abroad or at home, international flows of technology, and temporary movements of labour across borders. Although the so-called new issues in the Uruguay Round do not cover all of these matters, they go a considerable way in making the GATT more relevant for dealing with the problems of increasing internationalization.


Author(s):  
Kriss Ravetto-Biagioli

We are confronted with a new type of uncanny experience, an uncanny evoked by parallel processing, aggregate data, and cloud-computing. The digital uncanny does not erase the uncanny feeling we experience as déjà vu or when confronted with robots that are too lifelike. Today’s uncanny refers to how nonhuman devices (surveillance technologies, algorithms, feedback, and data flows) anticipate human gestures, emotions, actions, and interactions, intimating we are machines and our behavior is predicable because we are machinic. It adds another dimension to those feelings we get when we question whether our responses are subjective or automated—automated as in reducing one’s subjectivity to patterns of data and using those patterns to present objects or ideas that would then elicit one’s genuinely subjective—yet effectively preset—response. This anticipation of our responses is a feedback loop we have produced by designing software that studies our traces, inputs, and moves. Digital Uncanny explores how digital technologies, particularly software systems working through massive amounts of data, are transforming the meaning of the uncanny that Freud tied to a return of repressed memories, desires, and experiences to their anticipation. Through a close reading of interactive and experimental art works of Rafael Lozano-Hemmer, Bill Viola, Simon Biggs, Sue Hawksley, and Garth Paine, this book is designed to explore how the digital uncanny unsettles and estranges concepts of “self,” “affect,” “feedback,” and “aesthetic experience,” forcing us to reflect on our relationship with computational media and our relationship to others and our experience of the world.


Electronics ◽  
2021 ◽  
Vol 10 (7) ◽  
pp. 844
Author(s):  
Tsung-Yi Tang ◽  
Li-Yuan Hou ◽  
Tyng-Yeu Liang

With the rise in fog computing, users are no longer restricted to only accessing resources located in central and distant clouds and can request services from neighboring fog nodes distributed over networks. This can effectively reduce the network latency of service responses and the load of data centers. Furthermore, it can prevent the Internet’s bandwidth from being used up due to massive data flows from end users to clouds. However, fog-computing resources are distributed over multiple levels of networks and are managed by different owners. Consequently, the problem of service discovery becomes quite complicated. For resolving this problem, a decentralized service discovery method is required. Accordingly, this research proposes a service discovery framework based on the distributed ledger technology of IOTA. The proposed framework enables clients to directly search for service nodes through any node in the IOTA Mainnet to achieve the goals of public access and high availability and avoid network attacks to distributed hash tables that are popularly used for service discovery. Moreover, clients can obtain more comprehensive information by visiting known nodes and select a fog node able to provide services with the shortest latency. Our experimental results have shown that the proposed framework is cost-effective for distributed service discovery due to the advantages of IOTA. On the other hand, it can indeed enable clients to obtain higher service quality by automatic node selection.


2021 ◽  
Vol 11 (11) ◽  
pp. 5067
Author(s):  
Paulo Veloso Gomes ◽  
António Marques ◽  
João Donga ◽  
Catarina Sá ◽  
António Correia ◽  
...  

The interactivity of an immersive environment comes up from the relationship that is established between the user and the system. This relationship results in a set of data exchanges between human and technological actors. The real-time biofeedback devices allow to collect in real time the biodata generated by the user during the exhibition. The analysis, processing and conversion of these biodata into multimodal data allows to relate the stimuli with the emotions they trigger. This work describes an adaptive model for biofeedback data flows management used in the design of interactive immersive systems. The use of an affective algorithm allows to identify the types of emotions felt by the user and the respective intensities. The mapping between stimuli and emotions creates a set of biodata that can be used as elements of interaction that will readjust the stimuli generated by the system. The real-time interaction generated by the evolution of the user’s emotional state and the stimuli generated by the system allows him to adapt attitudes and behaviors to the situations he faces.


Sign in / Sign up

Export Citation Format

Share Document