scholarly journals The graph neural networking challenge

2021 ◽  
Vol 51 (3) ◽  
pp. 9-16
Author(s):  
José Suárez-Varela ◽  
Miquel Ferriol-Galmés ◽  
Albert López ◽  
Paul Almasan ◽  
Guillermo Bernárdez ◽  
...  

During the last decade, Machine Learning (ML) has increasingly become a hot topic in the field of Computer Networks and is expected to be gradually adopted for a plethora of control, monitoring and management tasks in real-world deployments. This poses the need to count on new generations of students, researchers and practitioners with a solid background in ML applied to networks. During 2020, the International Telecommunication Union (ITU) has organized the "ITU AI/ML in 5G challenge", an open global competition that has introduced to a broad audience some of the current main challenges in ML for networks. This large-scale initiative has gathered 23 different challenges proposed by network operators, equipment manufacturers and academia, and has attracted a total of 1300+ participants from 60+ countries. This paper narrates our experience organizing one of the proposed challenges: the "Graph Neural Networking Challenge 2020". We describe the problem presented to participants, the tools and resources provided, some organization aspects and participation statistics, an outline of the top-3 awarded solutions, and a summary with some lessons learned during all this journey. As a result, this challenge leaves a curated set of educational resources openly available to anyone interested in the topic.

2013 ◽  
Vol 14 (1) ◽  
pp. 51-61 ◽  
Author(s):  
Fabian Fischer ◽  
Johannes Fuchs ◽  
Florian Mansmann ◽  
Daniel A Keim

The enormous growth of data in the last decades led to a wide variety of different database technologies. Nowadays, we are capable of storing vast amounts of structured and unstructured data. To address the challenge of exploring and making sense out of big data using visual analytics, the tight integration of such backend services is needed. In this article, we introduce BANKSAFE, which was built for the VAST Challenge 2012 and won the outstanding comprehensive submission award. BANKSAFE is based on modern database technologies and is capable of visually analyzing vast amounts of monitoring data and security-related datasets of large-scale computer networks. To better describe and demonstrate the visualizations, we utilize the Visual Analytics Science and Technology (VAST) Challenge 2012 as case study. Additionally, we discuss lessons learned during the design and development of BANKSAFE, which are also applicable to other visual analytics applications for big data.


1997 ◽  
Vol 3 (1) ◽  
pp. 23-26 ◽  
Author(s):  
R. Wootton

Telemedicine may be a useful technique for delivering health care in the developing world. However, there is little practical experience to draw on and real concerns that if additional resources were to become available telemedicine might not be the most appropriate use for them. The logical steps to determine the place of telemedicine in the developing world therefore appear to be: 1 to identify potential telemedicine projectsthe Telecommunication Development Bureau of the International Telecommunication Union is trying to do this and has recently sponsored missions to various countries in Africa and Asia; 2 to carry out properly controlled pilot projects in order to demonstrate technical feasibility and to quantify the benefits to the healthcare system; 3 to calculate the costs of large-scale deployment. Assuming that telemedicine is shown to be beneficial, it is only at this final stage that a rational decision can be made about whether telemedicine would be an appropriate use of additional resources in a developing country, as opposed to alternative uses of those resources to solve other important problems of health care.


Author(s):  
Matthias Kranz ◽  
Andreas Möller ◽  
Florian Michahelles

Large-scale research has gained momentum in the context of Mobile Human-Computer Interaction (Mobile HCI), as many aspects of mobile app usage can only be evaluated in the real world. In this chapter, we present findings on the challenges of research in the large via app stores, in conjunction with selected data collection methods (logging, self-reporting) we identified and have proven as useful in our research. As a case study, we investigated the adoption of NFC technology, based on a gamification approach. We therefore describe the development of the game NFC Heroes involving two release cycles. We conclude with lessons learned and provide recommendations for conducting research in the large for mobile applications.


Author(s):  
Amparo Alonso-Betanzos ◽  
Verónica Bolón-Canedo ◽  
Diego Fernández-Francos ◽  
Iago Porto-Díaz ◽  
Noelia Sánchez-Maroño

With the advent of high dimensionality, machine learning researchers are now interested not only in accuracy, but also in scalability of algorithms. When dealing with large databases, pre-processing techniques are required to reduce input dimensionality and machine learning can take advantage of feature selection, which consists of selecting the relevant features and discarding irrelevant ones with a minimum degradation in performance. In this chapter, we will review the most up-to-date feature selection methods, focusing on their scalability properties. Moreover, we will show how these learning methods are enhanced when applied to large scale datasets and, finally, some examples of the application of feature selection in real world databases will be shown.


2019 ◽  
Vol 14 (2) ◽  
pp. 363-374 ◽  
Author(s):  
Yoshitaka Shimizu ◽  
Yasuo Suzuki ◽  
Ryota Sasazawa ◽  
Yuichi Kawamoto ◽  
Hiroki Nishiyama ◽  
...  

Based on the lessons learned from the East Japan Great Earthquake of March 11, 2011, the authors have been engaged in the research and development of movable and deployable information and communication technology (ICT) resource units (MDRUs), which provide immediate and minimally required ICT service in areas struck by disaster. The MDRU is a transportable unit that contains equipment necessary for ICT service provisions and is designed to be quickly transported to and set up in affected areas after a natural disaster has struck. The unit is used to quickly construct a wireless local network in the area, and thus provide immediate and minimally required ICT service to the people in the area. In this paper, we describe MDRU technology and other technologies that lead to improved performance and/or service when connected or linked up with the unit. Along with this development, we have conducted various activities aimed at its international deployment. Specifically, we conducted trials with resident participation in the Philippines and Nepal to verify the validity of these technologies overseas, where we were able to confirm its validity under various conditions. Furthermore, we have undertaken activities for its standardization, and have succeeded in the standardization of MDRU at International Telecommunication Union Telecommunication Standardization Sector (ITU-T).


Author(s):  
Georgia A. Papacharalampous ◽  
Hristos Tyralis ◽  
Demetris Koutsoyiannis

We perform an extensive comparison between 11 stochastic to 9 machine learning methods regarding their multi-step ahead forecasting properties by conducting 12 large-scale computational experiments. Each of these experiments uses 2 000 time series generated by linear stationary stochastic processes. We conduct each simulation experiment twice; the first time using time series of 110 values and the second time using time series of 310 values. Additionally, we conduct 92 real-world case studies using mean monthly time series of streamflow and particularly focus on one of them to reinforce the findings and highlight important facts. We quantify the performance of the methods using 18 metrics. The results indicate that the machine learning methods do not differ dramatically from the stochastic, while none of the methods under comparison is uniformly better or worse than the rest. However, there are methods that are regularly better or worse than others according to specific metrics.


2020 ◽  
Vol 127 ◽  
pp. 106368 ◽  
Author(s):  
Lucy Ellen Lwakatare ◽  
Aiswarya Raj ◽  
Ivica Crnkovic ◽  
Jan Bosch ◽  
Helena Holmström Olsson

2018 ◽  
Vol 7 (3.12) ◽  
pp. 255 ◽  
Author(s):  
Dhivya V ◽  
Apoorva Kumar Singh

Internet of Things is a very broad concept and it is the name given to the interconnection of everyday devices to simplify, ease or provide useful information to the user. The International Telecommunication Union (ITU) defines IoT as "A global infrastructure for the information society, enabling advanced services by interconnecting (physical and virtual) things based on, existing and evolving, interoperable information and communication technologies". The name "Internet of Things" was first coined in 1999 by Kevin Ashton in a presentation to Proctor and Gamble. In this paper, we review the protocols, architecture, and applications surfacing in the region of the Internet of Things in the current years. The web of things has the capability of changing a great part of the world we live in. IoT comprises of an advanced cluster of sensors inserted into various "things" that ceaselessly transmits and shares significant information to different gadgets and cloud. Information that causes us better see how these things function and cooperate. But how all of this can happen on such a large scale with so many devices transmitting data? A simple answer to that would be the Internet of Things platform that brings diverse information and provides a common language for the devices and apps to communicate with each other.  


Author(s):  
Iqbal H. Sarker

In the current age of the Fourth Industrial Revolution ($4IR$ or Industry $4.0$), the digital world has a wealth of data, such as Internet of Things (IoT) data, cybersecurity data, mobile data, business data, social media data, health data, etc. To intelligently analyze these data and develop the corresponding real-world applications, the knowledge of artificial intelligence (AI), particularly, machine learning (ML) is the key. Various types of machine learning algorithms such as supervised, unsupervised, semi-supervised, and reinforcement learning exist in the area. Besides, the deep learning, which is part of a broader family of machine learning methods, can intelligently analyze the data on a large scale. In this paper, we present a comprehensive view on these machine learning algorithms that can be applied to enhance the intelligence and the capabilities of an application. Thus, this study's key contribution is explaining the principles of different machine learning techniques and their applicability in various real-world applications areas, such as cybersecurity, smart cities, healthcare, business, agriculture, and many more. We also highlight the challenges and potential research directions based on our study. Overall, this paper aims to serve as a reference point for not only the application developers but also the decision-makers and researchers in various real-world application areas, particularly from the technical point of view.


Sign in / Sign up

Export Citation Format

Share Document