internet address
Recently Published Documents


TOTAL DOCUMENTS

47
(FIVE YEARS 5)

H-INDEX

6
(FIVE YEARS 1)

F1000Research ◽  
2021 ◽  
Vol 10 ◽  
pp. 8
Author(s):  
Philip Jacobs ◽  
Arvi Ohinmaa

We present a database listing local government mask orders for COVID-19 that were enacted between April and September, 2020, prior to the date that the governors issued statewide mask wearing mandates. We obtained data from a Google search of web pages of local and national commercial and public broadcasters and newspapers, and of the orders themselves.  In the database, we present data identifying the county, municipality or tribal council, date of the order, and the source’s internet address. In the 34 states with statewide orders, local governments in 21 of these states issued mandates in 218 municipalities, 155 counties, and 1 tribal council.  The dataset can be accessed from https://doi.org/10.7939/DVN/NDFEHK


2020 ◽  
Vol 67 (4) ◽  
pp. 1459-1475 ◽  
Author(s):  
Stefano Angieri ◽  
Alberto Garcia-Martinez ◽  
Bingyang Liu ◽  
Zhiwei Yan ◽  
Chuang Wang ◽  
...  
Keyword(s):  

2020 ◽  
Vol 2 (2) ◽  
pp. 62-70
Author(s):  
Miftakhul Anggita Bima Ferdinand ◽  
Aji Prasetya Wibawa ◽  
Ilham Ari Elbaith Zaeni ◽  
Harits Ar Rosyid

Jumlah kunjungan rerata pengunjung unik per hari pada jurnal elektronik menunjukkan bahwa hasil terbitan karya ilmiah website tersebut menarik. Sehingga jumlah pengunjung unik dijadikan indikator penting dalam mengukur keberhasilan sebuah jurnal elektronik untuk memenuhi perluasan, penyebaran dan percepatan sistem akreditasi jurnal. Pengunjung Unik merupakan jumlah pengunjung per Internet Address (IP) yang mengakses sebuah jurnal elektronik dalam kurun waktu tertentu. Terdapat beberapa metode yang biasa digunakan untuk peramalan, diantaranya adalah Multilayer Perceptron (MLP).  Kualitas data berpengaruh besar dalam membangun model MLP yang baik, karena sukses tidaknya permodelan pada MLP sangat dipengaruhi oleh data input. Salah satu cara untuk meningkatkan kualitas data adalah dengan melakukan smoothing pada data tersebut. Pada penelitian ini digunkan metode peramalan Multilayer Perceptron berdasarkan penelitian sebelumnya dengan kombinasi data training dan testing 80%-20% dengan asitektur 2-1-1 dan learning rate 0,4. Selanjutnya untuk meningkatkan kualitas data dilakukan smoothing dengan menerapkan metode Single Exponential Smoothing. Dari penelitian yang dilakukan diperoleh hasil terbaik menggunakan alpha 0.9 dengan hasil akurasi MSE 94.02% dan RMSE 75.54% dengan lama waktu eksekusi 580,27 detik. The number of visits by the average unique visitor per day on electronic journals shows that the published scientific papers on the website are interesting. So that the number of unique visitors is used as an important indicator in measuring the success of an electronic journal to meet the expansion, dissemination and acceleration of the journal accreditation system. Unique Visitors is the number of visitors per Internet Address (IP) who access an electronic journal within a certain period of time. There are several methods commonly used for forecasting, including the Multilayer Perceptron (MLP). Data quality has a big influence in building a good MLP model, because the success or failure of modeling in MLP is greatly influenced by the input data. One way to improve data quality is by smoothing the data. In this study, the Multilayer Perceptron forecasting method was used based on previous research with a combination of training data and testing 80% -20% with a 2-1-1 architecture and a learning rate of 0.4. Furthermore, to improve data quality, smoothing is done by applying the Single Exponential Smoothing method. From the research conducted, the best results were obtained using alpha 0.9 with MSE accuracy of 94.02% and RMSE 75.54% with a long execution time of 580.27 seconds.


2020 ◽  
pp. 1672-1685
Author(s):  
Timo Kiravuo ◽  
Seppo Tiilikainen ◽  
Mikko Särelä ◽  
Jukka Manner

The developed society depends on many critical infrastructure processes, such as power generation, water treatment, many types of manufacturing, and smart buildings. These processes need control and the automation industry has embraced the Internet to connect all these controls. However, the controlling devices thus opened to the world do not always have adequate safeguards to withstand malicious users. Many automation systems have default passwords or known and unknown backdoors. Also, often those systems are not updated to close security weaknesses found after original installation. The authors argue that while the industry is familiar with the notion of safety of equipment and processes, it has not focused enough on IT security. Several years ago the Shodan search engine showed how easy it is to find these control devices on the Internet. The authors followed this research line further by targeting one nation's IP address space with Shodan and found thousands of control systems, many of which represent models and versions with known vulnerabilities. Their first contribution is presenting these findings and analyzing their significance. Their study started in 2012 and the most recent results are from the end of 2015. To gain further knowledge, they have built a prototype scanner capable of finding industrial control systems. This lets the authors evaluate the possibility of performing routine scans to gauge the vulnerability of a nation. Their second contribution is to present a template for a national Internet scanning program. The authors discuss the technology, performance, and legality of such a program. Based on their findings and analysis they argue that nations should continuously monitor their own Internet address space for vulnerabilities. The authors' findings indicate that the current level of vulnerabilities is significant and unacceptable. Scanning a nation's critical infrastructure can be done in minutes, allowing them to keep a tight control of vulnerabilities. Yet, in addition, the authors need to extend current legislation and the rights of government officials to bring more security in national critical infrastructures; this discussion is their third contribution. The cyber-space has become a playing field for criminals, terrorists and nation states, all of which may have a motive to disrupt the daily life of a nation, and currently causing such disruptions is too easy.


2018 ◽  
Vol 1 (1) ◽  
Author(s):  
Philip Petrov ◽  
◽  
Aleksandra Zhecheva ◽  

The article presents a website dedicated to problems with sophisms, which can be used as a didactical tool in help of teachers in Mathematics, Informatics and Informational Technologies. The project has started as a part of a diploma thesis in the Master‘s program “Technologies of Education in Mathematics and Informatics” in Sofia University “St. Kliment Ohridski” with subject „Sophisms in help of the teachers of Mathematics, Informatics and Informational Technologies”. The website is available at the following Internet address: https://sofizmi.cphpvb.net.


2018 ◽  
Author(s):  
Sigit Haryadi

This notebook proposes a new method of calculating the equilibrium index for the physics and astronomy using the formula I created in April 2016, which originally named ad "the Haryadi Index", and then I changed the name to "the Harmony in Gradation". This idea may be useful for physicists and astronomers to examine the "future" of the pair of objects that must move in order to achieve a state of equilibrium. I submit to the physicists and astronomers of the world to further examine the usefulness of the method, and write it in the journal, and in order to do a preliminary study, experts may try to deepen the soul of this method by simulating some calculations using the internet calculator I created on my blog, i.e. at the internet address http://sigitharyadi.net/multidicipline/equilibrium-index-calculation/


Fog Computing ◽  
2018 ◽  
pp. 158-182
Author(s):  
Dan Jen ◽  
Michael Meisel ◽  
Daniel Massey ◽  
Lan Wang ◽  
Beichuan Zhang ◽  
...  

The global routing system has seen a rapid increase in table size and routing changes in recent years, mostly driven by the growth of edge networks. This growth reflects two major limitations in the current architecture: (a) the conflict between provider-based addressing and edge networks' need for multihoming, and (b) flat routing's inability to provide isolation from edge dynamics. In order to address these limitations, we propose A Practical Tunneling Architecture (APT), a routing architecture that enables the Internet routing system to scale independently from edge growth. APT partitions the Internet address space in two, one for the transit core and one for edge networks, allowing edge addresses to be removed from the routing table in the transit core. Packets between edge networks are tunneled through the transit core. In order to automatically tunnel the packets, APT provides a mapping service between edge addresses and the addresses of their transit-core attachment points. We conducted an extensive performance evaluation of APT using trace data collected from routers at two major service providers. Our results show that APT can tunnel packets through the transit core by incurring extra delay on up to 0.8% of all packets at the cost of introducing only one or a few new or repurposed devices per AS.


2017 ◽  
Vol 33 (3) ◽  
pp. 253-263
Author(s):  
Debora A. Person ◽  
Tawnya K. Plumb

Purpose For many years, the librarians at University of Wyoming’s George William Hopper Law Library fielded questions about the history of the law school and alumni. Unfortunately, no one collection of institutional historical documents was available to search for relevant answers. The result was a decision to collect historic materials in a digital archive to make them available to anyone in the law school who might field such inquiries and to preserve them for future interest. The purpose of this case study is to provide a blueprint for building a digital archives from the ground up. Design/methodology/approach The digital archive began with print-born historical documents, scanned as preservation copies and entered into a database of images and files to which searchable metadata could be added. In addition to historical materials, it was important to collect the materials that the law school and the law library were producing. Therefore, the project was twofold: collect, preserve and make searchable the printed historic materials in a digital environment and harvest, preserve and make searchable print-born and digital-born materials as part of an ongoing process. To do this, appropriate software had to be identified. Findings The following steps blueprint the building of an archive on a digital platform: establish the site’s internet address, title and description; select a look and feel template and personalize the archive; create collections; identify Dublin Core preferences; add items and files using controlled vocabulary; experiment with any available plugins; and promote and provide access to the archive. Originality/value The digital archives project initiated by the library has led to other initiatives and opportunities for service.


Sign in / Sign up

Export Citation Format

Share Document