scholarly journals Does exposed JSON data make paid e-songs vulnerable?

2019 ◽  
Author(s):  
Jimut Bahan Pal

JavaScript Object Notation (JSON) is a popular way of interchanging data between the client and servers. It is easy for computers to scan and create JSON objects; is it a secure way of transferring data? If the JSON is exposed, then the answer is no as JSON makes scraping easier. When raw data is scraped from the web it is useless, unless we get the meaningful items from it. With the help of automated bots some companies regularly scrap their competitors’ website for continuous monitoring opponents’ progress, without any human interaction. Google is a good example of scraper which scraps the entire internet at a very fast rate to store in their database, with an intention to implement page rank algorithm that indexes the pages. This paper investigates a popular website, for exposed JSON, to download paid e-songs free of cost. Results show that sites lack security for which it may result in the loss of their earnings.

2013 ◽  
Vol 569-570 ◽  
pp. 652-659 ◽  
Author(s):  
Gert de Sitter ◽  
Wout Weitjens ◽  
Mahmoud El-Kafafy ◽  
Christof Devriendt

This paper will show the first results of a long term monitoring campaign on an offshore wind turbine in the Belgian North Sea. It will focus on the vibration levels and resonant frequencies of the fundamental modes of the support structure. These parameters will be crucial to minimize O&M costs and to extend the lifetime of offshore wind turbine structures. For monopile foundations for example, scouring and reduction in foundation integrity over time are especially problematic because they reduce the fundamental structural resonance of the support structure, aligning that resonance frequency more closely to the lower frequencies. Since both the broadband wave energy and the rotating frequency of the turbine are contained in this low frequency band, the lower natural frequency can create resonant behavior increasing fatigue damage. Continuous monitoring of the effect of scour on the dynamics of the wind turbine will help to optimize the maintenance activities on the scour protection system. To allow a proper continuous monitoring during operation, reliable state-of-the-art operational modal analysis techniques should be used and these are presented in this paper. The methods are also automated, so that no human-interaction is required and the system can track the natural frequencies and damping ratios in a reliable manner.


2019 ◽  
Vol 70 (3) ◽  
pp. 131-145 ◽  
Author(s):  
Raimondo Gallo ◽  
Gianluca Ristorto ◽  
Alex Bojeri ◽  
Nadia Zorzi ◽  
Gabriele Daglio ◽  
...  

Summary The aim of WEQUAL project (WEb service centre for QUALity multidimensional design and tele-operated monitoring of Green Infrastructures) is the development of a system that is able to support a quick environmental monitoring of riparian areas subjected to the realization of new green infrastructures (GI). The Wequal’s idea is to organize a service center able to manage both the Web Platform and the whole data collection and analysis processes. Through a personal account, the final user (designer, technician, researcher) can get access to the service and requires the evaluation of alternatives GI projects. On the Web Platform, a set of algorithms runs in order to calculate, through automatic procedures, all the ecological criteria required to evaluate a quality environmental index that describes the eco-morphological value of the monitored riparian areas. For this aim, the WEQUI index was developed, which uses 15 indicators that are easy to monitor. In this paper, the approach for environmental data collection and the procedures to perform the automatic assessment of two of the ecological criteria are described. For the computation, the implemented algorithms use data including the vegetation indexes, Digital Terrain Model (DTM), Digital Surface Model (DSM) and a 3D point cloud classification. All the raw data are collected by UAVs (Unmanned Aircraft Vehicle) equipped with a 3D Lidar, multispectral camera and RGB camera. Interpreting all the raw data collected by these sensors, using a multi-attribute approach, the WEQUI index is assessed. The computed ecological index is then used to assess the riparian environmental quality at ex-ante and ex-post river stabilization works. This index, integrated with additional not-technical or not-ecological indicators such as investment required, maintenance costs or social acceptance, can be used in multicriteria analyses in order to evaluate the intervention from a wider point of view. The platform is expected to be attractive for GI designers and policy makers by providing a shared environment, which is able to integrate the method of detection and evaluation of complex indexes and a multidimensional evaluation supported by an expert guide.


Author(s):  
Epi Ludvik Nekaj

A digital transformation is underway. One that is redefining the essence of human interaction and with ideas, share unused resources and create new on-demand services that are customisable and unique. These are only a few examples of real productivity that when layered on the Internet creates an abundance of resources and opportunity. This people-powered abundance is called the crowd economy. It is the way the society lives, works and plays. There is a new paradigm shift that challenges traditional notions of the “norm” while expanding possibilities. The hallmark of the digital age is social connections that are boosted by the web and mobile networks. These technological advances have taken collaboration and cooperation to a level never seen before. Social connections through the web have gone beyond social media likes and shares and has evolved into social productivity - a phenomenon that arises when networked crowds collaborate to solve problems, raise funds, and come up with innovative ideas and solutions.


Author(s):  
Jana Polgar ◽  
Robert Mark Braum ◽  
Tony Polgar

Most of today’s portal implementations provide a model that facilitates plugging various components (portlets) into the portal infrastructure. Portlets run locally on the portal server, process input data, and render output. A local portlet and a good caching strategy for the content improves the response times, performance, and scalability of portal systems. However, very often we need to access remote Web services. One solution is to use a local portlet to access a remote Web service via its interface, obtain the required results as a raw data stream, and locally render the results in a fragment. This approach is relevant for data-oriented Web services. An alternative solution is to equip the Web service with an additional interface in the form of a portlet. When the Web service is called, it returns the entire portlet instead of raw data. This approach is suitable for presentation-oriented Web services.


Author(s):  
Bouchra Frikh ◽  
Brahim Ouhbi

The World Wide Web has emerged to become the biggest and most popular way of communication and information dissemination. Every day, the Web is expending and people generally rely on search engine to explore the web. Because of its rapid and chaotic growth, the resulting network of information lacks of organization and structure. It is a challenge for service provider to provide proper, relevant and quality information to the internet users by using the web page contents and hyperlinks between web pages. This paper deals with analysis and comparison of web pages ranking algorithms based on various parameters to find out their advantages and limitations for ranking web pages and to give the further scope of research in web pages ranking algorithms. Six important algorithms: the Page Rank, Query Dependent-PageRank, HITS, SALSA, Simultaneous Terms Query Dependent-PageRank (SQD-PageRank) and Onto-SQD-PageRank are presented and their performances are discussed.


Author(s):  
Sunny Sharma ◽  
Sunita Sunita ◽  
Arjun Kumar ◽  
Vijay Rana

<span lang="EN-US">The emergence of the Web technology generated a massive amount of raw data by enabling Internet users to post their opinions, comments, and reviews on the web. To extract useful information from this raw data can be a very challenging task. Search engines play a critical role in these circumstances. User queries are becoming main issues for the search engines. Therefore a preprocessing operation is essential. In this paper, we present a framework for natural language preprocessing for efficient data retrieval and some of the required processing for effective retrieval such as elongated word handling, stop word removal, stemming, etc. This manuscript starts by building a manually annotated dataset and then takes the reader through the detailed steps of process. Experiments are conducted for special stages of this process to examine the accuracy of the system.</span>


Supplementary factor to the general pagerank calculation which is utilized by Google chrome to rank sites in their web index results is tended to in this paper. These extra factors incorporate couple of ideas which expressly results to build the precision of evaluating the PageRank value. By making a decision about the likeness between the web page content with the text extracted from different site pagesresulted in topmost search using few keywords of the considered page for which the rank is to be determined by utilizing a comparability measure. It results with a worth or rate which speaks to the significance or similarity factor. Further, in a similar strategy if sentimental analysis is applied the search results of the keywords could be analysed with keywords of the page considered, it results with a Sentimental Analysed factor.In this way, one can improve and execute the Page ranking procedure which results with a superior accuracy.Hadoop Distributed File System is used to compute the page rank of input nodes. Python is chosen for parallel page rank algorithm that is executed on Hadoop


2016 ◽  
Vol 55 (04) ◽  
pp. 305-311 ◽  
Author(s):  
Nikolaus Marx ◽  
Thomas Deserno

SummaryBackground: Since 1942, when Goldberger introduced the 12-lead electrocardiography (ECG), this diagnostic method has not been changed.Objectives: After 70 years of technologic developments, we revisit Holter ECG from recording to understanding.Methods: A fundamental change is foreseen towards “computational ECG” (CECG), where continuous monitoring is producing big data volumes that are impossible to be inspected conventionally but require efficient computational methods. We draw parallels between CECG and computational biology, in particular with respect to computed tomography, computed radiology, and computed photography. From that, we identify technology and methodology needed for CECG.Results: Real-time transfer of raw data into meaningful parameters that are tracked over time will allow prediction of serious events, such as sudden cardiac death. Evolved from Holter’s technology, portable smartphones with Bluetooth-connected textile-embedded sensors will capture noisy raw data (recording), process meaningful parameters over time (analysis), and transfer them to cloud services for sharing (handling), predicting serious events, and alarming (understanding). To make this happen, the following fields need more research: i) signal processing, ii) cycle decomposition; iii) cycle normalization, iv) cycle modeling, v) clinical param -eter computation, vi) physiological modeling, and vii) event prediction.Conclusions: We shall start immediately developing methodology for CECG analysis and understanding.


2012 ◽  
Vol 8 (4) ◽  
pp. 708762 ◽  
Author(s):  
Sungmo Jung ◽  
Jae Young Ahn ◽  
Dae-Joon Hwang ◽  
Seoksoo Kim

In ubiquitous healthcare systems, machine-to-machine (M2M) communication promises large opportunities as it utilizes rapidly developing technologies of large-scale networking of devices for patient monitoring without dependence on human interaction. With the emergence of wireless multimedia sensor networks (WMSNs), M2M communications improve continuous monitoring and transmission and retrieval of multimedia content such as video and audio streams, images, and sensor data from the patient being monitored. This research deploys WMSN for continuous monitoring of target patients and reports tracking for preventive ubiquitous healthcare. This study performs optimization scheme movement coordination technique and data routing within the monitored area. A movement tracking algorithm is proposed for better patient tracking techniques and aids in optimal deployment of wireless sensor networks. Results show that our optimization scheme is capable of providing scalable and reliable patient monitoring results.


2020 ◽  
Vol 8 (1) ◽  
pp. 4-15
Author(s):  
Niclas Hagen

The purpose of this paper is to investigate online public participation and engagement in science through crowdsourcing platforms. In order to fulfil this purpose, this paper will use the crowdsourcing platform Zooniverse as a case study, as it constitutes the most prominent and established citizen science platform today. The point of departure for the analysis is that Zooniverse can be seen as a “platformization” of citizen science and scientific citizenship. The paper suggests that the mobilisation of individuals who participate and engage in science on the Zooniverse platform takes place through an epistemic culture that emphasises both authenticity and prospects of novel discoveries. Yet, in the process of turning “raw” data into useable data, Zooniverse has implemented a framework that structures the crowd, something that limits the sort of participation that is offered on the platform. This limitation means that the platform as a whole hardly be seen as fostering a more radical democratic inclusion, for example in the form of a co-production of scientific knowledge, that dissolves the institutional borders between scientists and non-professional volunteers.


Sign in / Sign up

Export Citation Format

Share Document