scholarly journals From photons to big-data applications: terminating terabits

Author(s):  
Noa Zilberman ◽  
Andrew W. Moore ◽  
Jon A. Crowcroft

Computer architectures have entered a watershed as the quantity of network data generated by user applications exceeds the data-processing capacity of any individual computer end-system. It will become impossible to scale existing computer systems while a gap grows between the quantity of networked data and the capacity for per system data processing. Despite this, the growth in demand in both task variety and task complexity continues unabated. Networked computer systems provide a fertile environment in which new applications develop. As networked computer systems become akin to infrastructure, any limitation upon the growth in capacity and capabilities becomes an important constraint of concern to all computer users. Considering a networked computer system capable of processing terabits per second, as a benchmark for scalability, we critique the state of the art in commodity computing, and propose a wholesale reconsideration in the design of computer architectures and their attendant ecosystem. Our proposal seeks to reduce costs, save power and increase performance in a multi-scale approach that has potential application from nanoscale to data-centre-scale computers.

Sensors ◽  
2019 ◽  
Vol 19 (3) ◽  
pp. 500 ◽  
Author(s):  
Luca Palmieri ◽  
Gabriele Scrofani ◽  
Nicolò Incardona ◽  
Genaro Saavedra ◽  
Manuel Martínez-Corral ◽  
...  

Light field technologies have seen a rise in recent years and microscopy is a field where such technology has had a deep impact. The possibility to provide spatial and angular information at the same time and in a single shot brings several advantages and allows for new applications. A common goal in these applications is the calculation of a depth map to reconstruct the three-dimensional geometry of the scene. Many approaches are applicable, but most of them cannot achieve high accuracy because of the nature of such images: biological samples are usually poor in features and do not exhibit sharp colors like natural scene. Due to such conditions, standard approaches result in noisy depth maps. In this work, a robust approach is proposed where accurate depth maps can be produced exploiting the information recorded in the light field, in particular, images produced with Fourier integral Microscope. The proposed approach can be divided into three main parts. Initially, it creates two cost volumes using different focal cues, namely correspondences and defocus. Secondly, it applies filtering methods that exploit multi-scale and super-pixels cost aggregation to reduce noise and enhance the accuracy. Finally, it merges the two cost volumes and extracts a depth map through multi-label optimization.


2014 ◽  
Vol 556-562 ◽  
pp. 6350-6353 ◽  
Author(s):  
Hao Li Ren ◽  
Jian Wei Zhang ◽  
Kong Yang Peng

A soft-bus radar system model basing on a multi-core processor computer was designed in the paper. In this soft-bus, a distribute network communicating protocol called ‘data distribute service’ was used. The plug-in management, dataflow control and system data transporting of the data processing functions were supported by a unified protocol of the design. The functions of the in-using radars were consummated and extended by this design.


Proceedings ◽  
2019 ◽  
Vol 19 (1) ◽  
pp. 15
Author(s):  
José A. Navarro ◽  
María Cuevas ◽  
Roberto Tomás ◽  
Anna Barra ◽  
Michele Crosetto

The H2020 MOMIT project (Multi-scale Observation and Monitoring of railway Infrastructure Threats, http://www.momit-project.eu/) is focused on showing how remote sensing data and techniques may help to monitor railway infrastructures. One of the hazards monitored are the ground movements nearby such infrastructures. Two methodologies targeted at the detection of Active Deformation Areas (ADA) and the later classification of these using Persistent Scatterers (PS) derived from Sentinel-1 imagery had been developed prior to the start of MOMIT. Although the validity of these procedures had already been validated, no actual tools automating their execution existed—these were applied manually using Geographic Information Systems (GIS). Such a manual process was slow and error-prone due to human intervention. This work presents two new applications, developed in the context of the MOMIT project, automating the aforementioned methodologies: ADAfinder and ADAclassifier. Their goal was (1) to reduce the possibility of human errors to a minimum and (2) to increase the performance/reduce the time needed to obtain results, thus allowing more room for experimentation.


2020 ◽  
Vol 14 (1) ◽  
pp. 113-118 ◽  
Author(s):  
Y. Facio ◽  
M. Berber

AbstractPost Processed Static (PPS) and Precise Point Positioning (PPP) techniques are not new; however, they have been refined over the decades. As such, today these techniques are offered online via GPS (Global Positioning System) data processing services. In this study, one Post Processed Static (OPUS) and one Precise Point Positioning (CSRS-PPP) technique is used to process 24 h GPS data for a CORS (Continuously Operating Reference Stations) station (P565) duration of year 2016. By analyzing the results sent by these two online services, subsidence is determined for the location of CORS station, P565, as 3–4 cm for the entire year of 2016. In addition, precision of these two techniques is determined as ∼2 cm. Accuracy of PPS and PPP results is 0.46 cm and 1.21 cm, respectively. Additionally, these two techniques are compared and variations between them is determined as 2.5 cm.


1984 ◽  
Vol 1 (1) ◽  
pp. 26-38 ◽  
Author(s):  
Robert Kowalski

The Japanese Fifth Generation Computer Systems (FGCS) project has chosen logic programming for its core programming language. It has recognized the major contribution that logic programming has to make not only in artificial intelligence but in database systems and software specification as well. It has recognized and intends to exploit the greater potential that logic programming has to offer for taking advantage of the parallelism possible with innovative multiprocessor computer architectures.


Sign in / Sign up

Export Citation Format

Share Document