Architectural Design and Complexity Analysis of Large-Scale Cortical Simulation on a Hybrid Computing Platform

Author(s):  
Qing Wu ◽  
Qinru Qiu ◽  
Richard Linderman ◽  
Daniel Burns ◽  
Michael Moore ◽  
...  
2020 ◽  
Vol 29 (2) ◽  
pp. 1-24
Author(s):  
Yangguang Li ◽  
Zhen Ming (Jack) Jiang ◽  
Heng Li ◽  
Ahmed E. Hassan ◽  
Cheng He ◽  
...  

2014 ◽  
Vol 687-691 ◽  
pp. 3733-3737
Author(s):  
Dan Wu ◽  
Ming Quan Zhou ◽  
Rong Fang Bie

Massive image processing technology requires high requirements of processor and memory, and it needs to adopt high performance of processor and the large capacity memory. While the single or single core processing and traditional memory can’t satisfy the need of image processing. This paper introduces the cloud computing function into the massive image processing system. Through the cloud computing function it expands the virtual space of the system, saves computer resources and improves the efficiency of image processing. The system processor uses multi-core DSP parallel processor, and develops visualization parameter setting window and output results using VC software settings. Through simulation calculation we get the image processing speed curve and the system image adaptive curve. It provides the technical reference for the design of large-scale image processing system.


2017 ◽  
Vol 2 (3) ◽  
pp. 103
Author(s):  
Uwe Rieger

<p>With the current exponential growth in the sector of Spatial Data Technology and Mixed Reality display devises we experience an increasing overlap of the physical and digital world. Next to making data spatially visible the attempt is to connect digital information with physical properties. Over the past years a number of research institutions have been laying the ground for these developments. In contemporary architecture architectural design the dominant application of data technology is connected to graphical presentation, form finding and digital fabrication.<br />The <em>arc/sec Lab for Digital Spatial Operations </em>at the University of Auckland takes a further step. The Lab explores concepts for a new condition of buildings and urban patterns in which digital information is connected with spatial appearance and linked to material properties. The approach focuses on the step beyond digital re-presentation and digital fabrication, where data is re-connected to the multi-sensory human perceptions and physical skills. The work at the Lab is conducted in a cross disciplinary design environment and based on experiential investigations. The arc/sec Lab utilizes large-scale interactive installations as the driving vehicle for the exploration and communication of new dimensions in architectural space. The experiments are aiming to make data “touchable” and to demonstrate real time responsive environments. In parallel they are the starting point for both the development of practice oriented applications and speculation on how our cities and buildings might change in the future.<br />The article gives an overview of the current experiments being undertaken at the arc/sec Lab. It discusses how digital technologies allow for innovation between the disciplines by introducing real time adaptive behaviours to our build environment and it speculates on the type of spaces we can construct when <em>digital matter </em>is used as a new dynamic building material.</p>


2014 ◽  
Vol 519-520 ◽  
pp. 1451-1454 ◽  
Author(s):  
Ya Kun Shi

BIM technology used more widely in construction industry in developed countries in Europe and the United States, the integration of building information modeling (BIM) in the domestic large-scale propulsion was still difficult, and further the trend of widening the gap with foreign advanced level. In order to identify problems and solve the current status quo, and cut into the integrated information from project management point of view of China's architectural design, and analysis the status quo of BIM technology in our project management and developmental disabilities, and BIM-based technology and related parties mutual relations, to explain the development prospects of its application in China.


Author(s):  
A. Nascetti ◽  
M. Di Rita ◽  
R. Ravanelli ◽  
M. Amicuzi ◽  
S. Esposito ◽  
...  

The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER) has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah) and one Italian Region (Trentino Alto- Adige, Northern Italy) exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED) and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region) and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters.<br><br> In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.


Author(s):  
Yingxu Wang ◽  
Vincent Chiew

Functional complexity is one of the most fundamental properties of software because almost all other software attributes and properties such as functional size, development effort, costs, quality, and project duration are highly dependent on it. The functional complexity of software is a macro-scope problem concerning the semantic properties of software and human cognitive complexity towards a given software system; while the computational complexity is a micro-scope problem concerning algorithmic analyses towards machine throughput and time/space efficiency. This paper presents an empirical study on the functional complexity of software known as cognitive complexity based on large-scale samples using a Software Cognitive Complexity Analysis Tool (SCCAT). Empirical data are obtained with SCCAT on 7,531 programs and five formally specified software systems. The theoretical foundation of software functional complexity is introduced and the metric of software cognitive complexity is formally modeled. The functional complexities of a large-scale software system and the air traffic control systems (ATCS) are rigorously analyzed. A novel approach to represent software functional complexities and their distributions in software systems is developed. The nature of functional complexity of software in software engineering is rigorously explained. The relationship between the symbolic and functional complexities of software is quantitatively analyzed.


Author(s):  
Holger Giese ◽  
Stefan Henkler ◽  
Martin Hirsch ◽  
Vladimir Rubin ◽  
Matthias Tichy

Software has become the driving force in the evolution of many systems, such as embedded systems (especially automotive applications), telecommunication systems, and large scale heterogeneous information systems. These so called software-intensive systems, are characterized by the fact that software influences the design, construction, deployment, and evolution of the whole system. Furthermore, the development of these systems often involves a multitude of disciplines. Besides the traditional engineering disciplines (e.g., control engineering, electrical engineering, and mechanical engineering) that address the hardware and its control, often the system has to be aligned with the organizational structures and workflows as addressed by business process engineering. The development artefacts of all these disciplines have to be combined and integrated in the software. Consequently, software-engineering adopts the central role for the development of these systems. The development of software-intensive systems is further complicated by the fact that future generations of software-intensive systems will become even more complex and, thus, pose a number of challenges for the software and its integration of the other disciplines. It is expected that systems become highly distributed, exhibit adaptive and anticipatory behavior, and act in highly dynamic environments interfacing with the physical world. Consequently, modeling as an essential design activity has to support not only the different disciplines but also the outlined new characteristics. Tool support for the model-driven engineering with this mix of composed models is essential to realize the full potential of software-intensive systems. In addition, modeling activities have to cover different development phases such as requirements analysis, architectural design, and detailed design. They have to support later phases such as implementation and verification and validation, as well as to systematically and efficiently develop systems.


Sign in / Sign up

Export Citation Format

Share Document