Efficient data management in a large-scale epidemiology research project

2012 ◽  
Vol 107 (3) ◽  
pp. 425-435 ◽  
Author(s):  
Jens Meyer ◽  
Stefan Ostrzinski ◽  
Daniel Fredrich ◽  
Christoph Havemann ◽  
Janina Krafczyk ◽  
...  
2013 ◽  
Vol 831 ◽  
pp. 276-281
Author(s):  
Ya Jie Ma ◽  
Zhi Jian Mei ◽  
Xiang Chuan Tian

Large-scale sensor networks are systems that a large number of high-throughput autonomous sensor nodes are distributed over wide areas. Much attention has paid to provide efficient data management in such systems. Sensor grid provides low cost and high performance computing to physical world data perceived through sensors. This article analyses the real-time sensor grid challenges on large-scale air pollution data management. A sensor grid architecture for pollution data management is proposed. The processing of the service-oriented grid management is described in psuedocode. A simulation experiment investigates the performance of the data management for such a system.


2001 ◽  
Author(s):  
Bradley Olson ◽  
Leonard Jason ◽  
Joseph R. Ferrari ◽  
Leon Venable ◽  
Bertel F. Williams ◽  
...  

SoftwareX ◽  
2021 ◽  
Vol 15 ◽  
pp. 100747
Author(s):  
José Daniel Lara ◽  
Clayton Barrows ◽  
Daniel Thom ◽  
Dheepak Krishnamurthy ◽  
Duncan Callaway

2021 ◽  
Vol 13 (7) ◽  
pp. 1367
Author(s):  
Yuanzhi Cai ◽  
Hong Huang ◽  
Kaiyang Wang ◽  
Cheng Zhang ◽  
Lei Fan ◽  
...  

Over the last decade, a 3D reconstruction technique has been developed to present the latest as-is information for various objects and build the city information models. Meanwhile, deep learning based approaches are employed to add semantic information to the models. Studies have proved that the accuracy of the model could be improved by combining multiple data channels (e.g., XYZ, Intensity, D, and RGB). Nevertheless, the redundant data channels in large-scale datasets may cause high computation cost and time during data processing. Few researchers have addressed the question of which combination of channels is optimal in terms of overall accuracy (OA) and mean intersection over union (mIoU). Therefore, a framework is proposed to explore an efficient data fusion approach for semantic segmentation by selecting an optimal combination of data channels. In the framework, a total of 13 channel combinations are investigated to pre-process data and the encoder-to-decoder structure is utilized for network permutations. A case study is carried out to investigate the efficiency of the proposed approach by adopting a city-level benchmark dataset and applying nine networks. It is found that the combination of IRGB channels provide the best OA performance, while IRGBD channels provide the best mIoU performance.


2015 ◽  
Vol 43 (3) ◽  
pp. 7-14 ◽  
Author(s):  
Jim Moffatt

Purpose – This case example looks at how Deloitte Consulting applies the Three Rules synthesized by Michael Raynor and Mumtaz Ahmed based on their large-scale research project that identified patterns in the way exceptional companies think. Design/methodology/approach – The Three Rules concept is a key piece of Deloitte Consulting’s thought leadership program. So how are the three rules helping the organization perform? Now that research has shown how exceptional companies think, CEO Jim Moffatt could address the question, “Does Deloitte think like an exceptional company?” Findings – Deloitte has had success with an approach that promotes a bias towards non-price value over price and revenue over costs. Practical implications – It’s critical that all decision makers in an organization understand how decisions that are consistent with the three rules have contributed to past success as well as how they can apply the rules to difficult challenges they face today. Originality/value – This is the first case study written from a CEO’s perspective that looks at how the Three Rules approach of Michael Raynor and Mumtaz Ahmed can foster a firm’s growth and exceptional performance.


Sign in / Sign up

Export Citation Format

Share Document