scholarly journals PRELIMINARY STUDY ON THE PREVENTIVE PROTECTION OF ROCKERY IN THE MOUNTAIN RESORT BASED ON THE COMPREHENSIVE DIGITAL TECHNOLOGY

Author(s):  
X. Wu ◽  
L. Xie ◽  
D. Chen ◽  
Y. Wang ◽  
X. Wang ◽  
...  

Abstract. The concept of preventive protection originates from the European heritage protection field. Nowadays, a complete knowledge system and normative reference framework have been formed in the field of museum collection cultural relics. However, it is just beginning in the field of traditional garden heritage protection. Based on the review of the theories and practices of the preventive protection of international historical building and traditional garden heritage, this paper makes a comprehensive survey, photo and modeling record of the rockery of Mountain Resort with the help of the current comprehensive digital mapping technology. Based on this, the paper combines the historical data files to carry out the comprehensive analysis and evaluation of the authenticity, destruction and safety of the present rockery; moreover, taking Jinshan as an example, the main points of digital surveying and mapping of large-scale rockery are summarized; Finally, summarize the current problems of rockery and put forward preventive protection suggestions. It is hoped that through this study, the case of preventive protection of Royal Garden heritage in Qing Dynasty will be enriched, and a new idea will be provided for the protection of Royal Garden heritage in Qing Dynasty.

1991 ◽  
Vol 44 (1) ◽  
pp. 58-66
Author(s):  
D. G. McCallum

The Automobile Association, Ordnance Survey, Philips BV and Robert Bosch GmbH are collaborating in a project to create and test a prototype navigation database. The project, (PANDORA–Prototyping A Navigation Database Of Road-network Attributes) which is being managed by consultants MVA Systematica, is supported by the European Community's DRIVE initiative.The data requirements of future vehicle navigation systems such as CARIN from Philips, Travelpilot/EVA from Bosch and the Autoguide scheme for London have been examined. Digital street networks have been extracted mainly from Ordnance Survey's large-scale digital mapping, and the necessary road and traffic attributes have been collected by the AA. These data have been integrated into a specially-designed prototype database for parts of London and Birmingham and the major interconnecting roads. Data have been abstracted from the database and supplied to Bosch and Philips using the Geographic Data Files (GDF) standard developed in the DEMETER project. This dataset is being tested in field trials using prototype vehicle navigation systems. The dataset will also be provided to the DRIVE Project ‘Task Forc European Digital Road Map’ as benchmark test task number 12.This paper describes the project, dealing with its objectives and relationship to other European initiatives, the work undertaken, the standards utilized and developed, its results and conclusions, and the lessons learned with respect to provision of data for larger areas of Europe. A glossary of technical terms and abbreviations is also included.


2012 ◽  
Vol 33 (2) ◽  
pp. 50-54 ◽  
Author(s):  
Birutė Ruzgienė ◽  
Edita Aleknienė

Up-to-date mapping technologies are in the middle of transition from analytical to digital. The usage of new methods and technologies implies the desire to increase mapping capability. Despite that, analytical and digital methods may be used simultaneously, thus getting more efficient results. The research objective is to present some aspects of functionality of both digital and analytical photogrammetric mapping approaches in generating 3D geodata. The experimental results show which of the two methods could lead to a more flexible mapping production in consideration of the following criteria: accuracy, flexibility, time and cost. The main result of investigations shows, that the orthophoto generation is successfully using fully automatic systems. The digital terrain models created by two technologies are almost the same due to time-consuming. Therefore more time is required for the Digital Photogrammetric System when the terrain is rougher. Despite the fact that digital photogrammetric mapping technology drastically develops, there is no doubt that analytical photogrammetry is still a significant production system for large‐scale mapping. The results demonstrate that there is not too much difference in accuracy between the analytical and the digital 14 μm pixel size images processing. The interpretation capability of experimental test area in the Digital Photogrammetric System was more complicated as it was by the analytical plotter. Two systems integrations have been foreseen. Digital terrain model obtained by the analytical plotter can be transferred to digital mapping system for orthophoto generation.


Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 4206
Author(s):  
Farhan Nawaz ◽  
Hemant Kumar ◽  
Syed Ali Hassan ◽  
Haejoon Jung

Enabled by the fifth-generation (5G) and beyond 5G communications, large-scale deployments of Internet-of-Things (IoT) networks are expected in various application fields to handle massive machine-type communication (mMTC) services. Device-to-device (D2D) communications can be an effective solution in massive IoT networks to overcome the inherent hardware limitations of small devices. In such D2D scenarios, given that a receiver can benefit from the signal-to-noise-ratio (SNR) advantage through diversity and array gains, cooperative transmission (CT) can be employed, so that multiple IoT nodes can create a virtual antenna array. In particular, Opportunistic Large Array (OLA), which is one type of CT technique, is known to provide fast, energy-efficient, and reliable broadcasting and unicasting without prior coordination, which can be exploited in future mMTC applications. However, OLA-based protocol design and operation are subject to network models to characterize the propagation behavior and evaluate the performance. Further, it has been shown through some experimental studies that the most widely-used model in prior studies on OLA is not accurate for networks with networks with low node density. Therefore, stochastic models using quasi-stationary Markov chain are introduced, which are more complex but more exact to estimate the key performance metrics of the OLA transmissions in practice. Considering the fact that such propagation models should be selected carefully depending on system parameters such as network topology and channel environments, we provide a comprehensive survey on the analytical models and framework of the OLA propagation in the literature, which is not available in the existing survey papers on OLA protocols. In addition, we introduce energy-efficient OLA techniques, which are of paramount importance in energy-limited IoT networks. Furthermore, we discuss future research directions to combine OLA with emerging technologies.


2021 ◽  
Vol 40 (3) ◽  
Author(s):  
Zhiyu Wang ◽  
Jingyu Wu ◽  
Guang Yu ◽  
Zhiping Song

In traditional historical research, interpreting historical documents subjectively and manually causes problems such as one-sided understanding, selective analysis, and one-way knowledge connection. In this study, we aim to use machine learning to automatically analyze and explore historical documents from a text analysis and visualization perspective. This technology solves the problem of large-scale historical data analysis that is difficult for humans to read and intuitively understand. In this study, we use the historical documents of the Qing Dynasty Hetu Dangse,preserved in the Archives of Liaoning Province, as data analysis samples. China’s Hetu Dangse is the largest Qing Dynasty thematic archive with Manchu and Chinese characters in the world. Through word frequency analysis, correlation analysis, co-word clustering, word2vec model, and SVM (Support Vector Machines) algorithms, we visualize historical documents, reveal the relationships between functions of the government departments in the Shengjing area of the Qing Dynasty, achieve the automatic classification of historical archives, improve the efficient use of historical materials as well as build connections between historical knowledge. Through this, archivists can be guided practically in historical materials’ management and compilation.


Blood ◽  
2020 ◽  
Vol 136 (Supplement 1) ◽  
pp. 17-18
Author(s):  
Shaadi Mehr ◽  
Daniel Auclair ◽  
Mark Hamilton ◽  
Leon Rozenblit ◽  
Hearn Jay Cho ◽  
...  

Abstract: Title: Architecture of sample preparation and data governance of Immuno-genomic data collected from bone marrow and peripheral blood samples obtained from multiple myeloma patients In multiple myeloma (MM), the interactions between malignant plasma cells and the bone marrow microenvironment is crucial to fully understand tumor development, disease progression, and response to therapy. The core challenge in understanding those interactions has been the establishment of a standard process and a standard model for handling the data quality workflow and the underlying data models. Here we present the Platform (Figure 1), an integrated data flow architecture designed to create data inventory and process tracking protocols for multi-dimensional and multi-technology immune data files. This system has been designed to inventory and track peripheral blood and bone marrow samples from multiple myeloma subjects submitted for immune analysis under the MMRF Immune Atlas initiative (figure 2), and the processing and storage of Single Cell RNA-seq (scRNA-seq) and Mass Cytometry time-of-flight (CyTOF) data files derived from these immune analyses. While these methods have been previously applied on both tumor and immune populations in MM [2,3], this level of multi-institutional and multi-technology is unique. The Cloud Immune-Precision platform contains standardized protocols and bioinformatics workflows for the identification and categorization of immune cell populations and functional states based upon scRNA-seq gene signatures (ref: Bioinformatics manuscript in submission) and CyTOF protein signatures. Upon further expansion, it will contain high dimensional scRNAseq and CyTOF immune data from both bone marrow and peripheral blood samples from myeloma patients enrolled in the Multiple Myeloma Research Foundation (MMRF) CoMMpass study (NCT01454297) [1] (Figure 3). The architecture covers the automation of data governance protocols, data transformation and ETL model developments that will create an immune proteomic and profiling database and its integration into clinical and genomics databases: e.g. the MMRF CoMMpass clinical trial. This large-scale data integration will establish a cutting-edge Immune-Precision central platform supporting large scale, immune-focused advanced analytics in multiple myeloma patients. This platform will allow researchers to interrogate the relationships between immune transcriptomic and proteomic signatures and tumor genomic features, and their impact on clinical outcomes, to aid in the optimization of therapy and therapeutic sequencing. Furthermore, this platform also promotes the potential to (further) elucidate the mechanisms-of-action of approved and experimental myeloma therapies, drive biomarker discovery, and identify new targets for drug discovery. Figure 1: Cloud Immune-Precision Platform (Integrated data flow architecture designed to create data inventory and process tracking protocols for multi-dimensional and multi-technology immune data files) Figure 2: Sample tracking process architecture Figure 3: Data file creation and repository process tracking References: 1- Settino, Marzia et al. "MMRF-CoMMpass Data Integration and Analysis for Identifying Prognostic Markers." Computational Science - ICCS 2020: 20th International Conference, Amsterdam, The Netherlands, June 3-5, 2020, Proceedings, Part III vol. 12139 564-571. 22 May. 2020, doi:10.1007/978-3-030-50420-5_42 2- Ledergor, Guy et al. "Single cell dissection of plasma cell heterogeneity in symptomatic and asymptomatic myeloma." Nature medicine vol. 24,12 (2018): 1867-1876. doi:10.1038/s41591-018-0269-2 3- Hansmann, Leo et al. "Mass cytometry analysis shows that a novel memory phenotype B cell is expanded in multiple myeloma." Cancer immunology research vol. 3,6 (2015): 650-60. doi:10.1158/2326-6066.CIR-14-0236-T Figure 1 Disclosures Bhasin: Canomiiks Inc: Current equity holder in private company, Other: Co-Founder. Dhodapkar:Amgen: Membership on an entity's Board of Directors or advisory committees, Other; Celgene/BMS: Membership on an entity's Board of Directors or advisory committees, Other; Janssen: Membership on an entity's Board of Directors or advisory committees, Other; Roche/Genentech: Membership on an entity's Board of Directors or advisory committees, Other; Lava Therapeutics: Membership on an entity's Board of Directors or advisory committees, Other; Kite: Membership on an entity's Board of Directors or advisory committees, Other.


2010 ◽  
pp. 1518-1542
Author(s):  
Janina Fengel ◽  
Heiko Paulheim ◽  
Michael Rebstock

Despite the development of e-business standards, the integration of business processes and business information systems is still a non-trivial issue if business partners use different e-business standards for formatting and describing information to be processed. Since those standards can be understood as ontologies, ontological engineering technologies can be applied for processing, especially ontology matching for reconciling them. However, as e-business standards tend to be rather large-scale ontologies, scalability is a crucial requirement. To serve this demand, we present our ORBI Ontology Mediator. It is linked with our Malasco system for partition-based ontology matching with currently available matching systems, which so far do not scale well, if at all. In our case study we show how to provide dynamic semantic synchronization between business partners using different e-business standards without initial ramp-up effort, based on ontological mapping technology combined with interactive user participation.


2015 ◽  
Vol 27 (10) ◽  
pp. 2039-2096 ◽  
Author(s):  
Frank-Michael Schleif ◽  
Peter Tino

Efficient learning of a data analysis task strongly depends on the data representation. Most methods rely on (symmetric) similarity or dissimilarity representations by means of metric inner products or distances, providing easy access to powerful mathematical formalisms like kernel or branch-and-bound approaches. Similarities and dissimilarities are, however, often naturally obtained by nonmetric proximity measures that cannot easily be handled by classical learning algorithms. Major efforts have been undertaken to provide approaches that can either directly be used for such data or to make standard methods available for these types of data. We provide a comprehensive survey for the field of learning with nonmetric proximities. First, we introduce the formalism used in nonmetric spaces and motivate specific treatments for nonmetric proximity data. Second, we provide a systematization of the various approaches. For each category of approaches, we provide a comparative discussion of the individual algorithms and address complexity issues and generalization properties. In a summarizing section, we provide a larger experimental study for the majority of the algorithms on standard data sets. We also address the problem of large-scale proximity learning, which is often overlooked in this context and of major importance to make the method relevant in practice. The algorithms we discuss are in general applicable for proximity-based clustering, one-class classification, classification, regression, and embedding approaches. In the experimental part, we focus on classification tasks.


2010 ◽  
Vol 107 (5) ◽  
pp. 2118-2123 ◽  
Author(s):  
Johannes Müller ◽  
Torsten M. Scheyer ◽  
Jason J. Head ◽  
Paul M. Barrett ◽  
Ingmar Werneburg ◽  
...  

The development of distinct regions in the amniote vertebral column results from somite formation and Hox gene expression, with the adult morphology displaying remarkable variation among lineages. Mammalian regionalization is reportedly very conservative or even constrained, but there has been no study investigating vertebral count variation across Amniota as a whole, undermining attempts to understand the phylogenetic, ecological, and developmental factors affecting vertebral column variation. Here, we show that the mammalian (synapsid) and reptilian lineages show early in their evolutionary histories clear divergences in axial developmental plasticity, in terms of both regionalization and meristic change, with basal synapsids sharing the conserved axial configuration of crown mammals, and basal reptiles demonstrating the plasticity of extant taxa. We conducted a comprehensive survey of presacral vertebral counts across 436 recent and extinct amniote taxa. Vertebral counts were mapped onto a generalized amniote phylogeny as well as individual ingroup trees, and ancestral states were reconstructed by using squared-change parsimony. We also calculated the relationship between presacral and cervical numbers to infer the relative influence of homeotic effects and meristic changes and found no correlation between somitogenesis and Hox-mediated regionalization. Although conservatism in presacral numbers characterized early synapsid lineages, in some cases reptiles and synapsids exhibit the same developmental innovations in response to similar selective pressures. Conversely, increases in body mass are not coupled with meristic or homeotic changes, but mostly occur in concert with postembryonic somatic growth. Our study highlights the importance of fossils in large-scale investigations of evolutionary developmental processes.


2020 ◽  
Vol 8 (6) ◽  
pp. 385
Author(s):  
Elvira Armenio ◽  
Michele Mossa

Sustainable management of coastal areas involves dealing with problems such as coastal erosion, rapid growth in the rate of urbanization, tourism, environmental degradation associated with industrial and urban activities. Besides consideration is provided for the effects of climate change, whose scenarios also have significant consequences on coastal systems that are already extremely vulnerable and prone to many human pressures. Over the years, several international and national studies have been conducted to deepen the coastal processes. To date—despite considerable efforts—there are still problems. Two relevant priorities emerge: managing coastal risks and ensuring sustainable coastal management. In response to the above-mentioned challenges, it is worthwhile to elaborate an integrated methodology that, basing on the collection, analysis and evaluation of data, may provide an effective guideline for the successful implementation of each action, while providing timely and targeted information for the adoption of governance strategies concerning the prevention and management of marine-coastal risks. In the present study, considering what emerged in the major research projects on the coastal field during the last decades, a methodological proposal is outlined to pursue the principles of the integrated coastal zona management (ICZM) and join the managing coastal risks with sustainable uses focusing on the implementation scale.


Sign in / Sign up

Export Citation Format

Share Document