volume data
Recently Published Documents


TOTAL DOCUMENTS

1404
(FIVE YEARS 348)

H-INDEX

50
(FIVE YEARS 6)

2022 ◽  
Vol 14 (2) ◽  
pp. 390
Author(s):  
Dinh Ho Tong Minh ◽  
Yen-Nhi Ngo

Modern Synthetic Aperture Radar (SAR) missions provide an unprecedented massive interferometric SAR (InSAR) time series. The processing of the Big InSAR Data is challenging for long-term monitoring. Indeed, as most deformation phenomena develop slowly, a strategy of a processing scheme can be worked on reduced volume data sets. This paper introduces a novel ComSAR algorithm based on a compression technique for reducing computational efforts while maintaining the performance robustly. The algorithm divides the massive data into many mini-stacks and then compresses them. The compressed estimator is close to the theoretical Cramer–Rao lower bound under a realistic C-band Sentinel-1 decorrelation scenario. Both persistent and distributed scatterers (PSDS) are exploited in the ComSAR algorithm. The ComSAR performance is validated via simulation and application to Sentinel-1 data to map land subsidence of the salt mine Vauvert area, France. The proposed ComSAR yields consistently better performance when compared with the state-of-the-art PSDS technique. We make our PSDS and ComSAR algorithms as an open-source TomoSAR package. To make it more practical, we exploit other open-source projects so that people can apply our PSDS and ComSAR methods for an end-to-end processing chain. To our knowledge, TomoSAR is the first public domain tool available to jointly handle PS and DS targets.


2022 ◽  
Vol 4 ◽  
Author(s):  
Kaiqi Zhang ◽  
Cole Hawkins ◽  
Zheng Zhang

A major challenge in many machine learning tasks is that the model expressive power depends on model size. Low-rank tensor methods are an efficient tool for handling the curse of dimensionality in many large-scale machine learning models. The major challenges in training a tensor learning model include how to process the high-volume data, how to determine the tensor rank automatically, and how to estimate the uncertainty of the results. While existing tensor learning focuses on a specific task, this paper proposes a generic Bayesian framework that can be employed to solve a broad class of tensor learning problems such as tensor completion, tensor regression, and tensorized neural networks. We develop a low-rank tensor prior for automatic rank determination in nonlinear problems. Our method is implemented with both stochastic gradient Hamiltonian Monte Carlo (SGHMC) and Stein Variational Gradient Descent (SVGD). We compare the automatic rank determination and uncertainty quantification of these two solvers. We demonstrate that our proposed method can determine the tensor rank automatically and can quantify the uncertainty of the obtained results. We validate our framework on tensor completion tasks and tensorized neural network training tasks.


Author(s):  
Saheli Banerjee ◽  
Alka B Garg ◽  
H. K. Poswal

Abstract In this article we report the synthesis, characterization and high pressure investigation on technologically important, rare earth orthotantalate, EuTaO4. Single phase polycrystalline sample of EuTaO4 has been synthesized by solid state reaction method adopting monoclinic M'-type fergusonite phase with space group P2/c. Structural and vibrational properties of synthesized compound are investigated using synchrotron based x-ray powder diffraction, and Raman spectroscopic techniques respectively. Both the techniques show presence of an isostructural, first order, reversible phase transition near 17 GPa. Bulk modulus obtained by fitting the experimental pressure volume data for low pressure and high pressure phase is 136.0(3) and 162.8(21) GPa. High pressure phase is accompanied by an increase in coordination number around Ta atom from 6 to 8. First principles calculations under the frame work of density functional theory (DFT) also predicts the isostructural phase transition and change in coordination around Ta atom, corroborating the experimental findings.


FLORESTA ◽  
2022 ◽  
Vol 52 (1) ◽  
pp. 159
Author(s):  
Stephany Diolino Cunha ◽  
Vagner Santiago Do Vale ◽  
Tatiana Vieira Ramos ◽  
Matheus Da Silva Araújo

Due to the positive impact that the eucalyptus species has on the Brazilian economy, it is currently the most used forest essence. The objective of this work was to evaluate different hypsometric and volumetric models for Eucalyptus urograndis clones (Eucalyptus urophylla S.T. Blak and Eucalyptus grandis W. Hill ex Maiden) in a Crop-Forest Integration (CFI) system. The trees were evaluated at 7 years of age and arranged in double rows, occupying 20.76% of the total system area. The individuals were subjected to rigorous volumetric cubing according to the Smalian method at intervals of one meter up to full height. The following models were evaluated for the collected height data: Linear, Trorey, Stofels, Curtis, Henriksen, Prodan, Chapman & Richards, Petterson and Bailey & Clutter. Furthermore, the Spurr, Hohenald-Krenn, Stoate, Schumacher Hall, Meyer, Husch, Ogaya and Takata models were used for volume data. The results were determined through the coefficient of determination (R2), standard error of the estimate in percentage (Syx%), significance of the regression coefficients (𝛽) and graphical distribution. The hypsometric model which best fit the database among tested models was the Prodan equation, with a coefficient of determination (R²) of 0.89, while the best result for volumetric models was found using the Meyer model, with a coefficient of determination (R²) of 0.99. All evaluated models were efficient in estimating the height and volume of the Crop-Forest Integration (CFI) system, thus demonstrating that GG100 eucalyptus is a good option in integrated systems.


2022 ◽  
Vol 6 (1) ◽  
pp. 89-99
Author(s):  
Annisa Heparyanti Safitri ◽  
Agung Teguh Wibowo Almais ◽  
A'la Syauqi ◽  
Roro Inda Melani

Volume data yang sangat besar dari tim surveyor Perencanaan dan Pengendalian Penanganan Bencana(P3B) menciptakan masalah yang luas dan beragam sehingga dapat menghabiskan sumber daya sistem dan waktu pemrosesan yang terbilang lama. Oleh karena itu penelitian ini mengusulkan solusi dengan melakukan Optimasi query pada metode TOPSIS yang diimplementasikan pada sistem pendukung kepeutusan untuk menentukan tingkat kerusakan pasca bencana. Berdasarkan 3 kali uji coba dengan jumlah data yang berbeda-beda yaitu ujicoba ke-1 menggunakan 114 data, ujicoba ke-2 sebanyak 228 data dan ujicoba ke-3 menggunakan 334 data. Selain itu, setiap ujicoba dilakukan lagi pengukuran re-spons time sebanyak 3 kali maka didapatkan hasil rata-rata (average) response time dari masing-masing langkah metode TOPSIS. Didapati bahwa hasil dari tahapan perangkingan menggunakan query optimiza-tion lebih cepat 0.00076 dibandingakan dengan qury non-optimization. Sehingga dapat di simpulkan bahwa response time yang didapat query optimization pada setiap langkah metode TOPSIS pada sistem pendukung keputusan kerusakan sektor pasca bencana alam lebih kecil dibandingkan dengan response time pada query non-optimization.


2022 ◽  
Vol 2161 (1) ◽  
pp. 012023
Author(s):  
Mukta Nivelkar ◽  
S. G. Bhirud

Abstract Mechanism of quantum computing helps to propose several task of machine learning in quantum technology. Quantum computing is enriched with quantum mechanics such as superposition and entanglement for making new standard of computation which will be far different than classical computer. Qubit is sole of quantum technology and help to use quantum mechanism for several tasks. Tasks which are non-computable by classical machine can be solved by quantum technology and these tasks are classically hard to compute and categorised as complex computations. Machine learning on classical models is very well set but it has more computational requirements based on complex and high-volume data processing. Supervised machine learning modelling using quantum computing deals with feature selection, parameter encoding and parameterized circuit formation. This paper highlights on integration of quantum computation and machine learning which will make sense on quantum machine learning modeling. Modelling of quantum parameterized circuit, Quantum feature set design and implementation for sample data is discussed. Supervised machine learning using quantum mechanism such as superposition and entanglement are articulated. Quantum machine learning helps to enhance the various classical machine learning methods for better analysis and prediction using complex measurement.


2022 ◽  
Vol 20 (1) ◽  
pp. 010501
Author(s):  
Xin Zhao ◽  
Xinzhu Sang ◽  
Hui Li ◽  
Duo Chen ◽  
Yuanhang Li ◽  
...  

Genes ◽  
2021 ◽  
Vol 13 (1) ◽  
pp. 101
Author(s):  
Julie Heng ◽  
Henry H. Heng

The year 2021 marks the 50th anniversary of the National Cancer Act, signed by President Nixon, which declared a national “war on cancer.” Powered by enormous financial support, this past half-century has witnessed remarkable progress in understanding the individual molecular mechanisms of cancer, primarily through the characterization of cancer genes and the phenotypes associated with their pathways. Despite millions of publications and the overwhelming volume data generated from the Cancer Genome Project, clinical benefits are still lacking. In fact, the massive, diverse data also unexpectedly challenge the current somatic gene mutation theory of cancer, as well as the initial rationales behind sequencing so many cancer samples. Therefore, what should we do next? Should we continue to sequence more samples and push for further molecular characterizations, or should we take a moment to pause and think about the biological meaning of the data we have, integrating new ideas in cancer biology? On this special anniversary, we implore that it is time for the latter. We review the Genome Architecture Theory, an alternative conceptual framework that departs from gene-based theories. Specifically, we discuss the relationship between genes, genomes, and information-based platforms for future cancer research. This discussion will reinforce some newly proposed concepts that are essential for advancing cancer research, including two-phased cancer evolution (which reconciles evolutionary contributions from karyotypes and genes), stress-induced genome chaos (which creates new system information essential for macroevolution), the evolutionary mechanism of cancer (which unifies diverse molecular mechanisms to create new karyotype coding during evolution), and cellular adaptation and cancer emergence (which explains why cancer exists in the first place). We hope that these ideas will usher in new genomic and evolutionary conceptual frameworks and strategies for the next 50 years of cancer research.


PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0261383
Author(s):  
Glenna F. Nightingale ◽  
Andrew James Williams ◽  
Ruth F. Hunter ◽  
James Woodcock ◽  
Kieran Turner ◽  
...  

Objectives Traffic speed is important to public health as it is a major contributory factor to collision risk and casualty severity. 20mph (32km/h) speed limit interventions are an increasingly common approach to address this transport and health challenge, but a more developed evidence base is needed to understand their effects. This study describes the changes in traffic speed and traffic volume in the City of Edinburgh, pre- and 12 months post-implementation of phased city-wide 20mph speed limits from 2016–2018. Methods The City of Edinburgh Council collected speed and volume data across one full week (24 hours a day) pre- and post-20mph speed limits for 66 streets. The pre- and post-speed limit intervention data were compared using measures of central tendency, dispersion, and basic t-tests. The changes were assessed at different aggregations and evaluated for statistical significance (alpha = 0.05). A mixed effects model was used to model speed reduction, in the presence of key variables such as baseline traffic speed and time of day. Results City-wide, a statistically significant reduction in mean speed of 1.34mph (95% CI 0.95 to 1.72) was observed at 12 months post-implementation, representing a 5.7% reduction. Reductions in speed were observed throughout the day and across the week, and larger reductions in speed were observed on roads with higher initial speeds. Mean 7-day volume of traffic was found to be lower by 86 vehicles (95% CI: -112 to 286) representing a reduction of 2.4% across the city of Edinburgh (p = 0.39) but with the direction of effect uncertain. Conclusions The implementation of the city-wide 20mph speed limit intervention was associated with meaningful reductions in traffic speeds but not volume. The reduction observed in road traffic speed may act as a mechanism to lessen the frequency and severity of collisions and casualties, increase road safety, and improve liveability.


2021 ◽  
Vol 6 (2) ◽  
pp. 117-126
Author(s):  
Febry Purnomo Aji ◽  
Arip Solehudin ◽  
Chaerur Rozikin

In the process of monitoring the capacity of the B3 waste storage facility at PT Fadira Teknik, the manual method is still used to determine whether the waste load is full (ready to be disposed) or not. Where in the process, workers must come and look directly at the B3 waste storage area. This will increase jobs for factory workers because they must always monitor the level of B3 waste before or after carrying out work. Apart from being harmful to humans, the B3 waste disposed of from the factory is in the form of small particles such as invisible dust which can be accidentally inhaled by the nose or into the eyes of the workers. Therefore the aim of this research is to create a smart trash can system that can monitor the volume of B3 waste in the trash, where the trash uses the IoT (Internet of Things) system by utilizing the Arduino Uno component as a microcontroller and ultrasonic sensor to detect the volume of waste then sends waste volume data to the Blynk application via the internet network to display information on the capacity of the trash. The research method used is the experimental method starting from system analysis, system design, system implementation, testing and evaluation. Testing on this smart trash system uses black box testing with the results of these tests being quite good where each test case is as expected.


Sign in / Sign up

Export Citation Format

Share Document