data density
Recently Published Documents


TOTAL DOCUMENTS

265
(FIVE YEARS 87)

H-INDEX

18
(FIVE YEARS 7)

2021 ◽  
Vol 12 (1) ◽  
pp. 406
Author(s):  
Pavel Cheremkhin ◽  
Nikolay Evtikhiev ◽  
Vitaly Krasnov ◽  
Ilya Ryabcev ◽  
Anna Shifrina ◽  
...  

The necessity of the correction of errors emerging during the optical encryption process led to the extensive use of data containers such as QR codes. However, due to specifics of optical encryption, QR codes are not very well suited for the task, which results in low error correction capabilities in optical experiments mainly due to easily breakable QR code’s service elements and byte data structure. In this paper, we present optical implementation of information optical encryption system utilizing new multilevel customizable digital data containers with high data density. The results of optical experiments demonstrate efficient error correction capabilities of the new data container.


2021 ◽  
Vol 1 (1) ◽  
pp. 399-406
Author(s):  
Ediyanto Ediyanto ◽  
Sugeng Sugeng ◽  
Hadi A.N. ◽  
Dewanto RH ◽  
Rifai N ◽  
...  

Land is an asset in today’s era of development. A patch of land’s management must be planned so that its utilization can be done proportionally and professionally. The Special Region of Yogyakarta, namely Gunung Kidul Regency, has yet to have a reliable base map with high accuracy for each patch of land within its area, which results in the hampering of land use due to the lack of spatial information available. The lack of clear status and spatial information is felt by UPN “Veteran” Yogyakarta that has several patches of land in Gunung Kidul Regency, namely Kuwon Kidul, Pacarejo, Semanu, Gunung Kidul Regency. This research will create a base map scaled to 1:500 accompanied with information about height differences of said area and a recommended planning area considering three different aspects: geology, environment, and agriculture. The creation of a topographic map is useful for construction planning that will be conducted in the area. During this research, the topographic measurement and data processing were done in 11 days on field. The topographic measurement and data processing were conducted through five stages: initial survey, GNSS measurement, detail measurement, and data processing using software to produce a topographic map. The map result and measurement on field using Total Station showed that UPN “Veteran” Yogyakarta does not have a level surface and has good data density.


GeoJournal ◽  
2021 ◽  
Author(s):  
Susanne Schröder-Bergen ◽  
Georg Glasze ◽  
Boris Michel ◽  
Finn Dammann

AbstractIn its early days, the geodata and mapping project OpenStreetMap (OSM) was widely celebrated for opening up and “democratizing” the production of geographic knowledge. However, critical research highlights that the new socio-technical practices of collaborative mapping often also produce or reproduce patterns of exclusion, not least in the area of relative data density between the Global South and North. These findings notwithstanding, we consider it important to acknowledge the increasing number of contributions of geodata from regions outside the old European core of OSM. This expansion of geodata production in OSM is related to a diversification of OSM actors and socio-technical practices. While OSM has often been described as a crowd-based project bringing together thousands of individual craft mappers, our analysis of OSM metadata indicates new institutional actors are gaining relevance. These developments have not only resulted in new collaborations but also conflicts between local mapping communities and institutional actors. We interpret these processes in two ways. First, the expansion of mapping activities can be viewed as a decolonizing process, whereby quantitative differences in data density between the Global North and South are partly reduced and new groups of local mappers are empowered to produce geographic knowledge. Second, these new developments can also be understood as colonizing processes. The engagement of large commercial actors in OSM raises concerns that the project (and its local mappers) could be used as a new means of data extraction and that in particular new and diverse voices in the OSM community are marginalized by a fixation on economically exploitable, modernistic and universalistic epistemologies. However, this supposedly clear distinction should not obscure the fact that colonizing and decolonizing processes intertwine in complex ways.


Author(s):  
Hosein Ghaedi ◽  
Payam Kalhor ◽  
Ming Zhao ◽  
Peter T. Clough ◽  
Edward J. Anthony ◽  
...  

AbstractIs it possible to improve CO2 solubility in potassium carbonate (K2CO3)-based transition temperature mixtures (TTMs)? To assess this possibility, a ternary transition-temperature mixture (TTTM) was prepared by using a hindered amine, 2-amino-2-methyl-1,3-propanediol (AMPD). Fourier transform infrared spectroscopy (FT-IR) was employed to detect the functional groups including hydroxyl, amine, carbonate ion, and aliphatic functional groups in the prepared solvents. From thermogravimetric analysis (TGA), it was found that the addition of AMPD to the binary mixture can increase the thermal stability of TTTM. The viscosity findings showed that TTTM has a higher viscosity than TTM while their difference was decreased by increasing temperature. In addition, Eyring’s absolute rate theory was used to compute the activation parameters (∆G*, ∆H*, and ∆S*). The CO2 solubility in liquids was measured at a temperature of 303.15 K and pressures up to 1.8 MPa. The results disclosed that the CO2 solubility of TTTM was improved by the addition of AMPD. At the pressure of about 1.8 MPa, the CO2 mole fractions of TTM and TTTM were 0.1697 and 0.2022, respectively. To confirm the experimental data, density functional theory (DFT) was employed. From the DFT analysis, it was found that the TTTM + CO2 system has higher interaction energy (|∆E|) than the TTM + CO2 system indicating the higher CO2 affinity of the former system. This study might help scientists to better understand and to improve CO2 solubility in these types of solvents by choosing a suitable amine as HBD and finding the best combination of HBA and HBD.


PLoS ONE ◽  
2021 ◽  
Vol 16 (11) ◽  
pp. e0259227
Author(s):  
Mingyang Deng ◽  
Yingshi Guo ◽  
Chang Wang ◽  
Fuwei Wu

To solve the oversampling problem of multi-class small samples and to improve their classification accuracy, we develop an oversampling method based on classification ranking and weight setting. The designed oversampling algorithm sorts the data within each class of dataset according to the distance from original data to the hyperplane. Furthermore, iterative sampling is performed within the class and inter-class sampling is adopted at the boundaries of adjacent classes according to the sampling weight composed of data density and data sorting. Finally, information assignment is performed on all newly generated sampling data. The training and testing experiments of the algorithm are conducted by using the UCI imbalanced datasets, and the established composite metrics are used to evaluate the performance of the proposed algorithm and other algorithms in comprehensive evaluation method. The results show that the proposed algorithm makes the multi-class imbalanced data balanced in terms of quantity, and the newly generated data maintain the distribution characteristics and information properties of the original samples. Moreover, compared with other algorithms such as SMOTE and SVMOM, the proposed algorithm has reached a higher classification accuracy of about 90%. It is concluded that this algorithm has high practicability and general characteristics for imbalanced multi-class samples.


2021 ◽  
Vol 8 (4) ◽  
pp. 309-332
Author(s):  
Efosa Michael Ogbeide ◽  
Joseph Erunmwosa Osemwenkhae

Density estimation is an important aspect of statistics. Statistical inference often requires the knowledge of observed data density. A common method of density estimation is the kernel density estimation (KDE). It is a nonparametric estimation approach which requires a kernel function and a window size (smoothing parameter H). It aids density estimation and pattern recognition. So, this work focuses on the use of a modified intersection of confidence intervals (MICIH) approach in estimating density. The Nigerian crime rate data reported to the Police as reported by the National Bureau of Statistics was used to demonstrate this new approach. This approach in the multivariate kernel density estimation is based on the data. The main way to improve density estimation is to obtain a reduced mean squared error (MSE), the errors for this approach was evaluated. Some improvements were seen. The aim is to achieve adaptive kernel density estimation. This was achieved under a sufficiently smoothing technique. This adaptive approach was based on the bandwidths selection. The quality of the estimates obtained of the MICIH approach when applied, showed some improvements over the existing methods. The MICIH approach has reduced mean squared error and relative faster rate of convergence compared to some other approaches. The approach of MICIH has reduced points of discontinuities in the graphical densities the datasets. This will help to correct points of discontinuities and display adaptive density. Keywords: approach, bandwidth, estimate, error, kernel density


2021 ◽  
Author(s):  
Eduardo Soares ◽  
plamen angelov

This paper presents the RADNN algorithm. The RADNN is a robust to imperceptible adversarial attack algorithm that uses the concept of data density and similarities to detect attacks on real-time. Differently from traditional deep learnings that need be trained on the attacks to be able to detect, RADNN has a mechanism that detects data patterns changes. In order to evaluate the proposed method, we considered the PerC attacks and a 1000 images from the Imagenet dataset. The RADNN could correctly identify 97.2% of the attacks.


2021 ◽  
Author(s):  
Eduardo Soares ◽  
plamen angelov

This paper presents the RADNN algorithm. The RADNN is a robust to imperceptible adversarial attack algorithm that uses the concept of data density and similarities to detect attacks on real-time. Differently from traditional deep learnings that need be trained on the attacks to be able to detect, RADNN has a mechanism that detects data patterns changes. In order to evaluate the proposed method, we considered the PerC attacks and a 1000 images from the Imagenet dataset. The RADNN could correctly identify 97.2% of the attacks.


Sign in / Sign up

Export Citation Format

Share Document