dense distribution
Recently Published Documents


TOTAL DOCUMENTS

62
(FIVE YEARS 27)

H-INDEX

11
(FIVE YEARS 2)

2021 ◽  
Vol 2132 (1) ◽  
pp. 012036
Author(s):  
Dawei Liu ◽  
Shujing Gao

Abstract An improved algorithm is proposed to solve the problems of inaccurate recognition and low recall of Faster-Regions with Convolutional Neural Network (Faster-RCNN) algorithm for the detection of ship targets in remote sensing images. The algorithm is based on the Faster-RCNN network framework. Aiming at the small size and dense distribution of ship targets in remote sensing images, the feature extraction network is improved to enhance the detection ability of small targets. ResNet50 is used as the basic feature extraction network of the algorithm,and the hole residual block is introduced for multi-layer feature fusion to construct a new feature extraction network,which improves the feature extraction capability of the algorithm. The experimental results show that compared with the Faster-RCNN algorithm, this algorithm can learn more abundant target features in smaller pixel areas, thereby effectively improving the detection accuracy of ship targets.


Author(s):  
Vorapoj Patanavijit ◽  
Kornkamol Thakulsukanant

This primary aim of this philosopher paper investigates the efficacy of the noise dissolving algorithm hinge on TTSD (Triple Threshold Statistical Detection) filter that has been originated since 2018 is one of the highest efficacy for dissolving RIIN (Random-Intensity Impulse Noise), exclusively at dense distribution. As a results, there are three essential contributions: the exhaustive explanation of the TTSD filter algorithm and its computation examples, the calculation simulation of noise apprehension correctness and overall comparative simulation of noise dissolving effectiveness. For TTSD filter, three malleable offsets that are the complementary requirement are employed in the TTSD filter that can adequately resolve the limitation of the antecedent noise dissolving algorithms. The first malleable offset is calculated for determining the noise characteristic of all elements by using the mathematical verification. Next, the second malleable offset is calculated for determining the another noise characteristic by using the normal distribution mathematical verification (the average value and standard deviation value). Later, the third malleable offset is calculated for determining the another noise characteristic by using the quartile mathematical verification (median value). In the simulation inquisition, the bountiful standard portraits that are desecrated by RIIN (Random Intensity Impulse Noise) with many dense distributions are experimented by noise dissolving algorithm hinge on TTSD in both noise segregation and noise dissolving perspective.


2021 ◽  
Author(s):  
Xuefeng Li ◽  
Baorong Zhang ◽  
Quan Hu ◽  
Changchao Chen ◽  
Lu Liu ◽  
...  

Abstract The methods developed for efficient insoluble protein production are less well explored. Our data demonstrated that PagP, an E. coli outer membrane protein with high β-sheet content, could function as an efficient fusion partner for inclusion body-targeted expression of antimicrobial peptide Magainin II, Metchnikowin and Andropin. The primary structure of a given polypeptide determines to a large extent its propensity to aggregate. The aggregation “hot spots” (HSs) in PagP was subsequently analyzed with the web-based software AGGRESCAN, leading to identification of the C-terminal region with high dense distribution of HSs. The absolute yields of recombinant antimicrobial peptide Metchnikowin and Andropin could be increased significantly when expressed in fusion with this version of PagP. Moreover, a Proline-rich region was found in the β-strands of PagP. Substitution for these prolines by residues with high β-sheet propensity and hydrophobicity significantly improved its ability to form aggregates, and greatly increased the yield of the recombinant passenger peptides. Fewer examples have been presented to separate the recombinant target proteins expressed in fusion inclusion bodies. Here, we reported an artificial linker peptide NHT with three motifs, by which separation and purification of the authentic recombinant antimicrobial peptides could be implemented.


BioResources ◽  
2021 ◽  
Vol 16 (4) ◽  
pp. 7444-7460
Author(s):  
Pengwei Zhao ◽  
Hong Yang ◽  
Guoqi Xu ◽  
Congxun Huang ◽  
Yan Zhong

A nano-CuO/silica sol wood preservative was obtained by dispersing CuO nanoparticles in propylene glycol and silica sol. Scanning electron microscopy, Fourier transform infrared spectroscopy, X-ray diffraction analysis, thermogravimetric analysis, and compressive tests were conducted to investigate the effects of different post-treatments, i.e., steaming at 100 °C and freezing at -30 °C, on the variations in microstructure, mechanical, physical, and thermal stability properties of the preservative-impregnated wood. The results revealed that the mechanical properties, water resistance, and thermal stability of the impregnated specimens were greatly ameliorated. The steaming treatment resulted in a more uniform and dense distribution of the preservative in the blocks. The steaming treatment performed better in terms of enhancing the compressive strength of the specimens, while the freezing treatment was more effective in improving the thermal stability of the specimens. Both the steaming and freezing treatments can considerably improve the water resistance of the specimens. The different post-treatments retain the basic properties of the wood; however, they differ in the improved wood properties and provide a basis for their selection in the industrial production of nano-preservatives.


PLoS ONE ◽  
2021 ◽  
Vol 16 (8) ◽  
pp. e0255675
Author(s):  
László Zimányi ◽  
Áron Sipos ◽  
Ferenc Sarlós ◽  
Rita Nagypál ◽  
Géza I. Groma

Dealing with a system of first-order reactions is a recurrent issue in chemometrics, especially in the analysis of data obtained by spectroscopic methods applied on complex biological systems. We argue that global multiexponential fitting, the still common way to solve such problems, has serious weaknesses compared to contemporary methods of sparse modeling. Combining the advantages of group lasso and elastic net—the statistical methods proven to be very powerful in other areas—we created an optimization problem tunable from very sparse to very dense distribution over a large pre-defined grid of time constants, fitting both simulated and experimental multiwavelength spectroscopic data with high computational efficiency. We found that the optimal values of the tuning hyperparameters can be selected by a machine-learning algorithm based on a Bayesian optimization procedure, utilizing widely used or novel versions of cross-validation. The derived algorithm accurately recovered the true sparse kinetic parameters of an extremely complex simulated model of the bacteriorhodopsin photocycle, as well as the wide peak of hypothetical distributed kinetics in the presence of different noise levels. It also performed well in the analysis of the ultrafast experimental fluorescence kinetics data detected on the coenzyme FAD in a very wide logarithmic time window. We conclude that the primary application of the presented algorithms—implemented in available software—covers a wide area of studies on light-induced physical, chemical, and biological processes carried out with different spectroscopic methods. The demand for this kind of analysis is expected to soar due to the emerging ultrafast multidimensional infrared and electronic spectroscopic techniques that provide very large and complex datasets. In addition, simulations based on our methods could help in designing the technical parameters of future experiments for the verification of particular hypothetical models.


2021 ◽  
Author(s):  
Jiří Macháček ◽  
Stefan Eichert ◽  
Adéla Balcárková ◽  
Petr Dresler ◽  
Radek Měchura ◽  
...  

Abstract:Interdisciplinary research, carried out by the Masaryk University Brno and the University of Vienna, at the site of Lány (CZ) at the border between Austria and Moravia has revealed a large settlement (∼12ha) from the 6th century until the 8th/9th century in a contact zone between Slavonic and Avarian influences. Aside from pottery that ranges from early slavic finds of the Prague type to specimens of the middle-danubian tradition („mitteldanubische Kulturtradition“) and other finds such as spindle whorls etc. several dozen typical Avar belt accessories have been found. Most of them date to the late Avar III period, are brand new and do not show any traces of usage. Together with semi-finished products, miscast objects and remains of the bronze casting process, we interpret Lány as a production site/workshop for Avar belts.Lány is at the very Northwestern periphery of the Avar Khaganate. However, material culture, aside from the belt accessories, is much more associated with what we know from regions where Slavonic populations of the 7th/8th century had settled.We furthermore discuss the usage of Avar belts amongst the Slavic elites of the 8th century and possible explanations for the dense distribution of Avar finds outside of the Khaganate.


2021 ◽  
Vol 13 (2) ◽  
pp. 115-133
Author(s):  
Pengcheng Cao ◽  
Weiwei Liu ◽  
Guangjie Liu ◽  
Jiangtao Zhai ◽  
Xiao-Peng Ji ◽  
...  

To conceal the very existence of communication, the noise-based wireless covert channel modulates secret messages into artificial noise, which is added to the normal wireless signal. Although the state-of-the-art work based on constellation modulation has made the composite and legitimate signal undistinguishable, there exists an imperfection on reliability due to the dense distribution of covert constellation points. In this study, the authors design a wireless covert channel based on dither analog chaotic code to improve the reliability without damaging the undetectability. The dither analog chaotic code (DACC) plays the role as the error correcting code. In the modulation, the analog variables converted from secret messages are encode into joint codewords by chaotic mapping and dither derivation of DACC. The joint codewords are mapped to artificial noise later. Simulation results show that the proposed scheme can achieve better reliability than the state-of-the-art scheme while maintaining the similar performance on undetectability.


Author(s):  
М.Ю. МАСЛОВ ◽  
Ю.М. СПОДОБАЕВ

Эволюция инфотелекоммуникаций, демонстрирующая стремительные темпы перехода к высокотехнологичным системам, сопровождается глубоким взаимным проникновением технологий - конвергенцией. Показано, что широкое использование беспроводных систем связи вызвало насыщение окружающей среды технологическими электромагнитными полями (ЭМП), а это, в свою очередь, актуализировало проблему защиты населения. Подчеркивается, что такая коренная перестройка привела к равномерному плотному размещению излучающих фрагментов сетей на селитебных территориях. Изменившиеся параметры излучаемых полей требуют пересмотра нормативно-методического обеспечения электромагнитной безопасности. Фрагментарный структурный, функциональный и параметрический анализ проблемы защиты населения от технологических полей выявил неопределенность в толковании реальных ситуаций, уязвимость, слабость и необоснованность методологической основы санитарно-гигиенических подходов на всех этапах электромагнитной экспертизы излучающих фрагментов сетевых технологий. Отмечается, что следствием этого являются недоверие со стороны специалистов и населения к системе санитарно-гигиенического контроля и в целом к безопасности современных технологий, растущая социальная напряженность и радиофобия. В качестве основы для решения проблем защиты населения предлагается субъективные методы и средства мониторинга ЭМП перенести в область информационных технологий. The evolution of infotelecommunications, demonstrating the rapid pace of transition to high-tech systems, is accompanied by deep mutual penetration of technologies - convergence. It is shown that the widespread use of wireless communication systems has led to the saturation of the environment with technological electromagnetic fields (EMF), and this, in turn, has actualized the problem of protecting the population. It is emphasized that such a radical restructuring has led to a uniform dense distribution of radiating network fragments in residential areas. The changed parameters of the radiated fields require a revision of the regulatory and methodological support of electromagnetic safety. A fragmentary structural, functional, and parametric analysis of the problem of protecting the population from technological fields revealed uncertainty in the interpretation of real situations, vulnerability, weakness, and groundlessness of the methodological basis of sanitary and hygienic approaches at all stages of electromagnetic examination of emitting fragments of network technologies. It is noted that the consequence of this is the distrust of specialists and the population to the system of sanitary and hygienic control and, in general, to the safety of modern technologies, growing social tension, and radiophobia. As a basis for solving the problems of protecting the population, it is proposed to transfer subjective methods and means of monitoring EMF to the field of information technologies.


2021 ◽  
Vol 2 (4) ◽  
pp. 160-164
Author(s):  
Abul Bashar ◽  
Dinesh Kumar

When transmitting data in a large number, the number of conflicts that arise is high. This is especially the case when there is dense distribution of marginal wireless sensor network (WSN). Some of the major conflicts that affect the overall operation of a system include heavy transmission delay and large data loss. In this proposed work, a multipath reliable transmission is used for wireless sensor network. In order to determine the reliability of the system, the WSN implements redundancy methodology. As the first step, data is sub-divided into packets of information along with data redundancy. These packets are then transmitted via multi-paths to their corresponding destination nodes. Experimental observation shows that the proposed work indicates a significant increase in network lifetime, reduction in transmission delay and data packet loss rate.


Author(s):  
Shweta Kaushik ◽  
Charu Gandhi

Cloud computing has emerged as a new technology that allows the users to acquire resources at anytime, anywhere by connecting with internet. It provides the options to users for renting of infrastructure, storage space, and services. One service issue that affects the QoS of cloud computing is network latency while dealing with real-time application. In this, the user interacts directly with application but delays in receiving the services, and jitter delay will encourage the user to think about this. In today's world, clients are moving towards the IoT techniques, enabling them to connect all things with internet and get their services from cloud. This advancement requires introduction of new technology termed as “fog computing.” Fog computing is an extension of cloud computing that provides the service at the edge of the network. Its proximity to end users, mobility support, and dense distribution reduces the service latency and improves QoS. This fog model provides the prosperity for advertisement and entertainment and is well suited for distributed data model.


Sign in / Sign up

Export Citation Format

Share Document