scholarly journals A Novel Link-to-System Mapping Technique Based on Machine Learning for 5G/IoT Wireless Networks

Sensors ◽  
2019 ◽  
Vol 19 (5) ◽  
pp. 1196 ◽  
Author(s):  
Eunmi Chu ◽  
Janghyuk Yoon ◽  
Bang Jung

In this paper, we propose a novel machine learning (ML) based link-to-system (L2S) mapping technique for inter-connecting a link-level simulator (LLS) and a system-level simulator (SLS). For validating the proposed technique, we utilized 5G K-Simulator, which was developed through a collaborative research project in Republic of Korea and includes LLS, SLS, and network-level simulator (NS). We first describe a general procedure of the L2S mapping methodology for 5G new radio (NR) systems, and then, we explain the proposed ML-based exponential effective signal-to-noise ratio (SNR) mapping (EESM) method with a deep neural network (DNN) regression algorithm. We compared the proposed ML-based EESM method with the conventional L2S mapping method. Through extensive simulation results, we show that the proposed ML-based L2S mapping technique yielded better prediction accuracy in regards to block error rate (BLER) while reducing the processing time.

Soil Systems ◽  
2021 ◽  
Vol 5 (3) ◽  
pp. 41
Author(s):  
Tulsi P. Kharel ◽  
Amanda J. Ashworth ◽  
Phillip R. Owens ◽  
Dirk Philipp ◽  
Andrew L. Thomas ◽  
...  

Silvopasture systems combine tree and livestock production to minimize market risk and enhance ecological services. Our objective was to explore and develop a method for identifying driving factors linked to productivity in a silvopastoral system using machine learning. A multi-variable approach was used to detect factors that affect system-level output (i.e., plant production (tree and forage), soil factors, and animal response based on grazing preference). Variables from a three-year (2017–2019) grazing study, including forage, tree, soil, and terrain attribute parameters, were analyzed. Hierarchical variable clustering and random forest model selected 10 important variables for each of four major clusters. A stepwise multiple linear regression and regression tree approach was used to predict cattle grazing hours per animal unit (h ha−1 AU−1) using 40 variables (10 per cluster) selected from 130 total variables. Overall, the variable ranking method selected more weighted variables for systems-level analysis. The regression tree performed better than stepwise linear regression for interpreting factor-level effects on animal grazing preference. Cattle were more likely to graze forage on soils with Cd levels <0.04 mg kg−1 (126% greater grazing hours per AU), soil Cr <0.098 mg kg−1 (108%), and a SAGA wetness index of <2.7 (57%). Cattle also preferred grazing (88%) native grasses compared to orchardgrass (Dactylis glomerata L.). The result shows water flow within the landscape position (wetness index), and associated metals distribution may be used as an indicator of animal grazing preference. Overall, soil nutrient distribution patterns drove grazing response, although animal grazing preference was also influenced by aboveground (forage and tree), soil, and landscape attributes. Machine learning approaches helped explain pasture use and overall drivers of grazing preference in a multifunctional system.


2014 ◽  
Vol 556-562 ◽  
pp. 6328-6331
Author(s):  
Su Zhen Shi ◽  
Yi Chen Zhao ◽  
Li Biao Yang ◽  
Yao Tang ◽  
Juan Li

The LIFT technology has applied in process of denoising to ensure the imaging precision of minor faults and structure in 3D coalfield seismic processing. The paper focused on the denoising process in two study areas where the LIFT technology is used. The separation of signal and noise is done firstly. Then denoising would be done in the noise data. The Data of weak effective signal that is from the noise data could be blended with the original effective signal to reconstruct the denoising data, so the result which has high signal-to-noise ratio and preserved amplitude is acquired. Thus the fact shows that LIFT is an effective denoising method for 3D seismic in coalfield and could be used widely in other work area.


2017 ◽  
Vol 27 (03) ◽  
pp. 1850044 ◽  
Author(s):  
Alireza Shamsi ◽  
Esmaeil Najafi Aghdam

Power consumption and bandwidth are two of the most important parameters in design of low power wideband modulators as power consumption is growing with the increase in bandwidth. In this study, a multi bit wideband low-power continuous time feed forward quadrature delta sigma modulator (CT-FF-QDSM) is designed for WLAN receiver applications by eliminating adders from modulator structure. In this method, a real modulator is designed and its excess loop delay (ELD) is compensated, then, it is converted into a quadrature structure by applying the complex coefficient to loop filter. Complex coefficients are extracted by the aid of a genetic algorithm to further improve signal to noise ratio (SNR) for bandwidth. One of the disadvantages of CT-FF-QDSM is the adders of loop filters which are power hungry and reduce the effective loop gain. Therefore, the adders have been eliminated while the transfer function is intact in the final modulator. The system level SNR of the proposed modulator is 62.53[Formula: see text]dB using OSR of 12. The circuit is implemented in CMOSTSMC180nm technology. The circuit levels SNR and power consumption are 54[Formula: see text]dB and 13.5[Formula: see text]mW, respectively. Figure of Merit (FOM) obtained from the proposed modulator is about 0.824 (pj/conv) which is improved (by more than 40%) compared to the previous designs.


Author(s):  
Bradley T. Martin ◽  
Tyler K. Chafin ◽  
Marlis R. Douglas ◽  
John S. Placyk ◽  
Roger D. Birkhead ◽  
...  

AbstractModel-based approaches that attempt to delimit species are hampered by computational limitations as well as the unfortunate tendency by users to disregard algorithmic assumptions. Alternatives are clearly needed, and machine-learning (M-L) is attractive in this regard as it functions without the need to explicitly define a species concept. Unfortunately, its performance will vary according to which (of several) bioinformatic parameters are invoked. Herein, we gauge the effectiveness of M-L-based species-delimitation algorithms by parsing 64 variably-filtered versions of a ddRAD-derived SNP dataset involving North American box turtles (Terrapene spp.). Our filtering strategies included: (A) minor allele frequencies (MAF) of 5%, 3%, 1%, and 0% (=none), and (B) maximum missing data per-individual/per-population at 25%, 50%, 75%, and 100% (=none). We found that species-delimitation via unsupervised M-L impacted the signal-to-noise ratio in our data, as well as the discordance among resolved clades. The latter may also reflect biogeographic history, gene flow, incomplete lineage sorting, or combinations thereof (as corroborated from previously observed patterns of differential introgression). Our results substantiate M-L as a viable species-delimitation method, but also demonstrate how commonly observed patterns of phylogenetic discord can seriously impact M-L-classification.


2020 ◽  
Vol 9 (1) ◽  
pp. 1700-1704

Classification of target from a mixture of multiple target information is quite challenging. In This paper we have used supervised Machine learning algorithm namely Linear Regression to classify the received data which is a mixture of target-return with the noise and clutter. Target state is estimated from the classified data using Kalman filter. Linear Kalman filter with constant velocity model is used in this paper. Minimum Mean Square Error (MMSE) analysis is used to measure the performance of the estimated track at various Signal to Noise Ratio (SNR) levels. The results state that the error is high for Low SNR, for High SNR the error is Low


2020 ◽  
Author(s):  
Nandkumar Niture

The AI, deep learning and machine learning algorithms are gaining the ground in every application domain of information technology including information security. In formation security domain knows for traditional password management systems, auto-provisioning systems and user information management systems. There is another raising concern on the application and system level security with ransomware. On the existing systems cyber-attacks of Ransomware asking for ransom increasing every day. Ransomware is the class of malware where the goal is to gain the data through encryption mechanism and render back with the ransom. The ransomware attacks are mainly on the vulnerable systems which are exposed to the network with weak security measures. With the help of machine learning algorithms, the pattern of the attacks can be analyzed. Create or discuss a workaround solution of a machine learning model with combination of cryptographic algorithm which will enhance the effectiveness of the system response to the possible attacks. The other part of the problem, which is hard part to create an intelligence for the organizations for preventing the ransomware attacks with the help of intelligent system password management and intelligent account provisioning. In this paper I elaborate on the machine learning algorithms analysis for the intelligent ransomware detection problem, later part of this paper would be design of the algorithm.


2021 ◽  
Author(s):  
Agnes M Resto Irizarry ◽  
Sajedeh Nasr Esfahani ◽  
Yi Zheng ◽  
Robin Zhexuan Yan ◽  
Patrick Kinnunen ◽  
...  

Abstract The human embryo is a complex structure that emerges and develops as a result of cell-level decisions guided by both intrinsic genetic programs and cell–cell interactions. Given limited accessibility and associated ethical constraints of human embryonic tissue samples, researchers have turned to the use of human stem cells to generate embryo models to study specific embryogenic developmental steps. However, to study complex self-organizing developmental events using embryo models, there is a need for computational and imaging tools for detailed characterization of cell-level dynamics at the single cell level. In this work, we obtained live cell imaging data from a human pluripotent stem cell (hPSC)-based epiblast model that can recapitulate the lumenal epiblast cyst formation soon after implantation of the human blastocyst. By processing imaging data with a Python pipeline that incorporates both cell tracking and event recognition with the use of a CNN-LSTM machine learning model, we obtained detailed temporal information of changes in cell state and neighborhood during the dynamic growth and morphogenesis of lumenal hPSC cysts. The use of this tool combined with reporter lines for cell types of interest will drive future mechanistic studies of hPSC fate specification in embryo models and will advance our understanding of how cell-level decisions lead to global organization and emergent phenomena. Insight, innovation, integration: Human pluripotent stem cells (hPSCs) have been successfully used to model and understand cellular events that take place during human embryogenesis. Understanding how cell–cell and cell–environment interactions guide cell actions within a hPSC-based embryo model is a key step in elucidating the mechanisms driving system-level embryonic patterning and growth. In this work, we present a robust video analysis pipeline that incorporates the use of machine learning methods to fully characterize the process of hPSC self-organization into lumenal cysts to mimic the lumenal epiblast cyst formation soon after implantation of the human blastocyst. This pipeline will be a useful tool for understanding cellular mechanisms underlying key embryogenic events in embryo models.


Author(s):  
О.Г. ПОНОМАРЕВ ◽  
М. АСАФ

Рассмотрена проблема коррекции искажений OFDM-сигнала, вызванных смещением частоты дискретизации сигнала в приемном и передающем устройствах системы сотовой связи пятого поколения. Предлагаемый метод компенсации смещения частоты дискретизации основывается на прямой коррекции искажений, вносимых в передаваемый сигнал наличием смещения, и не предполагает какой-либо оценки величины смещения. Метод предназначен для коррекции сигналов в восходящем канале системы сотовой связи пятого поколения и основывается на использовании референсных сигналов, рекомендованных стандартами 3GPP. Результаты численного моделирования показали, что использование предлагаемого метода позволяет повысить эффективность передачи данных по многолучевому радиоканалу более чем на 15% в широком диапазоне значений отношения сигнал/шум. 5G-NR, CP-OFDM, synchronization, sample clock offset, PUSCH. О The paper investigates the issue of sampling clock offset ( SCO) in the fifth generation new radio systems. Due to the imperfect SCO estimation methods, the correction methods relying on the SCO estimation are not perfect, so the proposed method directly corrects the effect of SCO without using any kind of estimation method. Our method is designed to correct the signals in the physical uplink shared channel (PUSCH). The method uses reference signals as recommended by the 3rd generation partnership project (3GPP) standards. The results of the numerical simulation show that the use of the proposed method increases the efficiency of data transmission over the multipath radio channel by more than 15% in a wide range of signal-to-noise ratio values.


Sign in / Sign up

Export Citation Format

Share Document