update strategy
Recently Published Documents


TOTAL DOCUMENTS

182
(FIVE YEARS 68)

H-INDEX

8
(FIVE YEARS 4)

Author(s):  
Shihui Li

The distribution optimization of WSN nodes is one of the key issues in WSN research, and also is a research hotspot in the field of communication. Aiming at the distribution optimization of WSN nodes, the distribution optimization scheme of nodes based on improved invasive weed optimization algorithm(IIWO) is proposed. IIWO improves the update strategy of the initial position of weeds by using cubic mapping chaotic operator, and uses the Gauss mutation operator to increase the diversity of the population. The simulation results show that the algorithm proposed in this paper has a higher solution quality and faster convergence speed than IWO and CPSO. In distribution optimization example of WSN nodes, the optimal network coverage rate obtained by IIWO is respectively improved by 1.82% and 0.93% than the IWO and CPSO. Under the condition of obtaining the same network coverage rate, the number of nodes required by IIWO is fewer.


Author(s):  
Jinhong Di ◽  
Pengkun Yang ◽  
Chunyan Wang ◽  
Lichao Yan

In order to overcome the problems of large error and low precision in traditional power fault record data compression, a new layered lossless compression method for massive fault record data is proposed in this paper. The algorithm applies LZW (Lempel Ziv Welch) algorithm, analyzes the LZW algorithm and existing problems, and improves the LZW algorithm. Use the index value of the dictionary to replace the input string sequence, and dynamically add unknown strings to the dictionary. The parallel search method is to divide the dictionary into several small dictionaries with different bit widths to realize the parallel search of the dictionary. According to the compression and decompression of LZW, the optimal compression effect of LZW algorithm hardware is obtained. The multi tree structure of the improved LZW algorithm is used to construct the dictionary, and the multi character parallel search method is used to query the dictionary. The multi character parallel search method is used to query the dictionary globally. At the same time, the dictionary size and update strategy of LZW algorithm are analyzed, and the optimization parameters are designed to construct and update the dictionary. Through the calculation of lossless dictionary compression, the hierarchical lossless compression of large-scale fault record data is completed. Select the optimal parameters, design the dictionary size and update strategy, and complete the lossless compression of recorded data. The experimental results show that compared with the traditional compression method, under this compression method, the mean square error percentage is effectively reduced, and the compression error and compression rate are eliminated, so as to ensure the integrity of fault record data, achieve the compression effect in a short time, and achieve the expected goal.


2021 ◽  
Author(s):  
Shaolong Chen ◽  
Changzhen Qiu ◽  
Yurong Huang ◽  
Zhiyong Zhang

Abstract In the visual object tracking, the tracking algorithm based on discriminative model prediction have shown favorable performance in recent years. Probabilistic discriminative model prediction (PrDiMP) is a typical tracker based on discriminative model prediction. The PrDiMP evaluates tracking results through output of the tracker to guide online update of the model. However, the tracker output is not always reliable, especially in the case of fast motion, occlusion or background clutter. Simply using the output of the tracker to guide the model update can easily lead to drift. In this paper, we present a robust model update strategy which can effectively integrate maximum response, multi-peaks and detector cues to guide model update of PrDiMP. Furthermore, we have analyzed the impact of different model update strategies on the performance of PrDiMP. Extensive experiments and comparisons with state-of-the-art trackers on the four benchmarks of VOT2018, VOT2019, NFS and OTB100 have proved the effectiveness and advancement of our algorithm.


2021 ◽  
pp. 1-17
Author(s):  
Maodong Li ◽  
Guanghui Xu ◽  
Yuanwang Fu ◽  
Tingwei Zhang ◽  
Li Du

 In this paper, a whale optimization algorithm based on adaptive inertia weight and variable spiral position updating strategy is proposed. The improved algorithm is used to solve the problem that the whale optimization algorithm is more dependent on the randomness of the parameters, so that the algorithm’s convergence accuracy and convergence speed are insufficient. The adaptive inertia weight, which varies with the fitness of individual whales, is used to balance the algorithm’s global search ability and local exploitation ability. The variable spiral position update strategy based on the collaborative convergence mechanism is used to dynamically adjust the search range and search accuracy of the algorithm. The effective combination of the two can make the improved whale optimization algorithm converge to the optimal solution faster. It had been used 18 international standard test functions, including unimodal function, multimodal function, and fixed-dimensional function to test the improved whale optimization algorithm in this paper. The test results show that the improved algorithm has faster convergence speed and higher algorithm accuracy than the original algorithm and several classic algorithms. The algorithm can quickly converge to near the optimal value in the early stage, and then effectively jump out of the local optimal through adaptive adjustment, and has a certain ability to solve large-scale optimization problems.


2021 ◽  
Vol 10 (12) ◽  
pp. 817
Author(s):  
Zhihong Ouyang ◽  
Lei Xue ◽  
Feng Ding ◽  
Da Li

Linear approximate segmentation and data compression of moving target spatio-temporal trajectory can reduce data storage pressure and improve the efficiency of target motion pattern mining. High quality segmentation and compression need to accurately select and store as few points as possible that can reflect the characteristics of the original trajectory, while the existing methods still have room for improvement in segmentation accuracy, reduction of compression rate and simplification of algorithm parameter setting. A trajectory segmentation and compression algorithm based on particle swarm optimization is proposed. First, the trajectory segmentation problem is transformed into a global intelligent optimization problem of segmented feature points, which makes the selection of segmented points more accurate; then, a particle update strategy combining neighborhood adjustment and random jump is established to improve the efficiency of segmentation and compression. Through experiments on a real data set and a maneuvering target simulation trajectory set, the results show that compared with the existing typical methods, this method has advantages in segmentation accuracy and compression rate.


2021 ◽  
Vol 2132 (1) ◽  
pp. 012010
Author(s):  
Guorong Xie ◽  
Rongqi Jiang ◽  
Yi Qu

Abstract To alleviate the occlusion problem in a single object tracking scene, this paper proposes an ECO-MHDU object tracking algorithm with a more powerful anti-occlusion performance based on the ECO tracker. The algorithm first uses the pre-trained MobileNetV3 lightweight backbone network on the ImageNet dataset to replace the ResNet network in the ECO to increase the speed of the algorithm to obtain the shallow and deep feature information of the image, while effectively using the attention mechanism in the MobileNetV3 network to strengthen the algorithm’s ability to extract target features; secondly, use the DropBlock operation on the acquired feature map to generate a random continuous mask on the feature map channel to improve the algorithm’s learning of the global robust spatial structure information; finally, a confidence update strategy is introduced into the GMM sample generation space. To improve the quality of training samples, unreliable tracking states such as confidence detection and occlusion are designed to avoid updating the sample space with damaging information. Compared with the ECO algorithm, the ECO-MHDU algorithm proposed in this paper has a success rate of 68.0% on the occlusion attributes of the OTB100 dataset, which is 2.3% higher than the ECO algorithm, and the ECO-MHDU algorithm also showed the best performance on the entire dataset sequence, with a success rate of 69.3%.


2021 ◽  
pp. 1-10
Author(s):  
Aamir Ali ◽  
Muhammad Asim

Generally, big interaction networks keep the interaction records of actors over a certain period. With the rapid increase of these networks users, the demand for frequent subgraph mining on a large database is more and more intense. However, most of the existing studies of frequent subgraphs have not considered the temporal information of the graph. To fill this research gap, this article presents a novel temporal frequent subgraph-based mining algorithm (TFSBMA) using spark. TFSBMA employs frequent subgraph mining with a minimum threshold in a spark environment. The proposed algorithm attempts to analyze the temporal frequent subgraph (TFS) using a Frequent Subgraph Mining Based Using Spark (FSMBUS) method with a minimum support threshold and evaluate its frequency in temporal manner. Furthermore, based on the FSMBUS results, the study also tries to compute TFS using an incremental update strategy. Experimental results show that the proposed algorithm can accurately and efficiently compute all the TFS with corresponding frequencies. In addition, we applied the proposed algorithm on a real-world dataset having artificial time information that confirms the practical usability of the proposed algorithm.


Mathematics ◽  
2021 ◽  
Vol 9 (23) ◽  
pp. 3006
Author(s):  
Junqiang Yang ◽  
Wenbing Tang ◽  
Zuohua Ding

During the target tracking process of unmanned aerial vehicles (UAVs), the target may disappear from view or be fully occluded by other objects, resulting in tracking failure. Therefore, determining how to identify tracking failure and re-detect the target is the key to the long-term target tracking of UAVs. Kernelized correlation filter (KCF) has been very popular for its satisfactory speed and accuracy since it was proposed. It is very suitable for UAV target tracking systems with high real-time requirements. However, it cannot detect tracking failure, so it is not suitable for long-term target tracking. Based on the above research, we propose an improved KCF to match long-term target tracking requirements. Firstly, we introduce a confidence mechanism to evaluate the target tracking results to determine the status of target tracking. Secondly, the tracking model update strategy is designed to make the model suffer from less background information interference, thereby improving the robustness of the algorithm. Finally, the Normalized Cross Correlation (NCC) template matching is used to make a regional proposal first, and then the tracking model is used for target re-detection. Then, we successfully apply the algorithm to the UAV system. The system uses binocular cameras to estimate the target position accurately, and we design a control method to keep the target in the UAV’s field of view. Our algorithm has achieved the best results in both short-term and long-term evaluations of experiments on tracking benchmarks, which proves that the algorithm is superior to the baseline algorithm and has quite good performance. Outdoor experiments show that the developed UAV system can achieve long-term, autonomous target tracking.


2021 ◽  
Vol 10 (4) ◽  
pp. 0-0

Modern multi-site database applications are not only time-driven but also require efficient quality of services with no single-node failure. It might be ideally achieved using database replication techniques. The transactions, being a basic component of these applications, are more likely to miss their deadlines because of requiring unpredictably long time to access remote data items. The temporal validity of data is another issue requiring attention to be paid. To address these problems, a Cluster-Replicas with Efficient Distributed Lazy Update (CRED) protocol is proposed in this paper. The CRED protocol increases the chance of timely execution of transactions and data freshness in an unpredictable workload environment by utilizing the lazy replica update strategy. It reduces the negative impact of the burst workload with a marginal overhead of ensuring timely-updated replicas. The simulation results confirm that the CRED outperforms the ORDER protocol by up to 4%.


Sign in / Sign up

Export Citation Format

Share Document