importance coefficient
Recently Published Documents


TOTAL DOCUMENTS

14
(FIVE YEARS 8)

H-INDEX

2
(FIVE YEARS 1)

Materials ◽  
2021 ◽  
Vol 14 (18) ◽  
pp. 5135
Author(s):  
Sheng-En Fang ◽  
Chen Wu ◽  
Xiao-Hua Zhang ◽  
Li-Sen Zhang ◽  
Zhi-Bin Wang ◽  
...  

Theoretical or numerical progressive collapse analysis is necessary for important civil structures in case of unforeseen accidents. However, currently, most analytical research is carried out under the assumption of material elasticity for problem simplification, leading to the deviation of analysis results from actual situations. On this account, a progressive collapse analysis procedure for truss structures is proposed, based on the assumption of elastoplastic materials. A plastic importance coefficient was defined to express the importance of truss members in the entire system. The plastic deformations of members were involved in the construction of local and global stiffness matrices. The conceptual removal of a member was adopted, and the impact of the member loss on the truss system was quantified by bearing capacity coefficients, which were subsequently used to calculate the plastic importance coefficients. The member failure occurred when its bearing capacity arrived at the ultimate value, instead of the elastic limit. The extra bearing capacity was embodied by additional virtual loads. The progressive collapse analysis was performed by iterations until the truss became a geometrically unstable system. After that, the critical progressive collapse path inside the truss system was found according to the failure sequence of the members. Lastly, the proposed method was verified against both analytical and experimental truss structures. The critical progressive collapse path of the experimental truss was found by the failure sequence of damaged members. The experimental observation agreed well with the corresponding analytical scenario, proving the method feasibility.


Axioms ◽  
2021 ◽  
Vol 10 (3) ◽  
pp. 159
Author(s):  
Yingdan Shang ◽  
Bin Zhou ◽  
Ye Wang ◽  
Aiping Li ◽  
Kai Chen ◽  
...  

Predicting the popularity of online content is an important task for content recommendation, social influence prediction and so on. Recent deep learning models generally utilize graph neural networks to model the complex relationship between information cascade graph and future popularity, and have shown better prediction results compared with traditional methods. However, existing models adopt simple graph pooling strategies, e.g., summation or average, which prone to generate inefficient cascade graph representation and lead to unsatisfactory prediction results. Meanwhile, they often overlook the temporal information in the diffusion process which has been proved to be a salient predictor for popularity prediction. To focus attention on the important users and exclude noises caused by other less relevant users when generating cascade graph representation, we learn the importance coefficient of users and adopt sample mechanism in graph pooling process. In order to capture the temporal features in the diffusion process, we incorporate the inter-infection duration time information into our model by using LSTM neural network. The results show that temporal information rather than cascade graph information is a better predictor for popularity. The experimental results on real datasets show that our model significantly improves the prediction accuracy compared with other state-of-the-art methods.


2020 ◽  
Vol 12 (6) ◽  
pp. 64-74
Author(s):  
Rasim M. Alguliyev ◽  
◽  
Gulnara Ch. Nabibayova ◽  
Saadat R. Abdullayeva

The article proposes a comprehensive method for the multicriteria evaluation of websites. The essence of this method is that using this method we can not only evaluate a website traditionally, but also obtain the following useful results: importance coefficient of each of the criteria, the evaluation of the website for each criterion individually. Moreover, we can also compare the sampled websites and then rank them. It is noted that in order to get the precise result, the sampled websites must be referred to the same category, that is, have the same set of criteria for evaluation.


Sensors ◽  
2020 ◽  
Vol 20 (23) ◽  
pp. 6975
Author(s):  
Yiming Li ◽  
Xiangmin Meng ◽  
Zhongchao Zhang ◽  
Guiqiu Song

The traditional predictive model for remaining useful life predictions cannot achieve adaptiveness, which is one of the main problems of said predictions. This paper proposes a LightGBM-based Remaining useful life (RUL) prediction method which considers the process and machining state. Firstly, a multi-information fusion strategy that can effectively reduce the model error and improve the generalization ability of the model is proposed. Secondly, a preprocessing method for improving the time precision and small-time granularity of feature extraction while avoiding dimensional explosion is proposed. Thirdly, an importance coefficient and a custom loss function related to the process and machining state are proposed. Finally, using the processing data of actual tool life cycle, through five evaluation indexes and 25 sets of contrast experiments, the superiority and effectiveness of the proposed method are verified.


2020 ◽  
Vol 2 (2) ◽  
Author(s):  
Qingfeng Wang ◽  
Zhigang Miao ◽  
Li Zhou

As a key supporting equipment for the construction of LNG carriers, the installation platform undertakes the support and guarantee of LNG carrier tank internal construction. This paper takes the secondary shielding installation platform of A-type tank as the object of study, the study firstly considers the semi-rigidity of the nodes and the material nonlinearity based on finite element software, and then the residual structure is calculated using static nonlinear method after single truss, two trusses and three trusses are invalid  simultaneously. The research results show that the truss with higher components importance coefficient has greater impact on the residual structure when the truss is invalid; After the 2 trusses of installation platform become invalid completely, the further progressive collapse will not occur; When A1-HJ, A2-HJ and A2-HJ are dismantled at the same time, it will lead to the local progressive damage, which can cause the collapse of large-scale structures. The research findings can support the design and use of the installation platform.


Author(s):  
James B Wood ◽  
Jessica L Mason ◽  
Alessandra Bianchini

The Department of Defense utilizes complex technologies in numerous fields; each technology must comply with specific parameters and system capabilities. In this selection of a complex technology best meeting prescribed capabilities, the parameter set includes nine areas, which have sub-areas. To handle the complexity of the process, the team identified the Analytical Hierarchy Process (AHP), a methodology providing a reliable solution. The AHP produced an importance coefficient for each area and sub-area, which were combined in the final phase of the ranking process. Users and technical support personnel agreed that the selected system met the requirements, field operational needs, and maintenance requisites. The AHP approach represents a viable tool to handle group decisions in support of technology management and related selection processes.


2020 ◽  
Vol 34 (04) ◽  
pp. 4772-4779 ◽  
Author(s):  
Yu Li ◽  
Yuan Tian ◽  
Jiawei Zhang ◽  
Yi Chang

Learning the low-dimensional representations of graphs (i.e., network embedding) plays a critical role in network analysis and facilitates many downstream tasks. Recently graph convolutional networks (GCNs) have revolutionized the field of network embedding, and led to state-of-the-art performance in network analysis tasks such as link prediction and node classification. Nevertheless, most of the existing GCN-based network embedding methods are proposed for unsigned networks. However, in the real world, some of the networks are signed, where the links are annotated with different polarities, e.g., positive vs. negative. Since negative links may have different properties from the positive ones and can also significantly affect the quality of network embedding. Thus in this paper, we propose a novel network embedding framework SNEA to learn Signed Network Embedding via graph Attention. In particular, we propose a masked self-attentional layer, which leverages self-attention mechanism to estimate the importance coefficient for pair of nodes connected by different type of links during the embedding aggregation process. Then SNEA utilizes the masked self-attentional layers to aggregate more important information from neighboring nodes to generate the node embeddings based on balance theory. Experimental results demonstrate the effectiveness of the proposed framework through signed link prediction task on several real-world signed network datasets.


Entropy ◽  
2019 ◽  
Vol 21 (2) ◽  
pp. 205 ◽  
Author(s):  
Shanyun Liu ◽  
Yunquan Dong ◽  
Pingyi Fan ◽  
Rui She ◽  
Shuo Wan

This paper focuses on the problem of finding a particular data recommendation strategy based on the user preference and a system expected revenue. To this end, we formulate this problem as an optimization by designing the recommendation mechanism as close to the user behavior as possible with a certain revenue constraint. In fact, the optimal recommendation distribution is the one that is the closest to the utility distribution in the sense of relative entropy and satisfies expected revenue. We show that the optimal recommendation distribution follows the same form as the message importance measure (MIM) if the target revenue is reasonable, i.e., neither too small nor too large. Therefore, the optimal recommendation distribution can be regarded as the normalized MIM, where the parameter, called importance coefficient, presents the concern of the system and switches the attention of the system over data sets with different occurring probability. By adjusting the importance coefficient, our MIM based framework of data recommendation can then be applied to systems with various system requirements and data distributions. Therefore, the obtained results illustrate the physical meaning of MIM from the data recommendation perspective and validate the rationality of MIM in one aspect.


2018 ◽  
Vol 2018 ◽  
pp. 1-14 ◽  
Author(s):  
Haoxiang He ◽  
Honggang Xu ◽  
Xiaobing Wang ◽  
Xiaofu Zhang ◽  
Shaoyong Fan

The current methods of optimal sensor placement are majorly presented based on modal analysis theory, lacking the consideration of damage process of the structure. The effect of different minor damage cases acting on the total spatial structure is studied based on vulnerability theory in structural analysis. The concept of generalized equivalent stiffness is introduced and the importance coefficient of component is defined. For numerical simulation, the random characteristics for both structural parameters and loads are considered, and the random samples are established. The damage path of each sample is calculated and all the important members on the damage failure path are listed; therefore the sensor placement scheme is determined according to the statistical data. This method is extended to dynamic analysis. For every dynamic time-history analysis, time-varying responses of the structure are calculated by selecting appropriate calculating interval and considering the randomness of structural parameters and load. The time-varying response is analyzed and the importance coefficient of members is sorted; finally the dynamic sensor placement scheme is determined. The effectiveness of the method in this paper is certified by example.


Sign in / Sign up

Export Citation Format

Share Document