optimal function
Recently Published Documents


TOTAL DOCUMENTS

201
(FIVE YEARS 65)

H-INDEX

31
(FIVE YEARS 5)

2022 ◽  
Vol 40 (3) ◽  
pp. 1-29
Author(s):  
Yashar Moshfeghi ◽  
Alvaro Francisco Huertas-Rosero

In this article, we propose an approach to improve quality in crowdsourcing (CS) tasks using Task Completion Time (TCT) as a source of information about the reliability of workers in a game-theoretical competitive scenario. Our approach is based on the hypothesis that some workers are more risk-inclined and tend to gamble with their use of time when put to compete with other workers. This hypothesis is supported by our previous simulation study. We test our approach with 35 topics from experiments on the TREC-8 collection being assessed as relevant or non-relevant by crowdsourced workers both in a competitive (referred to as “Game”) and non-competitive (referred to as “Base”) scenario. We find that competition changes the distributions of TCT, making them sensitive to the quality (i.e., wrong or right) and outcome (i.e., relevant or non-relevant) of the assessments. We also test an optimal function of TCT as weights in a weighted majority voting scheme. From probabilistic considerations, we derive a theoretical upper bound for the weighted majority performance of cohorts of 2, 3, 4, and 5 workers, which we use as a criterion to evaluate the performance of our weighting scheme. We find our approach achieves a remarkable performance, significantly closing the gap between the accuracy of the obtained relevance judgements and the upper bound. Since our approach takes advantage of TCT, which is an available quantity in any CS tasks, we believe it is cost-effective and, therefore, can be applied for quality assurance in crowdsourcing for micro-tasks.


Nutrients ◽  
2022 ◽  
Vol 14 (2) ◽  
pp. 301
Author(s):  
Andrew J. Sinclair ◽  
Xiao-Fei Guo ◽  
Lavinia Abedin

The retina requires docosahexaenoic acid (DHA) for optimal function. Alpha-linolenic acid (ALA) and DHA are dietary sources of retinal DHA. This research investigated optimizing retinal DHA using dietary ALA. Previous research identified 19% DHA in retinal phospholipids was associated with optimal retinal function in guinea pigs. Pregnant guinea pigs were fed dietary ALA from 2.8% to 17.3% of diet fatty acids, at a constant level of linoleic acid (LA) of 18% for the last one third of gestation and retinal DHA levels were assessed in 3-week-old offspring maintained on the same diets as their mothers. Retinal DHA increased in a linear fashion with the maximum on the diet with LA:ALA of 1:1. Feeding diets with LA:ALA of 1:1 during pregnancy and assessing retinal DHA in 3-week-old offspring was associated with optimized retinal DHA levels. We speculate that the current intakes of ALA in human diets, especially in relation to LA intakes, are inadequate to support high DHA levels in the retina.


Author(s):  
Mong Hien Thi Nguyen ◽  
◽  
Minh Hieu Tran ◽  

This paper presents the research results of automatic estimation of the neck girth and inside leg to extract the size and body shape from the male sizing system table. The data used in the study is the 3D scan file *.obj from the 3D body scanner. The author uses the interpolation and optimization method in the algorithm to automatically extract 2 primary dimensions combined with the fuzzy logic method to extract sizes, body shapes. Besides, rotate matrix method combines with the optimal function used to write an algorithm to estimate the neck girth, inside leg measurements. Furthermore, a simple approach based on vertices and surface normal vectors data and optimal search was adapted to estimate the neck girth and inside leg measurements. These extraction results will be linked to the algorithm of the fuzzy logic to run for the automated process. This automatic algorithm will be very useful in face-to-face clothing purchases or online or for garment manufacturers in reducing shopping time and choosing sizes to design samples for customers.


Author(s):  
А.В. Башкиров ◽  
И.В. Свиридова ◽  
Т.Д. Ижокина ◽  
Е.А. Зубкова ◽  
О.В. Свиридова ◽  
...  

Aналитический подход к определению оптимальной функции постобработки для минимальной операции в алгоритме MIN-SUM, ранее полученный для обычных кодов проверки на четность с низкой плотностью (LDPC-коды), распространяется на нерегулярные коды LDPC. Оптимальное выражение постобработки для нестандартного случая варьируется от одного контрольного узла к другому, а также от одной итерации к следующей. Для практического использования необходимо аппроксимировать эту оптимальную функцию. В отличие от обычного случая, когда можно было бы использовать уникальную функцию постобработки на протяжении всего процесса декодирования без потери производительности битовых ошибок, для нерегулярных кодов критически важно варьировать постобработку от одной итерации к следующей, чтобы добиться хорошей производительности. С использованием этого подхода было выявлено, что качество битовых ошибок от алгоритма распространения доверия соответствует улучшению на 1 дБ по сравнению с MIN-SUM алгоритмом без постобработки. Сначала будет представлен обзор подхода и представлена аналитическая основа для оптимальной постобработки. Далее будет представлена оптимальная функция постобработки для нерегулярных кодов и обсуждены возможные упрощения. И наконец, показаны результаты моделирования и преимущества аппроксимации We extended an analytical approach to determining the optimal post-processing function for the minimum operation in the MIN-SUM algorithm, previously obtained for conventional low density parity check codes (LDPC codes), to irregular LDPC codes. The optimal post-processing expression for the non-standard case varies from one control node to another, as well as from one iteration to the next. For practical use, it is necessary to approximate this optimal function. Unlike the usual case where one could use a unique post-processing function throughout the entire decoding process without sacrificing bit code performance, it is critical for irregular codes to distinguish post-processing from one iteration to the next in order to achieve good performance. Using this approach, we found that the quality of bit errors from the trust algorithm corresponds to an improvement of 1 level compared to the MIN-SUM algorithm without post-processing. First, we provide an overview and analytical framework for optimal post-processing. Then, we present the optimal post-processing function for irregular codes and discuss possible simplifications. Finally, we show the simulation results and the benefits of the approximation


2021 ◽  
pp. 1-10
Author(s):  
Haiyang Huang ◽  
Zhanlei Shang

In the traditional network heterogeneous fault-tolerant data mining process, there are some problems such as low accuracy and slow speed. This paper proposes a fast mining method based on K-means clustering for network heterogeneous fault-tolerant data. The confidence space of heterogeneous fault-tolerant data is determined, and the range of motion of fault-tolerant data is obtained; Singular value decomposition (SVD) method is used to construct the classified data model to obtain the characteristics of heterogeneous fault-tolerant data; The redundant data in fault-tolerant data is deleted by unsupervised feature selection algorithm, and the square sum and Euclidean distance of fault-tolerant data clustering center are determined by K-means algorithm. The discrete data clustering space is constructed, and the objective optimal function of network heterogeneous fault-tolerant data clustering is obtained, Realize fault-tolerant data fast mining. The results show that the mining accuracy of the proposed method can reach 97%.


2021 ◽  
Vol 2087 (1) ◽  
pp. 012041
Author(s):  
Zhipeng Xia ◽  
Ping Jin ◽  
Ling Chang

Abstract In recent years, DC/AC matrix converter with higher power density have become an attractive alternative to traditional voltage source converters. The traditional PI control has been employed to achieve accurate control, but its dynamic response and stability need to be improved. In this paper, a model predictive control (MPC) with reducing power backflow is introduced in an isolated DC/AC matrix converter with a loosely coupled high-frequency transformer (LCHFT). Prediction model and optimal function model are established to improve the dynamic response of the system and reduce the power backflow.


2021 ◽  
Vol 880 (1) ◽  
pp. 012029
Author(s):  
Arniza Fitri ◽  
Hao Chen ◽  
Li Yao ◽  
Ke-hong Zheng ◽  
Susarman ◽  
...  

Abstract Scouring problems faced in the Cimadur River especially near to the Citorek bridge abutments have become the major discussion by the local researchers and the local water resources manager in Banten Province. As an effort in reducing the scouring problem around the abutment of the Citorek bridge, a groundsill structure with specific design is going to be installed in Cimadur River at downstream of Citorek bridge. To make sure the optimal function of the structure, the stability of the groundsill structure in Cimadur River need to be evaluated. This study attempt to evaluate the stability of a groundsill structure from the occurrences of rolling and sliding at both normal and flood conditions. The eccentricity of the groundsill structure is also checked during normal and flood conditions to make sure the stability of the structure. The required data (which are consisting of detail description of the groundsill structure, cross section of the river, rainfall data, topography data and sediment/soil data) are observed in the field and obtained from P.T. Saeba Konsulindo. The data are further analysed to determine: 1) design water discharges for several return period, 2) forces acting to the groundsill structure and 3) stability of the groundsill structure in the river. The results showed that the groundsill structure are stable and safety again rolling and sliding occurrences where the safety factor (SF) for rolling and sliding are higher than critical coefficients of rolling and sliding (1.5). At normal water level, safety factors (SF) for rolling and sliding are 8.07>1.5 and 2.7>1.5, respectively, while at flood water level, SF for rolling and sliding are 5.61>1.5 and 1.88>1.5, respectively. Besides, the results also found that the groundsill is safety from eccentricity at both normal and flood conditions where the calculated coefficients of eccentricities are lower than critical coefficient of eccentricity which could cause rolling and sliding.


Electronics ◽  
2021 ◽  
Vol 10 (18) ◽  
pp. 2232
Author(s):  
Zhibo Sun ◽  
Yan Shi ◽  
Na Wang ◽  
Jian Zhang ◽  
Yixuan Wang ◽  
...  

Pneumatic suspension is the most significant subsystem for an automobile. In this paper, a simplified and novel pneumatic spring structure with only a conical rubber surface is presented and designed to reduce the influence of external factors besides the pneumatic. The nonlinear stiffness of the pneumatic spring is analyzed based on the ideal gas model and material mechanics. Natural frequency analysis and the transmission rate of the pneumatic suspension are obtained as two effect criteria for the dynamic model. The vibration isolation system platform is established in both simulation and prototype tests. With the results from the simulation, the rules of the pneumatic suspension are analyzed, and the optimal function of mass and pressure is achieved. The experiment results show the analysis of the simulation to be effective. This achievement will become an important basis for future research concerning precise active control of the pneumatic suspension in vehicles.


2021 ◽  
Author(s):  
Wen-Guang Lin ◽  
Xiao-Dong Liu ◽  
Ren-bin Xiao

Abstract Product functional configuration (PFC) is a common way for the firm to satisfy individual requirements of the customer and be carried out base on market analysis. This study aims to help firms analyze functions and carry out function configuration by the patent data. This paper proposes a patent data-driven product function configuration method based on a hypergraph network. It constructs a weighted network model to optimize the combination of product function quantity and object from the perspective of big data: (1) The functional knowledge contained in the patent is extracted. (2) The functional hypergraph (FH) is constructed according to the co-occurrence relationship of patent and applicant. (3) The function and patent weight are calculated from the perspective of the patent applicant and patent value. (4) The weight calculation model of PFC is built. (5) The weighted frequent subgraph algorithm is used to obtain the optimal function combination list. This method is applied in the innovative design process of a bathroom shower. The result indicates that this method designed in the study has a positive effect on helping firms detach optimal function candidates and develop a multi-function product.


Sign in / Sign up

Export Citation Format

Share Document