scholarly journals Analysis of Dimensional Tolerances on Hydraulic and Acoustic Properties of a New Type of Prototypal Gear Pumps

2020 ◽  
Vol 10 (23) ◽  
pp. 8535
Author(s):  
Adam Deptuła ◽  
Piotr Osiński ◽  
Marian A. Partyka

This study focuses on the construction of a prototype series of pumps. The technological capabilities of the entire series of gear pumps with a three-poly-involute outline were determined. We developed neural networks to analyze the dimensional tolerance and composition of the pump components and impact on the distribution for the constructed units. The most crucial dimensions to control were then determined—namely, dimensional and form tolerance were necessary—with a reduction in accuracy classification where it is less important. Measurements of acoustic quantities and of vibrations were also carried out. In conclusion, after positive verification, printed polyethylene wheels can be manufactured in greater, mass-produced quantities. Optimization techniques can then be applied, leading to reduced manufacturing costs and increased efficiency.

Author(s):  
Sarat Chandra Nayak ◽  
Subhranginee Das ◽  
Mohammad Dilsad Ansari

Background and Objective: Stock closing price prediction is enormously complicated. Artificial Neural Networks (ANN) are excellent approximation algorithms applied to this area. Several nature-inspired evolutionary optimization techniques are proposed and used in the literature to search the optimum parameters of ANN based forecasting models. However, most of them need fine-tuning of several control parameters as well as algorithm specific parameters to achieve optimal performance. Improper tuning of such parameters either leads toward additional computational cost or local optima. Methods: Teaching Learning Based Optimization (TLBO) is a newly proposed algorithm which does not necessitate any parameters specific to it. The intrinsic capability of Functional Link Artificial Neural Network (FLANN) to recognize the multifaceted nonlinear relationship present in the historical stock data made it popular and got wide applications in the stock market prediction. This article presents a hybrid model termed as Teaching Learning Based Optimization of Functional Neural Networks (TLBO-FLN) by combining the advantages of both TLBO and FLANN. Results and Conclusion: The model is evaluated by predicting the short, medium, and long-term closing prices of four emerging stock markets. The performance of the TLBO-FLN model is measured through Mean Absolute Percentage of Error (MAPE), Average Relative Variance (ARV), and coefficient of determination (R2); compared with that of few other state-of-the-art models similarly trained and found superior.


2010 ◽  
Vol 61 (2) ◽  
pp. 120-124 ◽  
Author(s):  
Ladislav Zjavka

Generalization of Patterns by Identification with Polynomial Neural Network Artificial neural networks (ANN) in general classify patterns according to their relationship, they are responding to related patterns with a similar output. Polynomial neural networks (PNN) are capable of organizing themselves in response to some features (relations) of the data. Polynomial neural network for dependence of variables identification (D-PNN) describes a functional dependence of input variables (not entire patterns). It approximates a hyper-surface of this function with multi-parametric particular polynomials forming its functional output as a generalization of input patterns. This new type of neural network is based on GMDH polynomial neural network and was designed by author. D-PNN operates in a way closer to the brain learning as the ANN does. The ANN is in principle a simplified form of the PNN, where the combinations of input variables are missing.


2018 ◽  
Vol 42 (2) ◽  
pp. 187-193 ◽  
Author(s):  
Jingwen Xu ◽  
Yunxuan Zhang ◽  
Ziqi Yin

In this paper, an extreme learning machine (ELM) network based on an improved shuffled frog leaping algorithm (CCSFLA) is applied in early bearing fault diagnosis. ELM is a new type of single layer forward network. Although the generalization is stronger compared with traditional neural networks, a random setup of initial parameters increases instability of the network. An improved SFLA based on sinusoidal chaotic mapping with infinite collapses and constriction factors (CCSFLA) is proposed in this paper to optimize the ELM and obtain a CCSFLA–ELM model. Results show that the CCSFLA–ELM model can be used for optimization and that it improved the recognition of early bearing fault diagnosis.


2013 ◽  
Vol 37 (1) ◽  
pp. 129-134 ◽  
Author(s):  
Hai-Lin Zhu ◽  
Jun Pan ◽  
Min Zou ◽  
Hong-Nen Wu ◽  
Xingpei Qin

There exist three major problems in current gear pumps. They are unbalanced radial force, big excessive flow pulsation and short working life. In order to solve the problems above, a new type of gear pump with flexible ring gear is introduced. Pumping action is achieved through meshing between a flexible ring gear and a rigid external gear. Thus radial pressure forces are hydraulically balanced and the volumetric displacement is doubled for the new pump.


Electronics ◽  
2020 ◽  
Vol 9 (11) ◽  
pp. 1923
Author(s):  
Eduardo G. Pardo ◽  
Jaime Blanco-Linares ◽  
David Velázquez ◽  
Francisco Serradilla

The objective of this research is to improve the hydrogen production and total profit of a real Steam Reforming plant. Given the impossibility of tuning the real factory to optimize its operation, we propose modelling the plant using Artificial Neural Networks (ANNs). Particularly, we combine a set of independent ANNs into a single model. Each ANN uses different sets of inputs depending on the physical processes simulated. The model is then optimized as a black-box system using metaheuristics (Genetic and Memetic Algorithms). We demonstrate that the proposed ANN model presents a high correlation between the real output and the predicted one. Additionally, the performance of the proposed optimization techniques has been validated by the engineers of the plant, who reported a significant increase in the benefit that was obtained after optimization. Furthermore, this approach has been favorably compared with the results that were provided by a general black-box solver. All methods were tested over real data that were provided by the factory.


2015 ◽  
Vol 2015 ◽  
pp. 1-15 ◽  
Author(s):  
Yongkun Li ◽  
Lili Zhao ◽  
Li Yang

On a new type of almost periodic time scales, a class of BAM neural networks is considered. By employing a fixed point theorem and differential inequality techniques, some sufficient conditions ensuring the existence and global exponential stability ofC1-almost periodic solutions for this class of networks with time-varying delays are established. Two examples are given to show the effectiveness of the proposed method and results.


Author(s):  
Hai-Lin Zhu ◽  
Peng Ning ◽  
Min Zou ◽  
Xingpei Qin ◽  
Jun Pan

Aimed at solving the problems of radial fluid pressure imbalance, bigger flow ripple and shorter service life that exists in traditional gear pumps, a new type of gear pump based on the principle of harmonic gear drive is put forth, where the function for pumping fluid is achieved by mutual engagement between flexible gear and rigid gear. The structural composition, principle and features of the new gear pump are described in this article. The new pump has two higher pressure cavities arranged symmetrically, which counteracts the fluid pressure and the pump could work longer. Its displacement is two times that of the conventional gear pump and the total discharge is bigger. Flow pulsation, vibration and noise in the new pump are evidently diminished, which make the operation smooth. The new gear pump has superiority in performance and could guide the development in gear pump technology.


2018 ◽  
Vol 108 ◽  
pp. 509-526 ◽  
Author(s):  
Mahmoud Ismail ◽  
Mina Attari ◽  
Saeid Habibi ◽  
Samir Ziada

2017 ◽  
Vol 108 (1) ◽  
pp. 13-25 ◽  
Author(s):  
Parnia Bahar ◽  
Tamer Alkhouli ◽  
Jan-Thorsten Peter ◽  
Christopher Jan-Steffen Brix ◽  
Hermann Ney

AbstractTraining neural networks is a non-convex and a high-dimensional optimization problem. In this paper, we provide a comparative study of the most popular stochastic optimization techniques used to train neural networks. We evaluate the methods in terms of convergence speed, translation quality, and training stability. In addition, we investigate combinations that seek to improve optimization in terms of these aspects. We train state-of-the-art attention-based models and apply them to perform neural machine translation. We demonstrate our results on two tasks: WMT 2016 En→Ro and WMT 2015 De→En.


Sign in / Sign up

Export Citation Format

Share Document