Bifurcation and stability analysis for 3SPS+1PS parallel hip joint manipulator based on unified theory

Author(s):  
Songtao Wang ◽  
Gang Cheng ◽  
Jianhua Yang ◽  
Xihui Chen

For a parallel hip joint manipulator, the unified kinematics and stiffness model are established based on a novel unified theory, and then the bifurcation and stability are analyzed under the same unified theory framework. In bifurcation analysis, a chaos method is first applied to solve the non-linear bifurcation equations in order to get the full configuration of the parallel hip joint manipulator, which improves the convergence rate and accuracy. Based on the full-configuration solution, the single-parameter and double-parameter simulation for the bifurcation and stability of the parallel hip joint manipulator is performed. The bifurcation simulation results show that the configuration only changes along the corresponding path but cannot change to other paths when the configuration of the parallel hip joint manipulator is at a certain path. The stability simulation results show that when the parallel hip joint manipulator enters into an uncontrolled domain of a bifurcation posture along different paths, the posture component which changes dramatically will lose control first, and the other posture components will move along the changed configuration. In this paper, the kinematics, stiffness, bifurcation and stability of the parallel hip joint manipulator are solved under the same theory framework, which improves the solving efficiency and enriches the mechanical theory for the parallel manipulators.

2015 ◽  
Vol 7 (4) ◽  
Author(s):  
Gang Cheng ◽  
Song-tao Wang ◽  
De-hua Yang ◽  
Jian-hua Yang

This paper presents a finite element method (FEM) for the kinematic solution of parallel manipulators (PMs), and this approach is applied to analyze the kinematics of a parallel hip joint manipulator (PHJM). The analysis and simulation results indicate that FEM can get accurate results of the kinematics of the PHJM, and the solution process shows that using FEM can solve nonlinear and linear kinematic problems in the same mathematical framework, which provides a theory base for establishing integrated model among different parameter models of the PHJM.


Author(s):  
Godfrey C. Hoskins ◽  
V. Williams ◽  
V. Allison

The method demonstrated is an adaptation of a proven procedure for accurately determining the magnification of light photomicrographs. Because of the stability of modern electrical lenses, the method is shown to be directly applicable for providing precise reproducibility of magnification in various models of electron microscopes.A readily recognizable area of a carbon replica of a crossed-line diffraction grating is used as a standard. The same area of the standard was photographed in Phillips EM 200, Hitachi HU-11B2, and RCA EMU 3F electron microscopes at taps representative of the range of magnification of each. Negatives from one microscope were selected as guides and printed at convenient magnifications; then negatives from each of the other microscopes were projected to register with these prints. By deferring measurement to the print rather than comparing negatives, correspondence of magnification of the specimen in the three microscopes could be brought to within 2%.


Author(s):  
Supriya Raheja

Background: The extension of CPU schedulers with fuzzy has been ascertained better because of its unique capability of handling imprecise information. Though, other generalized forms of fuzzy can be used which can further extend the performance of the scheduler. Objectives: This paper introduces a novel approach to design an intuitionistic fuzzy inference system for CPU scheduler. Methods: The proposed inference system is implemented with a priority scheduler. The proposed scheduler has the ability to dynamically handle the impreciseness of both priority and estimated execution time. It also makes the system adaptive based on the continuous feedback. The proposed scheduler is also capable enough to schedule the tasks according to dynamically generated priority. To demonstrate the performance of proposed scheduler, a simulation environment has been implemented and the performance of proposed scheduler is compared with the other three baseline schedulers (conventional priority scheduler, fuzzy based priority scheduler and vague based priority scheduler). Results: Proposed scheduler is also compared with the shortest job first CPU scheduler as it is known to be an optimized solution for the schedulers. Conclusion: Simulation results prove the effectiveness and efficiency of intuitionistic fuzzy based priority scheduler. Moreover, it provides optimised results as its results are comparable to the results of shortest job first.


Author(s):  
Richard Dagger

This book aims to develop a unified theory of political obligation and the justification of punishment that takes its bearings from the principle of fair play. Much has been written on each of these subjects, of course, including numerous essays in recent years that approach one or the other topic in fair-play terms. However, there has been no sustained effort to link the two in a fair-play theory of political obligation and punishment. This book undertakes such an effort. This introduction explains why such a theory is an attractive possibility and how the argument for it unfolds in the succeeding chapters.


2020 ◽  
Vol 12 (7) ◽  
pp. 2767 ◽  
Author(s):  
Víctor Yepes ◽  
José V. Martí ◽  
José García

The optimization of the cost and CO 2 emissions in earth-retaining walls is of relevance, since these structures are often used in civil engineering. The optimization of costs is essential for the competitiveness of the construction company, and the optimization of emissions is relevant in the environmental impact of construction. To address the optimization, black hole metaheuristics were used, along with a discretization mechanism based on min–max normalization. The stability of the algorithm was evaluated with respect to the solutions obtained; the steel and concrete values obtained in both optimizations were analyzed. Additionally, the geometric variables of the structure were compared. Finally, the results obtained were compared with another algorithm that solved the problem. The results show that there is a trade-off between the use of steel and concrete. The solutions that minimize CO 2 emissions prefer the use of concrete instead of those that optimize the cost. On the other hand, when comparing the geometric variables, it is seen that most remain similar in both optimizations except for the distance between buttresses. When comparing with another algorithm, the results show a good performance in optimization using the black hole algorithm.


2021 ◽  
Vol 13 (7) ◽  
pp. 3744
Author(s):  
Mingcheng Zhu ◽  
Shouqian Li ◽  
Xianglong Wei ◽  
Peng Wang

Fishbone-shaped dikes are always built on the soft soil submerged in the water, and the soft foundation settlement plays a key role in the stability of these dikes. In this paper, a novel and simple approach was proposed to predict the soft foundation settlement of fishbone dikes by using the extreme learning machine. The extreme learning machine is a single-hidden-layer feedforward network with high regression and classification prediction accuracy. The data-driven settlement prediction models were built based on a small training sample size with a fast learning speed. The simulation results showed that the proposed methods had good prediction performances by facilitating comparisons of the measured data and the predicted data. Furthermore, the final settlement of the dike was predicted by using the models, and the stability of the soft foundation of the fishbone-shaped dikes was assessed based on the simulation results of the proposed model. The findings in this paper suggested that the extreme learning machine method could be an effective tool for the soft foundation settlement prediction and assessment of the fishbone-shaped dikes.


Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2347
Author(s):  
Yanyan Wang ◽  
Lin Wang ◽  
Ruijuan Zheng ◽  
Xuhui Zhao ◽  
Muhua Liu

In smart homes, the computational offloading technology of edge cloud computing (ECC) can effectively deal with the large amount of computation generated by smart devices. In this paper, we propose a computational offloading strategy for minimizing delay based on the back-pressure algorithm (BMDCO) to get the offloading decision and the number of tasks that can be offloaded. Specifically, we first construct a system with multiple local smart device task queues and multiple edge processor task queues. Then, we formulate an offloading strategy to minimize the queue length of tasks in each time slot by minimizing the Lyapunov drift optimization problem, so as to realize the stability of queues and improve the offloading performance. In addition, we give a theoretical analysis on the stability of the BMDCO algorithm by deducing the upper bound of all queues in this system. The simulation results show the stability of the proposed algorithm, and demonstrate that the BMDCO algorithm is superior to other alternatives. Compared with other algorithms, this algorithm can effectively reduce the computation delay.


2021 ◽  
Vol 5 (2) ◽  
pp. 32
Author(s):  
Esmehan Uçar ◽  
Sümeyra Uçar ◽  
Fırat Evirgen ◽  
Necati Özdemir

It is possible to produce mobile phone worms, which are computer viruses with the ability to command the running of cell phones by taking advantage of their flaws, to be transmitted from one device to the other with increasing numbers. In our day, one of the services to gain currency for circulating these malignant worms is SMS. The distinctions of computers from mobile devices render the existing propagation models of computer worms unable to start operating instantaneously in the mobile network, and this is particularly valid for the SMS framework. The susceptible–affected–infectious–suspended–recovered model with a classical derivative (abbreviated as SAIDR) was coined by Xiao et al., (2017) in order to correctly estimate the spread of worms by means of SMS. This study is the first to implement an Atangana–Baleanu (AB) derivative in association with the fractional SAIDR model, depending upon the SAIDR model. The existence and uniqueness of the drinking model solutions together with the stability analysis are shown through the Banach fixed point theorem. The special solution of the model is investigated using the Laplace transformation and then we present a set of numeric graphics by varying the fractional-order θ with the intention of showing the effectiveness of the fractional derivative.


Games ◽  
2021 ◽  
Vol 12 (3) ◽  
pp. 53
Author(s):  
Roberto Rozzi

We consider an evolutionary model of social coordination in a 2 × 2 game where two groups of players prefer to coordinate on different actions. Players can pay a cost to learn their opponent’s group: if they pay it, they can condition their actions concerning the groups. We assess the stability of outcomes in the long run using stochastic stability analysis. We find that three elements matter for the equilibrium selection: the group size, the strength of preferences, and the information’s cost. If the cost is too high, players never learn the group of their opponents in the long run. If one group is stronger in preferences for its favorite action than the other, or its size is sufficiently large compared to the other group, every player plays that group’s favorite action. If both groups are strong enough in preferences, or if none of the groups’ sizes is large enough, players play their favorite actions and miscoordinate in inter-group interactions. Lower levels of the cost favor coordination. Indeed, when the cost is low, in inside-group interactions, players always coordinate on their favorite action, while in inter-group interactions, they coordinate on the favorite action of the group that is stronger in preferences or large enough.


2011 ◽  
Vol 58-60 ◽  
pp. 1018-1024
Author(s):  
Feng Ye ◽  
Gui Chen Xu ◽  
Di Kang Zhu

This paper reviews several current methods of calculating buffer on the basis of pointing out each merits and pitfalls and then introduces Bayesian statistical approach to CCS / BM domain to calculate the size of the project buffer, to overcome that the current method of the buffer calculation is too subjective and the defect on lacking of practical application. In Crystal Ball, we compare the simulation results of implementation process on the benchmark of C&PM, RESM and SM. The results show that the buffer using this method can ensure the stability of the project’s completion probability, and this method has great flexibility.


Sign in / Sign up

Export Citation Format

Share Document