An Efficient Method for Synthesizing Crank-Rocker Mechanisms for Generating Low Harmonic Curves

Author(s):  
Jun Wu ◽  
Q. J. Ge ◽  
Feng Gao

This paper deals with the development of an efficient method for synthesizing crank-rocker mechanisms that are capable of generating perceptually simple and smooth paths that can be approximated by the first and second harmonics of Fourier series. Through the harmonic analysis of the loop closure equations of the crank-rocker mechanism, analytical relations among the nine design variables are identified. This reduces the dimensions of the search space to two and thereby greatly speed up the synthesis process.

2012 ◽  
Vol 49 (2) ◽  
pp. 285-327 ◽  
Author(s):  
RUI P. CHAVES

Subject phrases impose particularly strong constraints on extraction. Most research assumes a syntactic account (e.g. Kayne 1983, Chomsky 1986, Rizzi 1990, Lasnik & Saito 1992, Takahashi 1994, Uriagereka 1999), but there are also pragmatic accounts (Erteschik-Shir & Lappin 1979; Van Valin 1986, 1995; Erteschik-Shir 2006, 2007) as well as performance-based approaches (Kluender 2004). In this work I argue that none of these accounts captures the full range of empirical facts, and show that subject and adjunct phrases (phrasal or clausal, finite or otherwise) are by no means impermeable to non-parasitic extraction of nominal, prepositional and adverbial phrases. The present empirical reassessment indicates that the phenomena involving subject and adjunct islands defies the formulation of a general grammatical account. Drawing from insights by Engdahl (1983) and Kluender (2004), I argue that subject island effects have a functional explanation. Independently motivated pragmatic and processing limitations cause subject-internal gaps to be heavily dispreferred, and therefore, extremely infrequent. In turn, this has led to heuristic parsing expectations that preempt subject-internal gaps and therefore speed up processing by pruning the search space of filler–gap dependencies. Such expectations cause processing problems when violated, unless they are dampened by prosodic and pragmatic cues that boost the construction of the correct parse. This account predicts subject islands and their (non-)parasitic exceptions.


Author(s):  
Karem A. Sakallah

Symmetry is at once a familiar concept (we recognize it when we see it!) and a profoundly deep mathematical subject. At its most basic, a symmetry is some transformation of an object that leaves the object (or some aspect of the object) unchanged. For example, a square can be transformed in eight different ways that leave it looking exactly the same: the identity “do-nothing” transformation, 3 rotations, and 4 mirror images (or reflections). In the context of decision problems, the presence of symmetries in a problem’s search space can frustrate the hunt for a solution by forcing a search algorithm to fruitlessly explore symmetric subspaces that do not contain solutions. Recognizing that such symmetries exist, we can direct a search algorithm to look for solutions only in non-symmetric parts of the search space. In many cases, this can lead to significant pruning of the search space and yield solutions to problems which are otherwise intractable. This chapter explores the symmetries of Boolean functions, particularly the symmetries of their conjunctive normal form (CNF) representations. Specifically, it examines what those symmetries are, how to model them using the mathematical language of group theory, how to derive them from a CNF formula, and how to utilize them to speed up CNF SAT solvers.


Author(s):  
Otokar Grošek ◽  
Pavol Zajac

Classical ciphers are used to encrypt plaintext messages written in a natural language in such a way that they are readable for sender or intended recipient only. Many classical ciphers can be broken by brute-force search through the key-space. Methods of artificial intelligence, such as optimization heuristics, can be used to narrow the search space, to speed-up text processing and text recognition in the cryptanalytic process. Here we present a broad overview of different AI techniques usable in cryptanalysis of classical ciphers. Specific methods to effectively recognize the correctly decrypted text among many possible decrypts are discussed in the next part Automated cryptanalysis – Language processing.


2018 ◽  
Vol 24 (3) ◽  
pp. 351-366
Author(s):  
Marcos Aurélio Basso ◽  
Daniel Rodrigues dos Santos

Abstract In this paper, we present a method for 3D mapping of indoor environments using RGB-D data. The contribution of our proposed method is two-fold. First, our method exploits a joint effort of the speed-up robust features (SURF) algorithm and a disparity-to-plane model for a coarse-to-fine registration procedure. Once the coarse-to-fine registration task accumulates errors, the same features can appear in two different locations of the map. This is known as the loop closure problem. Then, the variance-covariance matrix that describes the uncertainty of transformation parameters (3D rotation and 3D translation) for view-based loop closure detection followed by a graph-based optimization are proposed to achieve a 3D consistent indoor map. To demonstrate and evaluate the effectiveness of the proposed method, experimental datasets obtained in three indoor environments with different levels of details are used. The experimental results shown that the proposed framework can create 3D indoor maps with an error of 11,97 cm into object space that corresponds to a positional imprecision around 1,5% at the distance of 9 m travelled by sensor.


2020 ◽  
Author(s):  
Fulei Ji ◽  
Wentao Zhang ◽  
Tianyou Ding

Abstract Automatic search methods have been widely used for cryptanalysis of block ciphers, especially for the most classic cryptanalysis methods—differential and linear cryptanalysis. However, the automatic search methods, no matter based on MILP, SMT/SAT or CP techniques, can be inefficient when the search space is too large. In this paper, we propose three new methods to improve Matsui’s branch-and-bound search algorithm, which is known as the first generic algorithm for finding the best differential and linear trails. The three methods, named reconstructing DDT and LAT according to weight, executing linear layer operations in minimal cost and merging two 4-bit S-boxes into one 8-bit S-box, respectively, can efficiently speed up the search process by reducing the search space as much as possible and reducing the cost of executing linear layer operations. We apply our improved algorithm to DESL and GIFT, which are still the hard instances for the automatic search methods. As a result, we find the best differential trails for DESL (up to 14-round) and GIFT-128 (up to 19-round). The best linear trails for DESL (up to 16-round), GIFT-128 (up to 10-round) and GIFT-64 (up to 15-round) are also found. To the best of our knowledge, these security bounds for DESL and GIFT under single-key scenario are given for the first time. Meanwhile, it is the longest exploitable (differential or linear) trails for DESL and GIFT. Furthermore, benefiting from the efficiency of the improved algorithm, we do experiments to demonstrate that the clustering effect of differential trails for 13-round DES and DESL are both weak.


2014 ◽  
Vol 24 (4) ◽  
pp. 901-916
Author(s):  
Zoltán Ádám Mann ◽  
Tamás Szép

Abstract Backtrack-style exhaustive search algorithms for NP-hard problems tend to have large variance in their runtime. This is because “fortunate” branching decisions can lead to finding a solution quickly, whereas “unfortunate” decisions in another run can lead the algorithm to a region of the search space with no solutions. In the literature, frequent restarting has been suggested as a means to overcome this problem. In this paper, we propose a more sophisticated approach: a best-firstsearch heuristic to quickly move between parts of the search space, always concentrating on the most promising region. We describe how this idea can be efficiently incorporated into a backtrack search algorithm, without sacrificing optimality. Moreover, we demonstrate empirically that, for hard solvable problem instances, the new approach provides significantly higher speed-up than frequent restarting.


2017 ◽  
Vol 2017 ◽  
pp. 1-13 ◽  
Author(s):  
Aushim Koumar ◽  
Tine Tysmans ◽  
Rajan Filomeno Coelho ◽  
Niels De Temmerman

We developed a fully automated multiobjective optimisation framework using genetic algorithms to generate a range of optimal barrel vault scissor structures. Compared to other optimisation methods, genetic algorithms are more robust and efficient when dealing with multiobjective optimisation problems and provide a better view of the search space while reducing the chance to be stuck in a local minimum. The novelty of this work is the application and validation (using metrics) of genetic algorithms for the shape and size optimisation of scissor structures, which has not been done so far for two objectives. We tested the feasibility and capacity of the methodology by optimising a 6 m span barrel vault to weight and compactness and by obtaining optimal solutions in an efficient way using NSGA-II. This paper presents the framework and the results of the case study. The in-depth analysis of the influence of the optimisation variables on the results yields new insights which can help in making choices with regard to the design variables, the constraints, and the number of individuals and generations in order to obtain efficiently a trade-off of optimal solutions.


2015 ◽  
Vol 137 (4) ◽  
Author(s):  
Jie Hu ◽  
Yan Wang ◽  
Aiguo Cheng ◽  
Zhihua Zhong

Interval is an alternative to probability distribution in quantifying uncertainty for sensitivity analysis (SA) when there is a lack of data to fit a distribution with good confidence. It only requires the information of lower and upper bounds. Analytical relations among design parameters, design variables, and target performances under uncertainty can be modeled as interval-valued constraints. By incorporating logic quantifiers, quantified constraint satisfaction problems (QCSPs) can integrate semantics and engineering intent in mathematical relations for engineering design. In this paper, a global sensitivity analysis (GSA) method is developed for feasible design space searching problems that are formulated as QCSPs, where the effects of value variations and quantifier changes for design parameters on target performances are analyzed based on several proposed metrics, including the indeterminacy of target performances, information gain of parameter variations, and infeasibility of constraints. Three examples are used to demonstrate the proposed approach.


2013 ◽  
Vol 2013 ◽  
pp. 1-8
Author(s):  
Ying-Shen Juang ◽  
Hsi-Chin Hsin ◽  
Tze-Yun Sung ◽  
Carlo Cattani

Wavelet packet transform known as a substantial extension of wavelet transform has drawn a lot of attention to visual applications. In this paper, we advocate using adaptive wavelet packet transform for texture synthesis. The adaptive wavelet packet coefficients of an image are organized into hierarchical trees called adaptive wavelet packet trees, based on which an efficient algorithm has been proposed to speed up the synthesis process, from the low-frequency tree nodes representing the global characteristics of textures to the high-frequency tree nodes representing the local details. Experimental results show that the texture synthesis in the adaptive wavelet packet trees (TSIAWPT) algorithm is suitable for a variety of textures and is preferable in terms of computation time.


2021 ◽  
Author(s):  
Hala A. Omar ◽  
Mohammed El-Shorbagy

Abstract Grasshopper optimization algorithm (GOA) is one of the promising optimization algorithms for optimization problems. But, it has the main drawback of trapping into a local minimum, which causes slow convergence or inability to detect a solution. Several modifications and combinations have been proposed to overcome this problem. In this paper, a modified grasshopper optimization algorithm (MGOA) based genetic algorithm (GA) is proposed to overcome this problem. Modifications rely on certain mathematical assumptions and varying the domain of the Cmax control parameter to escape from the local minimum and move the search process to a new improved point. Parameter C is one of the most important parameters in GOA where it balances the exploration and exploitation of the search space. These modifications aim to lead to speed up the convergence rate by reducing the repeated solutions and the number of iterations. The proposed algorithm will be tested on the 19 main test functions to verify and investigate the influence of the proposed modifications. In addition, the algorithm will be applied to solve 5 different cases of nonlinear systems with different types of dimensions and regularity to show the reliability and efficiency of the proposed algorithm. Good results were achieved compared to the original GOA.


Sign in / Sign up

Export Citation Format

Share Document