scholarly journals Transmitter Positioning of Distributed Large-Scale Metrology Within Line-Less Mobile Assembly Systems

Author(s):  
Christoph Nicksch ◽  
Alexander K. Hüttner ◽  
Robert H. Schmitt

AbstractIn Line-less Mobile Assembly Systems (LMAS) the mobilization of assembly resources and products enables rapid physical system reconfigurations to increase flexibility and adaptability. The clean-floor approach discards fixed anchor points, so that assembly resources such as mobile robots and automated guided vehicles transporting products can adapt to new product requirements and form new assembly processes without specific layout restrictions. An associated challenge is spatial referencing between mobile resources and product tolerances. Due to the missing fixed points, there is a need for more positioning data to locate and navigate assembly resources. Distributed large-scale metrology systems offer the capability to cover a wide shop floor area and obtain positioning data from several resources simultaneously with uncertainties in the submillimeter range. The positioning of transmitter units of these systems becomes a demanding task taking visibility during dynamic processes and configuration-dependent measurement uncertainty into account. This paper presents a novel approach to optimize the position configuration of distributed large-scale metrology systems by minimizing the measurement uncertainty for dynamic assembly processes. For this purpose, a particle-swarm-optimization algorithm has been implemented. The results show that the algorithm is capable of determining suitable transmitter positions by finding global optima in the assembly station search space verified by applying brute-force method in simulation.

2012 ◽  
Vol 532-533 ◽  
pp. 1830-1835
Author(s):  
Ying Zhang ◽  
Bo Qin Liu ◽  
Han Rong Chen

Due to the existence of large numbers of local and global optima of super-high dimension complex functions, general Particle Swarm Optimizer (PSO) methods are slow speed on convergence and easy to be trapped in local optima. In this paper, an Adaptive Particle Swarm Optimizer(APSO) is proposed, which employ an adaptive inertia factor and dynamic changes strategy of search space and velocity in each cycle to plan large-scale space global search and refined local search as a whole according to the fitness change of swarm in optimization process of the functions, and to quicken convergence speed, avoid premature problem, economize computational expenses, and obtain global optimum. We test the proposed algorithm and compare it with other published methods on several super-high dimension complex functions, the experimental results demonstrate that this revised algorithm can rapidly converge at high quality solutions.


2017 ◽  
Vol 1 (2) ◽  
pp. 66
Author(s):  
Muhammad Hussain Mughal ◽  
Zubair Ahmed Shaikh

Diversity of application domain compelled to design sustainable classification scheme for significantly amassing software repository. The atomic reusable software components are articulated to improve the software component reusability in volatile industry.  Numerous approaches of software classification have been proposed over past decades. Each approach has some limitations related to coupling and cohesion. In this paper, we proposed a novel approach by constituting the software based on radical functionalities to improve software reusability. We analyze the element's semantics in Periodic Table used in chemistry to design our classification approach, and present this approach using tree-based classification to curtail software repository search space complexity and further refined based on semantic search techniques. We developed a Global unique Identifier (GUID) for indexing the functions and related components. We have exploited the correlation between chemistry element and software elements to simulate one to one mapping between them. Our approach is inspired from sustainability chemical periodic table. We have proposed software periodic table (SPT) representing atomic software components extracted from real application software. Based on SPT classified repository tree parsing & extraction to enable the user to program their software by customizing the ingredients of software requirements. The classified repository of software ingredients assist user to exploits their requirements to software engineer and enable requirement engineer to develop a rapid large-scale prototype with great essence. Furthermore, we would predict the usability of the categorized repository based on feedback of users.  The continuous evolution of that proposed repository will be fine-tuned based on utilization and SPT would be gradually optimized by ant colony optimization techniques. Succinctly would provoke automating the software development process.


2019 ◽  
Author(s):  
Chem Int

This research work presents a facile and green route for synthesis silver sulfide (Ag2SNPs) nanoparticles from silver nitrate (AgNO3) and sodium sulfide nonahydrate (Na2S.9H2O) in the presence of rosemary leaves aqueous extract at ambient temperature (27 oC). Structural and morphological properties of Ag2SNPs nanoparticles were analyzed by X-ray diffraction (XRD) and transmission electron microscopy (TEM). The surface Plasmon resonance for Ag2SNPs was obtained around 355 nm. Ag2SNPs was spherical in shape with an effective diameter size of 14 nm. Our novel approach represents a promising and effective method to large scale synthesis of eco-friendly antibacterial activity silver sulfide nanoparticles.


GigaScience ◽  
2020 ◽  
Vol 9 (12) ◽  
Author(s):  
Ariel Rokem ◽  
Kendrick Kay

Abstract Background Ridge regression is a regularization technique that penalizes the L2-norm of the coefficients in linear regression. One of the challenges of using ridge regression is the need to set a hyperparameter (α) that controls the amount of regularization. Cross-validation is typically used to select the best α from a set of candidates. However, efficient and appropriate selection of α can be challenging. This becomes prohibitive when large amounts of data are analyzed. Because the selected α depends on the scale of the data and correlations across predictors, it is also not straightforwardly interpretable. Results The present work addresses these challenges through a novel approach to ridge regression. We propose to reparameterize ridge regression in terms of the ratio γ between the L2-norms of the regularized and unregularized coefficients. We provide an algorithm that efficiently implements this approach, called fractional ridge regression, as well as open-source software implementations in Python and matlab (https://github.com/nrdg/fracridge). We show that the proposed method is fast and scalable for large-scale data problems. In brain imaging data, we demonstrate that this approach delivers results that are straightforward to interpret and compare across models and datasets. Conclusion Fractional ridge regression has several benefits: the solutions obtained for different γ are guaranteed to vary, guarding against wasted calculations; and automatically span the relevant range of regularization, avoiding the need for arduous manual exploration. These properties make fractional ridge regression particularly suitable for analysis of large complex datasets.


Author(s):  
Silvia Huber ◽  
Lars B. Hansen ◽  
Lisbeth T. Nielsen ◽  
Mikkel L. Rasmussen ◽  
Jonas Sølvsteen ◽  
...  

Author(s):  
Jin Zhou ◽  
Qing Zhang ◽  
Jian-Hao Fan ◽  
Wei Sun ◽  
Wei-Shi Zheng

AbstractRecent image aesthetic assessment methods have achieved remarkable progress due to the emergence of deep convolutional neural networks (CNNs). However, these methods focus primarily on predicting generally perceived preference of an image, making them usually have limited practicability, since each user may have completely different preferences for the same image. To address this problem, this paper presents a novel approach for predicting personalized image aesthetics that fit an individual user’s personal taste. We achieve this in a coarse to fine manner, by joint regression and learning from pairwise rankings. Specifically, we first collect a small subset of personal images from a user and invite him/her to rank the preference of some randomly sampled image pairs. We then search for the K-nearest neighbors of the personal images within a large-scale dataset labeled with average human aesthetic scores, and use these images as well as the associated scores to train a generic aesthetic assessment model by CNN-based regression. Next, we fine-tune the generic model to accommodate the personal preference by training over the rankings with a pairwise hinge loss. Experiments demonstrate that our method can effectively learn personalized image aesthetic preferences, clearly outperforming state-of-the-art methods. Moreover, we show that the learned personalized image aesthetic benefits a wide variety of applications.


2021 ◽  
Vol 13 (3) ◽  
pp. 1274
Author(s):  
Loau Al-Bahrani ◽  
Mehdi Seyedmahmoudian ◽  
Ben Horan ◽  
Alex Stojcevski

Few non-traditional optimization techniques are applied to the dynamic economic dispatch (DED) of large-scale thermal power units (TPUs), e.g., 1000 TPUs, that consider the effects of valve-point loading with ramp-rate limitations. This is a complicated multiple mode problem. In this investigation, a novel optimization technique, namely, a multi-gradient particle swarm optimization (MG-PSO) algorithm with two stages for exploring and exploiting the search space area, is employed as an optimization tool. The M particles (explorers) in the first stage are used to explore new neighborhoods, whereas the M particles (exploiters) in the second stage are used to exploit the best neighborhood. The M particles’ negative gradient variation in both stages causes the equilibrium between the global and local search space capabilities. This algorithm’s authentication is demonstrated on five medium-scale to very large-scale power systems. The MG-PSO algorithm effectively reduces the difficulty of handling the large-scale DED problem, and simulation results confirm this algorithm’s suitability for such a complicated multi-objective problem at varying fitness performance measures and consistency. This algorithm is also applied to estimate the required generation in 24 h to meet load demand changes. This investigation provides useful technical references for economic dispatch operators to update their power system programs in order to achieve economic benefits.


2021 ◽  
Vol 13 (5) ◽  
pp. 874
Author(s):  
Yu Chen ◽  
Mohamed Ahmed ◽  
Natthachet Tangdamrongsub ◽  
Dorina Murgulet

The Nile River stretches from south to north throughout the Nile River Basin (NRB) in Northeast Africa. Ethiopia, where the Blue Nile originates, has begun the construction of the Grand Ethiopian Renaissance Dam (GERD), which will be used to generate electricity. However, the impact of the GERD on land deformation caused by significant water relocation has not been rigorously considered in the scientific research. In this study, we develop a novel approach for predicting large-scale land deformation induced by the construction of the GERD reservoir. We also investigate the limitations of using the Gravity Recovery and Climate Experiment Follow On (GRACE-FO) mission to detect GERD-induced land deformation. We simulated three land deformation scenarios related to filling the expected reservoir volume, 70 km3, using 5-, 10-, and 15-year filling scenarios. The results indicated: (i) trends in downward vertical displacement estimated at −17.79 ± 0.02, −8.90 ± 0.09, and −5.94 ± 0.05 mm/year, for the 5-, 10-, and 15-year filling scenarios, respectively; (ii) the western (eastern) parts of the GERD reservoir are estimated to move toward the reservoir’s center by +0.98 ± 0.01 (−0.98 ± 0.01), +0.48 ± 0.00 (−0.48 ± 0.00), and +0.33 ± 0.00 (−0.33 ± 0.00) mm/year, under the 5-, 10- and 15-year filling strategies, respectively; (iii) the northern part of the GERD reservoir is moving southward by +1.28 ± 0.02, +0.64 ± 0.01, and +0.43 ± 0.00 mm/year, while the southern part is moving northward by −3.75 ± 0.04, −1.87 ± 0.02, and −1.25 ± 0.01 mm/year, during the three examined scenarios, respectively; and (iv) the GRACE-FO mission can only detect 15% of the large-scale land deformation produced by the GERD reservoir. Methods and results demonstrated in this study provide insights into possible impacts of reservoir impoundment on land surface deformation, which can be adopted into the GERD project or similar future dam construction plans.


2006 ◽  
Vol 04 (03) ◽  
pp. 639-647 ◽  
Author(s):  
ELEAZAR ESKIN ◽  
RODED SHARAN ◽  
ERAN HALPERIN

The common approaches for haplotype inference from genotype data are targeted toward phasing short genomic regions. Longer regions are often tackled in a heuristic manner, due to the high computational cost. Here, we describe a novel approach for phasing genotypes over long regions, which is based on combining information from local predictions on short, overlapping regions. The phasing is done in a way, which maximizes a natural maximum likelihood criterion. Among other things, this criterion takes into account the physical length between neighboring single nucleotide polymorphisms. The approach is very efficient and is applied to several large scale datasets and is shown to be successful in two recent benchmarking studies (Zaitlen et al., in press; Marchini et al., in preparation). Our method is publicly available via a webserver at .


Author(s):  
Luca Accorsi ◽  
Daniele Vigo

In this paper, we propose a fast and scalable, yet effective, metaheuristic called FILO to solve large-scale instances of the Capacitated Vehicle Routing Problem. Our approach consists of a main iterative part, based on the Iterated Local Search paradigm, which employs a carefully designed combination of existing acceleration techniques, as well as novel strategies to keep the optimization localized, controlled, and tailored to the current instance and solution. A Simulated Annealing-based neighbor acceptance criterion is used to obtain a continuous diversification, to ensure the exploration of different regions of the search space. Results on extensively studied benchmark instances from the literature, supported by a thorough analysis of the algorithm’s main components, show the effectiveness of the proposed design choices, making FILO highly competitive with existing state-of-the-art algorithms, both in terms of computing time and solution quality. Finally, guidelines for possible efficient implementations, algorithm source code, and a library of reusable components are open-sourced to allow reproduction of our results and promote further investigations.


Sign in / Sign up

Export Citation Format

Share Document