A new method of precise orientation adjustment based on matrix similarity for large-scale component

2018 ◽  
Vol 38 (2) ◽  
pp. 207-215 ◽  
Author(s):  
Dian Wu ◽  
Fuzhou Du

Purpose In the assembly process of the satellite, there will be multiple installation and disassembly operations for the solar wing and the main satellite body (or simulator). However, the traditional method of orientation adjustment by theodolite and two-axis turntable is difficult to coordinate three rotation angles of yaw, pitch and roll, which leads to the complexity of actual operation and dependency on manual experience. Therefore, this paper aims to propose a new method to achieve rapid and precise orientation adjustment. Design/methodology/approach The similarity relation of the orientation variation matrix in a different coordinate system is studied, and a mapping model of the similarity relation is established. By using multiple element matrices to construct the original rotation matrix, the mapping is solved in quaternion form. Taking the theodolite as a measuring instrument and the Stewart platform as a control equipment, an experiment on installing the solar wing is performed to validate the effectiveness of the algorithm. Findings Based on the solving algorithm, the orientation adjustment process is simplified to a three-step fixed mode, which is three adjustments to get the parameter of the mapping model, one to adjust the component in place and another to further fine tuning. The final orientation deviation is less than 0.003° and close to the level of using a laser tracker, achieving the required accuracy of 0.0115°. Originality/value This paper reveals the similarity relation of the variation matrix in the process of orientation adjustment and presents a new method to achieve rapid and precise orientation adjustment for the large-scale component.

2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Daniele Peri

PurposeA recursive scheme for the ALIENOR method is proposed as a remedy for the difficulties induced by the method. A progressive focusing on the most promising region, in combination with a variation of the density of the alpha-dense curve, is proposed.Design/methodology/approachALIENOR method is aimed at reducing the space dimensions of an optimization problem by spanning it by using a single alpha-dense curve: the curvilinear abscissa along the curve becomes the only design parameter for any design space. As a counterpart, the transformation of the objective function in the projected space is much more difficult to tackle.FindingsA fine tuning of the procedure has been performed in order to identity the correct balance between the different elements of the procedure. The proposed approach has been tested by using a set of algebraic functions with up to 1,024 design variables, demonstrating the ability of the method in solving large scale optimization problem. Also an industrial application is presented.Originality/valueIn the knowledge of the author there is not a similar paper in the current literature.


2018 ◽  
Vol 8 (4) ◽  
pp. 656-677 ◽  
Author(s):  
August Raimy Sjauw-Koen-Fa ◽  
Vincent Blok ◽  
Onno S.W.F. Omta

PurposeThe purpose of this paper is to assess the impact of smallholder supply chains on sustainable sourcing to answer the question how food and agribusiness multinationals can best include smallholders in their sourcing strategies and take social responsibility for large-scale sustainable and more equitable supply. A sustainable smallholder sourcing model with a list of critical success factors (CSFs) has been applied on two best-practise cases. In this model, business and corporate social responsibility perspectives are integrated.Design/methodology/approachThe primary data of the value chain analyses of the two smallholder supply chains of a food and agribusiness multinational have been applied. Both cases were of a join research program commissioned by the multinational and a non-governmental organization using the same methods and research tools. Similarities, differences and interference between the cases have been determined and assessed in order to confirm, fine tune or adjust the CSFs.FindingsBoth cases could be conceptualized through the smallholder sourcing model. Most CSFs could be found in both cases, but differences were also found, which led to fine tuning of some CSFs: building of a partnership and effective producers organization, providing farm financing and the use of cross-functional teams in smallholder supplier development programs. It was also concluded that the smallholder sourcing model is applicable in different geographical areas.Research limitations/implicationsThe findings of this study are based on just two cases. More best-practise cases are recommended in order to confirm or to adjust the developed sourcing model and the CSFs.Originality/valueThis paper/research fills the need in sustainable supply chain management literature to study supply chains that comply with the triple bottom line concept, rather than supply chains that are just more “green.”


2020 ◽  
Vol 47 (3) ◽  
pp. 547-560 ◽  
Author(s):  
Darush Yazdanfar ◽  
Peter Öhman

PurposeThe purpose of this study is to empirically investigate determinants of financial distress among small and medium-sized enterprises (SMEs) during the global financial crisis and post-crisis periods.Design/methodology/approachSeveral statistical methods, including multiple binary logistic regression, were used to analyse a longitudinal cross-sectional panel data set of 3,865 Swedish SMEs operating in five industries over the 2008–2015 period.FindingsThe results suggest that financial distress is influenced by macroeconomic conditions (i.e. the global financial crisis) and, in particular, by various firm-specific characteristics (i.e. performance, financial leverage and financial distress in previous year). However, firm size and industry affiliation have no significant relationship with financial distress.Research limitationsDue to data availability, this study is limited to a sample of Swedish SMEs in five industries covering eight years. Further research could examine the generalizability of these findings by investigating other firms operating in other industries and other countries.Originality/valueThis study is the first to examine determinants of financial distress among SMEs operating in Sweden using data from a large-scale longitudinal cross-sectional database.


1979 ◽  
Vol 6 (2) ◽  
pp. 70-72
Author(s):  
T. A. Coffelt ◽  
F. S. Wright ◽  
J. L. Steele

Abstract A new method of harvesting and curing breeder's seed peanuts in Virginia was initiated that would 1) reduce the labor requirements, 2) maintain a high level of germination, 3) maintain varietal purity at 100%, and 4) reduce the risk of frost damage. Three possible harvesting and curing methods were studied. The traditional stack-pole method satisfied the latter 3 objectives, but not the first. The windrow-combine method satisfied the first 2 objectives, but not the last 2. The direct harvesting method satisfied all four objectives. The experimental equipment and curing procedures for direct harvesting had been developed but not tested on a large scale for seed harvesting. This method has been used in Virginia to produce breeder's seed of 3 peanut varieties (Florigiant, VA 72R and VA 61R) during five years. Compared to the stackpole method, labor requirements have been reduced, satisfactory levels of germination and varietal purity have been obtained, and the risk of frost damage has been minimized.


2016 ◽  
Vol 28 (4) ◽  
pp. 245-262 ◽  
Author(s):  
Annalisa Sannino ◽  
Yrjö Engeström ◽  
Johanna Lahikainen

Purpose The paper aims to examine organizational authoring understood as a longitudinal, material and dialectical process of transformation efforts. The following questions are asked: To which extent can a Change Laboratory intervention help practitioners author their own learning? Are the authored outcomes of a Change Laboratory intervention futile if a workplace subsequently undergoes large-scale organizational transformations? Does the expansive learning authored in a Change Laboratory intervention survive large-scale organizational transformations, and if so, why does it survive and how? Design/methodology/approach The paper develops a conceptual argument based on cultural–historical activity theory. The conceptual argument is grounded in the examination of a case of eight years of change efforts in a university library, including a Change Laboratory (CL) intervention. Follow-up interview data are used to discuss and illuminate our argument in relation to the three research questions. Findings The idea of knotworking constructed in the CL process became a “germ cell” that generates novel solutions in the library activity. A large-scale transformation from the local organization model developed in the CL process to the organization model of the entire university library was not experienced as a loss. The dialectical tension between the local and global models became a source of movement driven by the emerging expansive object. Practitioners are modeling their own collective future competences, expanding them both in socio-spatial scope and interactive depth. Originality/value The article offers an expanded view of authorship, calling attention to material changes and practical change actions. The dialectical tensions identified serve as heuristic guidelines for future studies and interventions.


Author(s):  
Ezzeddine Touti ◽  
Ali Sghaier Tlili ◽  
Muhannad Almutiry

Purpose This paper aims to focus on the design of a decentralized observation and control method for a class of large-scale systems characterized by nonlinear interconnected functions that are assumed to be uncertain but quadratically bounded. Design/methodology/approach Sufficient conditions, under which the designed control scheme can achieve the asymptotic stabilization of the augmented system, are developed within the Lyapunov theory in the framework of linear matrix inequalities (LMIs). Findings The derived LMIs are formulated under the form of an optimization problem whose resolution allows the concurrent computation of the decentralized control and observation gains and the maximization of the nonlinearity coverage tolerated by the system without becoming unstable. The reliable performances of the designed control scheme, compared to a distinguished decentralized guaranteed cost control strategy issued from the literature, are demonstrated by numerical simulations on an extensive application of a three-generator infinite bus power system. Originality/value The developed optimization problem subject to LMI constraints is efficiently solved by a one-step procedure to analyze the asymptotic stability and to synthesize all the control and observation parameters. Therefore, such a procedure enables to cope with the conservatism and suboptimal solutions procreated by optimization problems based on iterative algorithms with multi-step procedures usually used in the problem of dynamic output feedback decentralized control of nonlinear interconnected systems.


2019 ◽  
Vol 20 (1) ◽  
Author(s):  
Fuyong Xing ◽  
Yuanpu Xie ◽  
Xiaoshuang Shi ◽  
Pingjun Chen ◽  
Zizhao Zhang ◽  
...  

Abstract Background Nucleus or cell detection is a fundamental task in microscopy image analysis and supports many other quantitative studies such as object counting, segmentation, tracking, etc. Deep neural networks are emerging as a powerful tool for biomedical image computing; in particular, convolutional neural networks have been widely applied to nucleus/cell detection in microscopy images. However, almost all models are tailored for specific datasets and their applicability to other microscopy image data remains unknown. Some existing studies casually learn and evaluate deep neural networks on multiple microscopy datasets, but there are still several critical, open questions to be addressed. Results We analyze the applicability of deep models specifically for nucleus detection across a wide variety of microscopy image data. More specifically, we present a fully convolutional network-based regression model and extensively evaluate it on large-scale digital pathology and microscopy image datasets, which consist of 23 organs (or cancer diseases) and come from multiple institutions. We demonstrate that for a specific target dataset, training with images from the same types of organs might be usually necessary for nucleus detection. Although the images can be visually similar due to the same staining technique and imaging protocol, deep models learned with images from different organs might not deliver desirable results and would require model fine-tuning to be on a par with those trained with target data. We also observe that training with a mixture of target and other/non-target data does not always mean a higher accuracy of nucleus detection, and it might require proper data manipulation during model training to achieve good performance. Conclusions We conduct a systematic case study on deep models for nucleus detection in a wide variety of microscopy images, aiming to address several important but previously understudied questions. We present and extensively evaluate an end-to-end, pixel-to-pixel fully convolutional regression network and report a few significant findings, some of which might have not been reported in previous studies. The model performance analysis and observations would be helpful to nucleus detection in microscopy images.


2019 ◽  
Vol 35 (14) ◽  
pp. i417-i426 ◽  
Author(s):  
Erin K Molloy ◽  
Tandy Warnow

Abstract Motivation At RECOMB-CG 2018, we presented NJMerge and showed that it could be used within a divide-and-conquer framework to scale computationally intensive methods for species tree estimation to larger datasets. However, NJMerge has two significant limitations: it can fail to return a tree and, when used within the proposed divide-and-conquer framework, has O(n5) running time for datasets with n species. Results Here we present a new method called ‘TreeMerge’ that improves on NJMerge in two ways: it is guaranteed to return a tree and it has dramatically faster running time within the same divide-and-conquer framework—only O(n2) time. We use a simulation study to evaluate TreeMerge in the context of multi-locus species tree estimation with two leading methods, ASTRAL-III and RAxML. We find that the divide-and-conquer framework using TreeMerge has a minor impact on species tree accuracy, dramatically reduces running time, and enables both ASTRAL-III and RAxML to complete on datasets (that they would otherwise fail on), when given 64 GB of memory and 48 h maximum running time. Thus, TreeMerge is a step toward a larger vision of enabling researchers with limited computational resources to perform large-scale species tree estimation, which we call Phylogenomics for All. Availability and implementation TreeMerge is publicly available on Github (http://github.com/ekmolloy/treemerge). Supplementary information Supplementary data are available at Bioinformatics online.


2018 ◽  
Vol 92 (22) ◽  
Author(s):  
Tomofumi Mochizuki ◽  
Rie Ohara ◽  
Marilyn J. Roossinck

ABSTRACTThe effect of large-scale synonymous substitutions in a small icosahedral, single-stranded RNA viral genome on virulence, viral titer, and protein evolution were analyzed. The coat protein (CP) gene of the Fny stain of cucumber mosaic virus (CMV) was modified. We created four CP mutants in which all the codons of nine amino acids in the 5′ or 3′ half of the CP gene were replaced by either the most frequently or the least frequently used synonymous codons in monocot plants. When the dicot host (Nicotiana benthamiana) was inoculated with these four CP mutants, viral RNA titers in uninoculated symptomatic leaves decreased, while all mutants eventually showed mosaic symptoms similar to those for the wild type. The codon adaptation index of these four CP mutants against dicot genes was similar to those of the wild-type CP gene, indicating that the reduction of viral RNA titer was due to deleterious changes of the secondary structure of RNAs 3 and 4. When two 5′ mutants were serially passaged inN. benthamiana, viral RNA titers were rapidly restored but competitive fitness remained decreased. Although no nucleic acid changes were observed in the passaged wild-type CMV, one to three amino acid changes were observed in the synonymously mutated CP of each passaged virus, which were involved in recovery of viral RNA titer of 5′ mutants. Thus, we demonstrated that deleterious effects of the large-scale synonymous substitutions in the RNA viral genome facilitated the rapid amino acid mutation(s) in the CP to restore the viral RNA titer.IMPORTANCERecently, it has been known that synonymous substitutions in RNA virus genes affect viral pathogenicity and competitive fitness by alteration of global or local RNA secondary structure of the viral genome. We confirmed that large-scale synonymous substitutions in the CP gene of CMV resulted in decreased viral RNA titer. Importantly, when viral evolution was stimulated by serial-passage inoculation, viral RNA titer was rapidly restored, concurrent with a few amino acid changes in the CP. This novel finding indicates that the deleterious effects of large-scale nucleic acid mutations on viral RNA secondary structure are readily tolerated by structural changes in the CP, demonstrating a novel part of the adaptive evolution of an RNA viral genome. In addition, our experimental system for serial inoculation of large-scale synonymous mutants could uncover a role for new amino acid residues in the viral protein that have not been observed in the wild-type virus strains.


2019 ◽  
Vol 72 (5) ◽  
pp. 557-565
Author(s):  
Dilek Bulut ◽  
Tatjana Krups ◽  
Gerhard Poll ◽  
Ulrich Giese

Purpose Elastomer seals are used in many applications. They are exposed to lubricants and additives at elevated temperatures, as well as mechanical stresses. They can only provide good sealing function when they have resistance to those factors. There are many elastomer-lubricant compatibility tests based on DIN ISO 1817 in industry. However, they are insufficient and costly. Correlations between the tests and the applications are inadequate. The purpose of this study is investigating lubricant compatibility of fluoroelastomers (FKM) seals in polyethylene-glycol (PG)- and polyalphaolefin (PAO)- based synthetic oils and developing a methodology to predict seal service life. Design/methodology/approach A new compatibility test which is more sufficient in terms of time and cost was developed and compared with a standard test, currently used in industry. Compatibility of FKM radial lip seals with PG- and PAO-based synthetic oils with different additives was investigated chemically and dynamically. Failure mechanisms were examined. Findings The new method and the Freudenberg Flender Test FB 73 11 008 showed similar results concerning damages and similar tendencies regarding wear. The additive imidazole derivative was the most critical. Static tests give indications of possible chemically active additives, but alone they are insufficient to simulate the dynamic applications. Originality/value The paper describes a new method to investigate elastomer-lubricant compatibility and gives first results with a variety of lubricants.


Sign in / Sign up

Export Citation Format

Share Document