A General Method for Computing the Distance Between Two Moving Objects Using Optimization Techniques

Author(s):  
Ou Ma ◽  
Meyer Nahon

Abstract Presented in this paper is a general method used to find the distance between two moving objects. This distance is defined as the length of the shortest path from one object to the other. The objects are assumed to be composed of arbitrary quadratic surface segments. The distance problem is formulated as a quadratic programming problem with linear and/or quadratic constraints, which is solved by efficient and robust quadratic programming techniques. Attention is focused on implementation in order to achieve computational efficiency for real-time applications. Computing tests show that the computational speed of this method is of linear order in terms of the total number of bounding surfaces of the two objects. It is also shown that, with a minor modification, this method can be used to calculate the interference between objects. A corresponding general software code has been implemented, and will be used for kinematics and dynamics modelling and simulation of space manipulators including situations with transient topologies, contact of environment, and capture/release of payloads.

Author(s):  
Meyer Nahon

Abstract The rapid determination of the minimum distance between objects is of importance in collision avoidance for a robot maneuvering among obstacles. Currently, the fastest algorithms for the solution of this problem are based on the use of optimization techniques to minimize a distance function. Furthermore, to date this problem has been approached purely through the position kinematics of the two objects. However, although the minimum distance between two objects can be found quickly on state-of-the-art hardware, the modelling of realistic scenes entails the determination of the minimum distances between large numbers of pairs of objects, and the computation time to calculate the overall minimum distance between any two objects is significant, and introduces a delay which has serious repercussions on the real-time control of the robot. This paper presents a technique to modify the original optimization problem in order to include velocity information. In effect, the minimum distance calculation is performed at a future time step by projecting the effect of present velocity. This method has proven to give good results on a 6-dof robot maneuvering among obstacles, and has allowed a complete compensation of the lags incurred due to computational delays.


2013 ◽  
Vol 427-429 ◽  
pp. 341-345
Author(s):  
Xue Fei Chang ◽  
Zhe Yong Piao ◽  
Xiang Yu Lv ◽  
De Xin Li

Co-optimization of output and reserve is necessary in order to provide maximum benefit to both consumers and producers. Once renewable generation sources like wind or solar begin to make up a large proportion of the generation mix, this co-optimization becomes much more difficult since the output of renewable sources is not well-known in advance. In this paper, a uniform reliability level is used as a constraint in the process of output and reserve. The proposed model is tested on the modified 5-bus PJM system. The co-optimization is performed by sequential quadratic programming techniques. The results show that the co-optimization results are strongly related to the uncertainties of wind power, the reliability level of the system, and the reliability of generators when wind makes up a significant portion of the generation mix.


2020 ◽  
Author(s):  
Sarah Sonnett ◽  
Amy Mainzer ◽  
Tommy Grav ◽  
Tim Spahr ◽  
Eva Lilly ◽  
...  

<p>The Near-Earth Object Surveillance Mission (NEOSM) is a planned space-based infrared mission that will nominally launch in 2025 and librate at the Earth-Sun L1 Lagrange point.  The NEOSM Project was formulated to address the need to detect, catalog, and characterize near-Earth objects (NEOs) to support informed decision making for any potential mitigation activity. NEOSM detects NEOs, obtains high quality orbits for them, provides physical characterization of the NEOs and their source populations, and provides more detailed physical characterization for individual targets with significant impact probabilities.  Specifically, NEOSM will detect, track, and characterize 2/3 of potentially hazardous asteroids (PHAs) larger than 140m - large enough to cause potentially significant regional damage.  NEOSM is expected to detect thousands of comets, hundreds of thousands of NEOs and millions of main belt asteroids. Since moving objects, in particular NEOs, are the main focus of the NEOSM project, the survey can be optimized for maximum discovery rate by adjusting the survey cadence to ensure efficient and reliable linking observations into tracklets, which are position-time sets of a minor planet. It is also important for the survey cadence to provide self-followup that yields orbits with quality similar to that of the known NEOs today. The NEOSM Investigation Software Suite (NISS) is a set of tools being developed to support the efforts to optimize the survey and verify the ability of the designed mission to meet its scientific objectives. The NISS consists of a comprehensive representation of the mission performance, including the flight system hardware, mission operations, and ground data system processing. The NSS takes as its input a reference population of solar system bodies, the NEOSM Reference Small Body Population Model (RSBPM), and performs a frame-by-frame simulation of the survey over the course of its entire operational lifetime. Note that the RSBPM allows for performance to be evaluated as a function of diameter, rather than the traditional method of equating absolute magnitude H = 22 mag as a proxy for 140m. It has been shown that a completeness of 90% of objects with H < 23 mag is needed in order to ensure that 90% of objects larger than 140 m are found. We present here our ongoing work on mission architecture trades and the optimization of the survey cadence for NEO discovery and tracking. We will present the latest NEOSM survey cadence and its expected performance.  We will present the completeness rate after the baseline 5-year mission and a possible extended mission.  Studies have previously shown that the 90% goal can be achieved by a combination of a space mission like NEOCam and a ground based survey like LSST. We will also present how the survey cadence provides self-followup of the NEOs population and ensures orbital quality on par with the current NEO population.</p>


Author(s):  
Gary Emch ◽  
Alan Parkinson

Abstract Engineering models can and should be used to understand the effects of variability on a design. When variability is ignored, brittle designs can result that fail in service. By contrast, robust designs function properly even when subjected to off-nominal conditions. There is a need for better analytical tools to help engineers develop robust designs. In this paper we present a general method for developing designs that are robust to variability induced by worst-case tolerances. The method adapts nonlinear programming techniques in order to determine how a design should be modified to account for variability. We show how this can be done with second order, or even exact, worst-case tolerance models. Results are given for 13 test cases that span a variety of problems. The method enables a designer to understand and account for the effects of worst-case tolerances, making it possible to build robustness into an engineering design.


Author(s):  
Ali Adibi ◽  
Ehsan Salari

It has been recently shown that an additional therapeutic gain may be achieved if a radiotherapy plan is altered over the treatment course using a new treatment paradigm referred to in the literature as spatiotemporal fractionation. Because of the nonconvex and large-scale nature of the corresponding treatment plan optimization problem, the extent of the potential therapeutic gain that may be achieved from spatiotemporal fractionation has been investigated using stylized cancer cases to circumvent the arising computational challenges. This research aims at developing scalable optimization methods to obtain high-quality spatiotemporally fractionated plans with optimality bounds for clinical cancer cases. In particular, the treatment-planning problem is formulated as a quadratically constrained quadratic program and is solved to local optimality using a constraint-generation approach, in which each subproblem is solved using sequential linear/quadratic programming methods. To obtain optimality bounds, cutting-plane and column-generation methods are combined to solve the Lagrangian relaxation of the formulation. The performance of the developed methods are tested on deidentified clinical liver and prostate cancer cases. Results show that the proposed method is capable of achieving local-optimal spatiotemporally fractionated plans with an optimality gap of around 10%–12% for cancer cases tested in this study. Summary of Contribution: The design of spatiotemporally fractionated radiotherapy plans for clinical cancer cases gives rise to a class of nonconvex and large-scale quadratically constrained quadratic programming (QCQP) problems, the solution of which requires the development of efficient models and solution methods. To address the computational challenges posed by the large-scale and nonconvex nature of the problem, we employ large-scale optimization techniques to develop scalable solution methods that find local-optimal solutions along with optimality bounds. We test the performance of the proposed methods on deidentified clinical cancer cases. The proposed methods in this study can, in principle, be applied to solve other QCQP formulations, which commonly arise in several application domains, including graph theory, power systems, and signal processing.


Sign in / Sign up

Export Citation Format

Share Document