scholarly journals Culling a Set of Points for Roundness or Cylindricity Evaluations

2003 ◽  
Vol 13 (03) ◽  
pp. 231-240 ◽  
Author(s):  
Olivier Devillers ◽  
Franco P. Preparata

Roundness and cylindricity evaluations are among the most important problems in computational metrology, and are based on sets of surface measurements (input data points). A recent approach to such evaluations is based on a linear-programming approach yielding a rapidly converging solution. Such a solution is determined by a fixed-size subset of a large input set. With the intent to simplify the main computational task, it appears desirable to cull from the input any point that cannot provably define the solution. In this note we present an analysis and an efficient solution to the problem of culling the input set. For input data points arranged in cross-sections under mild conditions of uniformity, this algorithm runs in linear time.

Author(s):  
Richard Mcintosh ◽  
David Mastronarde ◽  
Kent McDonald ◽  
Rubai Ding

Microtubules (MTs) are cytoplasmic polymers whose dynamics have an influence on cell shape and motility. MTs influence cell behavior both through their growth and disassembly and through the binding of enzymes to their surfaces. In either case, the positions of the MTs change over time as cells grow and develop. We are working on methods to determine where MTs are at different times during either the cell cycle or a morphogenetic event, using thin and thick sections for electron microscopy and computer graphics to model MT distributions.One approach is to track MTs through serial thin sections cut transverse to the MT axis. This work uses a video camera to digitize electron micrographs of cross sections through a MT system and create image files in computer memory. These are aligned and corrected for relative distortions by using the positions of 8 - 10 MTs on adjacent sections to define a general linear transformation that will align and warp adjacent images to an optimum fit. Two hundred MT images are then used to calculate an “average MT”, and this is cross-correlated with each micrograph in the serial set to locate points likely to correspond to MT centers. This set of points is refined through a discriminate analysis that explores each cross correlogram in the neighborhood of every point with a high correlation score.


Author(s):  
Bengt J. Nilsson ◽  
Paweł Żyliński

We present new results on two types of guarding problems for polygons. For the first problem, we present an optimal linear time algorithm for computing a smallest set of points that guard a given shortest path in a simple polygon having [Formula: see text] edges. We also prove that in polygons with holes, there is a constant [Formula: see text] such that no polynomial-time algorithm can solve the problem within an approximation factor of [Formula: see text], unless P=NP. For the second problem, we present a [Formula: see text]-FPT algorithm for computing a shortest tour that sees [Formula: see text] specified points in a polygon with [Formula: see text] holes. We also present a [Formula: see text]-FPT approximation algorithm for this problem having approximation factor [Formula: see text]. In addition, we prove that the general problem cannot be polynomially approximated better than by a factor of [Formula: see text], for some constant [Formula: see text], unless P [Formula: see text]NP.


Author(s):  
Mahbubur R. Meenar ◽  
John A. Sorrentino

Three-dimensional surface modeling has become an important element in the processing and visualization of geographic information. Models are created from a finite sample of data points over the relevant area. The techniques used for these activities can be broadly divided into raster-based interpolation methods and vector-based triangulation methods. This chapter contains a discussion of the benefits and costs of each set of methods. The functions available using 3D surface models include elevation, queries, contours, slope and aspect, hillshade, and viewshed. Applications include modeling elevation, pollution concentration and run-off and erosion potential. The chapter ends with a brief discussion of future trends, and concludes that the choice among the methods depends on the nature of the input data and the goals of the analyst.


2015 ◽  
Vol 770 ◽  
pp. 491-494
Author(s):  
Andrey E. Kovtanyuk

A computed tomography problem as a 3D reconstruction of density distribution is considered. The input data are obtained as a result of irradiations. The solution of the computed tomography problem is presented as a set of cross-section images. The reconstruction in a single cross-section is performed by algorithm of convolution and back projection. The parallelization is fulfilled over a set of cross-sections by use of the MPI technology.


2012 ◽  
Vol 2012 ◽  
pp. 1-10 ◽  
Author(s):  
S. Akhter ◽  
V. Karwal ◽  
R. C. Jain

Fast windowed update algorithms capable of independently updating the odd discrete cosine transform (ODCT) and odd discrete sine transform (ODST) of a running data sequence are analytically developed. In this algorithm, to compute the ODCT coefficients of a real-time sequence, we do not require the ODST coefficients. Similarly, the ODST coefficients of the shifted sequence can be calculated without using ODCT coefficients. The running input data sequence is sampled using a rectangular window. However, this idea can be easily extended for other windows. The update algorithm derived herein can be used to compute the transform coefficients of the shifted sequence as new data points are available. The complexity of developed algorithm isO(N). The validity of algorithm is tested by MATLAB simulations.


Author(s):  
Alok Sinha

Abstract The partial differential equation of motion of an axially moving beam with spatially varying geometric, mass and material properties has been derived. Using the theory of linear time-varying systems, a general algorithm has been developed to compute natural frequencies, mode shapes, and the critical speed for stability. Numerical results from the new method are presented for beams with spatially varying rectangular cross sections with sinusoidal variation in thickness and sine-squared variation in width. They are also compared to those from the Galerkin method. It has been found that critical speed of the beam can be significantly reduced by non-uniformity in a beam’s cross section.


Geophysics ◽  
1972 ◽  
Vol 37 (4) ◽  
pp. 669-674 ◽  
Author(s):  
R. C. Hessing ◽  
Henry K. Lee ◽  
Alan Pierce ◽  
Eldon N. Powers

A method is described for using a digital computer to construct contour maps automatically. Contour lines produced by this method have correct relations to given discrete data points regardless of the spatial distribution of these points. The computer‐generated maps are comparable to those drawn manually. The region to be contoured is divided into quadrilaterals whose vertices include the data points. After supplying values at each of the remaining vertices by using a surface‐fitting technique, bicubic functions are constructed on each quadrilateral to form a smooth surface through the data points. Points on a contour line are obtained from these surfaces by solving the resulting cubic equations. The bicubic functions may be used for other calculations consistent with the contour maps, such as interpolation of equally spaced values, calculation of cross‐sections, and volume calculations.


2009 ◽  
Vol 9 (21) ◽  
pp. 8617-8638 ◽  
Author(s):  
J.-C. Raut ◽  
P. Chazette

Abstract. We investigate in this study the vertical PM10 distributions from mobile measurements carried out from locations along the Paris Peripherique (highly trafficked beltway around Paris), examine distinctions in terms of aerosol concentrations between the outlying regions of Paris and the inner city and eventually discuss the influence of aerosol sources, meteorology, and dynamics on the retrieved PM10 distributions. To achieve these purposes, we combine in situ surface measurements with active remote sensing observations obtained from a great number of research programs in Paris area since 1999. Two approaches, devoted to the conversion of vertical profiles of lidar-derived extinction coefficients into PM10, have been set up. A very good agreement is found between the theoretical and empirical methods with a discrepancy of 3%. Hence, specific extinction cross-sections at 355 nm are provided with a reasonable relative uncertainty lower than 12% for urban (4.5 m2 g−1) and periurban (5.9 m2 g−1) aersols, lower than 26% for rural (7.1 m2 g−1) aerosols, biomass burning (2.6 m2 g−1) and dust (1.1 m2 g−1) aerosols The high spatial and temporal resolutions of the mobile lidar (respectively 1.5 m and 1 min) enable to follow the spatiotemporal variability of various layers trapping aerosols in the troposphere. Appropriate specific extinction cross-sections are applied in each layer detected in the vertical heterogeneities from the lidar profiles. The standard deviation (rms) between lidar-derived PM10 at 200 m above ground and surface network stations measurements was ~14μg m−3. This difference is particularly ascribed to a decorrelation of mass concentrations in the first meters of the boundary layer, as highlighted through multiangular lidar observations. Lidar signals can be used to follow mass concentrations with an uncertainty lower than 25% above urban areas and provide useful information on PM10 peak forecasting that affect air quality.


2001 ◽  
Vol DMTCS Proceedings vol. AA,... (Proceedings) ◽  
Author(s):  
Jérôme Durand-Lose

International audience Cellular automata are mappings over infinite lattices such that each cell is updated according tothe states around it and a unique local function.Block permutations are mappings that generalize a given permutation of blocks (finite arrays of fixed size) to a given partition of the lattice in blocks.We prove that any d-dimensional reversible cellular automaton can be exp ressed as thecomposition of d+1 block permutations.We built a simulation in linear time of reversible cellular automata by reversible block cellular automata (also known as partitioning CA and CA with the Margolus neighborhood) which is valid for both finite and infinite configurations. This proves a 1990 conjecture by Toffoli and Margolus <i>(Physica D 45)</i> improved by Kari in 1996 <i>(Mathematical System Theory 29)</i>.


2018 ◽  
Vol 7 (4.11) ◽  
pp. 202 ◽  
Author(s):  
Mohd Shahrum Md Guntor ◽  
Rohilak Sahak ◽  
Azlee Zabidi ◽  
Nooritawati Md Tahir ◽  
Ihsan Mohd Yassin ◽  
...  

Biometric identification systems have recently made exponential advancements in term of complexity and accuracy in recognition for security purposes and a variety of other application. In this paper, a Convolutional Neural Network (CNN) based gait recognition system using Microsoft Kinect skeletal joint data points is proposed for human identification. A total of 23 subjects were used for the experiments. The subjects were positioned 45 degrees (oblique view) from Kinect. A CNN based on the modified AlexNet structure was used to fit the different input data size. The results indicate that the training and testing accuracies were 100% and 69.6% respectively.  


Sign in / Sign up

Export Citation Format

Share Document