scholarly journals Operation and performance of an automatic metaphase finder based on the MRC fast interval processor.

1986 ◽  
Vol 34 (10) ◽  
pp. 1245-1252 ◽  
Author(s):  
G Shippey ◽  
A D Carothers ◽  
J Gordon

The Medical Research Council's fast interval processor (FIP) has been adapted for metaphase finding and selection. This article summarizes recent improvements to the hardware, and describes the selection of image features. The system uses a highly simplified but effective clustering procedure to reduce computation time, and incorporates a ranking algorithm based on computed cluster features so that high-quality metaphases can be preferentially selected. Experimental results indicate that the system can detect high-quality metaphases rapidly in "rich" material and a high proportion of the available metaphases in "sparse" material. It can handle a wide range of material with good repeatability of performance.

2019 ◽  
Vol 12 (1) ◽  
pp. 129-130
Author(s):  
Manlio Della Marca

Starting with this issue, our journal will include a completely redesigned Book Review Section, featuring three to five high-quality reviews by leading and emerging scholars from around the world. As for the selection of the books to be reviewed, even though I am a literary scholar, it is my intention as Review Editor to consider books that engage with the U.S. and the Americas as a hemispheric and global phenomenon from a wide range of perspectives and disciplines, including anthropology, art history, and media studies.


Open Physics ◽  
2018 ◽  
Vol 16 (1) ◽  
pp. 741-750 ◽  
Author(s):  
José Luis Roca ◽  
German Rodríguez-Bermúdez ◽  
Manuel Fernández-Martínez

AbstractAlong this paper, we shall update the state-of-the-art concerning the application of fractal-based techniques to test for fractal patterns in physiological time series. As such, the first half of the present work deals with some selected approaches to deal with the calculation of the self-similarity exponent of time series. They include broadly-used procedures as well as recent advances improving their accuracy and performance for a wide range of self-similar processes. The second part of this paper consists of a detailed review of high-quality studies carried out in the context of electroencephalogram signals. Both medical and non-medical applications have been deeply reviewed. This work is especially recommended to all those researchers especially interested in fractal pattern recognition for physiological time series.


2010 ◽  
Vol 163-167 ◽  
pp. 2641-2646
Author(s):  
Rong Shi ◽  
Yue Lei He

In this paper, rail fastening of shanghai metro line1,2,3,4, the species distribution from a variety of fasteners and fastening performance comparisons of the two main aspects of the investigation. To mastered the main content of the use of Shanghai Metro rail fastening. Analysis of the popular types fastener of Shanghai subway such as type WJ-2 fastener, type DTIII fastener, type DTIII-2 fastener, type II elastic fastener. Track fastener in the actual use of shock absorber characteristics of the process. Come to Shanghai Metro rail fastening design and selection of line with actual needs, Shanghai Metro rail fastening series featured a wide variety and distinctive. Also pointed out that lack of adjustable fasteners, fastener corrosion, fastener and the lack of vibration reduction and performance degradation caused by a wide range of management and maintenance of complex and emphasize on the use of investigation and analysis of search problems, tracking the course of the development, research the causes and improve the conservation measures to maintain rail fastening methods Keti the safe operation of the subway has a crucial role in.


2019 ◽  
Vol 214 ◽  
pp. 01051
Author(s):  
Julie Kirk

The design and performance of the ATLAS Inner Detector (ID) trigger algorithms running online on the High Level Trigger (HLT) processor farm for 13 TeV LHC collision data with high pileup are discussed. The HLT ID tracking is a vital component in all physics signatures in the ATLAS trigger for the precise selection of the rare or interesting events necessary for physics analysis without overwhelming the offline data storage in terms of both size and rate. To cope with the high interaction rates expected in the 13 TeV LHC collisions, the ID trigger was redesigned during the 2013-15 long shutdown. The performance of the ID trigger in Run 2 data from 13 TeV LHC collisions has been excellent and exceeded expectations, even at the very high interaction multiplicities observed at the end of data-taking in 2017. The detailed efficiencies and resolutions of the ID trigger in a wide range of physics signatures are presented for the Run 2 data. The superb performance of the ID trigger algorithms in these extreme pileup conditions demonstrates how the ID tracking continues to lie at the heart of the trigger performance to enable the ATLAS physics program, and will continue to do so in the future.


2015 ◽  
Vol 2015 ◽  
pp. 1-14 ◽  
Author(s):  
E. Castillo ◽  
D. P. Morales ◽  
A. García ◽  
L. Parrilla ◽  
E. Todorovich ◽  
...  

HDL-level design offers important advantages for the application of watermarking to IP cores, but its complexity also requires tools automating these watermarking algorithms. A new tool for signature distribution through combinational logic is proposed in this work. IPP@HDL, a previously proposed high-level watermarking technique, has been employed for evaluating the tool. IPP@HDL relies on spreading the bits of a digital signature at the HDL design level using combinational logic included within the original system. The development of this new tool for the signature distribution has not only extended and eased the applicability of this IPP technique, but it has also improved the signature hosting process itself. Three algorithms were studied in order to develop this automated tool. The selection of a cost function determines the best hosting solutions in terms of area and performance penalties on the IP core to protect. An 1D-DWT core and MD5 and SHA1 digital signatures were used in order to illustrate the benefits of the new tool and its optimization related to the extraction logic resources. Among the proposed algorithms, the alternative based on simulated annealing reduces the additional resources while maintaining an acceptable computation time and also saving designer effort and time.


2021 ◽  
Vol 12 (1) ◽  
pp. 18-36
Author(s):  
Eslam Mohammed Abdelkader

Selection of materials is pivotal for the success of engineering design applications. Their proper selection is an exhaustive task as a result of the presence of a wide range of material alternatives and performance attributes. Thus, this research proposes an integrated multi-criteria decision-making method for the sake of evaluating a set of material alternatives. The material alternatives are assessed with regards to their surface hardness, core hardness, surface fatigue limit, bending fatigue limit, ultimate tensile strength, and cost. The weights of attributes are obtained based on criteria importance through inter-criteria correlation (CRITIC) algorithm. This research then employs six types of multi-criteria decision-making algorithms to rank the material alternatives. Average ranking algorithm is then applied to generate a full consensus prioritization of material alternatives. A sensitivity analysis is also carried out to determine the most robust and sensitive multi-criteria decision-making algorithm.


2021 ◽  
Author(s):  
Katrin Hafner ◽  
Dave Wilson ◽  
Rob Mellors ◽  
Pete Davis

<p>The decades long recordings of high-quality open data from the Global Seismographic Network have facilitated studies of earth structure and earthquake processes, as well as monitoring of earthquakes and explosions worldwide.  These data have also enabled a wide range of transformative, cross-disciplinary research that far exceeded the original expectations and design goals of the network, including studies of slow earthquakes, landslides, the Earth’s “hum”, glacial earthquakes, sea-state, climate change, and induced seismicity. </p><p>The GSN continues to produce high quality waveform data, metadata, and multiple data quality metrics such as timing quality and noise levels.   This requires encouraging equipment vendors to develop modern instrumentation, upgrading the stations with new seismic sensors and infrastructure, implementing consistent and well documented calibrations, and monitoring of noise performance.    A Design Goals working group is convening to evaluate how well the GSN has met its original 1985 and 2002 goals, as well as how the network should evolve in order to be able to meet the requirements for enabling new research and monitoring capabilities.   </p><p>In collaboration with GEOFON and GEOSCOPE the GSN is also reviewing the current global distribution and performance of very broadband and broadband stations that comprise these three networks.  We are working to exchange our expertise and experience about new technologies and deployment techniques, and to identify regions where we could collaborate to make operations more efficient, where current efforts are overlapping or where we have similar needs for relocating stations. </p>


Methodology ◽  
2007 ◽  
Vol 3 (1) ◽  
pp. 14-23 ◽  
Author(s):  
Juan Ramon Barrada ◽  
Julio Olea ◽  
Vicente Ponsoda

Abstract. The Sympson-Hetter (1985) method provides a means of controlling maximum exposure rate of items in Computerized Adaptive Testing. Through a series of simulations, control parameters are set that mark the probability of administration of an item on being selected. This method presents two main problems: it requires a long computation time for calculating the parameters and the maximum exposure rate is slightly above the fixed limit. Van der Linden (2003) presented two alternatives which appear to solve both of the problems. The impact of these methods in the measurement accuracy has not been tested yet. We show how these methods over-restrict the exposure of some highly discriminating items and, thus, the accuracy is decreased. It also shown that, when the desired maximum exposure rate is near the minimum possible value, these methods offer an empirical maximum exposure rate clearly above the goal. A new method, based on the initial estimation of the probability of administration and the probability of selection of the items with the restricted method ( Revuelta & Ponsoda, 1998 ), is presented in this paper. It can be used with the Sympson-Hetter method and with the two van der Linden's methods. This option, when used with Sympson-Hetter, speeds the convergence of the control parameters without decreasing the accuracy.


2020 ◽  
pp. 1-12
Author(s):  
Wu Xin ◽  
Qiu Daping

The inheritance and innovation of ancient architecture decoration art is an important way for the development of the construction industry. The data process of traditional ancient architecture decoration art is relatively backward, which leads to the obvious distortion of the digitalization of ancient architecture decoration art. In order to improve the digital effect of ancient architecture decoration art, based on neural network, this paper combines the image features to construct a neural network-based ancient architecture decoration art data system model, and graphically expresses the static construction mode and dynamic construction process of the architecture group. Based on this, three-dimensional model reconstruction and scene simulation experiments of architecture groups are realized. In order to verify the performance effect of the system proposed in this paper, it is verified through simulation and performance testing, and data visualization is performed through statistical methods. The result of the study shows that the digitalization effect of the ancient architecture decoration art proposed in this paper is good.


Sign in / Sign up

Export Citation Format

Share Document