alignment technique
Recently Published Documents


TOTAL DOCUMENTS

323
(FIVE YEARS 50)

H-INDEX

20
(FIVE YEARS 3)

2022 ◽  
pp. 102-108
Author(s):  
Charles C.J. Rivière ◽  
Philippe Cartier ◽  
Pascal André Vendittoli ◽  
Justin Cobb

2021 ◽  
Vol 2021 ◽  
pp. 1-7
Author(s):  
Weiwei Lin ◽  
Reiko Haga

Security ontology can be used to build a shared knowledge model for an application domain to overcome the data heterogeneity issue, but it suffers from its own heterogeneity issue. Finding identical entities in two ontologies, i.e., ontology alignment, is a solution. It is important to select an effective similarity measure (SM) to distinguish heterogeneous entities. However, due to the complex semantic relationships among concepts, no SM is ensured to be effective in all alignment tasks. The aggregation of SMs so that their advantages and disadvantages complement each other directly affects the quality of alignments. In this work, we formally define this problem, discuss its challenges, and present a problem-specific genetic algorithm (GA) to effectively address it. We experimentally test our approach on bibliographic tracks provided by OAEI and five pairs of security ontologies. The results show that GA can effectively address different heterogeneous ontology-alignment tasks and determine high-quality security ontology alignments.


Author(s):  
A. T. Ringler ◽  
R. E. Anthony

AbstractAs seismologists continue to place more stringent demands on data quality, accurately described metadata are becoming increasingly important. In order to better constrain the orientation and sensitivities of seismometers deployed in U.S. Geological Survey networks, the Albuquerque Seismological Laboratory (ASL) has recently begun identifying true north with a fiber optic gyroscope (FOG) and has developed methodologies to constrain mid-band, vertical component sensitivity levels to less than 1% in a controlled environment. However, questions remain regarding the accuracy of this new alignment technique as well as if instrument sensitivities and background noise levels are stable when the seismometers are installed in different environmental settings. In this study, we examine the stability and repeatability of these parameters by reinstalling two high-quality broadband seismometers (Streckeisen STS-2.5 and Nanometrics T-360 Global Seismographic Network (GSN) version) at different locations around the ASL and comparing them to each other and a reference STS-6 seismometer that stayed stationary for the duration of the experiment. We find that even in different environmental conditions, the sensitivities of the two broadband seismometers stayed stable to within 0.1% and that orientations attained using the FOG are generally accurate to within a degree. However, one install was off by 5° due to a mistake made by the installation team. These results indicate that while technology and methodologies are now in place to calibrate and orient a seismometer to within 1°, human error both during the installation and while producing the metadata is often a limiting factor. Finally, we find that background noise levels at short periods (0.1–1 s) become noisier when the sensors are emplaced in unconsolidated materials, whereas the noise levels at long periods (30–100 s) are not sensitive to local geological structure on the vertical components.


Metrologia ◽  
2021 ◽  
Author(s):  
Yang Bai ◽  
Dawei Wang ◽  
Zhengkun Li ◽  
Yunfeng Lu ◽  
Pengcheng Hu ◽  
...  

2021 ◽  
Author(s):  
Hamsa Bastani ◽  
David Simchi-Levi ◽  
Ruihao Zhu

We study the problem of learning shared structure across a sequence of dynamic pricing experiments for related products. We consider a practical formulation in which the unknown demand parameters for each product come from an unknown distribution (prior) that is shared across products. We then propose a meta dynamic pricing algorithm that learns this prior online while solving a sequence of Thompson sampling pricing experiments (each with horizon T) for N different products. Our algorithm addresses two challenges: (i) balancing the need to learn the prior (meta-exploration) with the need to leverage the estimated prior to achieve good performance (meta-exploitation) and (ii) accounting for uncertainty in the estimated prior by appropriately “widening” the estimated prior as a function of its estimation error. We introduce a novel prior alignment technique to analyze the regret of Thompson sampling with a misspecified prior, which may be of independent interest. Unlike prior-independent approaches, our algorithm’s meta regret grows sublinearly in N, demonstrating that the price of an unknown prior in Thompson sampling can be negligible in experiment-rich environments (large N). Numerical experiments on synthetic and real auto loan data demonstrate that our algorithm significantly speeds up learning compared with prior-independent algorithms. This paper was accepted by George J. Shanthikumar for the Management Science Special Issue on Data-Driven Analytics.


2021 ◽  
Vol 11 (17) ◽  
pp. 8028
Author(s):  
Dong Wook Shin ◽  
Lue Quan ◽  
Yuki Shimizu ◽  
Hiraku Matsukuma ◽  
Yindi Cai ◽  
...  

Major modifications are made to the setup and signal processing of the method of in-situ measurement of the pitch of a diffraction grating based on the angles of diffraction of the diffracted optical frequency comb laser emanated from the grating. In the method, the improvement of the uncertainty of in-situ pitch measurement can be expected since every mode in the diffracted optical frequency comb laser can be utilized. Instead of employing a Fabry-Pérot etalon for the separation of the neighboring modes in the group of the diffracted laser beams, the weight-of-mass method is introduced in the method to detect the light wavelength in the Littrow configuration. An attempt is also made to reduce the influence of the non-uniform spectrum of the optical comb laser employed in the setup through normalization operation. In addition, an optical alignment technique with the employment of a retroreflector is introduced for the precise alignment of optical components in the setup. Furthermore, a mathematical model of the pitch measurement by the proposed method is established, and theoretical analysis on the uncertainty of pitch measurement is carried out based on the guide to the expression of uncertainty in measurement (GUM).


2021 ◽  
Author(s):  
Fabio Oliveira F. de Oliveira ◽  
Leonardo A. Dias ◽  
Marcelo Fernandes

In bioinformatics, alignment is an essential technique for finding similarities between biological sequences. Usually, the alignment is performed with the Smith-Waterman (SW) algorithm, a well-known sequence alignment technique of high-level precision based on dynamic programming. However, given the massive data volume in biological databases and their continuous exponential increase, high-speed data processing is necessary. Therefore, this work proposes a parallel hardware design for the SW algorithm with a systolic array structure to accelerate the Forward and Backtracking steps. For this purpose, the architecture calculates and stores the paths in the Forward stage for pre-organizing the alignment, which reduces the complexity of the Backtracking stage. The backtracking starts from the maximum score position in the matrix and generates the optimal SW sequence alignment path. The architecture was validated on Field-Programmable Gate Array (FPGA), and synthesis analyses have shown that the proposed design reaches up to 79.5 Giga Cell Updates per Second (GCPUS).


Micromachines ◽  
2021 ◽  
Vol 12 (8) ◽  
pp. 854
Author(s):  
Bo Chang ◽  
Yuhang Feng ◽  
Jialong Jin ◽  
Quan Zhou

Capillary self-alignment technique can achieve highly accurate and fast alignment of micro components. Capillary self-alignment technique relies on the confinement of liquid droplets at receptor sites where hydrophobic–hydrophilic patterns are widely used. This paper reports a low-cost microsecond pulse laser micromachining method for fabrication of super hydrophilic–super hydrophobic grooves as receptor sites for capillary self-alignment of microfibers. We investigated the influence of major manufacturing parameters on groove sizes and wetting properties. The effects of the width (20 µm–100 µm) and depth (8 µm–36 µm) of the groove on the volume of water droplet contained inside the groove were also investigated. We show that by altering scanning speed, using a de-focused laser beam, we can modify the wetting properties of the microgrooves from 10° to 120° in terms of the contact angle. We demonstrated that different types of microfibers including natural and artificial microfibers can self-align to the size matching super hydrophilic–super hydrophobic microgrooves. The results show that super hydrophilic–super hydrophobic microgrooves have great potential in microfiber micromanipulation applications such as natural microfiber categorization, fiber-based microsensor construction, and fiber-enforced material development.


Sign in / Sign up

Export Citation Format

Share Document