scholarly journals Terahertz Multilayer Thickness Measurements: Comparison of Optoelectronic Time and Frequency Domain Systems

Author(s):  
Lars Liebermeister ◽  
Simon Nellen ◽  
Robert B. Kohlhaas ◽  
Sebastian Lauck ◽  
Milan Deumer ◽  
...  

AbstractWe compare a state-of-the-art terahertz (THz) time domain spectroscopy (TDS) system and a novel optoelectronic frequency domain spectroscopy (FDS) system with respect to their performance in layer thickness measurements. We use equal sample sets, THz optics, and data evaluation methods for both spectrometers. On single-layer and multi-layer dielectric samples, we found a standard deviation of thickness measurements below 0.2 µm for TDS and below 0.5 µm for FDS. This factor of approx. two between the accuracy of both systems reproduces well for all samples. Although the TDS system achieves higher accuracy, FDS systems can be a competitive alternative for two reasons. First, the architecture of an FDS system is essentially simpler, and thus the price can be much lower compared to TDS. Second, an accuracy below 1 µm is sufficient for many real-world applications. Thus, this work may be a starting point for a comprehensive cross comparison of different terahertz systems developed for specific industrial applications.

2021 ◽  
Vol 54 (6) ◽  
pp. 1-35
Author(s):  
Ninareh Mehrabi ◽  
Fred Morstatter ◽  
Nripsuta Saxena ◽  
Kristina Lerman ◽  
Aram Galstyan

With the widespread use of artificial intelligence (AI) systems and applications in our everyday lives, accounting for fairness has gained significant importance in designing and engineering of such systems. AI systems can be used in many sensitive environments to make important and life-changing decisions; thus, it is crucial to ensure that these decisions do not reflect discriminatory behavior toward certain groups or populations. More recently some work has been developed in traditional machine learning and deep learning that address such challenges in different subdomains. With the commercialization of these systems, researchers are becoming more aware of the biases that these applications can contain and are attempting to address them. In this survey, we investigated different real-world applications that have shown biases in various ways, and we listed different sources of biases that can affect AI applications. We then created a taxonomy for fairness definitions that machine learning researchers have defined to avoid the existing bias in AI systems. In addition to that, we examined different domains and subdomains in AI showing what researchers have observed with regard to unfair outcomes in the state-of-the-art methods and ways they have tried to address them. There are still many future directions and solutions that can be taken to mitigate the problem of bias in AI systems. We are hoping that this survey will motivate researchers to tackle these issues in the near future by observing existing work in their respective fields.


Algorithms ◽  
2021 ◽  
Vol 14 (7) ◽  
pp. 197
Author(s):  
Ali Seman ◽  
Azizian Mohd Sapawi

In the conventional k-means framework, seeding is the first step toward optimization before the objects are clustered. In random seeding, two main issues arise: the clustering results may be less than optimal and different clustering results may be obtained for every run. In real-world applications, optimal and stable clustering is highly desirable. This report introduces a new clustering algorithm called the zero k-approximate modal haplotype (Zk-AMH) algorithm that uses a simple and novel seeding mechanism known as zero-point multidimensional spaces. The Zk-AMH provides cluster optimality and stability, therefore resolving the aforementioned issues. Notably, the Zk-AMH algorithm yielded identical mean scores to maximum, and minimum scores in 100 runs, producing zero standard deviation to show its stability. Additionally, when the Zk-AMH algorithm was applied to eight datasets, it achieved the highest mean scores for four datasets, produced an approximately equal score for one dataset, and yielded marginally lower scores for the other three datasets. With its optimality and stability, the Zk-AMH algorithm could be a suitable alternative for developing future clustering tools.


2021 ◽  
Vol 336 ◽  
pp. 06013
Author(s):  
Jizhaxi Dao ◽  
Zhijie Cai ◽  
Rangzhuoma Cai ◽  
Maocuo San ◽  
Mabao Ban

Corpus serves as an indispensable ingredient for statistical NLP research and real-world applications, therefore corpus construction method has a direct impact on various downstream tasks. This paper proposes a method to construct Tibetan text classification corpus based on a syllable-level processing technique which we refer as TC_TCCNL. Empirical evidence indicates that the algorithm is able to produce a promising performance, which may lay a starting point for research on Tibetan text classification in the future.


2014 ◽  
pp. 8-20
Author(s):  
Kurosh Madani

In a large number of real world dilemmas and related applications the modeling of complex behavior is the central point. Over the past decades, new approaches based on Artificial Neural Networks (ANN) have been proposed to solve problems related to optimization, modeling, decision making, classification, data mining or nonlinear functions (behavior) approximation. Inspired from biological nervous systems and brain structure, Artificial Neural Networks could be seen as information processing systems, which allow elaboration of many original techniques covering a large field of applications. Among their most appealing properties, one can quote their learning and generalization capabilities. The main goal of this paper is to present, through some of main ANN models and based techniques, their real application capability in real world industrial dilemmas. Several examples through industrial and real world applications have been presented and discussed.


2020 ◽  
Vol 68 ◽  
pp. 311-364
Author(s):  
Francesco Trovo ◽  
Stefano Paladino ◽  
Marcello Restelli ◽  
Nicola Gatti

Multi-Armed Bandit (MAB) techniques have been successfully applied to many classes of sequential decision problems in the past decades. However, non-stationary settings -- very common in real-world applications -- received little attention so far, and theoretical guarantees on the regret are known only for some frequentist algorithms. In this paper, we propose an algorithm, namely Sliding-Window Thompson Sampling (SW-TS), for nonstationary stochastic MAB settings. Our algorithm is based on Thompson Sampling and exploits a sliding-window approach to tackle, in a unified fashion, two different forms of non-stationarity studied separately so far: abruptly changing and smoothly changing. In the former, the reward distributions are constant during sequences of rounds, and their change may be arbitrary and happen at unknown rounds, while, in the latter, the reward distributions smoothly evolve over rounds according to unknown dynamics. Under mild assumptions, we provide regret upper bounds on the dynamic pseudo-regret of SW-TS for the abruptly changing environment, for the smoothly changing one, and for the setting in which both the non-stationarity forms are present. Furthermore, we empirically show that SW-TS dramatically outperforms state-of-the-art algorithms even when the forms of non-stationarity are taken separately, as previously studied in the literature.


Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 407 ◽  
Author(s):  
Dominik Weikert ◽  
Sebastian Mai ◽  
Sanaz Mostaghim

In this article, we present a new algorithm called Particle Swarm Contour Search (PSCS)—a Particle Swarm Optimisation inspired algorithm to find object contours in 2D environments. Currently, most contour-finding algorithms are based on image processing and require a complete overview of the search space in which the contour is to be found. However, for real-world applications this would require a complete knowledge about the search space, which may not be always feasible or possible. The proposed algorithm removes this requirement and is only based on the local information of the particles to accurately identify a contour. Particles search for the contour of an object and then traverse alongside using their known information about positions in- and out-side of the object. Our experiments show that the proposed PSCS algorithm can deliver comparable results as the state-of-the-art.


2008 ◽  
Vol 8 (5-6) ◽  
pp. 545-580 ◽  
Author(s):  
WOLFGANG FABER ◽  
GERALD PFEIFER ◽  
NICOLA LEONE ◽  
TINA DELL'ARMI ◽  
GIUSEPPE IELPA

AbstractDisjunctive logic programming (DLP) is a very expressive formalism. It allows for expressing every property of finite structures that is decidable in the complexity class ΣP2(=NPNP). Despite this high expressiveness, there are some simple properties, often arising in real-world applications, which cannot be encoded in a simple and natural manner. Especially properties that require the use of arithmetic operators (like sum, times, or count) on a set or multiset of elements, which satisfy some conditions, cannot be naturally expressed in classic DLP. To overcome this deficiency, we extend DLP by aggregate functions in a conservative way. In particular, we avoid the introduction of constructs with disputed semantics, by requiring aggregates to be stratified. We formally define the semantics of the extended language (called ), and illustrate how it can be profitably used for representing knowledge. Furthermore, we analyze the computational complexity of , showing that the addition of aggregates does not bring a higher cost in that respect. Finally, we provide an implementation of in DLV—a state-of-the-art DLP system—and report on experiments which confirm the usefulness of the proposed extension also for the efficiency of computation.


Geophysics ◽  
2014 ◽  
Vol 79 (6) ◽  
pp. E269-E286 ◽  
Author(s):  
Sébastien de la Kethulle de Ryhove ◽  
Rune Mittet

Frequency-domain methods, which are typically applied to 3D magnetotelluric (MT) modeling, require solving a system of linear equations for every frequency of interest. This is memory and computationally intensive. We developed a finite-difference time-domain algorithm to perform 3D MT modeling in a marine environment in which Maxwell’s equations are solved in a so-called fictitious-wave domain. Boundary conditions are efficiently treated via convolutional perfectly matched layers, for which we evaluated optimized parameter values obtained by testing over a large number of models. In comparison to the typically applied frequency-domain methods, two advantages of the finite-difference time-domain method are (1) that it is an explicit, low-memory method that entirely avoids the solution of systems of linear equations and (2) that it allows the computation of the electromagnetic field unknowns at all frequencies of interest in a single simulation. We derive a design criterion for vertical node spacing in a nonuniform grid using dispersion analysis as a starting point. Modeling results obtained using our finite-difference time-domain algorithm are compared with results obtained using an integral equation method. The agreement was found to be very good. We also discuss a real data inversion example in which MT modeling was done with our algorithm.


2018 ◽  
Author(s):  
Aditi Kathpalia ◽  
Nithin Nagaraj

Causality testing methods are being widely used in various disciplines of science. Model-free methods for causality estimation are very useful as the underlying model generating the data is often unknown. However, existing model-free measures assume separability of cause and effect at the level of individual samples of measurements and unlike model-based methods do not perform any intervention to learn causal relationships. These measures can thus only capture causality which is by the associational occurrence of ‘cause’ and ‘effect’ between well separated samples. In real-world processes, often ‘cause’ and ‘effect’ are inherently inseparable or become inseparable in the acquired measurements. We propose a novel measure that uses an adaptive interventional scheme to capture causality which is not merely associational. The scheme is based on characterizing complexities associated with the dynamical evolution of processes on short windows of measurements. The formulated measure, Compression- Complexity Causality is rigorously tested on simulated and real datasets and its performance is compared with that of existing measures such as Granger Causality and Transfer Entropy. The proposed measure is robust to presence of noise, long-term memory, filtering and decimation, low temporal resolution (including aliasing), non-uniform sampling, finite length signals and presence of common driving variables. Our measure outperforms existing state-of-the-art measures, establishing itself as an effective tool for causality testing in real world applications.


Processes ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 440 ◽  
Author(s):  
Guanghui Wang ◽  
Qun Liu ◽  
Chuanzhen Wang ◽  
Lulu Dong ◽  
Dan Dai ◽  
...  

Hydrocyclones are extensively known as important separation devices which are used in many industrial fields. However, the general method to estimate device performance is time-consuming and has a high cost. The aim of this paper was to investigate the blockage diagnosis for a lab-scale hydrocyclone using a vibration-based technique based on wavelet denoising and the discrete-time Fourier transform method. The results indicate that the farther away the installation location from feed inlet the more regular the frequency is, which reveals that the installation plane near to the spigot generated the regular frequency distribution. Furthermore, the acceleration amplitude under blockage degrees 0%, 50% and 100% fluctuates as a sine shape with increasing time, meanwhile the vibration frequency of the hydrocyclone rises with increasing throughput. Moreover, the distribution of four dimensional and five non-dimensional parameters for the time domain shows that the standard deviation, compared to the others, reduced gradually with increases in blockage degree. Thus, the standard deviation was used to evaluate the online diagnosis of the blockage. The frequency domain distribution under different throughput reveals that the characteristic peaks consisting of the faulty frequency and multiple frequency were produced by the faulty blockage and the feed pump, respectively. Hence, the faulty peak of 16–17 Hz was adopted to judge the real-time blockage of the hydrocyclone, i.e., the presence of the characteristic peak marks the blockage, and its value is proportional to the blockage degree. The application of the online monitoring system demonstrates that the combination of the time domain and the frequency domain could admirably detect the running state and rapidly recognize blockage faults.


Sign in / Sign up

Export Citation Format

Share Document