GFT NMR, a New Approach To Rapidly Obtain Precise High-Dimensional NMR Spectral Information

2003 ◽  
Vol 125 (5) ◽  
pp. 1385-1393 ◽  
Author(s):  
Seho Kim ◽  
Thomas Szyperski

2014 ◽  
Vol 2014 ◽  
pp. 1-10 ◽  
Author(s):  
Alexander Dementjev ◽  
Burkhard Hensel ◽  
Klaus Kabitzsch ◽  
Bernd Kauschinger ◽  
Steffen Schroeder

Machine tools are important parts of high-complex industrial manufacturing. Thus, the end product quality strictly depends on the accuracy of these machines, but they are prone to deformation caused by their own heat. The deformation needs to be compensated in order to assure accurate production. So an adequate model of the high-dimensional thermal deformation process must be created and parameters of this model must be evaluated. Unfortunately, such parameters are often unknown and cannot be calculated a priori. Parameter identification during real experiments is not an option for these models because of its high engineering and machine time effort. The installation of additional sensors to measure these parameters directly is uneconomical. Instead, an effective calibration of thermal models can be reached by combining real and virtual measurements on a machine tool during its real operation, without additional sensors installation. In this paper, a new approach for thermal model calibration is presented. The expected results are very promising and can be recommended as an effective solution for this class of problems.





2019 ◽  
Author(s):  
Sri Harsha Kondapalli ◽  
Shantanu Chakrabartty

In variance-based logic (VBL), information is encoded by the change in the variance of a signal as opposed to the conventional mean-based logic (MBL) where the information is encoded by the change in the mean of the signal. In this paper, we compare the fundamental limits on the minimum energy per bit that can be achieved by VBL and MBL representations in high-dimensional signal space. We show that while for MBL representations, the trade-off between the energy-per-bit and the bit-error-rate (BER) is fundamentally constrained by the classical Shannon-limit, using VBL representations it is theoretically possible to achieve arbitrarily small BER while dissipating near zero energy-per-bit. This surprising result has been experimentally verified for Additive-white-Gaussian-Noise (AWGN) channels using Monte-Carlo simulations. We believe that high-dimensional VBL based encoding could provide a new approach for designing ultra-energy-efficient communication and sensing systems.



2018 ◽  
Vol 7 (4) ◽  
pp. 605-655 ◽  
Author(s):  
Shirin Jalali ◽  
Arian Maleki

Abstract Consider the problem of estimating parameters $X^n \in \mathbb{R}^n $, from $m$ response variables $Y^m = AX^n+Z^m$, under the assumption that the distribution of $X^n$ is known. Lack of computationally feasible algorithms that employ generic prior distributions and provide a good estimate of $X^n$ has limited the set of distributions researchers use to model the data. To address this challenge, in this article, a new estimation scheme named quantized maximum a posteriori (Q-MAP) is proposed. The new method has the following properties: (i) In the noiseless setting, it has similarities to maximum a posteriori (MAP) estimation. (ii) In the noiseless setting, when $X_1,\ldots,X_n$ are independent and identically distributed, asymptotically, as $n$ grows to infinity, its required sampling rate ($m/n$) for an almost zero-distortion recovery approaches the fundamental limits. (iii) It scales favorably with the dimensions of the problem and therefore is applicable to high-dimensional setups. (iv) The solution of the Q-MAP optimization can be found via a proposed iterative algorithm that is provably robust to error (noise) in response variables.



Information ◽  
2018 ◽  
Vol 9 (11) ◽  
pp. 266
Author(s):  
Phillip Santos ◽  
Pedro Ruas ◽  
Julio Neves ◽  
Paula Silva ◽  
Sérgio Dias ◽  
...  

Formal concept analysis (FCA) is largely applied in different areas. However, in some FCA applications the volume of information that needs to be processed can become unfeasible. Thus, the demand for new approaches and algorithms that enable processing large amounts of information is increasing substantially. This article presents a new algorithm for extracting proper implications from high-dimensional contexts. The proposed algorithm, called ImplicPBDD, was based on the PropIm algorithm, and uses a data structure called binary decision diagram (BDD) to simplify the representation of the formal context and enhance the extraction of proper implications. In order to analyze the performance of the ImplicPBDD algorithm, we performed tests using synthetic contexts varying the number of objects, attributes and context density. The experiments show that ImplicPBDD has a better performance—up to 80% faster—than its original algorithm, regardless of the number of attributes, objects and densities.



2016 ◽  
Vol 12 (7) ◽  
pp. 625-635 ◽  
Author(s):  
Piraya Kaewsuwan ◽  
Chumpol Yuangyai ◽  
Chen-Yang Cheng ◽  
Udom Janjarassuk

Abstract Sausage color usually influences consumers’ selection due to the perceptions of quality. Extensive studies have applied image processing to capture the characteristics of food products according to the high-dimensional nature of the resultant images. However, the color homogeneity (i. e. “within pack” variation) and uniformity (i. e. “between-pack” variation) have rarely been studied. Therefore, this paper proposes a new framework to detect both variations using images. In addition, a new approach has been developed to deal with high-dimension data involving colorimetric characteristics, namely L*, a*, b*, hue (h) and chroma (C*). These high-dimensional data are transformed to represent color homogeneity and uniformity. Hotelling T2 chart is used to detect color abnormalities. Our approach indicates that the out-of-control items can be identified with the control chart signals. Nonetheless, the out-of-control signals alone are inadequate for determination of the possible causes. Then, the proposed analysis framework was subsequently applied to identify possible causes that contributed to the process deviations. Furthermore, prior to the experiments with sausages, the image inspection device was tested for gauge repeatability and reproducibility.



Sign in / Sign up

Export Citation Format

Share Document