scholarly journals Compton scattering tomography for agricultural measurements

2006 ◽  
Vol 26 (1) ◽  
pp. 151-160 ◽  
Author(s):  
Paulo E. Cruvinel ◽  
Fatai A. Balogun

This paper presents a new approach in tomographic instrumentation for agriculture based on Compton scattering, which allows for the simultaneous measurements of density and moisture of soil samples. Compton tomography is a technique that can be used to obtain a spatial map of electronic density of samples. Quantitative results can be obtained by using a reconstruction algorithm that takes into account the absorption of incident and scattered radiation. Results show a coefficient of linear correlation better than 0.81, when comparison is made between soil density measurements based on this method and direct transmission tomography. For soil water contents, a coefficient of linear correlation better than 0.79 was found when compared with measurements obtained by time domain reflectrometry (TDR). In addition, a set of Compton scatter images are presented to illustrate the efficacy of this imaging technique, which makes possible improved spatial variability analysis of pre-established planes.

1989 ◽  
Vol 54 (1) ◽  
pp. 117-135
Author(s):  
Oldřich Pytela ◽  
Vítězslav Zima

The method of conjugate deviations based on the regression analysis has been suggested for construction of a new nucleophilicity scale. This method has been applied to a set of 28 nucleophiles participating in 47 physical and chemical processes described in literature. The two-parameter nucleophilicity scale obtained represents-in the parameter denoted as ND-the general tendency to form a bond to an electrophile predominantly on the basis of the orbital interaction and-in the parameter denoted as PD-the ability to interact with a centre similar to the proton (basicity). The linear correlation equation involving the ND, PD parameters and the charge appears to be distinctly better than the most significant relations used. The correlation dependences have the physico-chemical meaning. From the position of individual nucleophiles in the space of the ND and PD parameters, some general conclusions have been derived about the factors governing the reactivity of nucleophiles.


2020 ◽  
pp. 1-16
Author(s):  
Meriem Khelifa ◽  
Dalila Boughaci ◽  
Esma Aïmeur

The Traveling Tournament Problem (TTP) is concerned with finding a double round-robin tournament schedule that minimizes the total distances traveled by the teams. It has attracted significant interest recently since a favorable TTP schedule can result in significant savings for the league. This paper proposes an original evolutionary algorithm for TTP. We first propose a quick and effective constructive algorithm to construct a Double Round Robin Tournament (DRRT) schedule with low travel cost. We then describe an enhanced genetic algorithm with a new crossover operator to improve the travel cost of the generated schedules. A new heuristic for ordering efficiently the scheduled rounds is also proposed. The latter leads to significant enhancement in the quality of the schedules. The overall method is evaluated on publicly available standard benchmarks and compared with other techniques for TTP and UTTP (Unconstrained Traveling Tournament Problem). The computational experiment shows that the proposed approach could build very good solutions comparable to other state-of-the-art approaches or better than the current best solutions on UTTP. Further, our method provides new valuable solutions to some unsolved UTTP instances and outperforms prior methods for all US National League (NL) instances.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
J. Buijs ◽  
J. van der Gucht ◽  
J. Sprakel

Abstract Laser speckle imaging is a powerful imaging technique that visualizes microscopic motion within turbid materials. At current two methods are widely used to analyze speckle data: one is fast but qualitative, the other quantitative but computationally expensive. We have developed a new processing algorithm based on the fast Fourier transform, which converts raw speckle patterns into maps of microscopic motion and is both fast and quantitative, providing a dynamnic spectrum of the material over a frequency range spanning several decades. In this article we show how to apply this algorithm and how to measure a diffusion coefficient with it. We show that this method is quantitative and several orders of magnitude faster than the existing quantitative method. Finally we harness the potential of this new approach by constructing a portable laser speckle imaging setup that performs quantitative data processing in real-time on a tablet.


2013 ◽  
Vol 2013 ◽  
pp. 1-19
Author(s):  
Wai-Yuan Tan ◽  
Hong Zhou

To incorporate biologically observed epidemics into multistage models of carcinogenesis, in this paper we have developed new stochastic models for human cancers. We have further incorporated genetic segregation of cancer genes into these models to derive generalized mixture models for cancer incidence. Based on these models we have developed a generalized Bayesian approach to estimate the parameters and to predict cancer incidence via Gibbs sampling procedures. We have applied these models to fit and analyze the SEER data of human eye cancers from NCI/NIH. Our results indicate that the models not only provide a logical avenue to incorporate biological information but also fit the data much better than other models. These models would not only provide more insights into human cancers but also would provide useful guidance for its prevention and control and for prediction of future cancer cases.


2011 ◽  
Vol 90-93 ◽  
pp. 2858-2863
Author(s):  
Wei Li ◽  
Xu Wang

Due to the soft and hard threshold function exist shortcomings. This will reduce the performance in wavelet de-noising. in order to solve this problem,This article proposes Modulus square approach. the new approach avoids the discontinuity of the hard threshold function and also decreases the fixed bias between the estimated wavelet coefficients and the wavelet coefficients of the soft-threshold method.Simulation results show that SNR and MSE are better than simply using soft and hard threshold,having good de-noising effect in Deformation Monitoring.


2014 ◽  
Vol 1 (4) ◽  
pp. 34-50
Author(s):  
Roee Anuar ◽  
Yossi Bukchin ◽  
Oded Maimon ◽  
Lior Rokach

The task of a recommender system evaluation has often been addressed in the literature, however there exists no consensus regarding the best metrics to assess its performance. This research deals with collaborative filtering recommendation systems, and proposes a new approach for evaluating the quality of neighbor selection. It theorizes that good recommendations emerge from good selection of neighbors. Hence, measuring the quality of the neighborhood may be used to predict the recommendation success. Since user neighborhoods in recommender systems are often sparse and differ in their rating range, this paper designs a novel measure to asses a neighborhood quality. First it builds the realization based entropy (RBE), which presents the classical entropy measure from a different angle. Next it modifies the RBE and propose the realization based distance entropy (RBDE), which considers also continuous data. Using the RBDE, it finally develops the consent entropy, which takes into account the absence of rating data. The paper compares the proposed approach with common approaches from the literature, using several recommendation evaluation metrics. It presents offline experiments using the Netflix database. The experimental results confirm that consent entropy performs better than commonly used metrics, particularly with high sparsity neighborhoods. This research is supported by The Israel Science Foundation, Grant #1362/10. This research is supported by NHECD EC, Grant #218639.


2014 ◽  
Vol 26 (2) ◽  
pp. 24003
Author(s):  
古宇飞 Gu Yufei ◽  
闫镔 Yan Bin ◽  
王彪 Wang Biao ◽  
李磊 Li Lei ◽  
韩玉 Han Yu

Biometrika ◽  
2019 ◽  
Vol 106 (4) ◽  
pp. 981-988
Author(s):  
Y Cheng ◽  
Y Zhao

Summary Empirical likelihood is a very powerful nonparametric tool that does not require any distributional assumptions. Lazar (2003) showed that in Bayesian inference, if one replaces the usual likelihood with the empirical likelihood, then posterior inference is still valid when the functional of interest is a smooth function of the posterior mean. However, it is not clear whether similar conclusions can be obtained for parameters defined in terms of $U$-statistics. We propose the so-called Bayesian jackknife empirical likelihood, which replaces the likelihood component with the jackknife empirical likelihood. We show, both theoretically and empirically, the validity of the proposed method as a general tool for Bayesian inference. Empirical analysis shows that the small-sample performance of the proposed method is better than its frequentist counterpart. Analysis of a case-control study for pancreatic cancer is used to illustrate the new approach.


2019 ◽  
Vol 9 (4) ◽  
pp. 13-22
Author(s):  
Fatima Ardjani ◽  
Djelloul Bouchiha

The ontology alignment process aims at generating a set of correspondences between entities of two ontologies. It is an important task, notably in the semantic web research, because it allows the joint consideration of resources defined in different ontologies. In this article, the authors developed an ontology alignment system called ABCMap+. It uses an optimization method based on artificial bee colonies (ABC) to solve the problem of optimizing the aggregation of three similarity measures of different matchers (syntactic, linguistic and structural) to obtain a single similarity measure. To evaluate the ABCMap+ ontology alignment system, authors considered the OAEI 2012 alignment system evaluation campaign. Experiments have been carried out to get the best ABCMap+'s alignment. Then, a comparative study showed that the ABCMap+ system is better than participants in the OAEI 2012 in terms of Recall and Precision.


2012 ◽  
Vol 32 (1) ◽  
pp. 18-31 ◽  
Author(s):  
Francesc Sidera ◽  
Anna Amadó ◽  
Elisabet Serrat

Abstract This paper studies children’s capacity to understand that the emotions displayed in pretend play contexts do not necessarily correspond to internal emotions, and that pretend emotions may create false beliefs in an observer. A new approach is taken by asking children about pretend emotions in terms of pretence-reality instead of appearance-reality. A total of 37 four-year-olds and 33 six-year-olds were asked to participate in tasks where they had to pretend an emotion or where they were told stories in which the protagonists pretended an emotion. In each task children were asked: a) if the pretend emotion was real or just pretended and b) if an observer would think that the emotional expression was real or just pretended. Results showed that four-year-olds are capable of understanding that pretend emotions are not necessarily real. Overall, six-year-olds performed better than younger children. Furthermore, both age groups showed difficulty in understanding that pretend emotions might unintentionally mislead an observer. Results are discussed in relation to previous research on children’s ability to understand pretend play and the emotional appearance-reality distinction.


Sign in / Sign up

Export Citation Format

Share Document