scholarly journals Permissible Region Extraction Strategies for XLCT: A Comparative Study

2021 ◽  
Vol 2112 (1) ◽  
pp. 012001
Author(s):  
Xiaohang Liu ◽  
Sihao Ma ◽  
Sheng Zhong ◽  
Aocheng Su ◽  
Zhiwei Huang ◽  
...  

Abstract Permissible region (PR) strategy has been used successfully to alleviate the ill-posedness of the X-ray luminescence computed tomography (XLCT) reconstruction problem. In the previous researches on the permissible region strategy, it is obvious that permissible region strategy can solve the reconstruction problem efficiently. This paper aims to research the performances of four types of permissible region extraction strategies, including a permissible region manually extraction strategy, a permissible region extraction strategy with a priori information of the surface nanophosphors distribution, a permissible region extraction strategy based on the first-time reconstruction result and a precise permissible region extraction strategy. In addition, some heuristic conclusions are provided for the future study in this paper. Fast iterative shrinkage-thresholding algorithm (FISTA) is used to reconstruct in this paper. The numerical simulation experiments and physical phantom experiments are setup to evaluate and illustrate the performances of the four different types of permissible region strategies.

2012 ◽  
Vol 5 (1) ◽  
pp. 1293-1315
Author(s):  
M. Reuter ◽  
M. Buchwitz ◽  
O. Schneising ◽  
J. Heymann ◽  
S. Guerlet ◽  
...  

Abstract. A simple empirical CO2 model (SECM) is presented to estimate column-average dry-air mole fractions of atmospheric CO2 (XCO2) as well as mixing ratio profiles. SECM is based on a simple equation depending on 17 empirical parameters, latitude, and date. The empirical parameters have been determined by least squares fitting to NOAA's (National Oceanic and Atmospheric Administration) assimilation system CarbonTracker version 2010 (CT2010). Comparisons with TCCON (total column carbon observing network) FTS (Fourier transform spectrometer) measurements show that SECM XCO2 agrees quite well with reality. The synthetic XCO2 values have a standard error of 1.39 ppm and systematic station-to-station biases of 0.46 ppm. Typical column averaging kernels of the TCCON FTS, a SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric CHartographY), and two GOSAT (Greenhouse gases Observing SATellite) XCO2 retrieval algorithms have been used to assess the smoothing error introduced by using SECM profiles instead of CT2010 profiles as a priori. The additional smoothing error amounts to 0.17 ppm for a typical SCIAMACHY averaging kernel and is most times much smaller for the other instruments (e.g. 0.05 ppm for a typical TCCON FTS averaging kernel). Therefore, SECM is well-suited to provide a priori information for state of the art ground-based (FTS) and satellite-based (GOSAT, SCIAMACHY) XCO2 retrievals. Other potential applications are: (i) quick check for obvious retrieval errors (by monitoring the difference to SECM), (ii) near real time processing systems (that cannot make use of models like CT2010 operated in delayed mode), (iii) "CO2 proxy" methods for XCH4 retrievals (as correction for the XCO2 background), (iv) observing system simulation experiments especially for future satellite missions.


2012 ◽  
Vol 5 (6) ◽  
pp. 1349-1357 ◽  
Author(s):  
M. Reuter ◽  
M. Buchwitz ◽  
O. Schneising ◽  
F. Hase ◽  
J. Heymann ◽  
...  

Abstract. A simple empirical CO2 model (SECM) is presented to estimate column-average dry-air mole fractions of atmospheric CO2 (XCO2) as well as mixing ratio profiles. SECM is based on a simple equation depending on 17 empirical parameters, latitude, and date. The empirical parameters have been determined by least squares fitting to NOAA's (National Oceanic and Atmospheric Administration) assimilation system CarbonTracker version 2010 (CT2010). Comparisons with TCCON (total carbon column observing network) FTS (Fourier transform spectrometer) measurements show that SECM XCO2 agrees quite well with reality. The synthetic XCO2 values have a standard error of 1.39 ppm and systematic station-to-station biases of 0.46 ppm. Typical column averaging kernels of the TCCON FTS, a SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric CHartographY), and two GOSAT (Greenhouse gases Observing SATellite) XCO2 retrieval algorithms have been used to assess the smoothing error introduced by using SECM profiles instead of CT2010 profiles as a priori. The additional smoothing error amounts to 0.17 ppm for a typical SCIAMACHY averaging kernel and is most times much smaller for the other instruments (e.g. 0.05 ppm for a typical TCCON FTS averaging kernel). Therefore, SECM is well suited to provide a priori information for state-of-the-art ground-based (FTS) and satellite-based (GOSAT, SCIAMACHY) XCO2 retrievals. Other potential applications are: (i) near real-time processing systems (that cannot make use of models like CT2010 operated in delayed mode), (ii) "CO2 proxy" methods for XCH4 retrievals (as correction for the XCO2 background), and (iii) observing system simulation experiments especially for future satellite missions.


2015 ◽  
Author(s):  
◽  
Daniel Omar Pérez

Uno de los objetivos centrales de la inversión de datos sísmicos prestack consiste en determinar contrastes entre las propiedades físicas de las rocas del subsuelo a partir de la información contenida en la variación en función del ángulo de incidencia de las amplitudes de las ondas sísmicas reflejadas en las interfaces geológicas. La inversión de datos sísmicos prestack es un problema mal planteado y mal condicionado, en el sentido de que pequeñas cantidades de ruido en el dato llevan a grandes inestabilidades en las soluciones estimadas. Además, debido a la naturaleza de los datos observados, que son ruidosos, incompletos y de banda limitada, coexiste el problema de la no-unicidad de las soluciones. Dichos problemas apremian la utilización de regularizaciones y restricciones con el fin de estabilizar el proceso de inversión y promover al mismo tiempo soluciones con alguna característica deseada. Las soluciones ralas, o sparse, son deseables debido a que permiten obtener reflectores bien definidos y de esa forma superar el problema de la baja resolución observada en las soluciones obtenidas por medio de métodos de inversión convencionales. En este trabajo de tesis presentamos tres nuevas estrategias basadas en la utilización de diferentes regularizaciones que estabilizan el problema de inversión y promueven soluciones sparse a partir de datos sísmicos prestack. En la primera estrategia se procede a estimar soluciones sub-óptimas del problema de inversión regularizado mediante la norma L<sub>0</sub> por medio de la utilización del algoritmo de optimización global Very Fast Simulated Annealing (VFSA). La segunda estrategia consta de dos etapas: primero se resuelve el problema de inversión regularizado mediante la norma L<sub>1</sub> por medio de un eficiente algoritmo de optimización conocido como Fast Iterative Shrinkage-Thresholding Algorithm (FISTA) y luego se realiza un paso correctivo de las amplitudes estimadas utilizando mínimos cuadrados. Estas dos primeras estrategias permiten estimar con éxito soluciones sparse utilizando la aproximación de Shuey de dos términos, modelo que describe la variación con el ángulo de incidencia de los coeficientes de reflexión sísmica. La tercera estrategia utiliza como regularización la norma L<sub>1,2</sub>, permitiendo incorporar información a priori por medio de matrices de covarianza o de escala. En este caso se estiman soluciones sparse de los parámetros de la aproximación de Aki & Richards de tres términos y, si la información a priori disponible es adecuada, es posible obtener también una estimación de tipo blocky de los parámetros elásticos del subsuelo.


Author(s):  
Maria A. Milkova

Nowadays the process of information accumulation is so rapid that the concept of the usual iterative search requires revision. Being in the world of oversaturated information in order to comprehensively cover and analyze the problem under study, it is necessary to make high demands on the search methods. An innovative approach to search should flexibly take into account the large amount of already accumulated knowledge and a priori requirements for results. The results, in turn, should immediately provide a roadmap of the direction being studied with the possibility of as much detail as possible. The approach to search based on topic modeling, the so-called topic search, allows you to take into account all these requirements and thereby streamline the nature of working with information, increase the efficiency of knowledge production, avoid cognitive biases in the perception of information, which is important both on micro and macro level. In order to demonstrate an example of applying topic search, the article considers the task of analyzing an import substitution program based on patent data. The program includes plans for 22 industries and contains more than 1,500 products and technologies for the proposed import substitution. The use of patent search based on topic modeling allows to search immediately by the blocks of a priori information – terms of industrial plans for import substitution and at the output get a selection of relevant documents for each of the industries. This approach allows not only to provide a comprehensive picture of the effectiveness of the program as a whole, but also to visually obtain more detailed information about which groups of products and technologies have been patented.


Author(s):  
Vladimir A. Lapin ◽  
Erken S. Aldakhov ◽  
S. D. Aldakhov ◽  
A. B. Ali

For the first time in Almaty full passport of apartment stock of multiapartment building was carried out. The structure of the housing stock was revealed with the allocation of groups of buildings according to structural solutions and assessment of their seismic resistance. Based on the results of certification, quantitative estimates of failure probability values for different types of buildings were obtained. Formulas for estimation of quantitative value of seismic risk are obtained. The number of deaths in the estimated zem-shakes was estimated. The results of the assessments will be used for practical recommendations to reduce risk and expected losses in possible earthquakes.


Author(s):  
Jack Corbett ◽  
Wouter Veenendaal

Chapter 1 introduces the main arguments of the book; outlines the approach, method, and data; defines key terms; and provides a chapter outline. Global theories of democratization have systematically excluded small states, which make up roughly 20 per cent of countries. These cases debunk mainstream theories of why democratization succeeds or fails. This book brings small states into the comparative politics fold for the first time. It is organized thematically, with each chapter tackling one of the main theories from the democratization literature. Different types of data are examined—case studies and other documentary evidence, interviews and observation. Following an abductive approach, in addition to examining the veracity of existing theory, each chapter is also used to build an explanation of how democracy is practiced in small states. Specifically, we highlight how small state politics is shaped by personalization and informal politics, rather than formal institutional design.


2021 ◽  
Vol 7 (3) ◽  
pp. 38
Author(s):  
Alexandra Korotaeva ◽  
Danzan Mansorunov ◽  
Natalya Apanovich ◽  
Anna Kuzevanova ◽  
Alexander Karpukhin

Neuroendocrine neoplasms (NEN) are infrequent malignant tumors of a neuroendocrine nature that arise in various organs. They occur most frequently in the lungs, intestines, stomach and pancreas. Molecular diagnostics and prognosis of NEN development are highly relevant. The role of clinical biomarkers can be played by microRNAs (miRNAs). This work is devoted to the analysis of data on miRNA expression in NENs. For the first time, a search for specificity or a community of their functional characteristics in different types of NEN was carried out. Their properties as biomarkers were also analyzed. To date, more than 100 miRNAs have been characterized as differentially expressed and significant for the development of NEN tumors. Only about 10% of the studied miRNAs are expressed in several types of NEN; differential expression of the remaining 90% was found only in tumors of specific localizations. A significant number of miRNAs have been identified as potential biomarkers. However, only a few miRNAs have values that characterized their quality as markers. The analysis demonstrates the predominant specific expression of miRNA in each studied type of NEN. This indicates that miRNA’s functional features are predominantly influenced by the tissue in which they are formed.


Sign in / Sign up

Export Citation Format

Share Document