multicomponent data
Recently Published Documents


TOTAL DOCUMENTS

85
(FIVE YEARS 15)

H-INDEX

13
(FIVE YEARS 2)

BMJ Open ◽  
2021 ◽  
Vol 11 (7) ◽  
pp. e053248
Author(s):  
Sydney Axson ◽  
Michelle M Mello ◽  
Deborah Lincow ◽  
Catherine Yang ◽  
Cary Gross ◽  
...  

ObjectivesTo examine company characteristics associated with better transparency and to apply a tool used to measure and improve clinical trial transparency among large companies and drugs, to smaller companies and biologics.DesignCross-sectional descriptive analysis.Setting and participantsNovel drugs and biologics Food and Drug Administration (FDA) approved in 2016 and 2017 and their company sponsors.Main outcome measuresUsing established Good Pharma Scorecard (GPS) measures, companies and products were evaluated on their clinical trial registration, results dissemination and FDA Amendments Act (FDAAA) implementation; companies were ranked using these measures and a multicomponent data sharing measure. Associations between company transparency scores with company size (large vs non-large), location (US vs non-US) and sponsored product type (drug vs biologic) were also examined.Results26% of products (16/62) had publicly available results for all clinical trials supporting their FDA approval and 67% (39/58) had public results for trials in patients by 6 months after their FDA approval; 58% (32/55) were FDAAA compliant. Large companies were significantly more transparent than non-large companies (overall median transparency score of 95% (IQR 91–100) vs 59% (IQR 41–70), p<0.001), attributable to higher FDAAA compliance (median of 100% (IQR 88–100) vs 57% (0–100), p=0.01) and better data sharing (median of 100% (IQR 80–100) vs 20% (IQR 20–40), p<0.01). No significant differences were observed by company location or product type.ConclusionsIt was feasible to apply the GPS transparency measures and ranking tool to non-large companies and biologics. Large companies are significantly more transparent than non-large companies, driven by better data sharing procedures and implementation of FDAAA trial reporting requirements. Greater research transparency is needed, particularly among non-large companies, to maximise the benefits of research for patient care and scientific innovation.


2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Xue-Bo Jin ◽  
Jia-Hui Zhang ◽  
Ting-Li Su ◽  
Yu-Ting Bai ◽  
Jian-Lei Kong ◽  
...  

Complex time series data exists widely in actual systems, and its forecasting has great practical significance. Simultaneously, the classical linear model cannot obtain satisfactory performance due to nonlinearity and multicomponent characteristics. Based on the data-driven mechanism, this paper proposes a deep learning method coupled with Bayesian optimization based on wavelet decomposition to model the time series data and forecasting its trend. Firstly, the data is decomposed by wavelet transform to reduce the complexity of the time series data. The Gated Recurrent Unit (GRU) network is trained as a submodel for each decomposition component. The hyperparameters of wavelet decomposition and each submodel are optimized with Bayesian sequence model-based optimization (SMBO) to develop the modeling accuracy. Finally, the results of all submodels are added to obtain forecasting results. The PM2.5 data collected by the US Air Quality Monitoring Station is used for experiments. By comparing with other networks, it can be found that the proposed method outperforms well in the multisteps forecasting task for the complex time series.


Geophysics ◽  
2021 ◽  
pp. 1-64
Author(s):  
David Vargas ◽  
Ivan Vasconcelos ◽  
Yanadet Sripanich ◽  
Matteo Ravasi

Reconstructing the details of subsurface structures deep beneath complex overburden structures, such as sub-salt, remains a challenge for seismic imaging. Over the past years, the Marchenko redatuming approach has proven to reliably retrieve full-wavefield information in the presence of complex overburden effects. When used for redatuming, current practical Marchenko schemes cannot make use of a priori subsurface models with sharp contrasts because of their requirements regarding initial focusing functions, which for sufficiently complex media can result in redatumed fields with significant waveform inaccuracies. Using a scattering framework, we present an alternative form of the Marchenko representation that aims at retrieving only the unknown perturbations to both focusing functions and redatumed fields. From this framework, we propose a two-step practical focusing-based redatuming scheme that first solves an inverse problem for the background focusing functions, which are then used to estimate the perturbations to focusing functions and redatumed fields. In our scheme, initial focusing functions are significantly different from previous approaches since they contain complex waveforms encoding the full transmission response of the a priori model. Our goal is the handling of not only highly complex media but also realistic data - band-limited, unevenly sampled, free-surface-multiple contaminated data. To that end, we combine the versatility of Rayleigh-Marchenko redatuming with the proposed scattering-based scheme allowing an extended version of the method able to handle single-sided band-limited multicomponent data. This Scattering-Rayleigh-Marchenko strategy accurately retrieves wavefields while requiring minimum pre-processing of the data. In support of the new methods, we present a comprehensive set of numerical tests using a complex 2D subsalt model. Our numerical results show that the scattering approaches retrieve accurate redatumed fields that appropriately account for the complexity of the a priori model. We show that the improvements in wavefield retrieval translate into measurable improvements in our subsalt images.


Geophysics ◽  
2021 ◽  
pp. 1-131
Author(s):  
Zheng Wu ◽  
Yuzhu Liu ◽  
Jizhong Yang

High-resolution reconstruction of steeply dipping structures is an important but challenging subject in seismic exploration. Prismatic reflections that contain information on these structures are helpful for reconstructing steeply dipping structures. Elastic full-waveform inversion (EFWI) is a powerful tool that can accurately estimate subsurface parameters from multicomponent seismic data, which can provide information useful for characterizing oil and gas reservoirs. We construct the relationship between the forward and inverse problems related to the prismatic reflections by considering the multiparameter exact Hessian in realistic elastic media. We numerically analyze the characteristics of the multiparameter exact Hessian and show that when prismatic reflections are apparent in multicomponent data, the multiparameter delta Hessian has a strong influence. We explain this in more detail through the forward analysis and demonstrate that the multiparameter delta Hessian considers not only the prismatic reflections but also compensates for the primary reflections in multicomponent data. To use the prismatic waves, we develop a migration/demigration approach-based truncated Newton (TN) method in frequency-domain EFWI, whose storage requirements and the computational costs are the same as those of the truncated Gauss–Newton (TGN) method. Realistic 2D numerical examples demonstrate that, compared with TGN method based on the first-order Born approximation, the TN method can converge faster and obtain higher accuracy in the reconstruction of steeply dipping structures.


Geophysics ◽  
2020 ◽  
Vol 85 (5) ◽  
pp. U99-U107
Author(s):  
Matthew P. Griffiths ◽  
André J.-M. Pugin ◽  
Dariush Motazedian

Seismic reflection processing for multicomponent data is very time consuming. To automatically streamline and shorten this process, a new approach for estimating the local event slope (local static shift) in the time-frequency domain is proposed and tested. The seismic event slope is determined by comparing the local phase content of Stockwell transformed signals. This calculation allows for noninterfering arrivals to be aligned by iteratively correcting trace by trace. Alternatively, the calculation can be used in a velocity-independent imaging framework with the possibility of exporting the determined time and velocities for each common midpoint gather, which leads to a more robust moveout correction. Synthetic models are used to test the robustness of the calculation and compare it directly to an existing method of local slope estimation. Compared to dynamic time warping, our method is more robust to noise but less robust to large time shifts, which limits our method to shorter geophone spacing. We apply the calculation to near-surface shear-wave data and compare it directly to semblance/normal-moveout processing. Examples demonstrate that the calculation yields an accurate local slope estimate and can produce sections of better or equal quality to sections processed using the conventional approach with much less user time input. It also serves as a first example of velocity-independent processing applied to near-surface reflection data.


2020 ◽  
Author(s):  
Zongbo Xu

One uses seismic interferometry (SI) to recover Green's functions (i.e. impulse response) from ambient seismic recordings and estimate surface-wave phase velocities to investigate subsurface structure. This method has been commonly used in the last 20 years because this method only utilizes ambient seismic recordings from seismic stations/sensors and does not rely on traditional seismic sources (e.g. earthquakes or active sources). SI assumes that the ambient seismic wavefield is isotropic, but this assumption is rarely met in practice. We demonstrate that, with linear-array spatial sampling of an anisotropic ambient seismic wavefield, SI provides a better estimate of Rayleigh-wave phase velocities than another commonly used ambient seismic method, the refraction microtremor (ReMi) method. However, even SI does not work in some extreme cases, such as when the out-of-line sources are stronger than the inline sources. This is because the recovered Green's functions and surface-wave phase velocity estimations from SI are biased due to the anisotropic wavefield. Thus, we propose to use multicomponent data to mitigate this bias. The multicomponent data are vertical (Z) and radial (R) components, where the R direction is parallel to a line or great circle path between two sensors. The multicomponent data can deal with the extreme anisotropic source cases, because the R component is more sensitive to the in-line sources than the out-of-line sources, while the Z component possesses a constant sensitivity to sources in all directions. Estimation of source distributions (i.e. locations and strengths) can aid correction of the bias in SI results, as well as enable the study of natural ambient seismic sources (e.g. microseism). We use multicomponent seismic data to estimate ambient seismic source distributions using full-waveform inversion. We demonstrate that the multicomponent data can better constrain the inversion than only the Z component data, due to the different source sensitivities between the Z and R components. When applying the inversion to field data, we propose a general workflow which is applicable for different field scales and includes vertical and multicomponent data. We demonstrate the workflow with a field data example from the CO2 degassing in Harstouˇsov, Czech Republic. We also apply the workflow to the seismic recordings in Antarctica during February 2010 and estimate the primary microseism source distributions. The SI results include both direct and coda waves. While using the direct waves in investigating subsurface structure and estimating source distributions, one can utilize the coda waves to monitor small changes in the subsurface. The coda waves include multiply-scattered body and surface waves. The two types of waves possess different spatial sensitivities to subsurface changes and interact each other through scattering. We present a Monte Carlo simulation to demonstrate the interaction in an elastic homogeneous media. In the simulation, we incorporate the scattering process between body and Rayleigh waves and the eigenfunctions of Rayleigh waves. This is a first step towards a complete modelling of multiply-scattered body and surface waves in elastic media.


Geophysics ◽  
2020 ◽  
Vol 85 (2) ◽  
pp. V143-V156
Author(s):  
Qiang Zhao ◽  
Qizhen Du ◽  
Qamar Yasin ◽  
Qingqing Li ◽  
Liyun Fu

Multicomponent noise attenuation often presents more severe processing challenges than scalar data owing to the uncorrelated random noise in each component. Meanwhile, weak signals merged in the noise are easier to degrade using the scalar processing workflows while ignoring their possible supplement from other components. For seismic data preprocessing, transform-based approaches have achieved improved performance on mitigating noise while preserving the signal of interest, especially when using an adaptive basis trained by dictionary-learning methods. We have developed a quaternion-based sparse tight frame (QSTF) with the help of quaternion matrix and tight frame analyses, which can be used to process the vector-valued multicomponent data by following a vectorial processing workflow. The QSTF is conveniently trained through iterative sparsity-based regularization and quaternion singular-value decomposition. In the quaternion-based sparse domain, multicomponent signals are orthogonally represented, which preserve the nonlinear relationships among multicomponent data to a greater extent as compared with the scalar approaches. We test the performance of our method on synthetic and field multicomponent data, in which component-wise, concatenated, and long-vector models of multicomponent data are used as comparisons. Our results indicate that more features, specifically the weak signals merged in the noise, are better recovered using our method than others.


Sign in / Sign up

Export Citation Format

Share Document