geophysical application
Recently Published Documents


TOTAL DOCUMENTS

43
(FIVE YEARS 8)

H-INDEX

6
(FIVE YEARS 0)

2021 ◽  
Author(s):  
WAN MOHD YAAKOB WAN BEJURI

The services of geophysical application in the obstructedarea can be extremely challenging, especially if theGlobal Navigation Satellite System (GNSS) signal isunavailable or weak. Usually, the integration with theinertial sensor in the smartphone was used for assistingbetter navigation. Nonetheless, the usage of particle filtermodule in optimizing data from positioning sensor willcontribute to the low particle sample size or known assample impoverishment phenomenon, and finallyincreasing the location positioning error in a certainperiod. Adaptations towards to particle sample size andnoise, must be made, to make the particle filter moreintelligent, reliable and robust in a long time it isrunning. In this paper, we propose a new algorithm ofsequential implementation resampling particle filter byadapting the particle sample size and sensor noisemeasurement. This adaptation will be used to counteractin a different situation. As the results, the paper showsthe proposed solution can achieve an averageimprovement of 24.78% by reducing RMSE of stateestimation compared to previous algorithm. In the future,it is expected to contribute to the modernization ofgeophysical applications.


2021 ◽  
Author(s):  
Chen Wang ◽  
Lili Lei ◽  
Zhe-Min Tan ◽  
Kekuan Chu

<p>One important aspect of successfully implementing an ensemble Kalman filter (EnKF) in a high dimensional geophysical application is covariance localization. But for satellite radiances whose vertical locations are not well defined, covariance localization is not straightforward. The global group filter (GGF) is an adaptive localization algorithm, which can provide adaptively estimated localization parameters including the localization width and vertical location of observations for each channel and every satellite platform of radiance data, and for different regions and times. This adaptive method is based on sample correlations between ensemble priors of observations and state variables, aiming to minimize sampling errors of estimated sample correlations. The adaptively estimated localization parameters are examined here for typhoon Yutu (2018), using the regional model WRF and a cycling EnKF system. The benefits of differentiating the localization parameters for TC and non-TC regions and varying the localization parameters with time are investigated. Results from the 6-h priors verified relative to the conventional and radiance observations show that the adaptively estimated localization parameters generally produce smaller errors than the default Gaspari and Cohn (GC) localization. The adaptively estimated localization parameters better capture the onset of RI and yield improved intensity and structure forecasts for typhoon Yutu (2018) compared to the default GC localization. The time-varying localization parameters have slightly advantages over the time-constant localization parameters. Further improvements are achieved by differentiating the localization parameters for TC and non-TC regions.</p>


2021 ◽  
Author(s):  
Giacomo Fornasari ◽  
Luigi Capozzoli ◽  
Gregory De Martino ◽  
Valeria Giampaolo ◽  
Enzo Rizzo

<p>The increase of the metropolises stresses the urban areas and intensive planning works is necessary. Therefore, the development of new technologies and methodologies able to explore the subsoil and manage its resources in urban areas becomes an important source in terms of saving time and money. In the last decade, a new subdiscipline in the Applied Geophysics started: Urban Geophysics (Lapenna, 2017). Urban Geophysics analyzes the contribute, in terms of limits and potentialities, that geophysical methodologies can give for providing useful information about the subsoil, environment, buildings and civil infrastructures and supporting the public administrations in planning interventions in urban scenarios.</p><p>This work introduces a laboratory test, that was performed at the Hydrogeosite CNR-IMAA laboratory of Marsico Nuovo (Basilicata region, Italy). The test consisted in a multisensor geophysical application on an analogue engineering model. Thanks to the possibility to work in laboratory conditions, a detailed knowledge of the structure was available, providing great advantages for assess the capability of the geophysical methodologies for analyze engineering issues, regarding the characterization of the infrastructural critical zone placed at the interface soil-structure. For this purpose, geoelectrical and electromagnetic methodologies, including Cross hole Electrical Resistivity Tomography and Ground Penetrating Radar, were used to characterize the geometry of the foundation structures and the disposition of the rebar for the reinforced concrete frame. Finally, new geophysical approaches were applied in order to define the corrosion rate of reinforcement.</p>


2021 ◽  
Vol 14 (1) ◽  
pp. 91-106
Author(s):  
Bertrand Bessagnet ◽  
Laurent Menut ◽  
Maxime Beauchamp

Abstract. An interpolation programme coded in Fortran for irregular N-dimensional cases is presented and freely available. The need for interpolation procedures over irregular meshes or matrixes with interdependent input data dimensions is frequent in geophysical models. Also, these models often embed look-up tables of physics or chemistry modules. Fortran is a fast and powerful language and is highly portable. It is easy to interface models written in Fortran with each other. Our programme does not need any libraries; it is written in standard Fortran and tested with two usual compilers. The programme is fast and competitive compared to current Python libraries. A normalization option parameter is provided when considering different types of units on each dimension. Some tests and examples are provided and available in the code package. Moreover, a geophysical application embedding this interpolation programme is provided and discussed; it consists in determining back trajectories using chemistry-transport or mesoscale meteorological model outputs, respectively, from the widely used CHIMERE and Weather Research and Forecasting (WRF) models.


2020 ◽  
Vol 312 ◽  
pp. 112122
Author(s):  
Qiu Wang ◽  
Xiaofang Ren ◽  
Shimin Jiao ◽  
Xiangzhou Lei ◽  
Shaolin Zhang ◽  
...  

2020 ◽  
Vol 29 (07) ◽  
pp. 2050050
Author(s):  
A. V. Gusev ◽  
E. Majorana ◽  
V. N. Rudenko ◽  
V. D. Yushkin

Geophysical application of large free-mass laser interferometers, which had been designed merely for the detection of gravitational radiation of an astrophysical nature, are considered. Despite the suspended mass-mirrors, these interferometers can be considered as two coordinate meters even at very low frequency ([Formula: see text][Formula: see text]Hz) are rather accurate two-coordinate distance meters. In this case, the measurement of geodynamic deformations looks like a parallel product of long-term observations dictated by the task of the blind search for gravitational waves (GW) of extraterrestrial origin. Compared to conventional laser strain meters, gravitational interferometers have the advantage of an increased absolute value of the deformation signal due to the 3–4[Formula: see text]km baseline. The magnitude of the tidal variations of the baseline is 150–200[Formula: see text]microns, leading to conceive the observation of the fine structure of geodynamic disturbances. This paper presents the results of processing geophysical measurements made on a Virgo interferometer during test (technical) series of observations in 2007–2009. The specific design of mass-mirrors suspensions in the Virgo gravitational interferometer also creates a unique possibility of separating gravitational and deformation perturbations through a recording mutual angular deviations of the suspensions of its central and end mirrors. It gives a measurement of the spatial derivative of the gravity acceleration along with the geoid of the Earth. In this mode, the physics of the interferometer is considered with estimates of the achievable sensitivity in the application to the classical problem of registration of oscillations of the inner Earth’s core.


Atmosphere ◽  
2020 ◽  
Vol 11 (4) ◽  
pp. 338
Author(s):  
Pinqiang Wang ◽  
Mengbin Zhu ◽  
Yan Chen ◽  
Weimin Zhang

Under the motivation of the great success of four-dimensional variational (4D-Var) data assimilation methods and the advantages of ensemble methods (e.g., Ensemble Kalman Filters and Particle Filters) in numerical weather prediction systems, we introduce the implicit equal-weights particle filter scheme in the weak constraint 4D-Var framework which avoids the filter degeneracy through implicit sampling in high-dimensional situations. The new variational particle smoother (varPS) method has been tested and explored using the Lorenz96 model with dimensions N x = 40 , N x = 100 , N x = 250 , and N x = 400 . The results show that the new varPS method does not suffer from the curse of dimensionality by construction and the root mean square error (RMSE) in the new varPS is comparable with the ensemble 4D-Var method. As a combination of the implicit equal-weights particle filter and weak constraint 4D-Var, the new method improves the RMSE compared with the implicit equal-weights particle filter and LETKF (local ensemble transformed Kalman filter) methods and enlarges the ensemble spread compared with ensemble 4D-Var scheme. To overcome the difficulty of the implicit equal-weights particle filter in real geophysical application, the posterior error covariance matrix is estimated using a limited ensemble and can be calculated in parallel. In general, this new varPS performs slightly better in ensemble quality (the balance between the RMSE and ensemble spread) than the ensemble 4D-Var and has the potential to be applied into real geophysical systems.


Fog Computing ◽  
2018 ◽  
pp. 142-157 ◽  
Author(s):  
Luiz Angelo Steffenel ◽  
Manuele Kirsch Pinheiro ◽  
Lucas Vaz Peres ◽  
Damaris Kirsch Pinheiro

The exponential dissemination of proximity computing devices (smartphones, tablets, nanocomputers, etc.) raises important questions on how to transmit, store and analyze data in networks integrating those devices. New approaches like edge computing aim at delegating part of the work to devices in the “edge” of the network. In this article, the focus is on the use of pervasive grids to implement edge computing and leverage such challenges, especially the strategies to ensure data proximity and context awareness, two factors that impact the performance of big data analyses in distributed systems. This article discusses the limitations of traditional big data computing platforms and introduces the principles and challenges to implement edge computing over pervasive grids. Finally, using CloudFIT, a distributed computing platform, the authors illustrate the deployment of a real geophysical application on a pervasive network.


Sign in / Sign up

Export Citation Format

Share Document