Shaker tests on downhole seismic tools

Geophysics ◽  
1988 ◽  
Vol 53 (9) ◽  
pp. 1160-1168 ◽  
Author(s):  
Vincent de Montmollin

Shaking a downhole seismic tool in situ is a powerful test procedure able to detect abnormal acoustical behavior. It consists of feeding a swept voltage to a special geophone located inside the tool, close to the triaxial measurement geophones. The special geophone creates a swept force, which excites the tool‐formation system. The resulting motion of the tool is recorded through the triaxial geophones. In this paper, a simple model of the tool‐formation system excited by the shaker is given. Predictions from this model compare nicely with results from a field experiment conducted with a very compact seismic tool. Another field experiment, conducted with a longer tool, shows that tool intrinsic modes are clearly visible in the shaker data. Their frequencies and amplitudes depend upon the geometry of the contact between the tool and the formation. A comparison between shaker data and impulsive shot data shows that moderate modal vibrations do not significantly deteriorate the quality of VSP data, but that large modal vibrations, when present, are visible on the shot data and also correspond to an increased sensitivity to tube waves. Recording one shaker trace at each depth allows routine well‐site quality control of VSP data. Such quality control is especially important for the horizontal axes of a tool and in an open hole, where coupling conditions are not easily controlled.

1999 ◽  
Vol 2 (02) ◽  
pp. 125-133 ◽  
Author(s):  
M.N. Hashem ◽  
E.C. Thomas ◽  
R.I. McNeil ◽  
Oliver Mullins

Summary Determination of the type and quality of hydrocarbon fluid that can be produced from a formation prior to construction of production facilities is of equal economic importance to predicting the fluid rate and flowing pressure. We have become adept at making such estimates for formations drilled with water-based muds, using open-hole formation evaluation procedures. However, these standard open-hole methods are somewhat handicapped in wells drilled with synthetic oil-based mud because of the chemical and physical similarity between the synthetic oil-based filtrate and any producible oil that may be present. Also complicating the prediction is that in situ hydrocarbons will be miscibly displaced away from the wellbore by the invading oil-based mud filtrate, leaving little or no trace of the original hydrocarbon in the invaded zone. Thus, normal methods that sample fluids in the invaded zone will be of little use in predicting the in situ type and quality of hydrocarbons deeper in the formation. Only when we can pump significant volume of filtrate from the invaded zone to reconnect and sample the virgin fluids are we successful. However, since the in situ oil and filtrate are miscible, diffusion mixes the materials and blurs the interface; as mud filtrate is pumped from the formation into the borehole, the degree of contamination is greater than one might expect, and it is difficult to know when to stop pumping and start sampling. What level of filtrate contamination in the in situ fluid is tolerable? We propose a procedure for enhancing the value of the data derived from a particular open-hole wireline formation tester by quantitatively evaluating in real time the quality of the fluid being collected. The approach focuses on expanding the display of the spectroscopic data as a function of time on a more sensitive scale than has been used previously. This enhanced sensitivity allows one to confidently decide when in the pumping cycle to begin the sampling procedure. The study also utilizes laboratory determined PVT information on collected samples to form a data set that we use to correlate to the wireline derived spectroscopic data. The accuracy of these correlations has been verified with subsequent predictions and corroborated with laboratory measurements. Lastly, we provide a guideline for predicting the pump-out time needed to obtain a fluid sample of a pre-determined level of contamination when sampling conditions fall within our range of empirical data. Conclusions This empirical study validates that PVT quality hydrocarbon samples can be obtained from boreholes drilled with synthetic oil-based mud utilizing wireline formation testers deployed with downhole pump-out and optical analyzer modules. The data set for this study has the following boundary conditions: samples were obtained in the Gulf of Mexico area; the rock formations are unconsolidated to slightly consolidated, clean to slightly shaly sandstones; the in situ hydrocarbons and the synthetic oil-based mud filtrate have measurable differences in their visible and/or near infrared spectra. Specifically, this study demonstrates that during the pump-out phase of operations we can use the optical analyzer response to predict the API gravity and gas/oil ratio of the reservoir hydrocarbons prior to securing a downhole sample. Additionally, we can predict the pump out time required to obtain a reservoir sample with less than 10% mud filtrate contamination if we know or can estimate reservoir fluid viscosity and formation permeability. Extension of this method to other formations and locales should be possible using similar empirical correlation methodology. Introduction The high cost of offshore production facilities construction and deployment require accurate prediction of hydrocarbon PVT properties prior to fabrication. In the offshore Gulf of Mexico, one method to obtain a PVT quality hydrocarbon sample is to use a cased hole drill stem test. However, this procedure is usually quite costly due to the need for sand control. Shell has been an advocate of eliminating this costly step by utilizing openhole wireline test tools to obtain the PVT quality sample of the reservoir hydrocarbon. The success of this approach depends upon the availability of a wireline tool with a downhole pump that permits removal of the mud filtrate contamination prior to sampling the reservoir fluids, and a downhole fluid analyzer that can distinguish reservoir fluid from filtrate. One such tool is the Modular Formation Dynamics Tester (MDT).1 The optical fluid analyzer module of the MDT functions by subjecting the fluids being pumped to absorption spectroscopy in the visible and near-infrared (NIR) ranges. Interpretation of these spectra is the subject of this paper. Tool descriptions and basic theory of operations were presented in an earlier text.2 The concept of using visible and/or NIR spectroscopy to characterize the fluids being sampled while pumping is straightforward when there are measurable differences in the spectra of the mud filtrate and the reservoir hydrocarbons. As shown in Fig. 1, there are well known areas3,4 of the NIR spectrum (800-2000 nm) that are diagnostic of water and oil. The optical fluid analyzer module (OFA) of the MDT has channels tuned at 10 locations as indicated in Fig. 1, and thus the response in channels 6, 8, and 9 can be used to discern water from hydrocarbon. Another section of the OFA is designed to detect gas by measuring reflected polarized light from the pumped fluids, but we do not discuss its operation further except to say that it is a reliable gas indicator.


2021 ◽  
Vol 8 ◽  
Author(s):  
Tanya L. Maurer ◽  
Joshua N. Plant ◽  
Kenneth S. Johnson

The Southern Ocean Carbon and Climate Observations and Modeling (SOCCOM) project has deployed 194 profiling floats equipped with biogeochemical (BGC) sensors, making it one of the largest contributors to global BGC-Argo. Post-deployment quality control (QC) of float-based oxygen, nitrate, and pH data is a crucial step in the processing and dissemination of such data, as in situ chemical sensors remain in early stages of development. In situ calibration of chemical sensors on profiling floats using atmospheric reanalysis and empirical algorithms can bring accuracy to within 3 μmol O2 kg–1, 0.5 μmol NO3– kg–1, and 0.007 pH units. Routine QC efforts utilizing these methods can be conducted manually through visual inspection of data to assess sensor drifts and offsets, but more automated processes are preferred to support the growing number of BGC floats and reduce subjectivity among delayed-mode operators. Here we present a methodology and accompanying software designed to easily visualize float data against select reference datasets and assess QC adjustments within a quantitative framework. The software is intended for global use and has been used successfully in the post-deployment calibration and QC of over 250 BGC floats, including all floats within the SOCCOM array. Results from validation of the proposed methodology are also presented which help to verify the quality of the data adjustments through time.


2019 ◽  
Author(s):  
André Valente ◽  
Shubha Sathyendranath ◽  
Vanda Brotas ◽  
Steve Groom ◽  
Michael Grant ◽  
...  

Abstract. A global compilation of in situ data is useful to evaluate the quality of ocean-colour satellite data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (including, inter alia, MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO) and span the period from 1997 to 2018. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll-a, spectral inherent optical properties, spectral diffuse attenuation coefficients and total suspended matter. The data were from multi-project archives acquired via open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were propagated throughout the work and made available in the final table. By making the metadata available, provenance is better documented, and it is also possible to analyse each set of data separately. This paper also describes the changes that were made to the compilation in relation to the previous version (Valente et al., 2016). The compiled data are available at https://doi.org/10.1594/PANGAEA.898188.


2021 ◽  
Vol 11 (19) ◽  
pp. 9155
Author(s):  
Masaki Kitazume

The deep mixing method (DMM), an in situ soil stabilization technique, was developed in Japan and Nordic countries in the 1970s and has gained increased popularity in many countries. The quality of stabilized soil depends upon many factors, including its type and condition, the type and amount of binder, and the production process. Quality control and quality assurance (QC/QA) practices focus on stabilized soil, and comprises laboratory mix tests, field trial tests, monitoring and controlling construction parameters, and verification. QC/QA is one of the major concerns for clients and engineers who have less experience with the relevant technologies. In this manuscript, the importance of QC/QA-related activities along the workflow of deep mixing projects is emphasized based on the Japanese experience/results with mechanical mixing technology by vertical shaft mixing tools with horizontal rotating circular mixing blade. The current and recent developments of QC/QA are also presented.


Ocean Science ◽  
2013 ◽  
Vol 9 (1) ◽  
pp. 1-18 ◽  
Author(s):  
C. Cabanes ◽  
A. Grouazel ◽  
K. von Schuckmann ◽  
M. Hamon ◽  
V. Turpin ◽  
...  

Abstract. The French program Coriolis, as part of the French operational oceanographic system, produces the COriolis dataset for Re-Analysis (CORA) on a yearly basis. This dataset contains in-situ temperature and salinity profiles from different data types. The latest release CORA3 covers the period 1990 to 2010. Several tests have been developed to ensure a homogeneous quality control of the dataset and to meet the requirements of the physical ocean reanalysis activities (assimilation and validation). Improved tests include some simple tests based on comparison with climatology and a model background check based on a global ocean reanalysis. Visual quality control is performed on all suspicious temperature and salinity profiles identified by the tests, and quality flags are modified in the dataset if necessary. In addition, improved diagnostic tools have been developed – including global ocean indicators – which give information on the quality of the CORA3 dataset and its potential applications. CORA3 is available on request through the MyOcean Service Desk (http://www.myocean.eu/).


2016 ◽  
Author(s):  
A. Valente ◽  
S. Sathyendranath ◽  
V. Brotas ◽  
S. Groom ◽  
M. Grant ◽  
...  

Abstract. A compiled set of in situ data is important to evaluate the quality of ocean-olour satellite data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GEPCO), spans between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll-a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were propagated throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594/PANGAEA.854832.


2019 ◽  
Vol 11 (3) ◽  
pp. 1037-1068 ◽  
Author(s):  
André Valente ◽  
Shubha Sathyendranath ◽  
Vanda Brotas ◽  
Steve Groom ◽  
Michael Grant ◽  
...  

Abstract. A global compilation of in situ data is useful to evaluate the quality of ocean-colour satellite data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (including, inter alia, MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT and GeP&CO) and span the period from 1997 to 2018. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties, spectral diffuse attenuation coefficients and total suspended matter. The data were from multi-project archives acquired via open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenization, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) was propagated throughout the work and made available in the final table. By making the metadata available, provenance is better documented, and it is also possible to analyse each set of data separately. This paper also describes the changes that were made to the compilation in relation to the previous version (Valente et al., 2016). The compiled data are available at https://doi.org/10.1594/PANGAEA.898188 (Valente et al., 2019).


2016 ◽  
Vol 8 (1) ◽  
pp. 235-252 ◽  
Author(s):  
André Valente ◽  
Shubha Sathyendranath ◽  
Vanda Brotas ◽  
Steve Groom ◽  
Michael Grant ◽  
...  

Abstract. A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594/PANGAEA.854832 (Valente et al., 2015).


Author(s):  
Xiaoming Sun ◽  
Hui Zhang ◽  
Qian Zhou

With the rapid increase of the construction volume of our country's landscape industry, the scale of the stock gardens is getting larger and larger. Later gardening maintenance technology needs to be constantly updated, and management modes and methods should be constantly innovated to meet the development needs of the maintenance market. The quality of maintenance and management in the later stage of green space construction is intuitively presented to the viewers through the quality of landscape. Therefore, improving the quality control ability of green space is the most important thing for the rapid development of garden enterprises. This paper combines the Internet with traditional maintenance quality management through Internet thinking, and achieves several important points in quality control, such as planning, traces, inspection, reporting and customer service evaluation, on the Internet management and control platform. Among them, the work plan combined with the maintenance calendar, the climate and soil and other environmental data from all parts of the country can better guide the field work; Maintenance trace record can provide data feedback through pictures, videos, etc.; Patrol function discovers green space problems in time, records data of basic problems of green space and feedback processing; And reporting function can provide customers with more convenient maintenance services. The addition of customer evaluation forms an important closed-loop for staff management and site quality from the perspective of customers. To some extent, this study solved the problems of insufficient quality standards and low efficiency under the traditional maintenance management mode, which accords with people’s service demand for greening maintenance at the present stage, and helps to enhance people’s recognition of greening personnel. With the help of Internet management and control thinking, we can better solve the pain points existing in the development of maintenance enterprises, provide new tools for the better development of maintenance industry, and lay a foundation for the growth of maintenance industry.


2019 ◽  
Vol 2 (5) ◽  
Author(s):  
Tong Wang

The compaction quality of the subgrade is directly related to the service life of the road. Effective control of the subgrade construction process is the key to ensuring the compaction quality of the subgrade. Therefore, real-time, comprehensive, rapid and accurate prediction of construction compaction quality through informatization detection method is an important guarantee for speeding up construction progress and ensuring subgrade compaction quality. Based on the function of the system, this paper puts forward the principle of system development and the development mode used in system development, and displays the development system in real-time to achieve the whole process control of subgrade construction quality.


Sign in / Sign up

Export Citation Format

Share Document