scholarly journals Basic Testing of the duchamp Source Finder

2012 ◽  
Vol 29 (3) ◽  
pp. 276-295 ◽  
Author(s):  
T. Westmeier ◽  
A. Popping ◽  
P. Serra

AbstractThis paper presents and discusses the results of basic source finding tests in three dimensions (using spectroscopic data cubes) with duchamp, the standard source finder for the Australian Square Kilometre Array Pathfinder. For this purpose, we generated different sets of unresolved and extended Hi model sources. These models were then fed into duchamp, using a range of different parameters and methods provided by the software. The main aim of the tests was to study the performance of duchamp on sources with different parameters and morphologies and assess the accuracy of duchamp's source parametrisation. Overall, we find duchamp to be a powerful source finder capable of reliably detecting sources down to low signal-to-noise ratios and accurately measuring their position and velocity. In the presence of noise in the data, duchamp's measurements of basic source parameters, such as spectral line width and integrated flux, are affected by systematic errors. These errors are a consequence of the effect of noise on the specific algorithms used by duchamp for measuring source parameters in combination with the fact that the software only takes into account pixels above a given flux threshold and hence misses part of the flux. In scientific applications of duchamp these systematic errors would have to be corrected for. Alternatively, duchamp could be used as a source finder only, and source parametrisation could be done in a second step using more sophisticated parametrisation algorithms.

2019 ◽  
Vol 631 ◽  
pp. A159 ◽  
Author(s):  
S. Martín ◽  
J. Martín-Pintado ◽  
C. Blanco-Sánchez ◽  
V. M. Rivilla ◽  
A. Rodríguez-Franco ◽  
...  

Context. The increase in bandwidth and sensitivity of state-of-the-art radio observatories is providing a wealth of molecular data from nearby star-forming regions up to high-z galaxies. Analysing large data sets of spectral cubes requires efficient and user-friendly tools optimised for astronomers with a wide range of backgrounds. Aims. In this paper we present the detailed formalism at the core of Spectral Line Identification and Modelling (SLIM) within the MAdrid Data CUBe Analysis (MADCUBA) package and their main data-handling functionalities. These tools have been developed to visualise, analyse, and model large spectroscopic data cubes. Methods. We present the highly interactive on-the-fly visualisation and modelling tools of MADCUBA and SLIM, which includes a stand-alone spectroscopic database. The parameters stored therein are used to solve the full radiative transfer equation under local thermodynamic equilibrium (LTE). The SLIM package provides tools to generate synthetic LTE model spectra based on input physical parameters of column density, excitation temperature, velocity, line width, and source size. It also provides an automatic fitting algorithm to obtain the physical parameters (with their associated errors) better fitting the observations. Synthetic spectra can be overlayed in the data cubes/spectra to ease the task of multi-molecular line identification and modelling. Results. We present the Java-based MADCUBA and its internal module SLIM packages which provide all the necessary tools for manipulation and analysis of spectroscopic data cubes. We describe in detail the spectroscopic fitting equations and make use of this tool to explore the breaking conditions and implicit errors of commonly used approximations in the literature. Conclusions. Easy-to-use tools like MADCUBA allow users to derive physical information from spectroscopic data without the need for simple approximations. The SLIM tool allows the full radiative transfer equation to be used, and to interactively explore the space of physical parameters and associated uncertainties from observational data.


2012 ◽  
Vol 29 (3) ◽  
pp. 244-250 ◽  
Author(s):  
L. Flöer ◽  
B. Winkel

AbstractToday, image denoising by thresholding of wavelet coefficients is a commonly used tool for 2D image enhancement. Since the data product of spectroscopic imaging surveys has two spatial dimensions and one spectral dimension, the techniques for denoising have to be adapted to this change in dimensionality. In this paper we will review the basic method of denoising data by thresholding wavelet coefficients and implement a 2D–1D wavelet decomposition to obtain an efficient way of denoising spectroscopic data cubes. We conduct different simulations to evaluate the usefulness of the algorithm as part of a source finding pipeline.


2004 ◽  
Vol 193 ◽  
pp. 275-278
Author(s):  
Malcolm Cropp ◽  
Karen R. Pollard ◽  
Jovan Skuljan

AbstractFour δ Scuti stars were observed with the HERCULES fibrefed échelle spectrograph at Mount John University Observatory, New Zealand. These observations were analysed by looking at the radial velocity variations as given by a cross-correlation technique as well as spectral line moment variations. These results were compared to published photometric studies of these stars to see if the modes identified in the photometry were also present in the spectroscopic data obtained.


2018 ◽  
Vol 18 (6) ◽  
pp. 4019-4038 ◽  
Author(s):  
Alejandro Marti ◽  
Arnau Folch

Abstract. Volcanic ash modeling systems are used to simulate the atmospheric dispersion of volcanic ash and to generate forecasts that quantify the impacts from volcanic eruptions on infrastructures, air quality, aviation, and climate. The efficiency of response and mitigation actions is directly associated with the accuracy of the volcanic ash cloud detection and modeling systems. Operational forecasts build on offline coupled modeling systems in which meteorological variables are updated at the specified coupling intervals. Despite the concerns from other communities regarding the accuracy of this strategy, the quantification of the systematic errors and shortcomings associated with the offline modeling systems has received no attention. This paper employs the NMMB-MONARCH-ASH model to quantify these errors by employing different quantitative and categorical evaluation scores. The skills of the offline coupling strategy are compared against those from an online forecast considered to be the best estimate of the true outcome. Case studies are considered for a synthetic eruption with constant eruption source parameters and for two historical events, which suitably illustrate the severe aviation disruptive effects of European (2010 Eyjafjallajökull) and South American (2011 Cordón Caulle) volcanic eruptions. Evaluation scores indicate that systematic errors due to the offline modeling are of the same order of magnitude as those associated with the source term uncertainties. In particular, traditional offline forecasts employed in operational model setups can result in significant uncertainties, failing to reproduce, in the worst cases, up to 45–70 % of the ash cloud of an online forecast. These inconsistencies are anticipated to be even more relevant in scenarios in which the meteorological conditions change rapidly in time. The outcome of this paper encourages operational groups responsible for real-time advisories for aviation to consider employing computationally efficient online dispersal models.


2002 ◽  
Vol 199 ◽  
pp. 3-10
Author(s):  
J. J. Condon

The primary goal of radio source surveys is to generate flux-limited samples. Sources selected at very low frequencies are dominated by unbeamed emission and give the only unbiased view of the parent populations used by “unification” models to account for the diversity of sources seen at high frequencies. Low-frequency surveys favor sources with exceptionally steep spectra. They include radio galaxies at high redshifts, radio halos of nearby galaxies, relic radio sources, diffuse cluster emission, pulsars that may be missed by traditional pulse searches, and a new class of unidentified compact sources. Flux densities from low-frequency surveys extend the spectra of known source populations to frequencies at which free-free and synchrotron absorption become significant and constrain basic source parameters. Finally, telescope fields-of-view scale ∝ λ2, so gridded surveys can be more efficient than directed observations of individual targets. This review covers recent and proposed low-frequency source surveys and their astronomical uses.


Geophysics ◽  
2013 ◽  
Vol 78 (1) ◽  
pp. B37-B47 ◽  
Author(s):  
Sherilyn Williams-Stroud ◽  
Chet Ozgen ◽  
Randall L. Billingsley

The effectiveness of hydraulic fracture stimulation in low-permeability reservoirs was evaluated by mapping microseismic events related to rock fracturing. The geometry of stage by stage event point sets were used to infer fracture orientation, particularly in the case where events line up along an azimuth, or have a planar distribution in three dimensions. Locations of microseismic events may have a higher degree of uncertainty when there is a low signal-to-noise ratio (either due to low magnitude or to propagation effects). Low signal-to-noise events are not as accurately located in the reservoir, or may fall below the detectability limit, so that the extent of fracture stimulated reservoir may be underestimated. In the Bakken Formation of the Williston Basin, we combined geologic analysis with process-based and stochastic fracture modeling to build multiple possible discrete fracture network (DFN) model realizations. We then integrated the geologic model with production data and numerical simulation to evaluate the impact on estimated ultimate recovery (EUR). We tested assumptions used to create the DFN model to determine their impact on dynamic calibration of the simulation model, and their impact on predictions of EUR. Comparison of simulation results, using fracture flow properties generated from two different calibrated DFN scenarios, showed a 16% difference in amount of oil ultimately produced from the well. The amount of produced water was strongly impacted by the geometry of the DFN model. The character of the DFN significantly impacts the relative amounts of fluids produced. Monitoring water cut with production can validate the appropriate DFN scenario, and provide critical information for the optimal method for well production. The results indicated that simulation of enhanced permeability using induced microseismicity to constrain a fracture flow property model is an effective way to evaluate the performance of reservoirs stimulated by hydraulic fracture treatments.


2020 ◽  
Vol 493 (1) ◽  
pp. 305-319 ◽  
Author(s):  
Stefan Hilbert ◽  
Alexandre Barreira ◽  
Giulio Fabbian ◽  
Pablo Fosalba ◽  
Carlo Giocoli ◽  
...  

ABSTRACT We investigate the accuracy of weak lensing simulations by comparing the results of five independently developed lensing simulation codes run on the same input N-body simulation. Our comparison focuses on the lensing convergence maps produced by the codes, and in particular on the corresponding PDFs, power spectra, and peak counts. We find that the convergence power spectra of the lensing codes agree to $\lesssim 2{{\ \rm per\ cent}}$ out to scales ℓ ≈ 4000. For lensing peak counts, the agreement is better than $5{{\ \rm per\ cent}}$ for peaks with signal-to-noise ≲ 6. We also discuss the systematic errors due to the Born approximation, line-of-sight discretization, particle noise, and smoothing. The lensing codes tested deal in markedly different ways with these effects, but they none-the-less display a satisfactory level of agreement. Our results thus suggest that systematic errors due to the operation of existing lensing codes should be small. Moreover their impact on the convergence power spectra for a lensing simulation can be predicted given its numerical details, which may then serve as a validation test.


1986 ◽  
Vol 90 ◽  
pp. 234-234
Author(s):  
Dietrich Baade ◽  
Werner W. Weiss

AbstractSpectral line profiles are computed for nonradially pulsating CP2 stars. For a range which currently is thought to be typical for these stars, the influence of six parameters on the line profiles is considered: mode order ℓ and degree m, pulsation velocity amplitude, the angle between the rotation and pulsation axis, the angle between the rotation axis and the line-of-sight, and the phase angle of the rotation. In view of the expected low signal-to-noise ratio of observational data it is investigated to what extent easily measurable, simple quantities can still be useful in discriminating between different modes.


Sign in / Sign up

Export Citation Format

Share Document